WorldWideScience

Sample records for applying geostatistical analysis

  1. Geostatistics applied to uranium mineral

    International Nuclear Information System (INIS)

    The concepts of geostatistics are particularly introduced in the field of regionalized variables theory in order to apply them to the reserves estimation problem, since through this theory we can study the structural characteristics of the variable under consideration. So, before the starting of mine works we might suggest the geostatistic study of the deposit, in order to obtain an estimation of the reserves. Finally, a geostatistic study of the sedimentary type uranium deposit, called ''La Coma'', is realized

  2. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    International Nuclear Information System (INIS)

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  3. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    Science.gov (United States)

    Wingle, William L.; Poeter, Eileen P.; McKenna, Sean A.

    1999-05-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines.

  4. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    The geomagnetic field varies on a variety of time- and length scales, which are only rudimentary considered in most present field models. The part of the observed field that can not be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty (which...... consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...... on 5 years of Ørsted and CHAMP data, and includes secular variation and acceleration, as well as low-degree external (magnetospheric) and induced fields. The analysis is done in order to find the statistical behaviour of the space-time structure of the residuals, as a proxy for the data covariances...

  5. Computational system for geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Vendrusculo Laurimar Gonçalves

    2004-01-01

    Full Text Available Geostatistics identifies the spatial structure of variables representing several phenomena and its use is becoming more intense in agricultural activities. This paper describes a computer program, based on Windows Interfaces (Borland Delphi, which performs spatial analyses of datasets through geostatistic tools: Classical statistical calculations, average, cross- and directional semivariograms, simple kriging estimates and jackknifing calculations. A published dataset of soil Carbon and Nitrogen was used to validate the system. The system was useful for the geostatistical analysis process, for the manipulation of the computational routines in a MS-DOS environment. The Windows development approach allowed the user to model the semivariogram graphically with a major degree of interaction, functionality rarely available in similar programs. Given its characteristic of quick prototypation and simplicity when incorporating correlated routines, the Delphi environment presents the main advantage of permitting the evolution of this system.

  6. Geostatistics

    International Nuclear Information System (INIS)

    The use of statistical methodology in the Hydrogeochemical and Stream Sediment Reconnaissance Program of the National Uranium Resource Evaluation (NURE) Program is useful in identifying local and regional trends related to uranium provinces and districts. Additionally, appropriate statistical summarization of the data is important in any subsequent evaluation. Since as many as 190,000 samples are to be collected during the Oak Ridge Program, it is necessary to computerize most statistical procedures to enable timely release of the data. These automated procedures are developed through the interaction of geologists, statisticians, and computer personnel. This team effort is designed to efficiently produce meaningful geologic results that are statistically sound. The following discussion will present a brief explanation of the regional technique of weighted sum contouring and a localized technique of cluster analysis. In both of these methods, the geologist constructs a geochemical model by selecting appropriate uranium related parameters and assigning the relative importance to each parameter in the model. The model can then be evaluated by relating the patterns defined by the analysis to the geologic formations and any known uranium mineralization

  7. Geostatistics and Analysis of Spatial Data

    OpenAIRE

    Nielsen, Allan Aasbjerg

    2007-01-01

    This note deals with geostatistical measures for spatial correlation, namely the auto-covariance function and the semi-variogram, as well as deterministic and geostatistical methods for spatial interpolation, namely inverse distance weighting and kriging. Some semi-variogram models are mentioned, specifically the spherical, the exponential and the Gaussian models. Equations to carry out simple og ordinary kriging are deduced. Other types of kriging are mentioned, and references to internation...

  8. Geostatistics applied to estimation of uranium bearing ore reserves

    International Nuclear Information System (INIS)

    A computer assisted method for assessing uranium-bearing ore deposit reserves is analyzed. Determinations of quality-thickness, namely quality by thickness calculations of mineralization, were obtained by means of a mathematical method known as the theory of rational variables for each drill-hole layer. Geostatistical results were derived based on a Fortrand computer program on a DEC 20/40 system. (author)

  9. Geostatistics and Analysis of Spatial Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2007-01-01

    This note deals with geostatistical measures for spatial correlation, namely the auto-covariance function and the semi-variogram, as well as deterministic and geostatistical methods for spatial interpolation, namely inverse distance weighting and kriging. Some semi-variogram models are mentioned......, specifically the spherical, the exponential and the Gaussian models. Equations to carry out simple og ordinary kriging are deduced. Other types of kriging are mentioned, and references to international literature, Internet addresses and state-of-the-art software in the field are given. A very simple example to...... illustrate the computations and a more realistic example with height data from an area near Slagelse, Denmark, are given. Finally, a series of attractive characteristics of kriging are mentioned, and a simple sampling strategic consideration is given based on the dependence of the kriging variance of...

  10. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  11. Geostatistical analysis and kriging of Hexachlorocyclohexane residues in topsoil from Tianjin, China

    International Nuclear Information System (INIS)

    A previously published data set of HCH isomer concentrations in topsoil samples from Tianjin, China, was subjected to geospatial analysis. Semivariograms were calculated and modeled using geostatistical techniques. Parameters of semivariogram models were analyzed and compared for four HCH isomers. Two-dimensional ordinary block kriging was applied to HCH isomers data set for mapping purposes. Dot maps and gray-scaled raster maps of HCH concentrations were presented based on kriging results. The appropriateness of the kriging procedure for mapping purposes was evaluated based on the kriging errors and kriging variances. It was found that ordinary block kriging can be applied to interpolate HCH concentrations in Tianjin topsoil with acceptable accuracy for mapping purposes. - Geostatistical analysis and kriging were applied to HCH concentrations in topsoil of Tianjin, China for mapping purposes

  12. Data analysis for radiological characterisation: Geostatistical and statistical complementarity

    International Nuclear Information System (INIS)

    Radiological characterisation may cover a large range of evaluation objectives during a decommissioning and dismantling (D and D) project: removal of doubt, delineation of contaminated materials, monitoring of the decontamination work and final survey. At each stage, collecting relevant data to be able to draw the conclusions needed is quite a big challenge. In particular two radiological characterisation stages require an advanced sampling process and data analysis, namely the initial categorization and optimisation of the materials to be removed and the final survey to demonstrate compliance with clearance levels. On the one hand the latter is widely used and well developed in national guides and norms, using random sampling designs and statistical data analysis. On the other hand a more complex evaluation methodology has to be implemented for the initial radiological characterisation, both for sampling design and for data analysis. The geostatistical framework is an efficient way to satisfy the radiological characterisation requirements providing a sound decision-making approach for the decommissioning and dismantling of nuclear premises. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. Thus geo-statistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, leading to a sound classification of radiological waste (surfaces and volumes). This way, the radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive or not) surface survey of the contamination is implemented on a regular grid. Finally, in order to assess activity levels and contamination depths, destructive samples are collected at several locations within the premises (based on the surface survey results) and analysed. Combined with

  13. Mercury emissions from coal combustion in Silesia, analysis using geostatistics

    Science.gov (United States)

    Zasina, Damian; Zawadzki, Jaroslaw

    2015-04-01

    Data provided by the UNEP's report on mercury [1] shows that solid fuel combustion in significant source of mercury emission to air. Silesia, located in southwestern Poland, is notably affected by mercury emission due to being one of the most industrialized Polish regions: the place of coal mining, production of metals, stone mining, mineral quarrying and chemical industry. Moreover, Silesia is the region with high population density. People are exposed to severe risk of mercury emitted from both: industrial and domestic sources (i.e. small household furnaces). Small sources have significant contribution to total emission of mercury. Official and statistical analysis, including prepared for international purposes [2] did not provide data about spatial distribution of the mercury emitted to air, however number of analysis on Polish public power and energy sector had been prepared so far [3; 4]. The distribution of locations exposed for mercury emission from small domestic sources is interesting matter merging information from various sources: statistical, economical and environmental. This paper presents geostatistical approach to distibution of mercury emission from coal combustion. Analysed data organized in 2 independent levels: individual, bottom-up approach derived from national emission reporting system [5; 6] and top down - regional data calculated basing on official statistics [7]. Analysis, that will be presented, will include comparison of spatial distributions of mercury emission using data derived from sources mentioned above. Investigation will include three voivodeships of Poland: Lower Silesian, Opole (voivodeship) and Silesian using selected geostatistical methodologies including ordinary kriging [8]. References [1] UNEP. Global Mercury Assessment 2013: Sources, Emissions, Releases and Environmental Transport. UNEP Chemicals Branch, Geneva, Switzerland, 2013. [2] NCEM. Poland's Informative Inventory Report 2014. NCEM at the IEP-NRI, 2014. http

  14. Geostatistical techniques applied to mapping limnological variables and quantify the uncertainty associated with estimates

    Directory of Open Access Journals (Sweden)

    Cristiano Cigagna

    2015-12-01

    Full Text Available Abstract Aim: This study aimed to map the concentrations of limnological variables in a reservoir employing semivariogram geostatistical techniques and Kriging estimates for unsampled locations, as well as the uncertainty calculation associated with the estimates. Methods: We established twenty-seven points distributed in a regular mesh for sampling. Then it was determined the concentrations of chlorophyll-a, total nitrogen and total phosphorus. Subsequently, a spatial variability analysis was performed and the semivariogram function was modeled for all variables and the variographic mathematical models were established. The main geostatistical estimation technique was the ordinary Kriging. The work was developed with the estimate of a heavy grid points for each variables that formed the basis of the interpolated maps. Results: Through the semivariogram analysis was possible to identify the random component as not significant for the estimation process of chlorophyll-a, and as significant for total nitrogen and total phosphorus. Geostatistical maps were produced from the Kriging for each variable and the respective standard deviations of the estimates calculated. These measurements allowed us to map the concentrations of limnological variables throughout the reservoir. The calculation of standard deviations provided the quality of the estimates and, consequently, the reliability of the final product. Conclusions: The use of the Kriging statistical technique to estimate heavy mesh points associated with the error dispersion (standard deviation of the estimate, made it possible to make quality and reliable maps of the estimated variables. Concentrations of limnological variables in general were higher in the lacustrine zone and decreased towards the riverine zone. The chlorophyll-a and total nitrogen correlated comparing the grid generated by Kriging. Although the use of Kriging is more laborious compared to other interpolation methods, this

  15. Geostatistical case studies

    International Nuclear Information System (INIS)

    The objective of this volume of contributed chapters is to present a series of applications of geostatistics. These range from a careful variographic analysis on uranium data, through detailed studies on geologically complex deposits, right up to the latest nonlinear methods applied to deposits with highly skewed data contributions. Applications of new techniques such as the external drift method for combining well data with seismic information have also been included. The volume emphasizes geostatistics in practice. Notation has been kept to a minimum and mathematical details have been relegated to annexes

  16. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    Science.gov (United States)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    ), applying different interpolation methods/parameters are shown: RMS (mm) error values obtained from the independent test set (20 % of the samples) follow, according to this order: IDW (exponent=1.5, 2, 2.5, 3) / SPT (tension=100, 125, 150, 175, 200) / OK. LOOCV: 92.5; 80.2; 74.2; 72.3 / 181.6; 90.6; 75.7; 71.1; 69.4; 68.8 RS: 101.2; 89.6; 83.9; 81.9 / 115.1; 92.4; 84.0; 81.4; 80.9; 81.1 / 81.1 EU: 57.4; 51.3; 53.1; 55.5 / 59.1; 57.1; 55.9; 55.0; 54.3 / 51.8 A3D: 48.3; 49.8; 52.5; 62.2 / 57.1; 54.4; 52.5; 51.2; 50.2 / 49.7 To study these results, a geostatistical analysis of uncertainty has been done. Main results: variogram analysis of the error (using the test set) shows that the total sill is reduced (50% EU, 60% A3D) when using the two new approaches, while the spatialized standard deviation model calculated from the OK shows significantly lower values when compared to the RS. In conclusion, A3D and EU highly improve LOOCV and RS, whereas A3D slightly improves EU. Also, LOOCV only shows slightly better results than RS, suggesting that non-random-split increases the power of both fitting-test steps. * Ninyerola, Pons, Roure. A methodological approach of climatological modelling of air temperature and precipitation through GIS techniques. IJC, 2000; 20:1823-1841.

  17. Geostatistics applied to the study of the spatial distribution of Tibraca limbativentris in flooded rice fields

    Directory of Open Access Journals (Sweden)

    Juliano de Bastos Pazini

    2015-06-01

    Full Text Available Tibraca limbativentris (rice stem bug is an insect highly injurious to the rice crop in Brazil. The aim of this research was to define the spatial distribution of the T. limbativentris and improve the sampling process by means of geostatistical application techniques and construction of prediction maps in a flooded rice field located in the "Planalto da Campanha" Region, Rio Grande do Sul (RS, Brazil. The experiments were conducted in rice crop in the municipality of Itaqui - RS, in the crop years of 2009/10, 2010/11 and 2011/12, counting fortnightly the number of nymphs and adults in a georeferenced grid with points spaced at 50m in the first year and in 10m in the another years. It was performed a geostatistical analysis by means adjusting semivariogram and interpolation of numeric data by kriging to verify the spatial dependence and the subsequent mapping population. The results obtained indicated that the rice stem bug, T. limbativentris, has a strong spatial dependence. The prediction maps allow estimating population density of the pest and visualization of the spatial distribution in flooded rice fields, enabling the improvement of the traditional method of sampling for rice stem bug

  18. A geostatistical method applied to the geochemical study of the Chichinautzin Volcanic Field in Mexico

    Science.gov (United States)

    Robidoux, P.; Roberge, J.; Urbina Oviedo, C. A.

    2011-12-01

    The origin of magmatism and the role of the subducted Coco's Plate in the Chichinautzin volcanic field (CVF), Mexico is still a subject of debate. It has been established that mafic magmas of alkali type (subduction) and calc-alkali type (OIB) are produced in the CVF and both groups cannot be related by simple fractional crystallization. Therefore, many geochemical studies have been done, and many models have been proposed. The main goal of the work present here is to provide a new tool for the visualization and interpretation of geochemical data using geostatistics and geospatial analysis techniques. It contains a complete geodatabase built from referred samples over the 2500 km2 area of CVF and its neighbour stratovolcanoes (Popocatepetl, Iztaccihuatl and Nevado de Toluca). From this database, map of different geochemical markers were done to visualise geochemical signature in a geographical manner, to test the statistic distribution with a cartographic technique and highlight any spatial correlations. The distribution and regionalization of the geochemical signatures can be viewed in a two-dimensional space using a specific spatial analysis tools from a Geographic Information System (GIS). The model of spatial distribution is tested with Linear Decrease (LD) and Inverse Distance Weight (IDW) interpolation technique because they best represent the geostatistical characteristics of the geodatabase. We found that ratio of Ba/Nb, Nb/Ta, Th/Nb show first order tendency, which means visible spatial variation over a large scale area. Monogenetic volcanoes in the center of the CVF have distinct values compare to those of the Popocatepetl-Iztaccihuatl polygenetic complex which are spatially well defined. Inside the Valley of Mexico, a large quantity of monogenetic cone in the eastern portion of CVF has ratios similar to the Iztaccihuatl and Popocatepetl complex. Other ratios like alkalis vs SiO2, V/Ti, La/Yb, Zr/Y show different spatial tendencies. In that case, second

  19. Bayesian Analysis of Geostatistical Models With an Auxiliary Lattice

    KAUST Repository

    Park, Jincheol

    2012-04-01

    The Gaussian geostatistical model has been widely used for modeling spatial data. However, this model suffers from a severe difficulty in computation: it requires users to invert a large covariance matrix. This is infeasible when the number of observations is large. In this article, we propose an auxiliary lattice-based approach for tackling this difficulty. By introducing an auxiliary lattice to the space of observations and defining a Gaussian Markov random field on the auxiliary lattice, our model completely avoids the requirement of matrix inversion. It is remarkable that the computational complexity of our method is only O(n), where n is the number of observations. Hence, our method can be applied to very large datasets with reasonable computational (CPU) times. The numerical results indicate that our model can approximate Gaussian random fields very well in terms of predictions, even for those with long correlation lengths. For real data examples, our model can generally outperform conventional Gaussian random field models in both prediction errors and CPU times. Supplemental materials for the article are available online. © 2012 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

  20. 3D Geostatistical Modeling and Uncertainty Analysis in a Carbonate Reservoir, SW Iran

    OpenAIRE

    Mohammad Reza Kamali; Azadeh Omidvar; Ezatallah Kazemzadeh

    2013-01-01

    The aim of geostatistical reservoir characterization is to utilize wide variety of data, in different scales and accuracies, to construct reservoir models which are able to represent geological heterogeneities and also quantifying uncertainties by producing numbers of equiprobable models. Since all geostatistical methods used in estimation of reservoir parameters are inaccurate, modeling of “estimation error” in form of uncertainty analysis is very important. In this paper, the definition of ...

  1. Geostatistical analysis using K-splines in the geoadditive model

    OpenAIRE

    Vandendijck, Yannick; Faes, Christel; Hens, Niel

    2015-01-01

    In geostatistics, both kriging and smoothing splines are commonly used to predict a quantity of interest. The geoadditive model proposed by Kammann and Wand (2003) represents a fusion of kriging and penalized spline additive models. The fact that the underlying spatial covariance structure is poorly estimated using geoadditive models is a drawback. We describe K-splines, an extension of geoadditive models such that estimation of the underlying spatial process parameters and predictions ...

  2. Geostatistical analysis of prevailing groundwater conditions and potential solute migration at Elstow, Bedfordshire

    International Nuclear Information System (INIS)

    A geostatistical approach is applied in a study of the potential migration of contaminants from a hypothetical waste disposal facility near Elstow, Bedfordshire. A deterministic numerical model of groundwater flow in the Kellaways Sands formation and adjacent layers is coupled with geostatistical simulation of the heterogeneous transmissivity field of this principal formation. A particle tracking technique is used to predict the migration pathways for alternative realisations of flow. Alternative statistical descriptions of the spatial structure of the transmissivity field are implemented and the temporal and spatial distributions of escape of contaminants to the biosphere are investigated. (author)

  3. Combining Geostatistics with Moran’s I Analysis for Mapping Soil Heavy Metals in Beijing, China

    Directory of Open Access Journals (Sweden)

    Bao-Guo Li

    2012-03-01

    Full Text Available Production of high quality interpolation maps of heavy metals is important for risk assessment of environmental pollution. In this paper, the spatial correlation characteristics information obtained from Moran’s I analysis was used to supplement the traditional geostatistics. According to Moran’s I analysis, four characteristics distances were obtained and used as the active lag distance to calculate the semivariance. Validation of the optimality of semivariance demonstrated that using the two distances where the Moran’s I and the standardized Moran’s I, Z(I reached a maximum as the active lag distance can improve the fitting accuracy of semivariance. Then, spatial interpolation was produced based on the two distances and their nested model. The comparative analysis of estimation accuracy and the measured and predicted pollution status showed that the method combining geostatistics with Moran’s I analysis was better than traditional geostatistics. Thus, Moran’s I analysis is a useful complement for geostatistics to improve the spatial interpolation accuracy of heavy metals.

  4. GEOSTATISTICS APPLIED TO THE STUDY OF SOIL PHYSIOCHEMICAL CHARACTERISTICS IN SEASONAL DECIDUOUS FOREST AREAS

    OpenAIRE

    Eleandro J. Brun; Carlos R. S. da Silva; Sandro Vaccaro; Rubens M. Rondon Neto

    2010-01-01

    Methods of geostatistics were used in the identification of size and structure of space variability of some physiochemical attributes of soils under seasonal deciduous forest areas, which were called mature forest, secondary forest and “capoeirão”. The areas, located in Santa Tereza, RS, were sampled during the period of 2002 and 2003, comprising the soil classes: Argiluvic Chernosol, Cambisol Ta and Litholic Neosol. Systematic sampling was performed with regular spacing grid of points varyin...

  5. A Practical Primer on Geostatistics

    Science.gov (United States)

    Olea, Ricardo A.

    2009-01-01

    significant methodological implications. HISTORICAL REMARKS As a discipline, geostatistics was firmly established in the 1960s by the French engineer Georges Matheron, who was interested in the appraisal of ore reserves in mining. Geostatistics did not develop overnight. Like other disciplines, it has built on previous results, many of which were formulated with different objectives in various fields. PIONEERS Seminal ideas conceptually related to what today we call geostatistics or spatial statistics are found in the work of several pioneers, including: 1940s: A.N. Kolmogorov in turbulent flow and N. Wiener in stochastic processing; 1950s: D. Krige in mining; 1960s: B. Mathern in forestry and L.S. Gandin in meteorology CALCULATIONS Serious applications of geostatistics require the use of digital computers. Although for most geostatistical techniques rudimentary implementation from scratch is fairly straightforward, coding programs from scratch is recommended only as part of a practice that may help users to gain a better grasp of the formulations. SOFTWARE For professional work, the reader should employ software packages that have been thoroughly tested to handle any sampling scheme, that run as efficiently as possible, and that offer graphic capabilities for the analysis and display of results. This primer employs primarily the package Stanford Geomodeling Software (SGeMS) - recently developed at the Energy Resources Engineering Department at Stanford University - as a way to show how to obtain results practically. This applied side of the primer should not be interpreted as the notes being a manual for the use of SGeMS. The main objective of the primer is to help the reader gain an understanding of the fundamental concepts and tools in geostatistics. ORGANIZATION OF THE PRIMER The chapters of greatest importance are those covering kriging and simulation. All other materials are peripheral and are included for better comprehension of th

  6. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...

  7. Incorporating temporal variability to improve geostatistical analysis of satellite-observed CO2 in China

    Institute of Scientific and Technical Information of China (English)

    ZENG ZhaoCheng; LEI LiPing; GUO LiJie; ZHANG Li; ZHANG Bing

    2013-01-01

    Observations of atmospheric carbon dioxide (CO2) from satellites offer new data sources to understand global carbon cycling.The correlation structure of satellite-observed CO2 can be analyzed and modeled by geostatistical methods,and CO2 values at unsampled locations can be predicted with a correlation model.Conventional geostatistical analysis only investigates the spatial correlation of CO2,and does not consider temporal variation in the satellite-observed CO2 data.In this paper,a spatiotemporal geostatistical method that incorporates temporal variability is implemented and assessed for analyzing the spatiotemporal correlation structure and prediction of monthly CO2 in China.The spatiotemporal correlation is estimated and modeled by a product-sum variogram model with a global nugget component.The variogram result indicates a significant degree of temporal correlation within satellite-observed CO2 data sets in China.Prediction of monthly CO2 using the spatiotemporal variogram model and spacetime kriging procedure is implemented.The prediction is compared with a spatial-only geostatistical prediction approach using a cross-validation technique.The spatiotemporal approach gives better results,with higher correlation coefficient (r2),and less mean absolute prediction error and root mean square error.Moreover,the monthly mapping result generated from the spatiotemporal approach has less prediction uncertainty and more detailed spatial variation of CO2 than those from the spatial-only approach.

  8. Geostatistical and stochastic inverse analysis of the Wolfcamp aquifer

    International Nuclear Information System (INIS)

    The quantification of uncertainty is critical in both the performance assessment and licensing of a high-level waste repository. Under expected conditions, repository performance is dominated by continuous processes with far field flow being one of these key processes. We present a kriging analysis of potentiometric data in the Wolfcamp aquifer of the Palo Duro Basin in northern Texas, combined with a stochastic inverse analysis using adjoint sensitivity theory. These results may be used to identify additional data needs. A careful development of the steps of the kriging analysis is presented. Universal kriging which combined fitting a polynomial trend surface with ordinary kriging is shown to provide a good fit to the isotropic potentiometric data. The necessary adjoint sensitivity theory is applied to backsolve for the hydraulic conductivities via an iterative stochastic inverse method

  9. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  10. Multivariate analysis and geostatistics of the fertility of a humic rhodic hapludox under coffee cultivation

    Directory of Open Access Journals (Sweden)

    Samuel de Assis Silva

    2012-04-01

    Full Text Available The spatial variability of soil and plant properties exerts great influence on the yeld of agricultural crops. This study analyzed the spatial variability of the fertility of a Humic Rhodic Hapludox with Arabic coffee, using principal component analysis, cluster analysis and geostatistics in combination. The experiment was carried out in an area under Coffea arabica L., variety Catucai 20/15 - 479. The soil was sampled at a depth 0.20 m, at 50 points of a sampling grid. The following chemical properties were determined: P, K+, Ca2+, Mg2+, Na+, S, Al3+, pH, H + Al, SB, t, T, V, m, OM, Na saturation index (SSI, remaining phosphorus (P-rem, and micronutrients (Zn, Fe, Mn, Cu and B. The data were analyzed with descriptive statistics, followed by principal component and cluster analyses. Geostatistics were used to check and quantify the degree of spatial dependence of properties, represented by principal components. The principal component analysis allowed a dimensional reduction of the problem, providing interpretable components, with little information loss. Despite the characteristic information loss of principal component analysis, the combination of this technique with geostatistical analysis was efficient for the quantification and determination of the structure of spatial dependence of soil fertility. In general, the availability of soil mineral nutrients was low and the levels of acidity and exchangeable Al were high.

  11. Evolution of the soil surface roughness using geostatistical analysis

    Directory of Open Access Journals (Sweden)

    Eva Vidal Vázquez

    2010-01-01

    Full Text Available The objective of this work was to investigate the decay of initial surface roughness induced by simulated rainfall under different soil residue cover and to compare classical statistical indices with geostatistical parameters. A conventionally tilled loamy soil with low structure stability, thus prone to crusting was placed at 1 m² microplots. Each microplot received three successive rainfall events which bring about cumulative 25 mm, 50 mm and 75 mm at 65 mm h-1 intensity. Five treatments without replication were tested with different corn straw quantities (0, 1, 2, 3 and 4 Mg ha"1. Soil surface microrelief was measured at the initial stage and after each simulated rainfall event. Five treatments and four surface stages were monitored, resulting in 20 data sets. Point elevation data were taken at 0.03 m intervals using a pinmeter. Digital elevation models were generated and analysed using semivariograms. All data sets showed spatial dependence and spherical models were fitted to experimental semivariograms. A very significant relationship was found between the random roughness index, RR, and the sill of the semivariogram (C0+C1. All the treatments showed a clear trend to sill value reduction with increasing precipitation. However, roughness decay was lower in treatments with higher straw cover (3 and 4 Mg ha-1. Therefore, residue cover limited soil surface roughness decline. The control treatment, without straw, showed the lowest nugget effect (C0, which means the lowest spatial discontinuity of all treatments in this study. The range of spatial dependence (a also showed a trend to decrease with increased cumulative rain, which was most apparent in treatments without or with relatively low straw cover (0, 1 and 2 Mg ha-1. The suitability of using sill variance and range for describing patterns of soil surface microrelief decline is discussed.

  12. Imprecise (fuzzy) information in geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Bardossy, A.; Bogardi, I.; Kelly, W.E.

    1988-05-01

    A methodology based on fuzzy set theory for the utilization of imprecise data in geostatistics is presented. A common problem preventing a broader use of geostatistics has been the insufficient amount of accurate measurement data. In certain cases, additional but uncertain (soft) information is available and can be encoded as subjective probabilities, and then the soft kriging method can be applied (Journal, 1986). In other cases, a fuzzy encoding of soft information may be more realistic and simplify the numerical calculations. Imprecise (fuzzy) spatial information on the possible variogram is integrated into a single variogram which is used in a fuzzy kriging procedure. The overall uncertainty of prediction is represented by the estimation variance and the calculated membership function for each kriged point. The methodology is applied to the permeability prediction of a soil liner for hazardous waste containment. The available number of hard measurement data (20) was not enough for a classical geostatistical analysis. An additional 20 soft data made it possible to prepare kriged contour maps using the fuzzy geostatistical procedure.

  13. Technology demonstration: geostatistical and hydrologic analysis of salt areas. Assessment of effectiveness of geologic isolation systems

    International Nuclear Information System (INIS)

    The Office of Nuclear Waste Isolation (ONWI) requested Pacific Northwest Laboratory (PNL) to: (1) use geostatistical analyses to evaluate the adequacy of hydrologic data from three salt regions, each of which contains a potential nuclear waste repository site; and (2) demonstrate a methodology that allows quantification of the value of additional data collection. The three regions examined are the Paradox Basin in Utah, the Permian Basin in Texas, and the Mississippi Study Area. Additional and new data became available to ONWI during and following these analyses; therefore, this report must be considered a methodology demonstration here would apply as illustrated had the complete data sets been available. A combination of geostatistical and hydrologic analyses was used for this demonstration. Geostatistical analyses provided an optimal estimate of the potentiometric surface from the available data, a measure of the uncertainty of that estimate, and a means for selecting and evaluating the location of future data. The hydrologic analyses included the calculation of transmissivities, flow paths, travel times, and ground-water flow rates from hypothetical repository sites. Simulation techniques were used to evaluate the effect of optimally located future data on the potentiometric surface, flow lines, travel times, and flow rates. Data availability, quality, quantity, and conformance with model assumptions differed in each of the salt areas. Report highlights for the three locations are given

  14. Geostatistical analysis of potentiometric data in Wolfcamp aquifer of the Palo Duro Basin, Texas

    International Nuclear Information System (INIS)

    This report details a geostatistical analysis of potentiometric data from the Wolfcamp aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Wolfcamp data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data

  15. Geostatistical methods for radiological evaluation and risk analysis of contaminated premises

    International Nuclear Information System (INIS)

    Full text: At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires the radiological assessment of residual activity levels of building structures. As stated by the IAEA, 'Segregation and characterization of contaminated materials are the key elements of waste minimization'. From this point of view, the set up of an appropriate evaluation methodology is of primordial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical, functional and qualitative information. Then, a systematic (exhaustive or not) control of the emergent signal is performed by means of in situ measurement methods such as surface control device combined with in situ gamma spectrometry. Besides, in order to assess the contamination depth, samples can be collected from boreholes at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data improve and reinforce the preliminary waste zoning. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The relevance of the geostatistical methodology relies on the presence of a spatial continuity for radiological contamination. In this case, geostatistics provides reliable methods for activity estimation, uncertainty quantification and risk analysis, which are essential decision-making tools for decommissioning and dismantling projects of nuclear installations. Besides, the ability of this geostatistical framework to provide answers to several key issues that generally occur during the clean-up preparation phase is discussed: How to optimise the investigation costs? How to deal with data quality issues? How to consistently take into account auxiliary information such as historical

  16. Geostatistical analysis of field hydraulic conductivity in compacted clay

    Energy Technology Data Exchange (ETDEWEB)

    Rogowski, A.S.; Simmons, D.E.

    1988-05-01

    Hydraulic conductivity (K) of fractured or porous materials is associated intimately with water flow and chemical transport. Basic concepts imply uniform flux through a homogeneous cross-sectional area. If flow were to occur only through part of the area, actual rates could be considerably different. Because laboratory values of K in compacted clays seldom agree with field estimates, questions arise as to what the true values of K are and how they should be estimated. Hydraulic conductivity values were measured on a 10 x 25 m elevated bridge-like platform. A constant water level was maintained for 1 yr over a 0.3-m thick layer of compacted clay, and inflow and outflow rates were monitored using 10 x 25 grids of 0.3-m diameter infiltration rings and outflow drains subtending approximately 1 x 1 m blocks of compacted clay. Variography of inflow and outflow data established relationships between cores and blocks of clay, respectively. Because distributions of outflow rates were much less and bore little resemblance to the distributions of break-through rates based on tracer studies, presence of macropores and preferential flow through the macropores was suspected. Subsequently, probability kriging was applied to reevaluate distribution of flux rates and possible location of macropores. Sites exceeding a threshold outflow of 100 x 10/sup -9/ m/s were classified as outliers and were assumed to probably contain a significant population of macropores. Different sampling schemes were examined. Variogram analysis of outflows with and without outliers suggested adequacy of sampling the site at 50 randomly chosen locations. Because of the potential contribution of macropores to pollutant transport and the practical necessity of extrapolating small plot values to larger areas, conditional simulations with and without outliers were carried out.

  17. Geostatistical analysis of groundwater level using Euclidean and non-Euclidean distance metrics and variable variogram fitting criteria

    Science.gov (United States)

    Theodoridou, Panagiota G.; Karatzas, George P.; Varouchakis, Emmanouil A.; Corzo Perez, Gerald A.

    2015-04-01

    Groundwater level is an important information in hydrological modelling. Geostatistical methods are often employed to map the free surface of an aquifer. In geostatistical analysis using Kriging techniques the selection of the optimal variogram model is very important for the optimal method performance. This work compares three different criteria, the least squares sum method, the Akaike Information Criterion and the Cressie's Indicator, to assess the theoretical variogram that fits to the experimental one and investigates the impact on the prediction results. Moreover, five different distance functions (Euclidean, Minkowski, Manhattan, Canberra, and Bray-Curtis) are applied to calculate the distance between observations that affects both the variogram calculation and the Kriging estimator. Cross validation analysis in terms of Ordinary Kriging is applied by using sequentially a different distance metric and the above three variogram fitting criteria. The spatial dependence of the observations in the tested dataset is studied by fitting classical variogram models and the Matérn model. The proposed comparison analysis performed for a data set of two hundred fifty hydraulic head measurements distributed over an alluvial aquifer that covers an area of 210 km2. The study area is located in the Prefecture of Drama, which belongs to the Water District of East Macedonia (Greece). This area was selected in terms of hydro-geological data availability and geological homogeneity. The analysis showed that a combination of the Akaike information Criterion for the variogram fitting assessment and the Brays-Curtis distance metric provided the most accurate cross-validation results. The Power-law variogram model provided the best fit to the experimental data. The aforementioned approach for the specific dataset in terms of the Ordinary Kriging method improves the prediction efficiency in comparison to the classical Euclidean distance metric. Therefore, maps of the spatial

  18. Error modeling based on geostatistics for uncertainty analysis in crop mapping using Gaofen-1 multispectral imagery

    Science.gov (United States)

    You, Jiong; Pei, Zhiyuan

    2015-01-01

    With the development of remote sensing technology, its applications in agriculture monitoring systems, crop mapping accuracy, and spatial distribution are more and more being explored by administrators and users. Uncertainty in crop mapping is profoundly affected by the spatial pattern of spectral reflectance values obtained from the applied remote sensing data. Errors in remotely sensed crop cover information and the propagation in derivative products need to be quantified and handled correctly. Therefore, this study discusses the methods of error modeling for uncertainty characterization in crop mapping using GF-1 multispectral imagery. An error modeling framework based on geostatistics is proposed, which introduced the sequential Gaussian simulation algorithm to explore the relationship between classification errors and the spectral signature from remote sensing data source. On this basis, a misclassification probability model to produce a spatially explicit classification error probability surface for the map of a crop is developed, which realizes the uncertainty characterization for crop mapping. In this process, trend surface analysis was carried out to generate a spatially varying mean response and the corresponding residual response with spatial variation for the spectral bands of GF-1 multispectral imagery. Variogram models were employed to measure the spatial dependence in the spectral bands and the derived misclassification probability surfaces. Simulated spectral data and classification results were quantitatively analyzed. Through experiments using data sets from a region in the low rolling country located at the Yangtze River valley, it was found that GF-1 multispectral imagery can be used for crop mapping with a good overall performance, the proposal error modeling framework can be used to quantify the uncertainty in crop mapping, and the misclassification probability model can summarize the spatial variation in map accuracy and is helpful for

  19. Geostatistical analysis of heavy metal distributions in soils; Geostatistische Analyse von Schwermetallverteilungen in Boeden

    Energy Technology Data Exchange (ETDEWEB)

    Geiler, H.; Dengel, H.S.; Donsbach, A.; Maurer, W. [Siegen Univ. (Gesamthochschule) (Germany). Analytische Chemie II; Knoblich, K.; Aschenbrenner, F. [Giessen Univ. (Germany). Inst. fuer Angewandte Geowissenschaften; Ostermann, R. [Siegen Univ. (Gesamthochschule) (Germany). Hochschulrechenzentrum

    1998-11-01

    Soil samples were taken from a test area of 1 km{sup 2} in a regular pattern at specified depths. The samples were analyzed in respect of heavy metal concentrations. The geologically homogeneous area is located close to the urban area of Siegen and covered with different forest vegetations and meadows. There is no direct influence of industrial emissions. The soil samples were taken at three different depths up to 1 m below surface and digested with aqua regia. The analysis comprised cadmium, chromium, copper, nickel, lead and zinc. Geostatistics was applied in order to ascertain the spatial distribution of heavy metal concentrations in the soil. This was accomplished by adjusting the data on a suitable semivariogram and using kriging as an interpolation algorithm. As a result of the investigations a minimum number of soil samples can be extracted that is sufficient to meet a practical degree of accuracy concerning the reproductivity of the concentration pattern of heavy metal concentrations in the soil. (orig.) [Deutsch] Der Boden einer 1 km{sup 2} grossen Modellflaeche wurde rastermaessig und tiefenorientiert beprobt und auf seine Schwermetallgehalte hin analysiert. Das betreffende Gelaende am Standrand von Siegen ist geologisch einheitlich aufgebaut. Der Bewuchs besteht aus Laub- und Nadelwald sowie Wiese. Das Gebiet steht nicht unter einem unmittelbaren Industrieeinfluss. Aus drei verschiedenen Entnahmetiefen bis zu 1 m sind nach einem Koenigswasseraufschluss die jeweils 100 Bodenproben auf Cadmium, Chrom, Kupfer, Nickel, Blei und Zink untersucht worden. Geostatistische Verfahren (Anpassung an Semivariogramme, Schaetzung mittels Kriging) wurden herangezogen, um den raeumlichen Zusammenhang der ermittelten Schwermetallkonzentrationen bzw. die Verteilungsmuster zu erfassen. Damit gelang es, fuer praxisnahe Genauigkeitsansprueche flaechendeckender Bodenbewertungen eine Mindestzahl von Beprobungsstellen zu bestimmen, die fuer eine repraesentative Aussage ueber das

  20. Geostatistical models for air pollution

    International Nuclear Information System (INIS)

    The objective of this paper is to present geostatistical models applied to the spatial characterisation of air pollution phenomena. A concise presentation of the geostatistical methodologies is illustrated with practical examples. The case study was conducted in an underground copper-mine located on the southern of Portugal, where a biomonitoring program using lichens has been implemented. Given the characteristics of lichens as indicators of air pollution it was possible to gather a great amount of data in space, which enabled the development and application of geostatistical methodologies. The advantages of using geostatistical models compared with deterministic models, as environmental control tools, are highlighted. (author)

  1. Geostatistical analysis of potentiometric data in the Pennsylvanian aquifer of the Palo Duro Basin, Texas

    International Nuclear Information System (INIS)

    This report details a geostatistical analysis of potentiometric data from the Pennsylvanian aquifer in the Palo Duro Basin, Texas. Such an analysis is a part of an overall uncertainty analysis for a high-level waste repository in salt. Both an expected potentiometric surface and the associated standard error surface are produced. The Pennsylvanian data are found to be well explained by a linear trend with a superimposed spherical semivariogram. A cross-validation of the analysis confirms this. In addition, the cross-validation provides a point-by-point check to test for possible anomalous data. The analysis is restricted to that portion of the Pennsylvanian aquifer that lies to the southwest of the Amarillo Uplift. The Pennsylvanian is absent is some areas across the uplift and data to the northeast were not used in this analysis. The surfaces produced in that analysis are included for comparison. 9 refs., 15 figs

  2. Geostatistical analysis of soil moisture distribution in a part of Solani River catchment

    Science.gov (United States)

    Kumar, Kamal; Arora, M. K.; Hariprasad, K. S.

    2016-03-01

    The aim of this paper is to estimate soil moisture at spatial level by applying geostatistical techniques on the point observations of soil moisture in parts of Solani River catchment in Haridwar district of India. Undisturbed soil samples were collected at 69 locations with soil core sampler at a depth of 0-10 cm from the soil surface. Out of these, discrete soil moisture observations at 49 locations were used to generate a spatial soil moisture distribution map of the region. Two geostatistical techniques, namely, moving average and kriging, were adopted. Root mean square error (RMSE) between observed and estimated soil moisture at remaining 20 locations was determined to assess the accuracy of the estimated soil moisture. Both techniques resulted in low RMSE at small limiting distance, which increased with the increase in the limiting distance. The root mean square error varied from 7.42 to 9.77 in moving average method, while in case of kriging it varied from 7.33 to 9.99 indicating similar performance of the two techniques.

  3. Geostatistical Analysis on the Temporal Patterns of the Yellow Rice Borer, Tryporyza incertulas

    Institute of Scientific and Technical Information of China (English)

    YUAN Zhe-ming; WANG Zhi; HU Xiang-yue

    2005-01-01

    In order to comprehend temporal pattern of the larvae population of the yellow rice borer, Tryporyza incertulas, and provide valuable information for its forecast model, the data series of the population for each generation and the over-wintered larvae from 1960 to 1990 in Dingcheng District, Changde City, Hunan Province, were analyzed with geostatistics. The data series of total number,the 1 st generation, the 3rd generation and the over-wintered larvae year to year displayed rather better autocorrelation and prediction.The data series of generation to generation, the 2nd generation and the 4th generation year to year, however, demonstrated poor autocorrelation, especially for the 4th generation, whose autocorrelation degree was zero. The population dynamics of the yellow rice borer was obviously intermittent. A remarkable cycle of four generations, one year, was observed in the population of generation to generation. Omitting the certain generation or interposing the over-wintered larvae only resulted in a less or slight change of autocorrelation of the whole data series generation to generation. Crop system, food, climate and natural enemies, therefore, played more important roles in regulating the population dynamics than base number of the larvae. The basic techniques of geostatistics applied in analyzing temporal population dynamics were outlined.

  4. Geostatistical analysis of Landsat-TM lossy compression images in a high-performance computing environment

    Science.gov (United States)

    Pesquer, Lluís; Cortés, Ana; Serral, Ivette; Pons, Xavier

    2011-11-01

    The main goal of this study is to characterize the effects of lossy image compression procedures on the spatial patterns of remotely sensed images, as well as to test the performance of job distribution tools specifically designed for obtaining geostatistical parameters (variogram) in a High Performance Computing (HPC) environment. To this purpose, radiometrically and geometrically corrected Landsat-5 TM images from April, July, August and September 2006 were compressed using two different methods: Band-Independent Fixed-Rate (BIFR) and three-dimensional Discrete Wavelet Transform (3d-DWT) applied to the JPEG 2000 standard. For both methods, a wide range of compression ratios (2.5:1, 5:1, 10:1, 50:1, 100:1, 200:1 and 400:1, from soft to hard compression) were compared. Variogram analyses conclude that all compression ratios maintain the variogram shapes and that the higher ratios (more than 100:1) reduce variance in the sill parameter of about 5%. Moreover, the parallel solution in a distributed environment demonstrates that HPC offers a suitable scientific test bed for time demanding execution processes, as in geostatistical analyses of remote sensing images.

  5. A geostatistical analysis of IBTS data for age 2 North Sea haddock ( Melanogrammus aeglefinus ) considering daylight effects

    DEFF Research Database (Denmark)

    Wieland, Kai; Rivoirard, J.

    2001-01-01

    A geostatistical analysis of age 2 North Sea haddock catches from the 1st quarter IBTS (International Bottom Trawl Survey) 1983-1997 is presented. IBTS standard abundance indices are routinely calculated in a way that does not account explicitly for the spatial distribution and night hauls are...

  6. 4th International Geostatistics Congress

    CERN Document Server

    1993-01-01

    The contributions in this book were presented at the Fourth International Geostatistics Congress held in Tróia, Portugal, in September 1992. They provide a comprehensive account of the current state of the art of geostatistics, including recent theoretical developments and new applications. In particular, readers will find descriptions and applications of the more recent methods of stochastic simulation together with data integration techniques applied to the modelling of hydrocabon reservoirs. In other fields there are stationary and non-stationary geostatistical applications to geology, climatology, pollution control, soil science, hydrology and human sciences. The papers also provide an insight into new trends in geostatistics particularly the increasing interaction with many other scientific disciplines. This book is a significant reference work for practitioners of geostatistics both in academia and industry.

  7. Geostatistical stability analysis of co-depositional sand-thickened tailings embankments

    Energy Technology Data Exchange (ETDEWEB)

    Elkateb, T. [Thurber Engineering Ltd., Edmonton, AB (Canada); Chalaturnyk, R.; Robertson, P.K. [Alberta Univ., Edmonton, AB (Canada). Dept. of Civil and Environmental Engineering

    2003-07-01

    Co-deposition is a novel technique for the disposal of thickened tailings pockets. In co-deposition, tailings are randomly distributed within a bigger mass of sand. The oil sands industry of Alberta is currently considering using this technique. This paper describes the attempt that was made to assess the engineering behaviour of this tailing disposal system in a probabilistic analysis framework. Several realizations of co-depositional embankments were generated using geostatistical theories. In turn, the stability of the disposal system expressed in terms of factors of safety against shear failure and the associated vertical deformations was assessed using these realizations and FLAC software. A sensitivity to embankment characteristics was revealed by failure probabilities and vertical displacements, such as embankment height and side slopes, and undrained shear strength of thickened tailings. The authors proposed an allowable failure probability of 17 per cent for these embankments to avoid irreparable excessive deformations. 11 refs., 1 tab., 8 figs.

  8. Thin sand modeling based on geostatistic, uncertainty and risk analysis in Zuata Principal field, Orinoco oil belt

    Energy Technology Data Exchange (ETDEWEB)

    Cardona, W.; Aranaga, R.; Siu, P.; Perez, L. [PDVSA Petroleos de Venezuela SA, Caracas (Venezuela, Bolivarian Republic of)

    2009-07-01

    The geological modelling of the Zuata Principal field in Venezuela, particularly the Junin Block 2 belonging to Orinoco oil belt, is a challenge because of the presence of thin sand bodies in an unexploited zone. This paper presented the results obtained from a horizontal well that contacted 96 per cent of pay count sand in the field. Geostatistical modelling and sensibility analysis were used for planning the well. The model was generated by processing and interpreting information from production and exploratory fishbones. Information provided by nearby wildcat wells suggested that the proposed area was not prospective. However, information provided by several exploratory fishbones offered some possibility of draining additional reserves. From available information, facies models and uncertainty analysis were made to statistically determine the best option, notably to drill additional stratwells to obtain a more accurate characterization or apply the already obtained model for drilling a production well in the investigated area. The study showed that geological uncertainty does not only depend on how much information is available, but also on how this information can be processed and interpreted. Decision analysis provides a rational basis for dealing with risk and uncertainties. 4 refs., 7 tabs., 7 figs., 1 appendix.

  9. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  10. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    International Nuclear Information System (INIS)

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin

  11. Geostatistical analysis of tritium, groundwater age and other noble gas derived parameters in California.

    Science.gov (United States)

    Visser, A; Moran, J E; Hillegonds, Darren; Singleton, M J; Kulongoski, Justin T; Belitz, Kenneth; Esser, B K

    2016-03-15

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge are revealed in a spatial geostatistical analysis of a unique data set of tritium, noble gases and other isotopic analyses unprecedented in size at nearly 4000 samples. The correlation length of key groundwater residence time parameters varies between tens of kilometers ((3)H; age) to the order of a hundred kilometers ((4)Heter; (14)C; (3)Hetrit). The correlation length of parameters related to climate, topography and atmospheric processes is on the order of several hundred kilometers (recharge temperature; δ(18)O). Young groundwater ages that highlight regional recharge areas are located in the eastern San Joaquin Valley, in the southern Santa Clara Valley Basin, in the upper LA basin and along unlined canals carrying Colorado River water, showing that much of the recent recharge in central and southern California is dominated by river recharge and managed aquifer recharge. Modern groundwater is found in wells with the top open intervals below 60 m depth in the southeastern San Joaquin Valley, Santa Clara Valley and Los Angeles basin, as the result of intensive pumping and/or managed aquifer recharge operations. PMID:26803267

  12. The geostatistical approach for structural and stratigraphic framework analysis of offshore NW Bonaparte Basin, Australia

    Energy Technology Data Exchange (ETDEWEB)

    Wahid, Ali, E-mail: ali.wahid@live.com; Salim, Ahmed Mohamed Ahmed, E-mail: mohamed.salim@petronas.com.my; Yusoff, Wan Ismail Wan, E-mail: wanismail-wanyusoff@petronas.com.my [Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 32610 Tronoh, Perak (Malaysia); Gaafar, Gamal Ragab, E-mail: gaafargr@gmail.com [Petroleum Engineering Division, PETRONAS Carigali Sdn Bhd, Kuala Lumpur (Malaysia)

    2016-02-01

    Geostatistics or statistical approach is based on the studies of temporal and spatial trend, which depend upon spatial relationships to model known information of variable(s) at unsampled locations. The statistical technique known as kriging was used for petrophycial and facies analysis, which help to assume spatial relationship to model the geological continuity between the known data and the unknown to produce a single best guess of the unknown. Kriging is also known as optimal interpolation technique, which facilitate to generate best linear unbiased estimation of each horizon. The idea is to construct a numerical model of the lithofacies and rock properties that honor available data and further integrate with interpreting seismic sections, techtonostratigraphy chart with sea level curve (short term) and regional tectonics of the study area to find the structural and stratigraphic growth history of the NW Bonaparte Basin. By using kriging technique the models were built which help to estimate different parameters like horizons, facies, and porosities in the study area. The variograms were used to determine for identification of spatial relationship between data which help to find the depositional history of the North West (NW) Bonaparte Basin.

  13. Spatial analysis of lettuce downy mildew using geostatistics and geographic information systems.

    Science.gov (United States)

    Wu, B M; van Bruggen, A H; Subbarao, K V; Pennings, G G

    2001-02-01

    ABSTRACT The epidemiology of lettuce downy mildew has been investigated extensively in coastal California. However, the spatial patterns of the disease and the distance that Bremia lactucae spores can be transported have not been determined. During 1995 to 1998, we conducted several field- and valley-scale surveys to determine spatial patterns of this disease in the Salinas valley. Geostatistical analyses of the survey data at both scales showed that the influence range of downy mildew incidence at one location on incidence at other locations was between 80 and 3,000 m. A linear relationship was detected between semivariance and lag distance at the field scale, although no single statistical model could fit the semi-variograms at the valley scale. Spatial interpolation by the inverse distance weighting method with a power of 2 resulted in plausible estimates of incidence throughout the valley. Cluster analysis in geographic information systems on the interpolated disease incidence from different dates demonstrated that the Salinas valley could be divided into two areas, north and south of Salinas City, with high and low disease pressure, respectively. Seasonal and spatial trends along the valley suggested that the distinction between the downy mildew conducive and nonconducive areas might be determined by environmental factors. PMID:18944386

  14. Geostatistical analysis of variations in soil salinity in a typical irrigation area in Xinjiang, northwest China

    Institute of Scientific and Technical Information of China (English)

    Mamattursun Eziz; Mihrigul Anwar; XinGuo Li

    2016-01-01

    Characterizing spatial and temporal variability of soil salinity is tremendously important for a variety of agronomic and environmental concerns in arid irrigation areas. This paper reviews the characteristics and spatial and temporal variations of soil salinization in the Ili River Irrigation Area by applying a geostatistical approach. Results showed that: (1) the soil salinity varied widely, with maximum value of 28.10 g/kg and minimum value of 0.10 g/kg, and was distributed mainly at the surface soil layer. Anions were mainly SO42− and Cl−, while cations were mainly Na+and Ca2+; (2) the abundance of salinity of the root zone soil layer for different land use types was in the following order: grassland > cropland > forestland. The abundance of salinity of root zone soil layers for different periods was in the following order: March > June > Sep-tember; (3) the spherical model was the most suitable variogram model to describe the salinity of the 0–3 cm and 3–20 cm soil layers in March and June, and the 3–20 cm soil layer in September, while the exponential model was the most suitable variogram model to describe the salinity of the 0–3 cm soil layer in September. Relatively strong spatial and temporal structure existed for soil salinity due to lower nugget effects; and (4) the maps of kriged soil salinity showed that higher soil salinity was distributed in the central parts of the study area and lower soil salinity was distributed in the marginal parts. Soil salinity tended to increase from the marginal parts to the central parts across the study area. Applying the kriging method is very helpful in detecting the problematic areas and is a good tool for soil resources management. Managing efforts on the appropriate use of soil and water resources in such areas is very important for sustainable agriculture, and more attention should be paid to these areas to prevent future problems.

  15. Geostatistics for high resolution geomorphometry: from spatial continuity to surface texture

    Science.gov (United States)

    Trevisani, Sebastiano

    2015-04-01

    This presentation introduces the use of geostatistics in the context of high-resolution geomorphometry. The application of geostatistics to geomorphometry permits a shift in perspective, moving our attention more toward spatial continuity description than toward the inference of a spatial continuity model. This change in perspective opens interesting directions in the application of geostatistical methods in geomorphometry. Geostatistical methodologies have been extensively applied and adapted in the context of remote sensing, leading to many interesting applications aimed at the analysis of the complex patterns characterizing imagery. Among these applications the analysis of image texture has to be mentioned. In fact, the analysis of image texture reverts to the analysis of surface texture when the analyzed image is a raster representation of a digital terrain model. The main idea is to use spatial-continuity indices as multiscale and directional descriptors of surface texture, including the important aspect related to surface roughness. In this context we introduce some examples regarding the application of geostatistics for image analysis and surface texture characterization. We also show as in presence of complex morphological settings there is the need to use alternative indices of spatial continuity, less sensitive to hotspots and to non-stationarity that often characterize surface morphology. This introduction is mainly dedicated to univariate geostatistics; however the same concepts could be exploited by means of multivariate as well as multipoint geostatistics.

  16. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    Science.gov (United States)

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant. PMID:27104113

  17. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  18. Analysis of vadose zone tritium transport from an underground storage tank release using numerical modeling and geostatistics

    International Nuclear Information System (INIS)

    Numerical and geostatistical analyses show that the artificial smoothing effect of kriging removes high permeability flow paths from hydrogeologic data sets, reducing simulated contaminant transport rates in heterogeneous vadose zone systems. therefore, kriging alone is not recommended for estimating the spatial distribution of soil hydraulic properties for contaminant transport analysis at vadose zone sites. Vadose zone transport if modeled more effectively by combining kriging with stochastic simulation to better represent the high degree of spatial variability usually found in the hydraulic properties of field soils. However, kriging is a viable technique for estimating the initial mass distribution of contaminants in the subsurface

  19. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    Science.gov (United States)

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  20. Geostatistical upscaling of rain gauge data to support uncertainty analysis of lumped urban hydrological models

    OpenAIRE

    Muthusamy, Manoranjan; Schellart, Alma; TAIT, Simon; B. M. Heuvelink, Gerard

    2016-01-01

    In this study we develop a method to estimate the spatially averaged rainfall intensity together with associated level of uncertainty using geostatistical upscaling. Rainfall data collected from a cluster of eight paired rain gauges in a 400 x 200 m2 urban catchment are used in combination with spatial stochastic simulation to obtain optimal predictions of the spatially averaged rainfall intensity at any point in time within the urban catchment. The uncertainty in the prediction of...

  1. Genetic Geostatistical Framework for Spatial Analysis of Fine-Scale Genetic Heterogeneity in Modern Populations: Results from the KORA Study.

    Science.gov (United States)

    Diaz-Lacava, A N; Walier, M; Holler, D; Steffens, M; Gieger, C; Furlanello, C; Lamina, C; Wichmann, H E; Becker, T

    2015-01-01

    Aiming to investigate fine-scale patterns of genetic heterogeneity in modern humans from a geographic perspective, a genetic geostatistical approach framed within a geographic information system is presented. A sample collected for prospective studies in a small area of southern Germany was analyzed. None indication of genetic heterogeneity was detected in previous analysis. Socio-demographic and genotypic data of German citizens were analyzed (212 SNPs; n = 728). Genetic heterogeneity was evaluated with observed heterozygosity (H O ). Best-fitting spatial autoregressive models were identified, using socio-demographic variables as covariates. Spatial analysis included surface interpolation and geostatistics of observed and predicted patterns. Prediction accuracy was quantified. Spatial autocorrelation was detected for both socio-demographic and genetic variables. Augsburg City and eastern suburban areas showed higher H O values. The selected model gave best predictions in suburban areas. Fine-scale patterns of genetic heterogeneity were observed. In accordance to literature, more urbanized areas showed higher levels of admixture. This approach showed efficacy for detecting and analyzing subtle patterns of genetic heterogeneity within small areas. It is scalable in number of loci, even up to whole-genome analysis. It may be suggested that this approach may be applicable to investigate the underlying genetic history that is, at least partially, embedded in geographic data. PMID:26258132

  2. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  3. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  4. IN SITU NON-INVASIVE SOIL CARBON ANALYSIS: SAMPLE SIZE AND GEOSTATISTICAL CONSIDERATIONS

    International Nuclear Information System (INIS)

    I discuss a new approach for quantitative carbon analysis in soil based on INS. Although this INS method is not simple, it offers critical advantages not available with other newly emerging modalities. The key advantages of the INS system include the following: (1) It is a non-destructive method, i.e., no samples of any kind are taken. A neutron generator placed above the ground irradiates the soil, stimulating carbon characteristic gamma-ray emission that is counted by a detection system also placed above the ground. (2) The INS system can undertake multielemental analysis, so expanding its usefulness. (3) It can be used either in static or scanning modes. (4) The volume sampled by the INS method is large with a large footprint; when operating in a scanning mode, the sampled volume is continuous. (5) Except for a moderate initial cost of about $100,000 for the system, no additional expenses are required for its operation over two to three years after which a NG has to be replenished with a new tube at an approximate cost of $10,000, this regardless of the number of sites analyzed. In light of these characteristics, the INS system appears invaluable for monitoring changes in the carbon content in the field. For this purpose no calibration is required; by establishing a carbon index, changes in carbon yield can be followed with time in exactly the same location, thus giving a percent change. On the other hand, with calibration, it can be used to determine the carbon stock in the ground, thus estimating the soil's carbon inventory. However, this requires revising the standard practices for deciding upon the number of sites required to attain a given confidence level, in particular for the purposes of upward scaling. Then, geostatistical considerations should be incorporated in considering properly the averaging effects of the large volumes sampled by the INS system that would require revising standard practices in the field for determining the number of spots to be

  5. IN SITU NON-INVASIVE SOIL CARBON ANALYSIS: SAMPLE SIZE AND GEOSTATISTICAL CONSIDERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2005-04-01

    I discuss a new approach for quantitative carbon analysis in soil based on INS. Although this INS method is not simple, it offers critical advantages not available with other newly emerging modalities. The key advantages of the INS system include the following: (1) It is a non-destructive method, i.e., no samples of any kind are taken. A neutron generator placed above the ground irradiates the soil, stimulating carbon characteristic gamma-ray emission that is counted by a detection system also placed above the ground. (2) The INS system can undertake multielemental analysis, so expanding its usefulness. (3) It can be used either in static or scanning modes. (4) The volume sampled by the INS method is large with a large footprint; when operating in a scanning mode, the sampled volume is continuous. (5) Except for a moderate initial cost of about $100,000 for the system, no additional expenses are required for its operation over two to three years after which a NG has to be replenished with a new tube at an approximate cost of $10,000, this regardless of the number of sites analyzed. In light of these characteristics, the INS system appears invaluable for monitoring changes in the carbon content in the field. For this purpose no calibration is required; by establishing a carbon index, changes in carbon yield can be followed with time in exactly the same location, thus giving a percent change. On the other hand, with calibration, it can be used to determine the carbon stock in the ground, thus estimating the soil's carbon inventory. However, this requires revising the standard practices for deciding upon the number of sites required to attain a given confidence level, in particular for the purposes of upward scaling. Then, geostatistical considerations should be incorporated in considering properly the averaging effects of the large volumes sampled by the INS system that would require revising standard practices in the field for determining the number of spots to

  6. The use of meshless method and geostatistical analysis in transport modelling

    International Nuclear Information System (INIS)

    The disposal of radioactive waste in geological formations is of great importance for nuclear safety. A number of key geosphere processes need to be considered when predicting the movement of radionuclides through the geosphere. The goal of this research is to investigate the influence of spatial variability of geo-hydrological data on the reliability and accuracy of computational modelling. We chose the Kansa meshless method that uses radial basis functions as the mathematical solution technique. The aim of this study is to determine the average and sample variance of radionuclide concentration with regard to spatial variability of hydraulic conductivity modelled by geostatistical approach.(author)

  7. Relationship between exports, imports, and economic growth in France: evidence from cointegration analysis and Granger causality with using geostatistical models

    OpenAIRE

    Amiri, Arshia; Gerdtham, Ulf-G

    2011-01-01

    This paper introduces a new way of investigating linear and nonlinear Granger causality between exports, imports and economic growth in France over the period 1961-2006 with using geostatistical models (kiriging and inverse distance weighting). Geostatistical methods are the ordinary methods for forecasting the locations and making map in water engineerig, environment, environmental pollution, mining, ecology, geology and geography. Although, this is the first time which geostatistics knowle...

  8. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  9. Precipitation estimation in mountainous terrain using multivariate geostatistics. Part I: structural analysis

    Science.gov (United States)

    Hevesi, Joseph A.; Istok, Jonathan D.; Flint, Alan L.

    1992-01-01

    Values of average annual precipitation (AAP) are desired for hydrologic studies within a watershed containing Yucca Mountain, Nevada, a potential site for a high-level nuclear-waste repository. Reliable values of AAP are not yet available for most areas within this watershed because of a sparsity of precipitation measurements and the need to obtain measurements over a sufficient length of time. To estimate AAP over the entire watershed, historical precipitation data and station elevations were obtained from a network of 62 stations in southern Nevada and southeastern California. Multivariate geostatistics (cokriging) was selected as an estimation method because of a significant (p = 0.05) correlation of r = .75 between the natural log of AAP and station elevation. A sample direct variogram for the transformed variable, TAAP = ln [(AAP) 1000], was fitted with an isotropic, spherical model defined by a small nugget value of 5000, a range of 190 000 ft, and a sill value equal to the sample variance of 163 151. Elevations for 1531 additional locations were obtained from topographic maps to improve the accuracy of cokriged estimates. A sample direct variogram for elevation was fitted with an isotropic model consisting of a nugget value of 5500 and three nested transition structures: a Gaussian structure with a range of 61 000 ft, a spherical structure with a range of 70 000 ft, and a quasi-stationary, linear structure. The use of an isotropic, stationary model for elevation was considered valid within a sliding-neighborhood radius of 120 000 ft. The problem of fitting a positive-definite, nonlinear model of coregionalization to an inconsistent sample cross variogram for TAAP and elevation was solved by a modified use of the Cauchy-Schwarz inequality. A selected cross-variogram model consisted of two nested structures: a Gaussian structure with a range of 61 000 ft and a spherical structure with a range of 190 000 ft. Cross validation was used for model selection and for

  10. Geostatistical analysis of groundwater chemistry in Japan. Evaluation of the base case groundwater data set

    International Nuclear Information System (INIS)

    The groundwater chemistry is one of important geological environment for performance assessment of high level radioactive disposal system. This report describes the results of geostatistical analysis of groundwater chemistry in Japan. Over 15,000 separate groundwater analyses have been collected of deep Japanese groundwaters for the purpose of evaluating the range of geochemical conditions for geological radioactive waste repositories in Japan. The significance to issues such as radioelement solubility limits, sorption, corrosion of overpack, behavior of compacted clay buffers, and many other factors involved in safety assessment. It is important therefore, that a small, but representative set of groundwater types be identified so that defensible models and data for generic repository performance assessment can be established. Principal component analysis (PCA) is used to categorize representative deep groundwater types from this extensive data set. PCA is a multi-variate statistical analysis technique, similar to factor analysis or eigenvector analysis, designed to provide the best possible resolution of the variability within multi-variate data sets. PCA allows the graphical inspection of the most important similarities (clustering) and differences among samples, based on simultaneous consideration of all variables in the dataset, in a low dimensionality plot. It also allows the analyst to determine the reasons behind any pattern that is observed. In this study, PCA has been aided by hierarchical cluster analysis (HCA), in which statistical indices of similarity among multiple samples are used to distinguish distinct clusters of samples. HCA allows the natural, a priori, grouping of data into clusters showing similar attributes and is graphically represented in a dendrogram Pirouette is the multivariate statistical software package used to conduct the PCA and HCA for the Japanese groundwater dataset. An audit of the initial 15,000 sample dataset on the basis of

  11. Analysis of Geostatistical and Deterministic Techniques in the Spatial Variation of Groundwater Depth in the North-western part of Bangladesh

    Directory of Open Access Journals (Sweden)

    Ibrahim Hassan

    2016-06-01

    Full Text Available Various geostatistical and deterministic techniques were used to analyse the spatial variations of groundwater depths. Two different geostatistical methods of ordinary kriging and co-kriging with four semivariogram models, spherical, exponential, circular, Gaussian, and four deterministic methods which are inverse distance weighted (IDW, global polynomial interpolation (GPI, local Polynomial Interpolation (LPI, radial basis function (RBF were used for the estimation of groundwater depths. The study area is in the three Northwestern districts of Bangladesh. Groundwater depth data were recorded from 132 observation wells in the study area over a period of 6 years (2004 to 2009 was considered for the analysis. The spatial interpolation of groundwater depths was then performed using the best-fit model which is geostatistical model selected by comparing the observed RMSE values predicted by the geostatistical and deterministic models and the empirical semi-variogram models. Out of the four semi-variogram models, spherical semi-variogram with cokriging model was considered as the best fitted model for the study area. Result of sensitivity analysis conducted on the input parameters shows that inputs have a strong influence on groundwater levels and the statistical indicators of RMSE and ME suggest that the Co-kriging work best with percolation in predicting the average groundwater table of the study area.

  12. Bayesian Geostatistical Analysis and Ecoclimatic Determinants of Corynebacterium pseudotuberculosis Infection among Horses

    Science.gov (United States)

    Boysen, Courtney; Davis, Elizabeth G.; Beard, Laurie A.; Lubbers, Brian V.; Raghavan, Ram K.

    2015-01-01

    Kansas witnessed an unprecedented outbreak in Corynebacterium pseudotuberculosis infection among horses, a disease commonly referred to as pigeon fever during fall 2012. Bayesian geostatistical models were developed to identify key environmental and climatic risk factors associated with C. pseudotuberculosis infection in horses. Positive infection status among horses (cases) was determined by positive test results for characteristic abscess formation, positive bacterial culture on purulent material obtained from a lanced abscess (n = 82), or positive serologic evidence of exposure to organism (≥1:512)(n = 11). Horses negative for these tests (n = 172)(controls) were considered free of infection. Information pertaining to horse demographics and stabled location were obtained through review of medical records and/or contact with horse owners via telephone. Covariate information for environmental and climatic determinants were obtained from USDA (soil attributes), USGS (land use/land cover), and NASA MODIS and NASA Prediction of Worldwide Renewable Resources (climate). Candidate covariates were screened using univariate regression models followed by Bayesian geostatistical models with and without covariates. The best performing model indicated a protective effect for higher soil moisture content (OR = 0.53, 95% CrI = 0.25, 0.71), and detrimental effects for higher land surface temperature (≥35°C) (OR = 2.81, 95% CrI = 2.21, 3.85) and habitat fragmentation (OR = 1.31, 95% CrI = 1.27, 2.22) for C. pseudotuberculosis infection status in horses, while age, gender and breed had no effect. Preventative and ecoclimatic significance of these findings are discussed. PMID:26473728

  13. Geostatistical Analysis of Spatial Variability of Mineral Abundance and Kd in Frenchman Flat, NTS, Alluvium

    Energy Technology Data Exchange (ETDEWEB)

    Carle, S F; Zavarin, M; Pawloski, G A

    2002-11-01

    LLNL hydrologic source term modeling at the Cambric site (Pawloski et al., 2000) showed that retardation of radionuclide transport is sensitive to the distribution and amount of radionuclide sorbing minerals. While all mineralogic information available near the Cambric site was used in these early simulations (11 mineral abundance analyses from UE-5n and 9 from RNM-l), these older data sets were qualitative in nature, with detection limits too high to accurately measure many of the important radionuclide sorbing minerals (e.g. iron oxide). Also, the sparse nature of the mineral abundance data permitted only a hypothetical description of the spatial distribution of radionuclide sorbing minerals. Yet, the modeling results predicted that the spatial distribution of sorbing minerals would strongly affect radionuclide transport. Clearly, additional data are needed to improve understanding of mineral abundances and their spatial distributions if model predictions in Frenchman Flat are to be defensible. This report evaluates new high-resolution quantitative X-Ray Diffraction (XRD) data on mineral distributions and their abundances from core samples recently collected from drill hole ER-5-4. The total of 94 samples from ER-5-4 were collected at various spacings to enable evaluation of spatial variability at a variety of spatial scales as small as 0.3 meters and up to hundreds of meters. Additional XRD analyses obtained from drillholes UE-Sn, ER-5-3, and U-11g-1 are used to augment evaluation of vertical spatial variability and permit some evaluation of lateral spatial variability. A total of 163 samples are evaluated. The overall goal of this study is to understand and characterize the spatial variation of sorbing minerals in Frenchman Flat alluvium using geostatistical techniques, with consideration for the potential impact on reactive transport of radionuclides. To achieve this goal requires an effort to ensure that plausible geostatistical models are used to

  14. A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data

    KAUST Repository

    Liang, Faming

    2013-03-01

    The Gaussian geostatistical model has been widely used in modeling of spatial data. However, it is challenging to computationally implement this method because it requires the inversion of a large covariance matrix, particularly when there is a large number of observations. This article proposes a resampling-based stochastic approximation method to address this challenge. At each iteration of the proposed method, a small subsample is drawn from the full dataset, and then the current estimate of the parameters is updated accordingly under the framework of stochastic approximation. Since the proposed method makes use of only a small proportion of the data at each iteration, it avoids inverting large covariance matrices and thus is scalable to large datasets. The proposed method also leads to a general parameter estimation approach, maximum mean log-likelihood estimation, which includes the popular maximum (log)-likelihood estimation (MLE) approach as a special case and is expected to play an important role in analyzing large datasets. Under mild conditions, it is shown that the estimator resulting from the proposed method converges in probability to a set of parameter values of equivalent Gaussian probability measures, and that the estimator is asymptotically normally distributed. To the best of the authors\\' knowledge, the present study is the first one on asymptotic normality under infill asymptotics for general covariance functions. The proposed method is illustrated with large datasets, both simulated and real. Supplementary materials for this article are available online. © 2013 American Statistical Association.

  15. Intergration of GIS Using GEOSTAtistical INterpolation Techniques (Kriging) (GEOSTAINT-K) in Deterministic Models for Landslide Susceptibility Analysis (LSA) at Kota Kinabalu, Sabah, Malaysia

    OpenAIRE

    Rodeano Roslee; Tajul Anuar Jamaluddin; Mustapa Abd Talip

    2012-01-01

    A practical application for landslide susceptibility analysis (LSA) based on GEOSTAtistical INterpolation Techniques (Kriging) (GEOSTAINT-K) for a deterministic model was used to calculate the factor of safety (FOS) and failure probabilities for the area of Kota Kinabalu, Sabah. In this paper, the LSA value can be expressed by a FOS value, which is the ratio of forces that make the slope fail and those that prevent the slope from failing. A geotechnical engineering properties data base has be...

  16. Geostatistical Modeling of Pore Velocity

    Energy Technology Data Exchange (ETDEWEB)

    Devary, J.L.; Doctor, P.G.

    1981-06-01

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses.

  17. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  18. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  19. Reducing complexity of inverse problems using geostatistical priors

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Mosegaard, Klaus; Cordua, Knud Skou

    posterior sample, can be reduced significantly using informed priors based on geostatistical models. We discuss two approaches to include such geostatistically based prior information. One is based on a parametric description of the prior likelihood that applies to 2-point based statistical models, and...

  20. Seismic forecast using geostatistics

    International Nuclear Information System (INIS)

    The main idea of this research direction consists in the special way of constructing a new type of mathematical function as being a correlation between a computed statistical quantity and another physical quantity. This type of function called 'position function' was taken over by the authors of this study in the field of seismology with the hope of solving - at least partially - the difficult problem of seismic forecast. The geostatistic method of analysis focuses on the process of energy accumulation in a given seismic area, completing this analysis by a so-called loading function. This function - in fact a temporal function - describes the process of energy accumulation during a seismic cycle from a given seismic area. It was possible to discover a law of evolution of the seismic cycles that was materialized in a so-called characteristic function. This special function will help us to forecast the magnitude and the occurrence moment of the largest earthquake in the analysed area. Since 2000, the authors have been evolving to a new stage of testing: real - time analysis, in order to verify the quality of the method. There were five large earthquakes forecasts. (authors)

  1. Geostatistical Analysis of Tritium, 3H/3He Age and Noble Gas Derived Parameters in California Groundwater

    Science.gov (United States)

    Visser, A.; Singleton, M. J.; Moran, J. E.; Fram, M. S.; Kulongoski, J. T.; Esser, B. K.

    2014-12-01

    Key characteristics of California groundwater systems related to aquifer vulnerability, sustainability, recharge locations and mechanisms, and anthropogenic impact on recharge, are revealed in a spatial geostatistical analysis of the data set of tritium, dissolved noble gas and helium isotope analyses collected for the California State Water Resources Control Board's Groundwater Ambient Monitoring and Assessment (GAMA) and California Aquifer Susceptibility (CAS) programs. Over 4,000 tritium and noble gas analyses are available from wells across California. 25% of the analyzed samples contained less than 1 pCi/L indicating recharge occurred before 1950. The correlation length of tritium concentration is 120 km. Nearly 50% of the wells show a significant component of terrigenic helium. Over 50% of these samples show a terrigenic helium isotope ratio (Rter) that is significantly higher than the radiogenic helium isotope ratio (Rrad = 2×10-8). Rter values of more than three times the atmospheric isotope ratio (Ra = 1.384×10-6) are associated with known faults and volcanic provinces in Northern California. In the Central Valley, Rter varies from radiogenic to 2.25 Ra, complicating 3H/3He dating. The Rter was mapped by kriging, showing a correlation length of less than 50 km. The local predicted Rter was used to separate tritiogenic from atmospheric and terrigenic 3He. Regional groundwater recharge areas, indicated by young groundwater ages, are located in the southern Santa Clara Basin and in the upper LA basin and in the eastern San Joaquin Valley and along unlined canals carrying Colorado River water. Recharge in California is dominated by agricultural return flows, river recharge and managed aquifer recharge rather than precipitation excess. Combined application of noble gases and other groundwater tracers reveal the impact of engineered groundwater recharge and prove invaluable for the study of complex groundwater systems. This work was performed under the

  2. Satellite Magnetic Residuals Investigated With Geostatistical Methods

    DEFF Research Database (Denmark)

    Fox Maule, Chaterine; Mosegaard, Klaus; Olsen, Nils

    2005-01-01

    The geomagnetic field varies on a variety of time- and length scales, which are only rudimentarily considered in most present field models. The part of the observed field that cannot be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty...... (which consists of measurement errors and unmodeled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyze the residuals of the Oersted (09d/04) field model (www.dsri.dk/Oersted/Field models/IGRF 2005 candidates/), which is based...... on 5 years of Ørsted and CHAMP data and includes secular variation and acceleration, as well as low-degree external (magnetospheric) and induced fields. The analysis is done in order to find the statistical behavior of the space-time structure of the residuals, as a proxy for the data covariances. Once...

  3. Women in applied behavior analysis

    OpenAIRE

    McSweeney, Frances K.; Donahoe, Patricia; Swindell, Samantha

    2000-01-01

    The status of women in applied behavior analysis was examined by comparing the participation of women in the Journal of Applied Behavior Analysis (JABA) to their participation in three similar journals. For all journals, the percentage of articles with at least one female author, the percentage of authors who are female, and the percentage of articles with a female first author increased from 1978 to 1997. Participation by women in JABA was equal to or greater than participation by women in t...

  4. Geostatistical Methods in R

    Directory of Open Access Journals (Sweden)

    Adéla Volfová

    2012-10-01

    Full Text Available Geostatistics is a scientific field which provides methods for processing spatial data.  In our project, geostatistics is used as a tool for describing spatial continuity and making predictions of some natural phenomena. An open source statistical project called R is used for all calculations. Listeners will be provided with a brief introduction to R and its geostatistical packages and basic principles of kriging and cokriging methods. Heavy mathematical background is omitted due to its complexity. In the second part of the presentation, several examples are shown of how to make a prediction in the whole area of interest where observations were made in just a few points. Results of these methods are compared.

  5. Chapter J: Issues and challenges in the application of geostatistics and spatial-data analysis to the characterization of sand-and-gravel resources

    Science.gov (United States)

    Hack, Daniel R.

    2005-01-01

    Sand-and-gravel (aggregate) resources are a critical component of the Nation's infrastructure, yet aggregate-mining technologies lag far behind those of metalliferous mining and other sectors. Deposit-evaluation and site-characterization methodologies are antiquated, and few serious studies of the potential applications of spatial-data analysis and geostatistics have been published. However, because of commodity usage and the necessary proximity of a mine to end use, aggregate-resource exploration and evaluation differ fundamentally from comparable activities for metalliferous ores. Acceptable practices, therefore, can reflect this cruder scale. The increasing use of computer technologies is colliding with the need for sand-and-gravel mines to modernize and improve their overall efficiency of exploration, mine planning, scheduling, automation, and other operations. The emergence of megaquarries in the 21st century will also be a contributing factor. Preliminary research into the practical applications of exploratory-data analysis (EDA) have been promising. For example, EDA was used to develop a linear-regression equation to forecast freeze-thaw durability from absorption values for Lower Paleozoic carbonate rocks mined for crushed aggregate from quarries in Oklahoma. Applications of EDA within a spatial context, a method of spatial-data analysis, have also been promising, as with the investigation of undeveloped sand-and-gravel resources in the sedimentary deposits of Pleistocene Lake Bonneville, Utah. Formal geostatistical investigations of sand-and-gravel deposits are quite rare, and the primary focus of those studies that have been completed is on the spatial characterization of deposit thickness and its subsequent effect on ore reserves. A thorough investigation of a gravel deposit in an active aggregate-mining area in central Essex, U.K., emphasized the problems inherent in the geostatistical characterization of particle-size-analysis data. Beyond such factors

  6. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPAK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    Science.gov (United States)

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. he purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. he programs were written so that they can ...

  7. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    Science.gov (United States)

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  8. Analysis of field-scale spatial correlations and variations of soil nutrients using geostatistics.

    Science.gov (United States)

    Liu, Ruimin; Xu, Fei; Yu, Wenwen; Shi, Jianhan; Zhang, Peipei; Shen, Zhenyao

    2016-02-01

    Spatial correlations and soil nutrient variations are important for soil nutrient management. They help to reduce the negative impacts of agricultural nonpoint source pollution. Based on the sampled available nitrogen (AN), available phosphorus (AP), and available potassium (AK), soil nutrient data from 2010, the spatial correlation, was analyzed, and the probabilities of the nutrient's abundance or deficiency were discussed. This paper presents a statistical approach to spatial analysis, the spatial correlation analysis (SCA), which was originally developed for describing heterogeneity in the presence of correlated variation and based on ordinary kriging (OK) results. Indicator kriging (IK) was used to assess the susceptibility of excess of soil nutrients based on crop needs. The kriged results showed there was a distinct spatial variability in the concentration of all three soil nutrients. High concentrations of these three soil nutrients were found near Anzhou. As the distance from the center of town increased, the concentration of the soil nutrients gradually decreased. Spatially, the relationship between AN and AP was negative, and the relationship between AP and AK was not clear. The IK results showed that there were few areas with a risk of AN and AP overabundance. However, almost the entire study region was at risk of AK overabundance. Based on the soil nutrient distribution results, it is clear that the spatial variability of the soil nutrients differed throughout the study region. This spatial soil nutrient variability might be caused by different fertilizer types and different fertilizing practices. PMID:26832723

  9. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...... been driven by applied work. After laying out CA's standard practices of data treatment and analysis, this article takes up the role of comparison as a fundamental analytical strategy and reviews recent developments into cross-linguistic and cross-cultural directions. The remaining article focuses...... on learning and development. In conclusion, we address some emerging themes in the relationship of CA and applied linguistics, including the role of multilingualism, standard social science methods as research objects, CA's potential for direct social intervention, and increasing efforts to complement CA...

  10. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  11. Mapping, Bayesian Geostatistical Analysis and Spatial Prediction of Lymphatic Filariasis Prevalence in Africa

    Science.gov (United States)

    Slater, Hannah; Michael, Edwin

    2013-01-01

    There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account

  12. Accuracy and uncertainty analysis of soil Bbf spatial distribution estimation at a coking plant-contaminated site based on normalization geostatistical technologies.

    Science.gov (United States)

    Liu, Geng; Niu, Junjie; Zhang, Chao; Guo, Guanlin

    2015-12-01

    Data distribution is usually skewed severely by the presence of hot spots in contaminated sites. This causes difficulties for accurate geostatistical data transformation. Three types of typical normal distribution transformation methods termed the normal score, Johnson, and Box-Cox transformations were applied to compare the effects of spatial interpolation with normal distribution transformation data of benzo(b)fluoranthene in a large-scale coking plant-contaminated site in north China. Three normal transformation methods decreased the skewness and kurtosis of the benzo(b)fluoranthene, and all the transformed data passed the Kolmogorov-Smirnov test threshold. Cross validation showed that Johnson ordinary kriging has a minimum root-mean-square error of 1.17 and a mean error of 0.19, which was more accurate than the other two models. The area with fewer sampling points and that with high levels of contamination showed the largest prediction standard errors based on the Johnson ordinary kriging prediction map. We introduce an ideal normal transformation method prior to geostatistical estimation for severely skewed data, which enhances the reliability of risk estimation and improves the accuracy for determination of remediation boundaries. PMID:26300353

  13. Geostatistical Analysis of Winter Rainfall for 2013 in Eastern Black Sea Basin, Turkey (comparison of the past status and future projections)

    Science.gov (United States)

    Ustaoglu, Beyza

    2014-05-01

    Rainfall is one of the most important climatic factor for environmental studies. Several methods (Thiessen polygon, Inverse Distance Weighting (IDW) and Kriging etc.) have been used by researchers for spatial interpolation of rainfall data. Kriging is a geostatistical method which is based on spatial correlation between neighbouring observations to predict attribute values at unsampled locations. The study area, Eastern Black Sea Basin is one of the highest rainfall accumulations in Turkey according to the measured station data (1942 - 2011). Eastern Black Sea Basin is the only basin in Turkey with an increase amount of winter (October, November, December) rainfall for 2013 in comparison to the long term mean and previous year winter rainfall. Regarding to the future projections (Ustaoglu, 2011), this basin has one of the strongest increasing trend according to the A2 scenario analysis obtained from RegCM3 regional climate model during the ten years periods (2011 - 2100). In this study, 2013 winter rainfall in the basin is highlighted and compared with the past and future rainfall conditions of the basin. Keywords: Geostatistical Analysis, Winter Rainfall, Eastern Black Sea Basin

  14. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  15. Integration of geologic interpretation into geostatistical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carle, S.F.

    1997-06-01

    Embedded Markov chain analysis has been used to quantify geologic interpretation of juxtapositional tendencies of geologic facies. Such interpretations can also be translated into continuous-lag Markov chain models of spatial variability for use in geostatistical simulation of facies architecture.

  16. Overview and technical and practical aspects for use of geostatistics in hazardous-, toxic-, and radioactive-waste-site investigations

    International Nuclear Information System (INIS)

    Technical and practical aspects of applying geostatistics are developed for individuals involved in investigation at hazardous-, toxic-, and radioactive-waste sites. Important geostatistical concepts, such as variograms and ordinary, universal, and indicator kriging, are described in general terms for introductory purposes and in more detail for practical applications. Variogram modeling using measured ground-water elevation data is described in detail to illustrate principles of stationarity, anisotropy, transformations, and cross validation. Several examples of kriging applications are described using ground-water-level elevations, bedrock elevations, and ground-water-quality data. A review of contemporary literature and selected public domain software associated with geostatistics also is provided, as is a discussion of alternative methods for spatial modeling, including inverse distance weighting, triangulation, splines, trend-surface analysis, and simulation

  17. Geostatistical analysis of disease data: accounting for spatial support and population density in the isopleth mapping of cancer mortality risk using area-to-point Poisson kriging

    Directory of Open Access Journals (Sweden)

    Goovaerts Pierre

    2006-11-01

    Full Text Available Abstract Background Geostatistical techniques that account for spatially varying population sizes and spatial patterns in the filtering of choropleth maps of cancer mortality were recently developed. Their implementation was facilitated by the initial assumption that all geographical units are the same size and shape, which allowed the use of geographic centroids in semivariogram estimation and kriging. Another implicit assumption was that the population at risk is uniformly distributed within each unit. This paper presents a generalization of Poisson kriging whereby the size and shape of administrative units, as well as the population density, is incorporated into the filtering of noisy mortality rates and the creation of isopleth risk maps. An innovative procedure to infer the point-support semivariogram of the risk from aggregated rates (i.e. areal data is also proposed. Results The novel methodology is applied to age-adjusted lung and cervix cancer mortality rates recorded for white females in two contrasted county geographies: 1 state of Indiana that consists of 92 counties of fairly similar size and shape, and 2 four states in the Western US (Arizona, California, Nevada and Utah forming a set of 118 counties that are vastly different geographical units. Area-to-point (ATP Poisson kriging produces risk surfaces that are less smooth than the maps created by a naïve point kriging of empirical Bayesian smoothed rates. The coherence constraint of ATP kriging also ensures that the population-weighted average of risk estimates within each geographical unit equals the areal data for this unit. Simulation studies showed that the new approach yields more accurate predictions and confidence intervals than point kriging of areal data where all counties are simply collapsed into their respective polygon centroids. Its benefit over point kriging increases as the county geography becomes more heterogeneous. Conclusion A major limitation of choropleth

  18. The use of geostatistics in the study of floral phenology of Vulpia geniculata (L.) link.

    Science.gov (United States)

    León Ruiz, Eduardo J; García Mozo, Herminia; Domínguez Vilches, Eugenio; Galán, Carmen

    2012-01-01

    Traditionally phenology studies have been focused on changes through time, but there exist many instances in ecological research where it is necessary to interpolate among spatially stratified samples. The combined use of Geographical Information Systems (GIS) and Geostatistics can be an essential tool for spatial analysis in phenological studies. Geostatistics are a family of statistics that describe correlations through space/time and they can be used for both quantifying spatial correlation and interpolating unsampled points. In the present work, estimations based upon Geostatistics and GIS mapping have enabled the construction of spatial models that reflect phenological evolution of Vulpia geniculata (L.) Link throughout the study area during sampling season. Ten sampling points, scattered throughout the city and low mountains in the "Sierra de Córdoba" were chosen to carry out the weekly phenological monitoring during flowering season. The phenological data were interpolated by applying the traditional geostatitical method of Kriging, which was used to elaborate weekly estimations of V. geniculata phenology in unsampled areas. Finally, the application of Geostatistics and GIS to create phenological maps could be an essential complement in pollen aerobiological studies, given the increased interest in obtaining automatic aerobiological forecasting maps. PMID:22629169

  19. Classification of resources with the aid of geostatistics

    International Nuclear Information System (INIS)

    The book gives an introduction to applied geostatistics. Examples of application in uranium exploration, hard coal mining, and exploration of marine ore sludges are described. A GDMB classification of resources is presented as well as a list of the basic concepts of geostatistics in German, English and French. Separate documentary analyses have been carried out on the chapters on uranium exploration and coal mining. (MSK)

  20. A comparison of geostatistically-based inverse techniques for use in performance assessment analysis at the WIPP site results from test case No. 1

    International Nuclear Information System (INIS)

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to described flow and transport in the Culebra need to be addressed. A 'Geostatistics Test Problem' is being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No 1. Of the five techniques compared, those based on the linearized form of the groundwater flow equation exhibited less bias and less spread in their GWTT distribution functions; the semi-analytical method had the least bias. While the results are not sufficient to make generalizations about which techniques may be better suited for the WIPP PA (only one test case has been exercised), analysis of the data from this test case provides some indication about the relative importance of other aspects of the flow modeling (besides inverse method or geostatistical approach) in PA. These ancillary analyses examine the effect of gridding and the effect of boundary conditions on the groundwater travel time estimates

  1. Geostatistical estimation of uranium ore reserves

    International Nuclear Information System (INIS)

    Since the early 1960s geostatistics have been applied for uranium ore reserves calculation, and, as in the case for other minerals, has been considerably developed. This is because of the ability of geostatistics to quantify clearly the main ore reserve questions, i.e. which are the geological (or in situ) ore reserves; what are the effects of a mining selection (recoverable reserves); and what is the precision of these estimates. These different concepts are presented in this paper as applied to uranium ore deposits. First, the specific problem of uranium is analysed, which is the importance of indirect measurements of grade by radiometry logging, which introduces imprecisions generally higher in the uranium grades and tonnages than those coming from the ore reserve calculation itself. (author)

  2. 7th International Geostatistics Congress

    CERN Document Server

    Deutsch, Clayton

    2005-01-01

    The conference proceedings consist of approximately 120 technical papers presented at the Seventh International Geostatistics Congress held in Banff, Alberta, Canada in 2004. All the papers were reviewed by an international panel of leading geostatisticians. The five major sections are: theory, mining, petroleum, environmental and other applications. The first section showcases new and innovative ideas in the theoretical development of geostatistics as a whole; these ideas will have large impact on (1) the directions of future geostatistical research, and (2) the conventional approaches to heterogeneity modelling in a wide range of natural resource industries. The next four sections are focused on applications and innovations relating to the use of geostatistics in specific industries. Historically, mining, petroleum and environmental industries have embraced the use of geostatistics for uncertainty characterization, so these three industries are identified as major application areas. The last section is open...

  3. rasterEngine: an easy-to-use R function for applying complex geostatistical models to raster datasets in a parallel computing environment

    Science.gov (United States)

    Greenberg, J. A.

    2013-12-01

    As geospatial analyses progress in tandem with increasing availability of large complex geographic data sets and high performance computing (HPC), there is an increasing gap in the ability of end-user tools to take advantage of these advances. Specifically, the practical implementation of complex statistical models on large gridded geographic datasets (e.g. remote sensing analysis, species distribution mapping, topographic transformations, and local neighborhood analyses) currently requires a significant knowledge base. A user must be proficient in the chosen model as well as the nuances of scientific programming, raster data models, memory management, parallel computing, and system design. This is further complicated by the fact that many of the cutting-edge analytical tools were developed for non-geospatial datasets and are not part of standard GIS packages, but are available in scientific computing languages such as R and MATLAB. We present a computing function 'rasterEngine' written in the R scientific computing language and part of the CRAN package 'spatial.tools' with these challenges in mind. The goal of rasterEngine is to allow a user to quickly develop and apply analytical models within the R computing environment to arbitrarily large gridded datasets, taking advantage of available parallel computing resources, and without requiring a deep understanding of HPC and raster data models. We provide several examples of rasterEngine being used to solve common grid based analyses, including remote sensing image analyses, topographic transformations, and species distribution modeling. With each example, the parallel processing performance results are presented.

  4. Modelling infiltration and geostatistical analysis of spatial variability of sorptivity and transmissivity in a flood spreading area

    Energy Technology Data Exchange (ETDEWEB)

    Haghighi-Fashi, F.; Sharifi, F.; Kamali, K.

    2014-06-01

    Knowledge of infiltration characteristics is useful in hydrological studies of agricultural soils. Soil hydraulic parameters such as steady infiltration rate, sorptivity, and transmissivity can exhibit appreciable spatial variability. The main objectives of this study were to examine several mathematical models of infiltration and to analyze the spatial variability of observed final infiltration rate, estimated sorptivity and estimated transmissivity in flood spreading and control areas in Ilam province, Iran. The suitability of geostatistics to describe such spatial variability was assessed using data from 30 infiltration measurements sampled along three lines. The Horton model provided the most accurate simulation of infiltration considering all measurements and the Philips two-term model provided less accurate simulation. A comparison of the measured values and the estimated final infiltration rates showed that the Kostiakov- Lewis, Kostiakov, and SCS models could not estimate the final infiltration rate as well as Horton model. Estimated sorptivity and transmissivity parameters of the Philips two-term model and final infiltration rate had spatial structure, and were considered to be structural variables over the transect pattern. The Gaussian model provided the best-fit theoretical variogram for these three parameters. Variogram values ranged from 99 and 88 m for sorptivity and final infiltration rate to 686 (spherical) and 384 m (Gaussian) for transmissivity. Sorptivity, transmissivity and final infiltration attributes showed a high degree of spatial dependence, being 0.99, 0.81 and 1, respectively. Results showed that kriging could be used to predict the studied parameters in the study area. (Author)

  5. Spatial variability of isoproturon mineralizing activity within an agricultural field: Geostatistical analysis of simple physicochemical and microbiological soil parameters

    International Nuclear Information System (INIS)

    We assessed the spatial variability of isoproturon mineralization in relation to that of physicochemical and biological parameters in fifty soil samples regularly collected along a sampling grid delimited across a 0.36 ha field plot (40 x 90 m). Only faint relationships were observed between isoproturon mineralization and the soil pH, microbial C biomass, and organic nitrogen. Considerable spatial variability was observed for six of the nine parameters tested (isoproturon mineralization rates, organic nitrogen, genetic structure of the microbial communities, soil pH, microbial biomass and equivalent humidity). The map of isoproturon mineralization rates distribution was similar to that of soil pH, microbial biomass, and organic nitrogen but different from those of structure of the microbial communities and equivalent humidity. Geostatistics revealed that the spatial heterogeneity in the rate of degradation of isoproturon corresponded to that of soil pH and microbial biomass. - In field spatial variation of isoproturon mineralization mainly results from the spatial heterogeneity of soil pH and microbial C biomass

  6. Geostatistics for fracture characterization

    International Nuclear Information System (INIS)

    As the critical role of fractures has become more apparent in fluid flow and contaminant transport studies, the characterization of fracture networks has received considerable attention in a wide variety of applications such as nuclear waste repository design. The application of geostatistics to fracture characterization has traditionally involved modelling fractures as thin disks; assumptions about the frequency, orientation, length and width of these disks allow the construction of a 3D model of the fracture network. This paper examines alternatives whose statistical parameters are more relevant for contaminant transport studies and are also easier to infer and validate. A new algorithm for conditional simulation is presented, one that is able to honor multipoint statistics through annealing. By honoring statistics that capture with two-point spatial convariances, this algorithm offers an important new tool not only for the specific problem of fracture characterization but also for the more general problem of spatial simulation

  7. 柑橘全爪螨种群空间格局的地学统计学分析%Geostatistic analysis of spatial pattern of the citrus red mite, Panonychus cirri (McGregor) ( Acarina: Tetranychidae) in citrus orchard

    Institute of Scientific and Technical Information of China (English)

    李志强; 梁广文; 岑伊静

    2008-01-01

    The citrus red mite, Panonychus citri (McGregor), is a key pest of citrus. Geostatistic method was applied to study the spatial pattern of citrus red mite population, in citrus orchard by the spatial analysis software Variowin 2.1, The results indicated that the spatial pattern of citrus red mite population can be described by geostatistic method, and the semivariogram of citrus red mite mainly fitted the gauss models with the ranges of 1.1-21.0 m. Citrus red mite population showed an aggregative distribution, and the aggregating intensities were relatively strong in March, August and September. The spatial pattern dynamics showed that two occurrence peaks of citrus red mite population occurred in April and October, specially in October, citrus red mite popula-tion rapidly diffused. March and September were two crucial stages of monitoring and treatment for citrus red mite.%应用地学统计学方法分析了柑橘园主要害螨柑橘全爪螨Panonychus citri(McGregor)种群的空间格局及其动态.结果表明,柑橘全爪螨种群具有空间相关性,变程介于1.10~21.0 m,其半变异函数主要符合高斯模型,表现为聚集分布,其中3月、8月和9月的聚集强度较大;种群空间格局动态显示,4月、10月为该种群的两个发生高峰期,柑橘全爪螨种群数量快速上升扩散.地学统计学方法能够应用于柑橘全爪螨种群的空间格局分析,并有助于对该害螨进行发生预测与控制处理.

  8. The technical drift of applied behavior analysis

    OpenAIRE

    Hayes, Steven C.; Rincover, Arnold; Solnick, Jay V.

    1980-01-01

    Four dimensions (applied, analytic, general, conceptual) were selected from Baer, Wolf, and Risley's (1968) seminal article on the nature of applied behavior analysis and were monitored throughout the first 10 volumes of the Journal of Applied Behavior Analysis. Each of the experimental articles in Volumes 1 through 6 and the first half of Volumes 7 through 10 was rated on each of these dimensions. The trends showed that applied behavior analysis is becoming a more purely technical effort, wi...

  9. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, D.A. [GRAM, Inc., Albuquerque, NM (United States); Gallegos, D.P. [Sandia National Labs., Albuquerque, NM (United States)

    1993-10-01

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ``Geostatistics Test Problem`` is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1.

  10. A comparison of geostatistically based inverse techniques for use in performance assessment analysis at the Waste Isolation Pilot Plant Site: Results from Test Case No. 1

    International Nuclear Information System (INIS)

    The groundwater flow pathway in the Culebra Dolomite aquifer at the Waste Isolation Pilot Plant (WIPP) has been identified as a potentially important pathway for radionuclide migration to the accessible environment. Consequently, uncertainties in the models used to describe flow and transport in the Culebra need to be addressed. A ''Geostatistics Test Problem'' is being developed to evaluate a number of inverse techniques that may be used for flow calculations in the WIPP performance assessment (PA). The Test Problem is actually a series of test cases, each being developed as a highly complex synthetic data set; the intent is for the ensemble of these data sets to span the range of possible conceptual models of groundwater flow at the WIPP site. The Test Problem analysis approach is to use a comparison of the probabilistic groundwater travel time (GWTT) estimates produced by each technique as the basis for the evaluation. Participants are given observations of head and transmissivity (possibly including measurement error) or other information such as drawdowns from pumping wells, and are asked to develop stochastic models of groundwater flow for the synthetic system. Cumulative distribution functions (CDFs) of groundwater flow (computed via particle tracking) are constructed using the head and transmissivity data generated through the application of each technique; one semi-analytical method generates the CDFs of groundwater flow directly. This paper describes the results from Test Case No. 1

  11. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  12. Applied systems analysis. Pt. 7

    International Nuclear Information System (INIS)

    For the evaluation of future electricity demand in the Federal Republic of Germany, determining factors are derived for analysis with a multi-sectoral energy model. The development of the use of electric energy is calculated for two cases of economic growth for the industry, private households, commercial sector and transportation categories. The present situation in the planning and construction of new power plants makes a future shortage of electricity become conceivable. For several restrictions on the use of primary energy carriers for electricity generation, the possibilities of covering the requirements are analysed. Furthermore the effects that different strategies of extending power plant capacity may cause on economy and environment are discussed. (orig.)

  13. Applied systems analysis. No. 22

    International Nuclear Information System (INIS)

    Based on a detailed analysis of demands in the area Cologne/Frankfurt, the amount of the system products for this region were ascertained, which under consideration of technical conditions and entrepreneurial aspects seemed to be disposable at cost equality with competative energy supplies. Based on these data, the technical components of the system, location and piping were fixed and first- and operating costs were determined. For a judgement of the economics, the key numbers, cash value, internal rate of interest and cost recovery rate were determined from the difference of costs between the nuclear long distance energy system and alternative facilities. Furthermore specific production cost, associated prices and contribution margin were presented for each product. (orig.)

  14. Spatial Downscaling of TRMM Precipitation Using Geostatistics and Fine Scale Environmental Variables

    Directory of Open Access Journals (Sweden)

    No-Wook Park

    2013-01-01

    Full Text Available A geostatistical downscaling scheme is presented and can generate fine scale precipitation information from coarse scale Tropical Rainfall Measuring Mission (TRMM data by incorporating auxiliary fine scale environmental variables. Within the geostatistical framework, the TRMM precipitation data are first decomposed into trend and residual components. Quantitative relationships between coarse scale TRMM data and environmental variables are then estimated via regression analysis and used to derive trend components at a fine scale. Next, the residual components, which are the differences between the trend components and the original TRMM data, are then downscaled at a target fine scale via area-to-point kriging. The trend and residual components are finally added to generate fine scale precipitation estimates. Stochastic simulation is also applied to the residual components in order to generate multiple alternative realizations and to compute uncertainty measures. From an experiment using a digital elevation model (DEM and normalized difference vegetation index (NDVI, the geostatistical downscaling scheme generated the downscaling results that reflected detailed characteristics with better predictive performance, when compared with downscaling without the environmental variables. Multiple realizations and uncertainty measures from simulation also provided useful information for interpretations and further environmental modeling.

  15. The basic importance of applied behavior analysis

    OpenAIRE

    Epling, W. Frank; Pierce, W. David

    1986-01-01

    We argue that applied behavior analysis is relevant to basic research. Modification studies, and a broad range of investigations that focus on the precipitating and maintaining conditions of socially significant human behavior, have basic importance. Applied behavior analysis may aid basic researchers in the design of externally valid experiments and thereby enhance the theoretical significance of basic research for understanding human behavior. Applied research with humans, directed at cultu...

  16. Applied behavior analysis and statistical process control?

    OpenAIRE

    Hopkins, B. L.

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Ex...

  17. Geostatistical and stratigraphic analysis of deltaic reservoirs from the Reconcavo Basin, Brazil; Analise estratigrafica e geoestatistica de reservatorios deltaicos da Bacia do Reconcavo (BA)

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Carlos Moreira

    1997-07-01

    This study presents the characterization of the external geometry of deltaic oil reservoirs, including the description of their areal distribution using geo statistic tools, such as variography and kriging. A high-resolution stratigraphic study was developed over a 25 km{sup 2} area, by using data from 276 closely-spaced wells of an oil-producer field from the Reconcavo Basin, northeastern Brazil. The studied succession records the progressive lacustrine transgression of a deltaic environment. Core data and stratigraphic cross sections suggest that the oil reservoirs are mostly amalgamated, delta-front lobes, and subordinately, crevasse deposits. Some important geometrical elements were recognized by the detailed variographic analysis developed for each stratigraphic unit (zone). The average width for the groups of deltaic lobes of one zone was measured from the variographic feature informally named as hole effect. This procedure was not possible for the other zones due to the intense lateral amalgamation of sandstones, indicated by many variographic nested structures. Net sand krigged maps for the main zones suggest a NNW-SSE orientation for the deltaic lobes, as also their common amalgamation and compensation arrangements. High-resolution stratigraphic analyses should include a more regional characterization of the depositional system that comprises the studied succession. On the other hand, geostatistical studies should be developed only after the recognition of the depositional processes acting in the study area and the geological meaning of the variable to be treated, including its spatial variability scales as a function of sand body thickness, orientation and amalgamation. (author)

  18. Multivariate Geostatistical Analysis of Uncertainty for the Hydrodynamic Model of a Geological Trap for Carbon Dioxide Storage. Case study: Multilayered Geological Structure Vest Valcele, ROMANIA

    Science.gov (United States)

    Scradeanu, D.; Pagnejer, M.

    2012-04-01

    The purpose of the works is to evaluate the uncertainty of the hydrodynamic model for a multilayered geological structure, a potential trap for carbon dioxide storage. The hydrodynamic model is based on a conceptual model of the multilayered hydrostructure with three components: 1) spatial model; 2) parametric model and 3) energy model. The necessary data to achieve the three components of the conceptual model are obtained from: 240 boreholes explored by geophysical logging and seismic investigation, for the first two components, and an experimental water injection test for the last one. The hydrodinamic model is a finite difference numerical model based on a 3D stratigraphic model with nine stratigraphic units (Badenian and Oligocene) and a 3D multiparameter model (porosity, permeability, hydraulic conductivity, storage coefficient, leakage etc.). The uncertainty of the two 3D models was evaluated using multivariate geostatistical tools: a)cross-semivariogram for structural analysis, especially the study of anisotropy and b)cokriging to reduce estimation variances in a specific situation where is a cross-correlation between a variable and one or more variables that are undersampled. It has been identified important differences between univariate and bivariate anisotropy. The minimised uncertainty of the parametric model (by cokriging) was transferred to hydrodynamic model. The uncertainty distribution of the pressures generated by the water injection test has been additional filtered by the sensitivity of the numerical model. The obtained relative errors of the pressure distribution in the hydrodynamic model are 15-20%. The scientific research was performed in the frame of the European FP7 project "A multiple space and time scale approach for the quantification of deep saline formation for CO2 storage(MUSTANG)".

  19. Applied analysis mathematical methods in natural science

    CERN Document Server

    Senba, Takasi

    2004-01-01

    This book provides a general introduction to applied analysis; vectoranalysis with physical motivation, calculus of variation, Fourieranalysis, eigenfunction expansion, distribution, and so forth,including a catalogue of mathematical theories, such as basicanalysis, topological spaces, complex function theory, real analysis,and abstract analysis. This book also gives fundamental ideas ofapplied mathematics to discuss recent developments in nonlinearscience, such as mathematical modeling of reinforced random motion ofparticles, semi-conductor device equation in applied physics, andchemotaxis in

  20. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  1. A geostatistical analysis of IBTS data for age 2 North Sea haddock ( Melanogrammus aeglefinus ) considering daylight effects

    DEFF Research Database (Denmark)

    Wieland, Kai; Rivoirard, J.

    2001-01-01

    included in the estimation without any correction for possible daylight effects. In the present study, ordinary kriging was used to correct for sampling irregularities and external drift kriging with a day/night indicator or a cosine function of time of day was applied to account additionally for diurnal...... differences in the catch rates. Only minor differences between the standard indices and the abundance estimates obtained by ordinary kriging were found. In contrast, the external drift kriging, particularly with time of day, yielded higher estimates of mean abundance for all years with the differences to...... ordinary kriging being most pronounced for years characterized by a high portion of night hauls and a low mean catch rate at night. This demonstrates that external drift kriging with a day/night indicator but preferably with time of day is capable of compensating successfully for daylight effects and...

  2. A Comparison of the Effectiveness of Using the Meshless Method and the Finite Difference Method in Geostatistical Analysis of Transport Modeling

    OpenAIRE

    Vrankar, Leopold; Turk, Goran; Runovc, Franc

    2005-01-01

    Disposal of radioactive waste in geological formations is a great concern with regards to nuclear safety. The general reliability and accuracy of transport modeling depends predominantly on input data such as hydraulic conductivity, water velocity, radioactive inventory, and hydrodynamic dispersion. The most important input data are obtained from field measurernents, but they are not always available. One way to study the spatial variability of hydraulic conductivit,y is geostatistics. The nu...

  3. GEOSTATISTICAL ANALYSIS OF SURFACE TEMPERATURE AND IN-SITU SOIL MOISTURE USING LST TIME-SERIES FROM MODIS

    OpenAIRE

    Sohrabinia, M.; W. Rack; P. Zawar-Reza

    2012-01-01

    The objective of this analysis is to provide a quantitative estimate of the fluctuations of land surface temperature (LST) with varying near surface soil moisture (SM) on different land-cover (LC) types. The study area is located in the Canterbury Plains in the South Island of New Zealand. Time series of LST from the MODerate resolution Imaging Spectro-radiometer (MODIS) have been analysed statistically to study the relationship between the surface skin temperature and near-surface S...

  4. Geostatistical Analysis of Surface Temperature and In-Situ Soil Moisture Using LST Time-Series from Modis

    Science.gov (United States)

    Sohrabinia, M.; Rack, W.; Zawar-Reza, P.

    2012-07-01

    The objective of this analysis is to provide a quantitative estimate of the fluctuations of land surface temperature (LST) with varying near surface soil moisture (SM) on different land-cover (LC) types. The study area is located in the Canterbury Plains in the South Island of New Zealand. Time series of LST from the MODerate resolution Imaging Spectro-radiometer (MODIS) have been analysed statistically to study the relationship between the surface skin temperature and near-surface SM. In-situ measurements of the skin temperature and surface SM with a quasi-experimental design over multiple LC types are used for validation. Correlations between MODIS LST and in-situ SM, as well as in-situ surface temperature and SM are calculated. The in-situ measurements and MODIS data are collected from various LC types. Pearson's r correlation coefficient and linear regression are used to fit the MODIS LST and surface skin temperature with near-surface SM. There was no significant correlation between time-series of MODIS LST and near-surface SM from the initial analysis, however, careful analysis of the data showed significant correlation between the two parameters. Night-time series of the in-situ surface temperature and SM from a 12 hour period over Irrigated-Crop, Mixed-Grass, Forest, Barren and Open- Grass showed inverse correlations of -0.47, -0.68, -0.74, -0.88 and -0.93, respectively. These results indicated that the relationship between near-surface SM and LST in short-terms (12 to 24 hours) is strong, however, remotely sensed LST with higher temporal resolution is required to establish this relationship in such time-scales. This method can be used to study near-surface SM using more frequent LST observations from a geostationary satellite over the study area.

  5. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  6. Applying economic analysis to technical assistance projects

    OpenAIRE

    McMahon, Gary

    1997-01-01

    The author recommends using more quantitative economic analysis in appraising technical assistance loans and loan components. After giving a brief history of tecnhical assistance and the problems commonly associated with it, he describes classifications of technical assistance, proposes a new typology to be used for project appraisal, suggests methods for screening projects, and discusses different levels of economic analysis. He shows how the typology and economic analysis could be applied t...

  7. Applied Systems Analysis: A Genetic Approach

    OpenAIRE

    Majone, G.

    1980-01-01

    The International Institute for Applied Systems Analysis is preparing a "Handbook of Systems Analysis," which will appear in three volumes: Volume 1, "Overview," is aimed at a widely varied audience of producers and users of systems analysis; Volume 2, "Methods," is aimed at systems analysts who need basic knowledge of methods in which they are not expert; the volume contains introductory overviews of such methods; Volume 3, "Cases," contains descriptions of actual systems analyses that illus...

  8. Geostatistics - bloodhound of uranium exploration

    International Nuclear Information System (INIS)

    Geostatistics makes possible the efficient use of the information contained in core samples obtained by diamond drilling. The probability that a core represents the true content of a deposit, and the likely content of an orebody between two core samples can both be estimated using geostatistical methods. A confidence interval can be given for the mean grade of a deposit. The use of a computer is essential in the calculation of the continuity function, the variogram, when as many as 800,000 core samples may be involved. The results may be used to determine where additional samples need to be taken, and to develop a picture of the probable grades throughout the deposit. The basic mathematical model is about 15 years old, but applications to different types of deposit require various adaptations. The Ecole Polytechnique is currently developing methods for uranium deposits. (LL)

  9. Geostatistical Analysis of Spatial Isotope (δ18O, δ2H and 3H) Variability of Groundwater Across Morocco

    International Nuclear Information System (INIS)

    Environmental isotopes are increasingly being used for a variety of applications in the fields of the Earth's water cycle and climate change. This paper reports the first national level survey of δ18O, δ2H, 3H and 14C in groundwater across Morocco, including the analysis of the spatial distribution of stable (δ18O and δ2H) and radioactive (tritium and carbon-14) isotopes used to assess eleven groundwater basins distributed across Morocco. The interpolations were carried out using ESRI ArcGIS 9.2 with spatial analyst extension. The methods used are ordinary kriging inverse distance weighting (IDW). The maps showing the spatial variability of tritium and radiocarbon in the basins are used to visualize the presence of modern and old groundwater, while the stable isotope maps show that the age of groundwater, the type (shallow or deep groundwater), the distance of the basin from the sea (Atlantic and Mediterranean) and the altitude are the main factors influencing the isotopic composition of groundwater. Those thematic maps will provide a valuable contribution for sustainable groundwater management of resources for drinking water supplies, agriculture and industry, which is a prime concern in countries dominated by arid and semiarid climates such as Morocco. (author)

  10. Geostatistical analysis of microrelief of an oxisol as a function of tillage and cumulative rainfall Análise geoestatística do microrrelevo de um Latossolo em função do preparo do solo e da precipitação acumulada

    Directory of Open Access Journals (Sweden)

    Eva Vidal Vázquez

    2009-04-01

    Full Text Available Surface roughness can be influenced by type and intensity of soil tillage among other factors. In tilled soils microrelief may decay considerably as rain progresses. Geostatistics provides some tools that may be useful to study the dynamics of soil surface variability. The objective of this study was to show how it is possible to apply geostatistics to analyze soil microrelief variability. Data were taken at an Oxisol over six tillage treatments, namely, disk harrow, disk plow, chisel plow, disk harrow + disk level, disk plow + disk level and chisel plow + disk level. Measurements were made initially just after tillage and subsequently after cumulative natural rainfall events. Duplicated measurements were taken in each one of the treatments and dates of samplings, yielding a total of 48 experimental surfaces. A pin microrelief meter was used for the surface roughness measurements. The plot area was 1.35 × 1.35 m and the sample spacing was 25 mm, yielding a total of 3,025 data points per measurement. Before geostatistical analysis, trend was removed from the experimental data by two methods for comparison. Models were fitted to the semivariograms of each surface and the model parameters were analyzed. The trend removing method affected the geostatistical results. The geostatistical parameter dependence ratio showed that spatial dependence improved for most of the surfaces as the amount of cumulative rainfall increased.A rugosidade da superfície pode ser influenciada pelo tipo e pela intensidade do preparo do solo, entre outros fatores. Em solos preparados o microrrelevo é aplanado consideravelmente com o acúmulo da chuva. A Geoestatística promove algumas ferramentas que podem ser úteis no estudo da dinâmica da variabilidade da superfície do solo. O objetivo desse estudo foi verificar se é possível aplicar geoestatística na análise da variação do microrrelevo do solo. Os resultados foram obtidos num Latossolo sob seis tratamentos de

  11. Geostatistics for radiological characterization: overview and application cases

    International Nuclear Information System (INIS)

    The objective of radiological characterization is to find a suitable balance between gathering data (constrained by cost, deadlines, accessibility or radiation) and managing the issues (waste volumes, levels of activity or exposure). It is necessary to have enough information to have confidence in the results without multiplying useless data. Geo-statistics processing of data considers all available pieces of information: historical data, non-destructive measurements and laboratory analyses of samples. The spatial structure modelling is then used to produce maps and to estimate the extent of radioactive contamination (surface and depth). Quantifications of local and global uncertainties are powerful decision-making tools for better management of remediation projects at contaminated sites, and for decontamination and dismantling projects at nuclear facilities. They can be used to identify hot spots, estimate contamination of surfaces and volumes, classify radioactive waste according to thresholds, estimate source terms, and so on. The spatial structure of radioactive contamination makes the optimization of sampling (number and position of data points) particularly important. Geo-statistics methodology can help determine the initial mesh size and reduce estimation uncertainties. Several show cases are presented to illustrate why and how geo-statistics can be applied to a range of radiological characterization where investigated units can represent very small areas (a few m2 or a few m3) or very large sites (at a country scale). The focus is then put on experience gained over years in the use of geo-statistics and sampling optimization. (author)

  12. Geostatistical enhancement of european hydrological predictions

    Science.gov (United States)

    Pugliese, Alessio; Castellarin, Attilio; Parajka, Juraj; Arheimer, Berit; Bagli, Stefano; Mazzoli, Paolo; Montanari, Alberto; Blöschl, Günter

    2016-04-01

    Geostatistical Enhancement of European Hydrological Prediction (GEEHP) is a research experiment developed within the EU funded SWITCH-ON project, which proposes to conduct comparative experiments in a virtual laboratory in order to share water-related information and tackle changes in the hydrosphere for operational needs (http://www.water-switch-on.eu). The main objective of GEEHP deals with the prediction of streamflow indices and signatures in ungauged basins at different spatial scales. In particular, among several possible hydrological signatures we focus in our experiment on the prediction of flow-duration curves (FDCs) along the stream-network, which has attracted an increasing scientific attention in the last decades due to the large number of practical and technical applications of the curves (e.g. hydropower potential estimation, riverine habitat suitability and ecological assessments, etc.). We apply a geostatistical procedure based on Top-kriging, which has been recently shown to be particularly reliable and easy-to-use regionalization approach, employing two different type of streamflow data: pan-European E-HYPE simulations (http://hypeweb.smhi.se/europehype) and observed daily streamflow series collected in two pilot study regions, i.e. Tyrol (merging data from Austrian and Italian stream gauging networks) and Sweden. The merger of the two study regions results in a rather large area (~450000 km2) and might be considered as a proxy for a pan-European application of the approach. In a first phase, we implement a bidirectional validation, i.e. E-HYPE catchments are set as training sites to predict FDCs at the same sites where observed data are available, and vice-versa. Such a validation procedure reveals (1) the usability of the proposed approach for predicting the FDCs over the entire river network of interest using alternatively observed data and E-HYPE simulations and (2) the accuracy of E-HYPE-based predictions of FDCs in ungauged sites. In a

  13. Applied mathematics analysis of the multibody systems

    Science.gov (United States)

    Sahin, H.; Kar, A. K.; Tacgin, E.

    2012-08-01

    A methodology is developed for the analysis of the multibody systems that is applied on the vehicle as a case study. The previous study emphasizes the derivation of the multibody dynamics equations of motion for bogie [2]. In this work, we have developed a guide-way for the analysis of the dynamical behavior of the multibody systems for mainly validation, verification of the realistic mathematical model and partly for the design of the alternative optimum vehicle parameters.

  14. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  15. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  16. A Bayesian Markov geostatistical model for estimation of hydrogeological properties

    International Nuclear Information System (INIS)

    A geostatistical methodology based on Markov-chain analysis and Bayesian statistics was developed for probability estimations of hydrogeological and geological properties in the siting process of a nuclear waste repository. The probability estimates have practical use in decision-making on issues such as siting, investigation programs, and construction design. The methodology is nonparametric which makes it possible to handle information that does not exhibit standard statistical distributions, as is often the case for classified information. Data do not need to meet the requirements on additivity and normality as with the geostatistical methods based on regionalized variable theory, e.g., kriging. The methodology also has a formal way for incorporating professional judgments through the use of Bayesian statistics, which allows for updating of prior estimates to posterior probabilities each time new information becomes available. A Bayesian Markov Geostatistical Model (BayMar) software was developed for implementation of the methodology in two and three dimensions. This paper gives (1) a theoretical description of the Bayesian Markov Geostatistical Model; (2) a short description of the BayMar software; and (3) an example of application of the model for estimating the suitability for repository establishment with respect to the three parameters of lithology, hydraulic conductivity, and rock quality designation index (RQD) at 400--500 meters below ground surface in an area around the Aespoe Hard Rock Laboratory in southeastern Sweden

  17. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  18. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... novelty of heart plots for spectral data. The important aspect of registration, also called warping or alignment, emerges from both the chemometric and statistical perspectives. In Paper III we apply functional registration in the context of biomechanics, specically to data from a juggling experiment. The...

  19. Evaluating the function of applied behavior analysis a bibliometric analysis.

    OpenAIRE

    Critchfield, Thomas S

    2002-01-01

    Analysis of scholarly citations involving behavioral journals reveals that, consistent with its mission, applied behavior analysis research frequently references the basic behavioral literature but, as some have suspected, exerts narrow scholarly influence.

  20. Book review: Spatial statistics and geostatistics

    OpenAIRE

    Clift, Hamish

    2013-01-01

    "Spatial Statistics and Geostatistics." Yongwan Chun and Daniel A. Griffith. SAGE. January 2013. --- This book aims to explain and demonstrate techniques in spatial sampling, local statistics, and advanced topics including Bayesian methods, Monte Carlo simulation, error and uncertainty. Spatial Statistics and Geostatistics is highly recommended to researchers in geography, environmental science, health and epidemiology, population and demography, and planning, writes Hamish Clift.

  1. Geostatistics approach to radon potential mapping

    International Nuclear Information System (INIS)

    Soil-gas radon potential assessment is an important component of the most of national radon programs. With regard to recent intensive advances in all fields of science and increasing demands on accuracy in data there is an urgent need to support current approaches by alternative methods. Important finding of this study is the benefit of variographic analysis based on random sampling for prediction of the overall radon potential, as a counterpart of a widely used regular sampling. In addition, as could be seen the temporal variability might be a crucial factor affecting consequent accuracy of radon risk assessment. We hope that introduced combination of geo-statistics tools, results of long-term radon activity monitoring and, in general, dealing with uncertainties affecting radon potential/risk assessment can bring synergic effect providing more exhaustive data treatment. (authors)

  2. Defining applied behavior analysis: An historical analogy

    OpenAIRE

    Deitz, Samuel M.

    1982-01-01

    This article examines two criteria for a definition of applied behavior analysis. The criteria are derived from a 19th century attempt to establish medicine as a scientific field. The first criterion, experimental determinism, specifies the methodological boundaries of an experimental science. The second criterion, philosophic doubt, clarifies the tentative nature of facts and theories derived from those facts. Practices which will advance the science of behavior are commented upon within eac...

  3. Positive Behavior Support and Applied Behavior Analysis

    OpenAIRE

    Johnston, J M; Foxx, Richard M; Jacobson, John W.; Green, Gina; Mulick, James A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We also consider the features of PBS that have facilitated its broad dissemination and how ABA might benefit from emulating certain practices of the P...

  4. Current measurement in applied behavior analysis

    OpenAIRE

    Springer, Bonnie; Brown, Tom; Duncan, Philip K.

    1981-01-01

    The analysis of behavior began with a form of data, rate of responding, which allowed for efficient study and for the description of the basic principles of behavior. Especially important were the facts that rate of responding was a direct reflection of fundamental properties of behavior, and that rate of responding was measured continuously within an experimental session. As behavior analysts moved from purely experimental to applied settings, discontinuous, time-based methods of measurement...

  5. The role of geostatistics in medical geology

    Science.gov (United States)

    Goovaerts, Pierre

    2014-05-01

    Since its development in the mining industry, geostatistics has emerged as the primary tool for spatial data analysis in various fields, ranging from earth and atmospheric sciences, to agriculture, soil science, remote sensing, and more recently environmental exposure assessment. In the last few years, these tools have been tailored to the field of medical geography or spatial epidemiology, which is concerned with the study of spatial patterns of disease incidence and mortality and the identification of potential 'causes' of disease, such as environmental exposure, diet and unhealthy behaviors, economic or socio-demographic factors. On the other hand, medical geology is an emerging interdisciplinary scientific field studying the relationship between natural geological factors and their effects on human and animal health. This paper provides an introduction to the field of medical geology with an overview of geostatistical methods available for the analysis of geological and health data. Key concepts are illustrated using the mapping of groundwater arsenic concentrations across eleven Michigan counties and the exploration of its relationship to the incidence of prostate cancer at the township level. Arsenic in drinking-water is a major problem and has received much attention because of the large human population exposed and the extremely high concentrations (e.g. 600 to 700 μg/L) recorded in many instances. Few studies have however assessed the risks associated with exposure to low levels of arsenic (say different private wells that were sampled between 1993 and 2002, 2) prostate cancer incidence recorded at the township level over the period 1985-2002, and 3) block-group population density that served as proxy for urbanization and use of regulated public water supply versus use of potentially contaminated private wells in rural areas.

  6. Activation analysis as applied to environmental substances

    International Nuclear Information System (INIS)

    The historical background of activation analysis as applied to environmental problems is first briefly described. Then, the present state of its utilization for environmental samples, mainly atmospheric floating particles and human hairs, is reviewed. The problem with irradiation reactors is also mentioned. In the activation analysis of environmental substances, the instrumental neutron activation analysis (INAA) with the thermal neutrons in reactors is the main; besides, there are the methods with bremsstrahlung, etc. The INAA is most effectively used for atmospheric airborne particles and the micro-elements in human hairs. In Japan, the INAA is currently employed by the Environmental Agency in its national air pollution surveillance network for metallic pollutants. The problem with reactors is the limited capacity for thermal neutron irradiation. (Mori, K.)

  7. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  8. Tropospheric Delay Raytracing Applied in VLBI Analysis

    Science.gov (United States)

    MacMillan, D. S.; Eriksson, D.; Gipson, J. M.

    2013-12-01

    Tropospheric delay modeling error continues to be one of the largest sources of error in VLBI analysis. For standard operational solutions, we use the VMF1 elevation-dependent mapping functions derived from ECMWF data. These mapping functions assume that tropospheric delay at a site is azimuthally symmetric. As this assumption does not reflect reality, we have determined the raytrace delay along the signal path through the troposphere for each VLBI quasar observation. We determined the troposphere refractivity fields from the pressure, temperature, specific humidity and geopotential height fields of the NASA GSFC GEOS-5 numerical weather model. We discuss results from analysis of the CONT11 R&D and the weekly operational R1+R4 experiment sessions. When applied in VLBI analysis, baseline length repeatabilities were better for 66-72% of baselines with raytraced delays than with VMF1 mapping functions. Vertical repeatabilities were better for 65% of sites.

  9. Strategic decision analysis applied to borehole seismology

    International Nuclear Information System (INIS)

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  10. Wavelet analysis applied to the IRAS cirrus

    Science.gov (United States)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  11. A Cyclostationarity Analysis Applied to Scaled Images

    Czech Academy of Sciences Publication Activity Database

    Saic, Stanislav; Mahdian, Babak

    Berlin : Springer, 2009 - (Leung, C.; Lee, M.; Chan, J.), s. 683-690 ISBN 978-3-642-10682-8. - (Lecture Notes in Computer Science. 5864). [ICONIP 2009. International Conference on Neural Information Processing /16./. Bangkok (TH), 01.12.2009-05.12.2009] R&D Projects: GA ČR GA102/08/0470 Institutional research plan: CEZ:AV0Z10750506 Keywords : Interpolation * scaling * cyclostationary * authentication * image forensics Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2010/ZOI/babak-a cyclostationarity analysis applied to scaled images.pdf

  12. What happened to analysis in applied behavior analysis?

    OpenAIRE

    Pierce, W. David; Epling, W. Frank

    1980-01-01

    This paper addresses the current help-oriented focus of researchers in applied behavior analysis. Evidence from a recent volume of JABA suggests that analytic behavior is at low levels in applied analysis while cure-help behavior is at high strength. This low proportion of scientific behavior is apparantly related to cure-help contingencies set by institutions and agencies of help and the editorial policies of JABA itself. These contingencies have favored the flight to real people and a conce...

  13. 2nd European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Soares, Amílcar; Froidevaux, Roland

    1999-01-01

    The Second European Conference on Geostatistics for Environmental Ap­ plications took place in Valencia, November 18-20, 1998. Two years have past from the first meeting in Lisbon and the geostatistical community has kept active in the environmental field. In these days of congress inflation, we feel that continuity can only be achieved by ensuring quality in the papers. For this reason, all papers in the book have been reviewed by, at least, two referees, and care has been taken to ensure that the reviewer comments have been incorporated in the final version of the manuscript. We are thankful to the members of the scientific committee for their timely review of the scripts. All in all, there are three keynote papers from experts in soil science, climatology and ecology and 43 contributed papers providing a good indication of the status of geostatistics as applied in the environ­ mental field all over the world. We feel now confident that the geoENV conference series, seeded around a coffee table almost six...

  14. Evaluating factorial kriging for seismic attributes filtering: a geostatistical filter applied to reservoir characterization; Avaliacao da krigagem fatorial na filtragem de atributos sismicos: um filtro geoestatistico aplicado a caracterizacao de reservatorios

    Energy Technology Data Exchange (ETDEWEB)

    Mundim, Evaldo Cesario

    1999-02-01

    In this dissertation the Factorial Kriging analysis for the filtering of seismic attributes applied to reservoir characterization is considered. Factorial Kriging works in the spatial, domain in a similar way to the Spectral Analysis in the frequency domain. The incorporation of filtered attributes via External Drift Kriging and Collocated Cokriging in the estimate of reservoir characterization is discussed. Its relevance for the reservoir porous volume calculation is also evaluated based on comparative analysis of the volume risk curves derived from stochastic conditional simulations with collocated variable and stochastic conditional simulations with collocated variable and stochastic conditional simulations with external drift. results prove Factorial Kriging as an efficient technique for the filtering of seismic attributes images, of which geologic features are enhanced. The attribute filtering improves the correlation between the attributes and the well data and the estimates of the reservoir properties. The differences between the estimates obtained by External Drift Kriging and Collocated Cokriging are also reduced. (author)

  15. Artificial intelligence technologies applied to terrain analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wright, J.C. (Army Training and Doctrine Command, Fort Monroe, VA (USA)); Powell, D.R. (Los Alamos National Lab., NM (USA))

    1990-01-01

    The US Army Training and Doctrine Command is currently developing, in cooperation with Los Alamos National Laboratory, a Corps level combat simulation to support military analytical studies. This model emphasizes high resolution modeling of the command and control processes, with particular attention to architectural considerations that enable extension of the model. A planned future extension is the inclusion of an computer based planning capability for command echelons that can be dynamical invoked during the execution of then model. Command and control is the process through which the activities of military forces are directed, coordinated, and controlled to achieve the stated mission. To perform command and control the commander must understand the mission, perform terrain analysis, understand his own situation and capabilities as well as the enemy situation and his probable actions. To support computer based planning, data structures must be available to support the computer's ability to understand'' the mission, terrain, own capabilities, and enemy situation. The availability of digitized terrain makes it feasible to apply artificial intelligence technologies to emulate the terrain analysis process, producing data structures for uses in planning. The work derived thus for to support the understanding of terrain is the topic of this paper. 13 refs., 5 figs., 6 tabs.

  16. Epithermal neutron activation analysis in applied microbiology

    International Nuclear Information System (INIS)

    Some results from applying epithermal neutron activation analysis at FLNP JINR, Dubna, Russia, in medical biotechnology, environmental biotechnology and industrial biotechnology are reviewed. In the biomedical experiments biomass from the blue-green alga Spirulina platensis (S. platensis) has been used as a matrix for the development of pharmaceutical substances containing such essential trace elements as selenium, chromium and iodine. The feasibility of target-oriented introduction of these elements into S. platensis biocomplexes retaining its protein composition and natural beneficial properties was shown. The absorption of mercury on growth dynamics of S. platensis and other bacterial strains was observed. Detoxification of Cr and Hg by Arthrobacter globiformis 151B was demonstrated. Microbial synthesis of technologically important silver nanoparticles by the novel actinomycete strain Streptomyces glaucus 71 MD and blue-green alga S. platensis were characterized by a combined use of transmission electron microscopy, scanning electron microscopy and energy-dispersive analysis of X-rays. It was established that the tested actinomycete S. glaucus 71 MD produces silver nanoparticles extracellularly when acted upon by the silver nitrate solution, which offers a great advantage over an intracellular process of synthesis from the point of view of applications. The synthesis of silver nanoparticles by S. platensis proceeded differently under the short-term and long-term silver action. (author)

  17. Improved Accuracy of Chlorophyll-a Concentration Estimates from MODIS Imagery Using a Two-Band Ratio Algorithm and Geostatistics: As Applied to the Monitoring of Eutrophication Processes over Tien Yen Bay (Northern Vietnam

    Directory of Open Access Journals (Sweden)

    Nguyen Thi Thu Ha

    2013-12-01

    Full Text Available Sea eutrophication is a natural process of water enrichment caused by increased nutrient loading that severely affects coastal ecosystems by decreasing water quality. The degree of eutrophication can be assessed by chlorophyll-a concentration. This study aims to develop a remote sensing method suitable for estimating chlorophyll-a concentrations in tropical coastal waters with abundant phytoplankton using Moderate Resolution Imaging Spectroradiometer (MODIS/Terra imagery and to improve the spatial resolution of MODIS/Terra-based estimation from 1 km to 100 m by geostatistics. A model based on the ratio of green and blue band reflectance (rGBr is proposed considering the bio-optical property of chlorophyll-a. Tien Yen Bay in northern Vietnam, a typical phytoplankton-rich coastal area, was selected as a case study site. The superiority of rGBr over two existing representative models, based on the blue-green band ratio and the red-near infrared band ratio, was demonstrated by a high correlation of the estimated chlorophyll-a concentrations at 40 sites with values measured in situ. Ordinary kriging was then shown to be highly capable of predicting the concentration for regions of the image covered by clouds and, thus, without sea surface data. Resultant space-time maps of concentrations over a year clarified that Tien Yen Bay is characterized by natural eutrophic waters, because the average of chlorophyll-a concentrations exceeded 10 mg/m3 in the summer. The temporal changes of chlorophyll-a concentrations were consistent with average monthly air temperatures and precipitation. Consequently, a combination of rGBr and ordinary kriging can effectively monitor water quality in tropical shallow waters.

  18. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  19. Delineation of Management Zones in Precision Agriculture by Integration of Proximal Sensing with Multivariate Geostatistics. Examples of Sensor Data Fusion

    Directory of Open Access Journals (Sweden)

    Annamaria Castrignanò

    2015-07-01

    Full Text Available Fundamental to the philosophy of Precision Agriculture (PA is the concept of matching inputs to needs. Recent research in PA has focused on use of Management Zones (MZ that are field areas characterised by homogeneous attributes in landscape and soil conditions. Proximal sensing (such as Electromagnetic Induction (EMI, Ground Penetrating Radar (GPR and X-ray fluorescence can complement direct sampling and a multisensory platform can enable us to map soil features unambiguously. Several methods of multi-sensor data analysis have been developed to determine the location of subfield areas. Modern geostatistical techniques, treating variables as continua in a joint attribute and geographic space, offer the potential to analyse such data effectively. The objective of the paper is to show the potential of multivariate geostatistics to create MZ in the perspective of PA by integrating field data from different types of sensors, describing two study cases. In particular, in the first case study, cokriging and factorial cokriging were employed to produce thematic maps of soil trace elements and to delineate homogenous zones, respectively. In the second case, a multivariate geostatistical data-fusion technique (multi collocated cokriging was applied to different geophysical sensor data (GPR and EMI, for stationary estimation of soil water content and for delineating within-field zone with different wetting degree. The results have shown that linking sensors of different type improves the overall assessment of soil and sensor data fusion could be effectively applied to delineate MZs in Precision Agriculture. However, techniques of data integration are urgently required as a result of the proliferation of data from different sources.

  20. Application of Geostatistics to the resolution of structural problems in homogeneous rocky massifs

    International Nuclear Information System (INIS)

    The nature and possibilities of application of intrinsic functions to the structural research and the delimitation of the areas of influence in an ore deposit are briefly described. Main models to which the different distributions may be assimilated: 'logarithmic' and 'linear' among those with no sill value, and on the other hand, 'spherical', 'exponential' and 'gaussian' among those having a sill level, which allows the establishment of a range value liable to separate the field of independent samples from that of non-independent ones are shown. Thereafter as an original contribution to applied geostatistics the autor postulates 1) the application of the 'fracturing rank' as a regionalized variable after verifying its validity through strict probabilistic methodologies, and 2) a methodological extension of the conventional criterion of 'rock quality designation' to the analysis of the quality and degree of structural discontinuity in the rock surface. Finally, some examples are given of these applications. (M.E.L.)

  1. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  2. Geostatistical Study of Precipitation on the Island of Crete

    Science.gov (United States)

    Agou, Vasiliki D.; Varouchakis, Emmanouil A.; Hristopulos, Dionissios T.

    2015-04-01

    Understanding and predicting the spatiotemporal patterns of precipitation in the Mediterranean islands is an important topic of research, which is emphasized by alarming long-term predictions for increased drought conditions [4]. The analysis of records from drought-prone areas around the world has demonstrated that precipitation data are non-Gaussian. Typically, such data are fitted to the gamma distribution function and then transformed into a normalized index, the so-called Standardized Precipitation Index (SPI) [5]. The SPI can be defined for different time scales and has been applied to data from various regions [2]. Precipitation maps can be constructed using the stochastic method of Ordinary Kriging [1]. Such mathematical tools help to better understand the space-time variability and to plan water resources management. We present preliminary results of an ongoing investigation of the space-time precipitation distribution on the island of Crete (Greece). The study spans the time period from 1948 to 2012 and extends over an area of 8 336 km2. The data comprise monthly precipitation measured at 56 stations. Analysis of the data showed that the most severe drought occurred in 1950 followed by 1989, whereas the wettest year was 2002 followed by 1977. A spatial trend was observed with the spatially averaged annual precipitation in the West measured at about 450mm higher than in the East. Analysis of the data also revealed strong correlations between the precipitation in the western and eastern parts of the island. In addition to longitude, elevation (masl) was determined to be an important factor that exhibits strong linear correlation with precipitation. The precipitation data exhibit wet and dry periods with strong variability even during the wet period. Thus, fitting the data to specific probability distribution models has proved challenging. Different time scales, e.g. monthly, biannual, and annual have been investigated. Herein we focus on annual

  3. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  4. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  5. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Lloyd A. [Leading Solutions, LLC.; Paresol, Bernard [U.S. Department of Agriculture, Forest Service, Pacific Northwest Research Station, Portland, OR.

    2014-09-01

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  6. Subject selection in applied behavior analysis

    OpenAIRE

    Homer, Andrew L.; Peterson, Lizette; Wonderlich, Stephen A.

    1983-01-01

    Past researchers have commented on the role of specifying relevant subject characteristics in determining the generality of experimental findings. Knowledge of subject selection criteria is important in interpreting and replicating research results. Such knowledge, as compared with many other historical and demographic characteristics of the subject, is likely to be related to a procedure's effectiveness. Data indicated that the majority of articles published in the Journal of Applied Behavio...

  7. Applied modal analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Pedersen, H.B.; Kristensen, O.J.D.

    2003-01-01

    In this project modal analysis has been used to determine the natural frequencies, damping and the mode shapes for wind turbine blades. Different methods to measure the position and adjust the direction of the measuring points are discussed. Differentequipment for mounting the accelerometers are...... unloaded wind turbine blade. During this campaign the modal analysis are performed on ablade mounted in a horizontal and a vertical position respectively. Finally the results obtained from modal analysis carried out on a wind turbine blade are compared with results obtained from the Stig Øyes blade_EV1...

  8. Geostatistics Portal - an Integrated System for the Dissemination of Geo-Statistical Data

    Directory of Open Access Journals (Sweden)

    Igor Kuzma

    2013-09-01

    Full Text Available A wide range of applicability of spatial statistical data for managing and planning various human activitiesin the environment or monitoring the trends of diff erent phenomena in space and time requires an adequateresponse from data providers. Th e Statistical Offi ce of the Republic of Slovenia (SURS has a long traditionof processing geo-referenced statistical data that can be point located or aggregated to an optional (administrativespatial unit and in line with the increasing need for geo-referenced statistical data of high resolution,SURS followed the users’ needs by developing various services that are a part of an integrated system for thedissemination of geo-statistical data.Th e article discusses the production of geo-statistical data in Slovenia with the focus on the grid data, relatedconfi dentiality issues and the system for the dissemination of geo-statistical data, i.e. the Geostatistics portal.

  9. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  10. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    Next generation sequencing (NGS) has revolutionized the field of genomics and its wide range of applications has resulted in the genome-wide analysis of hundreds of species and the development of thousands of computational tools. This thesis represents my work on NGS analysis of four species, Lotus...... japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... agricultural and biological importance. Its capacity to form symbiotic relationships with rhizobia and microrrhizal fungi has fascinated researchers for years. Lotus has a small genome of approximately 470 Mb and a short life cycle of 2 to 3 months, which has made Lotus a model legume plant for many molecular...

  11. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  12. Science, Skepticism, and Applied Behavior Analysis

    OpenAIRE

    Normand, Matthew P.

    2008-01-01

    Pseudoscientific claims concerning medical and psychological treatments of all varieties are commonplace. As behavior analysts, a sound skeptical approach to our science and practice is essential. The present paper offers an overview of science and skepticism and discusses the relationship of skepticism to behavior analysis, with an emphasis on the types of issues concerning behavior analysts in practice.

  13. Instrumentation and computational techniques and resulting geostatistical characterization of INEL vadose zone basalt

    International Nuclear Information System (INIS)

    The geostatistical characterization of the Eastern Snake River Plains (ESRP) basalt found in the vadose zone at Idaho National Engineering Laboratory (INEL) depend heavily upon instrumentation and computer programs developed by the INEL Geoscience Unit personnel. Laboratory measurements of permeability, porosity, density, capillary pressure-pore-size distributions, and surface area were carried out with equipment developed and fabricated at INEL, and computation of the characterization parameters was carried out using personal computers, with computer programs specifically written for this purpose by the authors. Geostatistical analysis results from Hell's Half Acre were also used to develop distribution models for medial/distal length:width:thickness substrate elements. 3 refs., 10 figs

  14. Reliability analysis applied to structural tests

    Science.gov (United States)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  15. Geospatial interpolation and mapping of tropospheric ozone pollution using geostatistics.

    Science.gov (United States)

    Kethireddy, Swatantra R; Tchounwou, Paul B; Ahmad, Hafiz A; Yerramilli, Anjaneyulu; Young, John H

    2014-01-01

    Tropospheric ozone (O3) pollution is a major problem worldwide, including in the United States of America (USA), particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels. PMID:24434594

  16. Geospatial Interpolation and Mapping of Tropospheric Ozone Pollution Using Geostatistics

    Directory of Open Access Journals (Sweden)

    Swatantra R. Kethireddy

    2014-01-01

    Full Text Available Tropospheric ozone (O3 pollution is a major problem worldwide, including in the United States of America (USA, particularly during the summer months. Ozone oxidative capacity and its impact on human health have attracted the attention of the scientific community. In the USA, sparse spatial observations for O3 may not provide a reliable source of data over a geo-environmental region. Geostatistical Analyst in ArcGIS has the capability to interpolate values in unmonitored geo-spaces of interest. In this study of eastern Texas O3 pollution, hourly episodes for spring and summer 2012 were selectively identified. To visualize the O3 distribution, geostatistical techniques were employed in ArcMap. Using ordinary Kriging, geostatistical layers of O3 for all the studied hours were predicted and mapped at a spatial resolution of 1 kilometer. A decent level of prediction accuracy was achieved and was confirmed from cross-validation results. The mean prediction error was close to 0, the root mean-standardized-prediction error was close to 1, and the root mean square and average standard errors were small. O3 pollution map data can be further used in analysis and modeling studies. Kriging results and O3 decadal trends indicate that the populace in Houston-Sugar Land-Baytown, Dallas-Fort Worth-Arlington, Beaumont-Port Arthur, San Antonio, and Longview are repeatedly exposed to high levels of O3-related pollution, and are prone to the corresponding respiratory and cardiovascular health effects. Optimization of the monitoring network proves to be an added advantage for the accurate prediction of exposure levels.

  17. Assessment of spatial distribution of fallout radionuclides through geostatistics concept.

    Science.gov (United States)

    Mabit, L; Bernard, C

    2007-01-01

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution. PMID:17673340

  18. Assessment of spatial distribution of fallout radionuclides through geostatistics concept

    International Nuclear Information System (INIS)

    After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 (137Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a 137Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of 137Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This 137Cs semivariogram showed a good autocorrelation (R2 = 0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of 137Cs. The spatial redistribution of 137Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 107 Bq of 137Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution

  19. Artificial intelligence applied to process signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Corsberg, D.

    1986-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge-based appraoch to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored. 8 refs.

  20. Toward applied behavior analysis of life aloft

    Science.gov (United States)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  1. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  2. Neutron activation analysis applied to archaeological problems

    International Nuclear Information System (INIS)

    Among the various techniques, the main analytical methods used to characterize ceramics are undoubtedly XRF and INAA. The principles of NAA differ from those of XRF in that samples are irradiated by thermal neutrons from a nuclear reactor. During irradiation, a few neutrons are captured by the nuclei of atoms in the specimen. This process, called activation, causes some of the nuclei to become unstable. During and after neutron irradiation, these unstable nuclei emit γ rays with unique energies at rates defined by the characteristic half-lives of the radioactive nuclei. Identification of the radioactive nucleus is possible by measuring the γ ray energies. Determination of their intensities permits quantitative analysis of the elements in the sample. The use of NAA in ceramics by a combination of two or three irradiation, decay and measurement strategies allows the determination of the elements Ba, Ce, Cl, Co, Cs, Dy, Eu, Fe, Hf, K, La, Lu, Mn, Na, Nd, Rb, Sb, Sc, Sm, Sr, Ta, Tb, Th, U, Yb, Zn and Zr, if necessary by changing the irradiation, decay and measurement schemes. In general, XRF is more available, more rapid and less expensive than NAA. However, NAA offers a far greater number of elements, more sensitivity, superior precision and greater accuracy than XRF. On the other hand, NAA can be performed on extremely small samples (5-10 mg), meaning that only minor damage to valuable artefacts may be required

  3. APPLYING EXPRESSION ANALYSIS IN METAPHORICAL VOCABULARY COMPREHENSION

    Directory of Open Access Journals (Sweden)

    Биљана Б. Радић-Бојанић

    2012-02-01

    Full Text Available Metaphorical vocabulary of the English language is often a problem for EFL students both in receptive and productive skills. One possible way of overcoming this difficulty is conscious development of learning strategies, which aid comprehension and production and lead to the autonomy of students in the domain of the use of metaphorical expressions in the English language. One of the strategies that can be used in metaphorical vocabulary comprehension is analyzing expressions, which implies the parsing of a multi-word chunk into words, the analysis of each individual element and checking if all the elements of the multi-word chunk have a literal or metaphorical meaning. The paper is based on a one-year research conducted at the Department of English at the Faculty of Philosophy in Novi Sad, during which English language students were interviewed in order to determine in how and to what extent they analyze metaphorical expressions as a strategy in understanding metaphorical EFL vocabulary.

  4. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  5. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  6. Validating spatial structure in canopy water content using geostatistics

    Science.gov (United States)

    Sanderson, E. W.; Zhang, M. H.; Ustin, S. L.; Rejmankova, E.; Haxo, R. S.

    1995-01-01

    Heterogeneity in ecological phenomena are scale dependent and affect the hierarchical structure of image data. AVIRIS pixels average reflectance produced by complex absorption and scattering interactions between biogeochemical composition, canopy architecture, view and illumination angles, species distributions, and plant cover as well as other factors. These scales affect validation of pixel reflectance, typically performed by relating pixel spectra to ground measurements acquired at scales of 1m(exp 2) or less (e.g., field spectra, foilage and soil samples, etc.). As image analysis becomes more sophisticated, such as those for detection of canopy chemistry, better validation becomes a critical problem. This paper presents a methodology for bridging between point measurements and pixels using geostatistics. Geostatistics have been extensively used in geological or hydrogeolocial studies but have received little application in ecological studies. The key criteria for kriging estimation is that the phenomena varies in space and that an underlying controlling process produces spatial correlation between the measured data points. Ecological variation meets this requirement because communities vary along environmental gradients like soil moisture, nutrient availability, or topography.

  7. MULTIVARIATE GEOSTATISTICAL METHODS FOR MAPPING SOIL SALINITY

    OpenAIRE

    BİLGİLİ, A.V.; ÇULLU, M.A.; AYDEMİR, A.; Turan, V; SÖNMEZ, O.; AYDEMİR, S.; Kaya, C.

    2012-01-01

    Degradation of the lands by salinity under arid climate and poor drainage conditions can be inevitable. In the Harran plain total salt affected areas covers 10 % of total irrigated areas which are mainly located in the low lying parts of the plain where elevation ranges from 350 to 400 m. Soil salinity shows high spatial variability which requires intensive sampling and laboratory analyses. Geostatistical techniques such as simple or ordinary kriging can be used in explaining this spatial var...

  8. Geostatistical Evaluation of Fracture Frequency and Crushing

    OpenAIRE

    Séguret, Serge Antoine; Guajardo Moreno, Cristian; Freire Rivera, Ramon

    2014-01-01

    International audience This work details how to estimate the Fracture Frequency (FF), ratio of a number of fractures divided by a sample length. The difficulty is that often, a part of the sample cannot be analyzed by the geologist because it is crushed, a characteristics of the rock strength that must also be considered for the Rock Mass Rating. After analyzing the usual practices, the paper describes the (geo)statistical link between fracturing and crushing and the resulting method to ob...

  9. Soil penetration resistance analysis by multivariate and geostatistical methods Análisis de la resistencia a la penetración del suelo mediante métodos geoestadísticos y multivariados

    Directory of Open Access Journals (Sweden)

    Cecilia Medina

    2012-02-01

    Full Text Available The penetration resistance (PR is a soil attribute that allows identifies areas with restrictions due to compaction, which results in mechanical impedance for root growth and reduced crop yield. The aim of this study was to characterize the PR of an agricultural soil by geostatistical and multivariate analysis. Sampling was done randomly in 90 points up to 0.60 m depth. It was determined spatial distribution models of PR, and defined areas with mechanical impedance for roots growth. The PR showed a random distribution to 0.55 and 0.60 m depth. PR in other depths analyzed showed spatial dependence, with adjustments to exponential and spherical models. The cluster analysis that considered sampling points allowed establishing areas with compaction problem identified in the maps by kriging interpolation. The analysis with main components identified three soil layers, where the middle layer showed the highest values of PR.La resistencia a la penetración (RP es un atributo del suelo que permite identificar zonas con restricciones debido a la compactación, que se traduce en impedancia mecánica para el desarrollo de las raíces y en una menor productividad de los cultivos. El objetivo del presente trabajo fue caracterizar la RP de un suelo agrícola, mediante análisis geoestadístico y multivariado. El muestreo se realizó de manera aleatoria en 90 puntos, hasta una profundidad de 0,60 m. Se determinaron los modelos de distribución espacial de la RP y se delimitaron áreas con problemas de impedancia mecánica de las raíces. La RP presentó distribución aleatoria a 0,55 y 0,60 m de profundidad. La RP en las otras profundidades analizadas mostraron dependencia espacial, con ajustes a modelos exponenciales y esféricos. El análisis jerárquico que consideró puntos de muestreo, permitió establecer zonas con problemas de compactación, identificadas en los mapas obtenidos mediante interpolación por kriging. El análisis de componentes principales

  10. Applied behavior analysis: New directions from the laboratory

    OpenAIRE

    Epling, W. Frank; Pierce, W. David

    1983-01-01

    Applied behavior analysis began when laboratory based principles were extended to humans inorder to change socially significant behavior. Recent laboratory findings may have applied relevance; however, the majority of basic researchers have not clearly communicated the practical implications of their work. The present paper samples some of the new findings and attempts to demonstrate their applied importance. Schedule-induced behavior which occurs as a by-product of contingencies of reinforce...

  11. Soil exchangeable cations: A geostatistical study from Russia

    Directory of Open Access Journals (Sweden)

    Tayfun Aşkın

    2012-01-01

    Full Text Available In present study, geostatistical techniques was applied to assess the spatial variability of exchangeable cations such as; calcium (Ex-Ca2+, magnesium (Ex-Mg2+, potassium (Ex-K+ and sodium (Ex-Na+ in the tillaged layer in a Perm State Agricultural Academy Farm site in Perm region, West Urals, Russia. A 250x100 m plot (approximately 2.35 ha was divided into grids with 25x25 m spacing that included 51 sampling points from 0-0.2 m in depth. Soil reaction (pH was the least variable property while the Ex-K was the most variable. The greatest range of influence (237.6 m occurred for Ex-Ca and the least range (49.7 m for Ex-Mg.

  12. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  13. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  14. Introduction to This Special Issue on Geostatistics and Geospatial Techniques in Remote Sensing

    Science.gov (United States)

    Atkinson, Peter; Quattrochi, Dale A.; Goodman, H. Michael (Technical Monitor)

    2000-01-01

    Geology for allowing us to use C&G as a vehicle to convey how geostatistics and geospatial techniques can be used to analyze remote sensing and other types of spatial data. We see this special issue of C&G. and its complementary issue of PE&RS. as a testament to the vitality and interest in the application of geostatistical and geospatial techniques in remote sensing. We also see these special journal issues as the beginning of a fruitful. and hopefully long-term relationship, between American and British geographers and other researchers interested in geostatistical and geospatial techniques applied to remote sensing and other spatial data.

  15. Combined assimilation of streamflow and satellite soil moisture with the particle filter and geostatistical modeling

    Science.gov (United States)

    Yan, Hongxiang; Moradkhani, Hamid

    2016-08-01

    Assimilation of satellite soil moisture and streamflow data into a distributed hydrologic model has received increasing attention over the past few years. This study provides a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a distributed Sacramento Soil Moisture Accounting (SAC-SMA) model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. Performance is assessed over the Salt River Watershed in Arizona, which is one of the watersheds without anthropogenic effects in Model Parameter Estimation Experiment (MOPEX). A total of five data assimilation (DA) scenarios are designed and the effects of the locations of streamflow gauges and the ASCAT soil moisture on the predictions of soil moisture and streamflow are assessed. In addition, a geostatistical model is introduced to overcome the significantly biased satellite soil moisture and also discontinuity issue. The results indicate that: (1) solely assimilating outlet streamflow can lead to biased soil moisture estimation; (2) when the study area can only be partially covered by the satellite data, the geostatistical approach can estimate the soil moisture for those uncovered grid cells; (3) joint assimilation of streamflow and soil moisture from geostatistical modeling can further improve the surface soil moisture prediction. This study recommends that the geostatistical model is a helpful tool to aid the remote sensing technique and the hydrologic DA study.

  16. Gentle teaching and applied behavior analysis: a critical review.

    OpenAIRE

    Jones, R S; McCaughey, R E

    1992-01-01

    In recent years, there has been a growing controversy surrounding gentle teaching. This paper explores the nature of this controversy with particular reference to the relationship between gentle teaching and applied behavior analysis. Advantages and disadvantages of this approach are discussed, and it is suggested that gentle teaching and applied behavior analysis need not be regarded as mutually exclusive approaches to working with persons with mental retardation.

  17. Treatment integrity in applied behavior analysis with children.

    OpenAIRE

    F. M. Gresham; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Beha...

  18. Reservoir Modeling Combining Geostatistics with Markov Chain Monte Carlo Inversion

    DEFF Research Database (Denmark)

    Zunino, Andrea; Lange, Katrine; Melnikova, Yulia;

    2014-01-01

    geostatistics. The geostatistical algorithm learns the multiple-point statistics from prototype models, then generates proposal models which are tested by a Metropolis sampler. The solution of the inverse problem is finally represented by a collection of reservoir models in terms of facies and porosity, which...

  19. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik;

    2010-01-01

    of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...... the geology of e.g. a contaminated site, it is not always possible to gather enough information to build a representative geological model. Mapping in analogue geological settings and applying geostatistical tools to simulate spatial variability of heterogeneities can improve ordinary geological models...

  20. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  1. Application of the geostatistical analyses to uranium geology

    International Nuclear Information System (INIS)

    A method for treating the uranium geological data of different types using geostatistical analyses (kriging analysis and/or factor kriging analysis) is described in the paper. The original data are stored in the data bank and can be taken out for analysis. The uranium reserves are estimated by kriging analysis. A complete system of programs suitable for uranium reserve estimation is developed, beginning with input of original gamma-logging data in order to transfer them into ore grades in computer and to calculate uranium grade variograms and, then, to estimate uranium reserves. A case example is presented. In order to develop a new method of analysis for regional geophysical and geochemical data processing, the factor kriging analysis in combination with the essential ideas of kriging analysis and principle component analysis is used. This method enables the regionalized variables to split into the components of different frequency intervals corresponding to different ranges, and then, on the basis of major component analysis several (usually two) major ranges are determined in order to infer the geological structure related to these major ranges. According to the formula of magnetic frequency spectrum and by using fourier inverse transform of 2-dimension, covariance function and variogram are derived. The preliminary results obtained by treatment and analysis of a large number of airborne magnetic data using factor kriging analysis are given. (author). 4 refs, 7 figs

  2. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  3. Methodology, the matching law, and applied behavior analysis

    OpenAIRE

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are de...

  4. Applied Methods for Analysis of Economic Structure and Change

    OpenAIRE

    Anderstig, Christer

    1988-01-01

    The thesis comprises five papers and an introductory overview of applied models and methods. The papers concern interdependences and interrelations in models applied to empirical analyses of various problems related to production, consumption, location and trade. Among different definitions of 'structural analysis' one refers to the study of the properties of economic models on the assumption of invariant structural relations, this definition is close to what is aimed at in lire present case....

  5. ANIMAL RESEARCH IN THE JOURNAL OF APPLIED BEHAVIOR ANALYSIS

    OpenAIRE

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say–do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by...

  6. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  7. The distribution of arsenic in shallow alluvial groundwater under agricultural land in central Portugal: insights from multivariate geostatistical modeling.

    Science.gov (United States)

    Andrade, A I A S S; Stigter, T Y

    2013-04-01

    In this study multivariate and geostatistical methods are jointly applied to model the spatial and temporal distribution of arsenic (As) concentrations in shallow groundwater as a function of physicochemical, hydrogeological and land use parameters, as well as to assess the related uncertainty. The study site is located in the Mondego River alluvial body in Central Portugal, where maize, rice and some vegetable crops dominate. In a first analysis scatter plots are used, followed by the application of principal component analysis to two different data matrices, of 112 and 200 samples, with the aim of detecting associations between As levels and other quantitative parameters. In the following phase explanatory models of As are created through factorial regression based on correspondence analysis, integrating both quantitative and qualitative parameters. Finally, these are combined with indicator-geostatistical techniques to create maps indicating the predicted probability of As concentrations in groundwater exceeding the current global drinking water guideline of 10 μg/l. These maps further allow assessing the uncertainty and representativeness of the monitoring network. A clear effect of the redox state on the presence of As is observed, and together with significant correlations with dissolved oxygen, nitrate, sulfate, iron, manganese and alkalinity, points towards the reductive dissolution of Fe (hydr)oxides as the essential mechanism of As release. The association of high As values with rice crop, known to promote reduced environments due to ponding, further corroborates this hypothesis. An additional source of As from fertilizers cannot be excluded, as the correlation with As is higher where rice is associated with vegetables, normally associated with higher fertilization rates. The best explanatory model of As occurrence integrates the parameters season, crop type, well and water depth, nitrate and Eh, though a model without the last two parameters also gives

  8. Geostatistical prediction of flow-duration curves

    Science.gov (United States)

    Pugliese, A.; Castellarin, A.; Brath, A.

    2013-11-01

    We present in this study an adaptation of Topological kriging (or Top-kriging), which makes the geostatistical procedure capable of predicting flow-duration curves (FDCs) in ungauged catchments. Previous applications of Top-kriging mainly focused on the prediction of point streamflow indices (e.g. flood quantiles, low-flow indices, etc.). In this study Top-kriging is used to predict FDCs in ungauged sites as a weighted average of standardised empirical FDCs through the traditional linear-weighting scheme of kriging methods. Our study focuses on the prediction of period-of-record FDCs for 18 unregulated catchments located in Central Italy, for which daily streamflow series with length from 5 to 40 yr are available, together with information on climate referring to the same time-span of each daily streamflow sequence. Empirical FDCs are standardised by a reference streamflow value (i.e. mean annual flow, or mean annual precipitation times the catchment drainage area) and the overall deviation of the curves from this reference value is then used for expressing the hydrological similarity between catchments and for deriving the geostatistical weights. We performed an extensive leave-one-out cross-validation to quantify the accuracy of the proposed technique, and to compare it to traditional regionalisation models that were recently developed for the same study region. The cross-validation points out that Top-kriging is a reliable approach for predicting FDCs, which can significantly outperform traditional regional models in ungauged basins.

  9. Geostatistical sampling optimization and waste characterization of contaminated premises

    International Nuclear Information System (INIS)

    At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires a radiological assessment of the building structure residual activity. From this point of view, the set up of an appropriate evaluation methodology is of crucial importance. The radiological characterization of contaminated premises can be divided into three steps. First, the most exhaustive facility analysis provides historical and qualitative information. Then, a systematic (exhaustive) control of the emergent signal is commonly performed using in situ measurement methods such as surface controls combined with in situ gamma spectrometry. Finally, in order to assess the contamination depth, samples are collected at several locations within the premises and analyzed. Combined with historical information and emergent signal maps, such data allow the definition of a preliminary waste zoning. The exhaustive control of the emergent signal with surface measurements usually leads to inaccurate estimates, because of several factors: varying position of the measuring device, subtraction of an estimate of the background signal, etc. In order to provide reliable estimates while avoiding supplementary investigation costs, there is therefore a crucial need for sampling optimization methods together with appropriate data processing techniques. The initial activity usually presents a spatial continuity within the premises, with preferential contamination of specific areas or existence of activity gradients. Taking into account this spatial continuity is essential to avoid bias while setting up the sampling plan. In such a case, Geostatistics provides methods that integrate the contamination spatial structure. After the characterization of this spatial structure, most probable estimates of the surface activity at un-sampled locations can be derived using kriging techniques. Variants of these techniques also give access to estimates of the uncertainty associated to the spatial

  10. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  11. A methodological approach for the geostatistical characterization of radiological contaminations in nuclear facilities

    International Nuclear Information System (INIS)

    The decommissioning of nuclear sites represents huge industrial and financial challenges. At the end of process equipment dismantling, the complete decontamination of nuclear facilities requires the radiological assessment of residual activity levels of the building structures (waste zoning). From this point of view, the set up of an appropriate evaluation methodology is of prime importance. The developed radiological characterization is divided into three steps: historical and functional analysis, radiation mapping and in-depth investigations. Combined with relevant data analysis and processing tools, this methodology aims at optimizing the investigation costs and the radiological waste volumes. In a former CEA nuclear facility, a substantial radiation survey was performed at the beginning of the thesis in order to try and test the methodology. The relevance of geo-statistics was also illustrated: exploratory data analysis, spatial structure analysis through the variogram, mapping, risk analysis results, iso-factorial modeling to take the structuring of extreme values into account. Destructive concrete samples are then integrated to complete the characterization in addition to radiation data through the geostatistical multivariate approach. Finally, waste segregation results from the risk of exceeding given radiological thresholds. An inventory of the spatial structures of radiological contaminations in nuclear facilities is provided within the geostatistical framework. It leads to sampling recommendations in order to improve the overall characterization strategy and optimize the investigation costs. (author)

  12. Assessment of effectiveness of geologic isolation systems: geostatistical modeling of pore velocity

    International Nuclear Information System (INIS)

    A significant part of evaluating a geologic formation as a nuclear waste repository involves the modeling of contaminant transport in the surrounding media in the event the repository is breached. The commonly used contaminant transport models are deterministic. However, the spatial variability of hydrologic field parameters introduces uncertainties into contaminant transport predictions. This paper discusses the application of geostatistical techniques to the modeling of spatially varying hydrologic field parameters required as input to contaminant transport analyses. Kriging estimation techniques were applied to Hanford Reservation field data to calculate hydraulic conductivity and the ground-water potential gradients. These quantities were statistically combined to estimate the groundwater pore velocity and to characterize the pore velocity estimation error. Combining geostatistical modeling techniques with product error propagation techniques results in an effective stochastic characterization of groundwater pore velocity, a hydrologic parameter required for contaminant transport analyses

  13. Geostatistical simulations for radon indoor with a nested model including the housing factor.

    Science.gov (United States)

    Cafaro, C; Giovani, C; Garavaglia, M

    2016-01-01

    The radon prone areas definition is matter of many researches in radioecology, since radon is considered a leading cause of lung tumours, therefore the authorities ask for support to develop an appropriate sanitary prevention strategy. In this paper, we use geostatistical tools to elaborate a definition accounting for some of the available information about the dwellings. Co-kriging is the proper interpolator used in geostatistics to refine the predictions by using external covariates. In advance, co-kriging is not guaranteed to improve significantly the results obtained by applying the common lognormal kriging. Here, instead, such multivariate approach leads to reduce the cross-validation residual variance to an extent which is deemed as satisfying. Furthermore, with the application of Monte Carlo simulations, the paradigm provides a more conservative radon prone areas definition than the one previously made by lognormal kriging. PMID:26547362

  14. AGEFIS:Applied General Equilibrium for FIScal Policy Analysis

    OpenAIRE

    Arief Anshory Yusuf; Djoni Hartono; Wawan Hermawan; Yayan

    2008-01-01

    AGEFIS (Applied General Equilibrium model for FIScal Policy Analysis) is a Computable General Equilibrium (CGE) model designed specifically, but not limited, to analyze various aspects of fiscal policies in Indonesia. It is yet, the first Indonesian fully-SAM-based CGE model solved by Gempack. This paper describes the structure of the model and illustrates its application.

  15. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    Science.gov (United States)

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  16. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  17. B. F. Skinner's Contributions to Applied Behavior Analysis

    Science.gov (United States)

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  18. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  19. The spread of behavior analysis to the applied fields 1

    OpenAIRE

    Fraley, Lawrence E.

    1981-01-01

    This paper reviews the status of applied behavioral science as it exists in the various behavioral fields and considers the role of the Association for Behavior Analysis in serving those fields. The confounding effects of the traditions of psychology are discussed. Relevant issues are exemplified in the fields of law, communications, psychology, and education, but broader generalization is implied.

  20. Applying Frequency Map Analysis to the Australian Synchrotron Storage Ring

    CERN Document Server

    Tan, Yaw-Ren E; Le Blanc, Gregory Scott

    2005-01-01

    The technique of frequency map analysis has been applied to study the transverse dynamic aperture of the Australian Synchrotron Storage Ring. The results have been used to set the strengths of sextupoles to optimise the dynamic aperture. The effects of the allowed harmonics in the quadrupoles and dipole edge effects are discussed.

  1. Ore reserve evalution, through geostatistical methods, in sector C-09, Pocos de Caldas, MG-Brazil

    International Nuclear Information System (INIS)

    In sector C-09, Pocos de Caldas in the state of Minas Gerais, geostatistical techniques have been used to evaluate the tonnage of U3O8 and associated minerals and to delimit ore from sterile areas. The calculation of reserve was based on borehole information including the results of chemical and/or radiometric analysis. Two-and three dimensional evalutions were made following the existing geological models. Initially, the evaluation was based on chemical analysis using the more classical geostatistical technique of kriging. This was followed by a second evaluation using the more recent technique of co-kriging which permited the incorporation of radiometric information in the calculations. The correlation between ore grade and radiometric was studied using the method of cross-covariance. Following restrictions imposed by mining considerations, a probabilistic selection was made of blocks of appropriate dimensions so as to evaluate the grade tonnage curve for each panel. (Author)

  2. A geostatistical approach to predicting sulfur content in the Pittsburgh coal bed

    Energy Technology Data Exchange (ETDEWEB)

    Watson, William D. [US Department of Energy, Energy Information Administration, 950 L' Enfant Plaza South Building EI-52, 1000 Independence Avenue SW, 20585 Washington, DC (United States); Ruppert, Leslie F.; Bragg, Linda J.; Tewalt, Susan J. [US Geological Survey, National Center, MS 956, 20192 Reston, VA (United States)

    2001-12-01

    The US Geological Survey (USGS) is completing a national assessment of coal resources in the five top coal-producing regions in the US. Point-located data provide measurements on coal thickness and sulfur content. The sample data and their geologic interpretation represent the most regionally complete and up-to-date assessment of what is known about top-producing US coal beds. The sample data are analyzed using a combination of geologic and Geographic Information System (GIS) models to estimate tonnages and qualities of the coal beds. Traditionally, GIS practitioners use contouring to represent geographical patterns of 'similar' data values. The tonnage and grade of coal resources are then assessed by using the contour lines as references for interpolation. An assessment taken to this point is only indicative of resource quantity and quality. Data users may benefit from a statistical approach that would allow them to better understand the uncertainty and limitations of the sample data. To develop a quantitative approach, geostatistics were applied to the data on coal sulfur content from samples taken in the Pittsburgh coal bed (located in the eastern US, in the southwestern part of the state of Pennsylvania, and in adjoining areas in the states of Ohio and West Virginia). Geostatistical methods that account for regional and local trends were applied to blocks 2.7 mi (4.3 km) on a side. The data and geostatistics support conclusions concerning the average sulfur content and its degree of reliability at regional- and economic-block scale over the large, contiguous part of the Pittsburgh outcrop, but not to a mine scale. To validate the method, a comparison was made with the sulfur contents in sample data taken from 53 coal mines located in the study area. The comparison showed a high degree of similarity between the sulfur content in the mine samples and the sulfur content represented by the geostatistically derived contours.

  3. Structural reliability analysis applied to pipeline risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)

  4. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  5. Statistics and geostatistics: Kriging and use of hemivariogram functions in the structural investigation of uranium deposits

    International Nuclear Information System (INIS)

    After presenting some general conceptual considerations regarding the theory of regionalized variables, the paper deals with specific applications of the intrinsic dispersion law to the determination, description and quantification of structures. It then briefly describes two uranium deposits in Cordoba province, the study of which yielded the basic data and parameters for compiling the geostatistical results presented. Before taking up the matter of structural interpretations, it refers briefly to the mathematical relationship between the number of sampling points available and the number of directions that can be investigated by the variogram method and also emphasizes the need for quantifying regionalization concepts on the basis of a table of absolute dimensionalities. In the case of the ''Rodolfo'' deposit it presents and comments on the hemivariograms for concentrations, thicknesses and accumulations, drawing attention at the same time to the existence of significant nest-like phenomena (gigogne structures). In this connection there is also a discussion of the case of iterative lenticular mineralization on a natural and a simulated model. The ''Schlagintweit'' deposit is dealt with in the same way, with descriptions and evaluations of the subjacent structures revealed by the hemivariographic analysis of grades, mineralization thicknesses and accumulations. This is followed by some considerations on the possibility of applying Krige and Matheron correctors in the moderation of anomalous mineralized thicknesses. In conclusion, the paper presents a ''range ellipse'' for grades; this is designed to supplement the grid of sampling points for the ''Rodolfo'' deposit by means of Matheronian kriging techniques. (author)

  6. The Split-Apply-Combine Strategy for Data Analysis

    Directory of Open Access Journals (Sweden)

    Hadley Wickham

    2011-04-01

    Full Text Available Many data analysis problems involve the application of a split-apply-combine strategy, where you break up a big problem into manageable pieces, operate on each piece independently and then put all the pieces back together. This insight gives rise to a new R package that allows you to smoothly apply this strategy, without having to worry about the type of structure in which your data is stored.The paper includes two case studies showing how these insights make it easier to work with batting records for veteran baseball players and a large 3d array of spatio-temporal ozone measurements.

  7. Interannual Changes in Biomass Affect the Spatial Aggregations of Anchovy and Sardine as Evidenced by Geostatistical and Spatial Indicators.

    Directory of Open Access Journals (Sweden)

    Marco Barra

    Full Text Available Geostatistical techniques were applied and a series of spatial indicators were calculated (occupation, aggregation, location, dispersion, spatial autocorrelation and overlap to characterize the spatial distributions of European anchovy and sardine during summer. Two ecosystems were compared for this purpose, both located in the Mediterranean Sea: the Strait of Sicily (upwelling area and the North Aegean Sea (continental shelf area, influenced by freshwater. Although the biomass of anchovy and sardine presented high interannual variability in both areas, the location of the centres of gravity and the main spatial patches of their populations were very similar between years. The size of the patches representing the dominant part of the abundance (80% was mostly ecosystem- and species-specific. Occupation (area of presence appears to be shaped by the extent of suitable habitats in each ecosystem whereas aggregation patterns (how the populations are distributed within the area of presence were species-specific and related to levels of population biomass. In the upwelling area, both species showed consistently higher occupation values compared to the continental shelf area. Certain characteristics of the spatial distribution of sardine (e.g. spreading area, overlapping with anchovy differed substantially between the two ecosystems. Principal component analysis of geostatistical and spatial indicators revealed that biomass was significantly related to a suite of, rather than single, spatial indicators. At the spatial scale of our study, strong correlations emerged between biomass and the first principal component axis with highly positive loadings for occupation, aggregation and patchiness, independently of species and ecosystem. Overlapping between anchovy and sardine increased with the increase of sardine biomass but decreased with the increase of anchovy. This contrasting pattern was attributed to the location of the respective major patches

  8. Simultaneous inversion of petrophysical parameters based on geostatistical a priori information

    Institute of Scientific and Technical Information of China (English)

    Yin Xing-Yao; Sun Rui-Ying; Wang Bao-Li; Zhang Guang-Zhi

    2014-01-01

    The high-resolution nonlinear simultaneous inversion of petrophysical parameters is based on Bayesian statistics and combines petrophysics with geostatistical a priori information. We used the fast Fourier transform-moving average (FFT-MA) and gradual deformation method (GDM) to obtain a reasonable variogram by using structural analysis and geostatistical a priori information of petrophysical parameters. Subsequently, we constructed the likelihood function according to the statistical petrophysical model. Finally, we used the Metropolis algorithm to sample the posteriori probability density and complete the inversion of the petrophysical parameters. We used the proposed method to process data from an oil fi eld in China and found good match between inversion and real data with high-resolution. In addition, the direct inversion of petrophysical parameters avoids the error accumulation and decreases the uncertainty, and increases the computational effi ciency.

  9. Determination of 137Cs contamination depth distribution in building structures using geostatistical modeling of ISOCS measurements

    International Nuclear Information System (INIS)

    Decommissioning of nuclear building structures usually leads to large amounts of low level radioactive waste. Using a reliable method to determine the contamination depth is indispensable prior to the start of decontamination works and also for minimizing the radioactive waste volume and the total workload. The method described in this paper is based on geostatistical modeling of in situ gamma-ray spectroscopy measurements using the multiple photo peak method. The method has been tested on the floor of the waste gas surge tank room within the BR3 (Belgian Reactor 3) decommissioning project and has delivered adequate results. - Highlights: • 137Cs depth contamination was determined using the multiple photo peak method. • Geostatistical modeling was used to determine treatment depth areas and perform risk analysis. • Results were evaluated using laser scanning and long term gamma-ray spectroscopy. • Waste volume reduction of about 1/3 compared to a more traditional approach

  10. Preliminary evaluation of uranium deposits. A geostatistical study of drilling density in Wyoming solution fronts

    International Nuclear Information System (INIS)

    Studies of a roll-front uranium deposit in Shirley Basin Wyoming indicate that preliminary evaluation of the reserve potential of an ore body is possible with less drilling than currently practiced in industry. Estimating ore reserves from sparse drilling is difficult because most reserve calculation techniques do not give the accuracy of the estimate. A study of several deposits with a variety of drilling densities shows that geostatistics consistently provides a method of assessing the accuracy of an ore reserve estimate. Geostatistics provides the geologist with an additional descriptive technique - one which is valuable in the economic assessment of a uranium deposit. Closely spaced drilling on past properties provides both geological and geometric insight into the occurrence of uranium in roll-front type deposits. Just as the geological insight assists in locating new ore bodies and siting preferential drill locations, the geometric insight can be applied mathematically to evaluate the accuracy of a new ore reserve estimate. By expressing the geometry in numerical terms, geostatistics extracts important geological characteristics and uses this information to aid in describing the unknown characteristics of a property. (author)

  11. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    Science.gov (United States)

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  12. Geostatistical inference using crosshole ground-penetrating radar

    DEFF Research Database (Denmark)

    Looms, Majken C; Hansen, Thomas Mejer; Cordua, Knud Skou; Nielsen, Lars; Jensen, Karsten H.; Binley, Andrew

    2010-01-01

    High-resolution tomographic images obtained from crosshole geophysical measurements have the potential to provide valuable information about the geostatistical properties of unsaturated-zone hydrologic-state va riables such as moisture content. Under drained or quasi-steady-state conditions, the...... wave velocity structure, which may diminish the utility of these images for geostatistical inference. We have used a linearized stochastic inversion technique to infer the geostatistical properties of the subsurface radar wave velocity distribution using crosshole GPR traveltimes directly. Expanding on...... realizations of the subsurface are used to evaluate the uncertainty of the inversion estimate. We have explored the full potential of the geostatistical inference method using several synthetic models of varying correlation structures and have tested the influence of different assumptions concerning the choice...

  13. Numerical continuation applied to landing gear mechanism analysis

    OpenAIRE

    Knowles, J.; Krauskopf, B; Lowenberg, MH

    2010-01-01

    A method of investigating quasi-static mechanisms is presented and applied to an overcentre mechanism and to a nose landing gear mechanism. The method uses static equilibrium equations along with equations describing the geometric constraints in the mechanism. In the spirit of bifurcation analysis, solutions to these steady-state equations are then continued numerically in parameters of interest. Results obtained from the bifurcation method agree with the equivalent results obtained from two ...

  14. Emerging Opportunities in Higher Education : Applied Behavior Analysis and Autism

    OpenAIRE

    Ala í- Rosales, Shahla; Roll-Pettersson, Lise; Pinkelman, Sarah

    2010-01-01

      The growing number of children diagnosed with autism and the recognized importance of evidence-based interventions has substantially increased the need for well-trained applied behavior analysts. Relative to public/consumer demand, there are very few higher education programs that are equipped to train behavior analysts specializing in autism. Worldwide, there are only a few programs accredited by Association for Behavior Analysis International (ABAI), that have course sequences approved by...

  15. B. F. Skinner's contributions to applied behavior analysis

    OpenAIRE

    Morris, Edward K.; Smith, Nathaniel G; Altus, Deborah E

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew from his science for application, his descriptions of possible applications, and his own applications to nonhuman and human behavior. Second, we foun...

  16. Developing an interdisciplinary master's program in applied behavior analysis

    OpenAIRE

    Lowenkron, Barry; Mitchell, Lynda

    1995-01-01

    At many universities, faculty interested in behavior analysis are spread across disciplines. This makes difficult the development of behavior-analytically oriented programs, and impedes regular contact among colleagues who share common interests. However, this separation by disciplines can be a source of strength if it is used to develop interdisciplinary programs. In this article we describe how a bottom-up strategy was used to develop two complementary interdisciplinary MS programs in appli...

  17. Wavelet Analysis Applied to the Research on Heroin Detection

    International Nuclear Information System (INIS)

    Wavelet analysis is applied to process energy spectrum signal of drug detection by energy scattering X-Ray. In the paper, we put forward a new adaptive correlation denoising algorithm, which could achieve good filter effect. Also, a simple and effective method of peaks seeking is designed, which might anchor both strong peaks and weak peaks. Eventually, comparing experiment result with data by XRD and PDF, it makes out low relative error. (authors)

  18. An applied ethics analysis of best practice tourism entrepreneurs

    OpenAIRE

    Power, Susann

    2015-01-01

    Ethical entrepreneurship and by extension wider best practice are noble goals for the future of tourism. However, questions arise which concepts, such as values motivations, actions and challenges underpin these goals. This thesis seeks to answers these questions and in so doing develop an applied ethics analysis for best practice entrepreneurs in tourism. The research is situated in sustainable tourism, which is ethically very complex and has thus far been dominated by the economic, social a...

  19. Recent reinforcement-schedule research and applied behavior analysis

    OpenAIRE

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule pe...

  20. Applied behavior analysis at West Virginia University: A brief history

    OpenAIRE

    Hawkins, Robert P.; Chase, Philip N.; Scotti, Joseph R.

    1993-01-01

    The development of an emphasis on applied behavior analysis in the Department of Psychology at West Virginia University is traced. The emphasis began primarily in the early 1970s, under the leadership of Roger Maley and Jon Krapfl, and has continued to expand and evolve with the participation of numerous behavior analysts and behavior therapists, both inside and outside the department. The development has been facilitated by several factors: establishment of a strong behavioral emphasis in th...

  1. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    OpenAIRE

    Zhang, Xinxin; Lind, Morten; Gola, Giulio; Ravn, Ole

    2013-01-01

    This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for cons...

  2. Applying DEA sensitivity analysis to efficiency measurement of Vietnamese universities

    Directory of Open Access Journals (Sweden)

    Thi Thanh Huyen Nguyen

    2015-11-01

    Full Text Available The primary purpose of this study is to measure the technical efficiency of 30 doctorate-granting universities, the universities or the higher education institutes with PhD training programs, in Vietnam, applying the sensitivity analysis of data envelopment analysis (DEA. The study uses eight sets of input-output specifications using the replacement as well as aggregation/disaggregation of variables. The measurement results allow us to examine the sensitivity of the efficiency of these universities with the sets of variables. The findings also show the impact of variables on their efficiency and its “sustainability”.

  3. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  4. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  5. Applied behavior analysis at West Virginia University: A brief history.

    Science.gov (United States)

    Hawkins, R P; Chase, P N; Scotti, J R

    1993-01-01

    The development of an emphasis on applied behavior analysis in the Department of Psychology at West Virginia University is traced. The emphasis began primarily in the early 1970s, under the leadership of Roger Maley and Jon Krapfl, and has continued to expand and evolve with the participation of numerous behavior analysts and behavior therapists, both inside and outside the department. The development has been facilitated by several factors: establishment of a strong behavioral emphasis in the three Clinical graduate programs; change of the graduate program in Experimental Psychology to a program in basic Behavior Analysis; development of nonclinical applied behavior analysis within the Behavior Analysis program; establishment of a joint graduate program with Educational Psychology; establishment of a Community/Systems graduate program; and organization of numerous conferences. Several factors are described that seem to assure a stable role for behavior analysis in the department: a stable and supportive "culture" within the department; American Psychological Association accreditation of the clinical training; a good reputation both within the university and in psychology; and a broader community of behavior analysts and behavior therapists. PMID:16795816

  6. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    Energy Technology Data Exchange (ETDEWEB)

    Davis, J.M.

    1994-01-01

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing.

  7. A conceptual sedimentological-geostatistical model of aquifer heterogeneity based on outcrop studies

    International Nuclear Information System (INIS)

    Three outcrop studies were conducted in deposits of different depositional environments. At each site, permeability measurements were obtained with an air-minipermeameter developed as part of this study. In addition, the geological units were mapped with either surveying, photographs, or both. Geostatistical analysis of the permeability data was performed to estimate the characteristics of the probability distribution function and the spatial correlation structure. The information obtained from the geological mapping was then compared with the results of the geostatistical analysis for any relationships that may exist. The main field site was located in the Albuquerque Basin of central New Mexico at an outcrop of the Pliocene-Pleistocene Sierra Ladrones Formation. The second study was conducted on the walls of waste pits in alluvial fan deposits at the Nevada Test Site. The third study was conducted on an outcrop of an eolian deposit (miocene) south of Socorro, New Mexico. The results of the three studies were then used to construct a conceptual model relating depositional environment to geostatistical models of heterogeneity. The model presented is largely qualitative but provides a basis for further hypothesis formulation and testing

  8. Optimizing Groundwater Monitoring Networks Using Integrated Statistical and Geostatistical Approaches

    Directory of Open Access Journals (Sweden)

    Jay Krishna Thakur

    2015-08-01

    Full Text Available The aim of this work is to investigate new approaches using methods based on statistics and geo-statistics for spatio-temporal optimization of groundwater monitoring networks. The formulated and integrated methods were tested with the groundwater quality data set of Bitterfeld/Wolfen, Germany. Spatially, the monitoring network was optimized using geo-statistical methods. Temporal optimization of the monitoring network was carried out using Sen’s method (1968. For geostatistical network optimization, a geostatistical spatio-temporal algorithm was used to identify redundant wells in 2- and 2.5-D Quaternary and Tertiary aquifers. Influences of interpolation block width, dimension, contaminant association, groundwater flow direction and aquifer homogeneity on statistical and geostatistical methods for monitoring network optimization were analysed. The integrated approach shows 37% and 28% redundancies in the monitoring network in Quaternary aquifer and Tertiary aquifer respectively. The geostatistical method also recommends 41 and 22 new monitoring wells in the Quaternary and Tertiary aquifers respectively. In temporal optimization, an overall optimized sampling interval was recommended in terms of lower quartile (238 days, median quartile (317 days and upper quartile (401 days in the research area of Bitterfeld/Wolfen. Demonstrated methods for improving groundwater monitoring network can be used in real monitoring network optimization with due consideration given to influencing factors.

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. Tomogram-based comparison of geostatistical models: Application to the Macrodispersion Experiment (MADE) site

    Science.gov (United States)

    Linde, Niklas; Lochbühler, Tobias; Dogan, Mine; Van Dam, Remke L.

    2015-12-01

    We propose a new framework to compare alternative geostatistical descriptions of a given site. Multiple realizations of each of the considered geostatistical models and their corresponding tomograms (based on inversion of noise-contaminated simulated data) are used as a multivariate training image. The training image is scanned with a direct sampling algorithm to obtain conditional realizations of hydraulic conductivity that are not only in agreement with the geostatistical model, but also honor the spatially varying resolution of the site-specific tomogram. Model comparison is based on the quality of the simulated geophysical data from the ensemble of conditional realizations. The tomogram in this study is obtained by inversion of cross-hole ground-penetrating radar (GPR) first-arrival travel time data acquired at the MAcro-Dispersion Experiment (MADE) site in Mississippi (USA). Various heterogeneity descriptions ranging from multi-Gaussian fields to fields with complex multiple-point statistics inferred from outcrops are considered. Under the assumption that the relationship between porosity and hydraulic conductivity inferred from local measurements is valid, we find that conditioned multi-Gaussian realizations and derivatives thereof can explain the crosshole geophysical data. A training image based on an aquifer analog from Germany was found to be in better agreement with the geophysical data than the one based on the local outcrop, which appears to under-represent high hydraulic conductivity zones. These findings are only based on the information content in a single resolution-limited tomogram and extending the analysis to tracer or higher resolution surface GPR data might lead to different conclusions (e.g., that discrete facies boundaries are necessary). Our framework makes it possible to identify inadequate geostatistical models and petrophysical relationships, effectively narrowing the space of possible heterogeneity representations.

  11. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling

    Directory of Open Access Journals (Sweden)

    Laura Grisotto

    2016-04-01

    Full Text Available In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling.

  12. Geostatistical integration and uncertainty in pollutant concentration surface under preferential sampling.

    Science.gov (United States)

    Grisotto, Laura; Consonni, Dario; Cecconi, Lorenzo; Catelan, Dolores; Lagazio, Corrado; Bertazzi, Pier Alberto; Baccini, Michela; Biggeri, Annibale

    2016-01-01

    In this paper the focus is on environmental statistics, with the aim of estimating the concentration surface and related uncertainty of an air pollutant. We used air quality data recorded by a network of monitoring stations within a Bayesian framework to overcome difficulties in accounting for prediction uncertainty and to integrate information provided by deterministic models based on emissions meteorology and chemico-physical characteristics of the atmosphere. Several authors have proposed such integration, but all the proposed approaches rely on representativeness and completeness of existing air pollution monitoring networks. We considered the situation in which the spatial process of interest and the sampling locations are not independent. This is known in the literature as the preferential sampling problem, which if ignored in the analysis, can bias geostatistical inferences. We developed a Bayesian geostatistical model to account for preferential sampling with the main interest in statistical integration and uncertainty. We used PM10 data arising from the air quality network of the Environmental Protection Agency of Lombardy Region (Italy) and numerical outputs from the deterministic model. We specified an inhomogeneous Poisson process for the sampling locations intensities and a shared spatial random component model for the dependence between the spatial location of monitors and the pollution surface. We found greater predicted standard deviation differences in areas not properly covered by the air quality network. In conclusion, in this context inferences on prediction uncertainty may be misleading when geostatistical modelling does not take into account preferential sampling. PMID:27087040

  13. Computer tomography data on sol structural and hydraulic parameters assessed for spatial continuity by semivariance geostatistics

    International Nuclear Information System (INIS)

    Visual observations on the spatial distribution, at 1-cm intervals, of bulk density (ρ), porosity (ε), fractal dimension (D), water content (θ), and unsaturated hydraulic conductivity (Kus) in uniformly packed soil columns showed randomness. We explore the use of semivariance geostatistics to clarify the issue of randomness and continuity on the spatial distribution of ρ, ε, D, θ, and Kus data obtained using a custom-built gamma scanner and computed tomography technique. Semivariance increased with increasing lag distance and plots of semivariance Iag distance produced spherical semivariograms for most of the soil parameters investigated. This indicated that even though randomness existed in the spatial distribution of the soil parameters, there existed specific trends in their spatial continuity. Higher spatial continuity, in water stable aggregates, was characterised by smaller values of semi-, sill-,and nugget-variances and larger values of span. Opposite trends were observed for unstable aggregates. Wetting in unstable aggregates produced further reductions in span increases for other geostatistical parameters, indicating that wetting decreased spatial continuity. The results indicate that geostatistical analysis is useful to clarify the issue of randomnes at very small scales and to quantify and discriminate the influence of differences in structural stability and wetting-induced changes in the spatial continuity of soil parameters, particularly ε. Copyright (1998) CSIRO Australia

  14. Cladistic analysis applied to the classification of volcanoes

    Science.gov (United States)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  15. Study on geological environment model using geostatistics method

    International Nuclear Information System (INIS)

    The purpose of this study is to develop the geostatistical procedure for modeling geological environments and to evaluate the quantitative relationship between the amount of information and the reliability of the model using the data sets obtained in the surface-based investigation phase (Phase 1) of the Horonobe Underground Research Laboratory Project. This study lasts for three years from FY2004 to FY2006 and this report includes the research in FY2005 as the second year of three-year study. In FY2005 research, the hydrogeological model was built as well as FY2004 research using the data obtained from the deep boreholes (HDB-6, 7 and 8) and the ground magnetotelluric (AMT) survey which were executed in FY2004 in addition to the data sets used in the first year of study. Above all, the relationship between the amount of information and the reliability of the model was demonstrated through a comparison of the models at each step which corresponds to the investigation stage in each FY. Furthermore, the statistical test was applied for detecting the difference of basic statistics of various data due to geological features with a view to taking the geological information into the modeling procedures. (author)

  16. An interactive Bayesian geostatistical inverse protocol for hydraulic tomography

    Science.gov (United States)

    Fienen, Michael N.; Clemo, Tom; Kitanidis, Peter K.

    2008-01-01

    Hydraulic tomography is a powerful technique for characterizing heterogeneous hydrogeologic parameters. An explicit trade-off between characterization based on measurement misfit and subjective characterization using prior information is presented. We apply a Bayesian geostatistical inverse approach that is well suited to accommodate a flexible model with the level of complexity driven by the data and explicitly considering uncertainty. Prior information is incorporated through the selection of a parameter covariance model characterizing continuity and providing stability. Often, discontinuities in the parameter field, typically caused by geologic contacts between contrasting lithologic units, necessitate subdivision into zones across which there is no correlation among hydraulic parameters. We propose an interactive protocol in which zonation candidates are implied from the data and are evaluated using cross validation and expert knowledge. Uncertainty introduced by limited knowledge of dynamic regional conditions is mitigated by using drawdown rather than native head values. An adjoint state formulation of MODFLOW-2000 is used to calculate sensitivities which are used both for the solution to the inverse problem and to guide protocol decisions. The protocol is tested using synthetic two-dimensional steady state examples in which the wells are located at the edge of the region of interest.

  17. Ion-exchange resin separation applied to activation analysis (1963)

    International Nuclear Information System (INIS)

    The separation techniques based on ion-exchange resins have been used, in this study, for carrying out activation analyses on about thirty impurities. A separation process has been developed so as to standardise these analyses and to render them execution a matter of routine. The reparation yields obtained are excellent and make it possible to carry out analyses on samples having a large activation cross-section ween working inside a reinforced fume-cupboard. This technique has been applied to the analysis of impurities in tantalum, iron, gallium, germanium, terphenyl, and tungsten. The extension of this process to other impurities and to other matrices is now being studied. (authors)

  18. Application of indicator geostatistics to the Gorleben data set

    International Nuclear Information System (INIS)

    Analysis and understanding of the groundwater flow in the neighborhood of a radioactive waste repository play important roles in a performance assessment. Such analyses rely in numerical modelling in order to study the flow and transport over the very long times that are relevant. This paper describes a trial application of Indicator geostatistical methods to borehole data from the Gorleben site. Indicator variograms have been estimated from the site data and fitted to theoretical models. Indicator Kriging calculations have been carried out for a region between two boreholes, and for planes lying across and along the Gorleben erosion channel. The results of Indicator Kriging share some features with the suggested geological interpretation supplied with the data, but also show some potentially significant differences, particularly in the continuity of the clay layers. The sensitivity of the Kriged fields to various parameter choices has been investigated, as has the distribution of the Kriging variance. Consideration is now being given to possible sources of the observed discrepancies and thus to making greater use of stratigraphic information in the analysis of the data

  19. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology.

  20. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  1. Evaluating host rock representativeness through geostatistics

    International Nuclear Information System (INIS)

    The Nuclear Waste Policy Act of 1982 (NWPA) assigns the U.S. Department of Energy (DOE) the responsibility of locating, constructing, operating, closing and decommissioning a nuclear waste repository. Prior to submittal of a licensee application to the U.S. Nuclear Regulatory Commission, the DOE is required by 10 CFR part 60 to conduct a program of site characterization for the site to be described in such application. Site characterization includes an exploration and research program in the field and laboratory to determine the geologic conditions and ranges of those parameters which are necessary for determining the suitability of the site as a geologic repository. The information thus collected should be able to establish with reasonable assurance that the public and environment will be adequately protected from the hazards associated with a repository at the site. This paper demonstrates by example a geostatistical approach which, on the basis of the existing limited data base, can be used to evaluate (1) whether existing information on a certain parameter is adequate to be considered representative, (2) if additional information is required, where should the additional sampling or testing be performed, and (3) whether the newly acquired data along with the existing information constitute representative data

  2. Empirical modal decomposition applied to cardiac signals analysis

    Science.gov (United States)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  3. Applying cluster analysis to physics education research data

    Science.gov (United States)

    Springuel, R. Padraic

    One major thrust of Physics Education Research (PER) is the identification of student ideas about specific physics concepts, both correct ideas and those that differ from the expert consensus. Typically the research process of eliciting the spectrum of student ideas involves the administration of specially designed questions to students. One major analysis task in PER is the sorting of these student responses into thematically coherent groups. This process is one which has previously been done by eye in PER. This thesis explores the possibility of using cluster analysis to perform the task in a more rigorous and less time-intensive fashion while making fewer assumptions about what the students are doing. Since this technique has not previously been used in PER, a summary of the various kinds of cluster analysis is included as well as a discussion of which might be appropriate for the task of sorting student responses into groups. Two example data sets (one based on the Force and Motion Conceptual Evaluation (DICE) the other looking at acceleration in two-dimensions (A2D) are examined in depth to demonstrate how cluster analysis can be applied to PER data and the various considerations which must be taken into account when doing so. In both cases, the techniques described in this thesis found 5 groups which contained about 90% of the students in the data set. The results of this application are compared to previous research on the topics covered by the two examples to demonstrate that cluster analysis can effectively uncover the same patterns in student responses that have already been identified.

  4. Geostatistical description of geological heterogeneity in clayey till as input for improved characterization of contaminated sites

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Klint, K.E.S.; Renard, P.;

    2010-01-01

    In low-permeability clay tills subsurface transport is governed by preferential flow in sand lenses and fractures. A proper geological model requires the integration of these features, i.e. the spatial distribution of the geological heterogeneities. Detailed mapping of sand lenses has been done a...... a clay till outcrop in Denmark to characterise the shapes and the spatial variability. Further, geostatistics were applied to simulate the distribution and to develop a heterogeneity model that can be incorporated into an existing geological model of, for example, a contaminated site....

  5. Spatial Pattern of Great Lakes Estuary Processes from Water Quality Sensing and Geostatistical Methods

    Science.gov (United States)

    Xu, W.; Minsker, B. S.; Bailey, B.; Collingsworth, P.

    2014-12-01

    Mixing of river and lake water can alter water temperature, conductivity, and other properties that influence ecological processes in freshwater estuaries of the Great Lakes. This study uses geostatistical methods to rapidly visualize and understand water quality sampling results and enable adaptive sampling to remove anomalies and explore interesting phenomena in more detail. Triaxus, a towed undulating sensor package, was used for collecting various physical and biological water qualities in three estuary areas of Lake Michigan in Summer 2011. Based on the particular sampling pattern, data quality assurance and quality control (QA/QC) processes, including sensor synchronization, upcast and downcast separation, and spatial outlier removal are first applied. An automated kriging interpolation approach that considers trend and anisotropy is then proposed to estimate data on a gridded map for direct visualization. Other methods are explored with the data to gain more insights on water quality processes. Local G statistics serve as a supplementary tool to direct visualization. The method identifies statistically high value zones (hot spots) and low value zones (cold spots) in water chemistry across the estuaries, including locations of water sources and intrusions. In addition, chlorophyll concentration distributions are different among sites. To further understand the interactions and differences between river and lake water, K-means clustering algorithm is used to spatially cluster the water based on temperature and specific conductivity. Statistical analysis indicates that clusters with significant river water can be identified from higher turbidity, specific conductivity, and chlorophyll concentrations. Different ratios between zooplankton biomass and density indicate different zooplankton structure across clusters. All of these methods can contribute to improved near real-time analysis of future sampling activity.

  6. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  7. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques

  8. Downside Risk analysis applied to Hedge Funds universe

    CERN Document Server

    Perello, J

    2006-01-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.

  9. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  10. Evaluation of stationary and non-stationary geostatistical models for inferring hydraulic conductivity values at Aespoe

    International Nuclear Information System (INIS)

    This report describes the comparison of stationary and non-stationary geostatistical models for the purpose of inferring block-scale hydraulic conductivity values from packer tests at Aespoe. The comparison between models is made through the evaluation of cross-validation statistics for three experimental designs. The first experiment consisted of a 'Delete-1' test previously used at Finnsjoen. The second test consisted of 'Delete-10%' and the third test was a 'Delete-50%' test. Preliminary data analysis showed that the 3 m and 30 m packer test data can be treated as a sample from a single population for the purposes of geostatistical analyses. Analysis of the 3 m data does not indicate that there are any systematic statistical changes with depth, rock type, fracture zone vs non-fracture zone or other mappable factor. Directional variograms are ambiguous to interpret due to the clustered nature of the data, but do not show any obvious anisotropy that should be accounted for in geostatistical analysis. Stationary analysis suggested that there exists a sizeable spatially uncorrelated component ('Nugget Effect') in the 3 m data, on the order of 60% of the observed variance for the various models fitted. Four different nested models were automatically fit to the data. Results for all models in terms of cross-validation statistics were very similar for the first set of validation tests. Non-stationary analysis established that both the order of drift and the order of the intrinsic random functions is low. This study also suggests that conventional cross-validation studies and automatic variogram fitting are not necessarily evaluating how well a model will infer block scale hydraulic conductivity values. 20 refs, 20 figs, 14 tabs

  11. Applying Conjoint Analysis to Study Attitudes of Thai Government Organisations

    Directory of Open Access Journals (Sweden)

    Natee Suriyanon

    2012-11-01

    Full Text Available This article presents the application of choice-based conjointanalysis to analyse the attitude of Thai government organisationstowards the restriction of the contractor’s right to claimcompensation for unfavourable effects from undesirable events.The analysis reveals that the organisations want to restrict only 6out of 14 types of the claiming rights that were studied. The rightthat they want to restrict most is the right to claim for additionaldirect costs due to force majeure. They are willing to pay between0.087% - 0.210% of the total project direct cost for restricting eachtype of contractor right. The total additional cost for restrictingall six types of rights that the organisations are willing to pay is0.882%. The last section of this article applies the knowledgegained from a choice based conjoint analysis experiment to theanalysis of the standard contract of the Thai government. Theanalysis reveals three types of rights where Thai governmentorganisations are willing to forego restrictions, but the presentstandard contract does not grant such rights.

  12. Multivariate Statistical Analysis Applied in Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Jieling Zou

    2015-08-01

    Full Text Available This study applies multivariate statistical approaches to wine quality evaluation. With 27 red wine samples, four factors were identified out of 12 parameters by principal component analysis, explaining 89.06% of the total variance of data. As iterative weights calculated by the BP neural network revealed little difference from weights determined by information entropy method, the latter was chosen to measure the importance of indicators. Weighted cluster analysis performs well in classifying the sample group further into two sub-clusters. The second cluster of red wine samples, compared with its first, was lighter in color, tasted thinner and had fainter bouquet. Weighted TOPSIS method was used to evaluate the quality of wine in each sub-cluster. With scores obtained, each sub-cluster was divided into three grades. On the whole, the quality of lighter red wine was slightly better than the darker category. This study shows the necessity and usefulness of multivariate statistical techniques in both wine quality evaluation and parameter selection.

  13. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    CERN Document Server

    von Hippel, Ted

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants a...

  14. Correlation network analysis applied to complex biofilm communities.

    Directory of Open Access Journals (Sweden)

    Ana E Duran-Pinedo

    Full Text Available The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM, which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063. After two rounds of enrichment by a selected helper (Prevotella oris OT311 we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of

  15. Geostatistical analysis for soil moisture content under the no tillage cropping system Análise geoestatística do teor de água do solo sob sistema de cultivo em plantio direto

    Directory of Open Access Journals (Sweden)

    Célia Regina Grego

    2006-08-01

    Full Text Available Experiments in agriculture usually consider the topsoil properties to be uniform in space and, for this reason, often make inadequate use of the results. The objective of this study was to assess the variability for soil moisture content using geostatistical techniques. The experiment was carried out on a Rhodic Ferralsol (typic Haplorthox in Campinas, SP, Brazil, in an area of 3.42 ha cultivated under the no tillage system, and the sampling was made in a grid of 102 points spaced 10 m x 20 m. Access tubes were inserted down to one meter at each evaluation point in order to measure soil moisture contents (cm³ cm-3 at depths of 30, 60 and 90 cm with a neutron moisture gauge. Samplings were made between the months of August and September of 2003 and in January 2004. The soil moisture content for each sampling date was analyzed using classical statistics in order to appropriately describe the central tendency and dispersion on the data and then using geostatistics to describe the spatial variability. The comparison between the spatial variability for different samplings was made examining scaled semivariograms. Water content was mapped using interpolated values with punctual kriging. The semivariograms showed that, at the 60 cm depth, soil water content had moderate spatial dependence with ranges between 90 and 110 m. However, no spatial dependence was found for 30 and 90 cm depths in 2003. Sampling density was insufficient for an adequate characterization of the spatial variability of soil moisture contents at the 30 and 90 cm depths.Experimentos em agricultura geralmente consideram as propriedades do solo como sendo uniformes no espaço e, por esta razão, os resultados são freqüentemente mal interpretados. O objetivo deste estudo foi avaliar a variabilidade do teor de água do solo usando técnicas de geoestatística. O experimento foi desenvolvido em um Latossolo Vermelho eutroférrico, Campinas, SP, Brasil, numa área de 3,42 ha sob plantio

  16. 白背飞虱长翅型空间格局的地统计学分析%Geostatistical Analysis on Spatial Distribution of Macropterous Whitebacked Planthopper,Sogatella furcifera(Horváth)

    Institute of Scientific and Technical Information of China (English)

    闫香慧; 黄燕

    2012-01-01

    白背飞虱Sogatella furcifera(Horváth)是我国重要的迁飞性水稻害虫,对我国的水稻生产造成严重危害.为了解其迁入后的聚集与扩散的动态过程和空间分布规律,为综合防治提供理论依据,本文根据2008年在秀山县的田间系统调查资料,运用地统计学中的半方差函数,研究白背飞虱长翅型成虫迁入后在稻田间的聚集与扩散的动态过程和空间分布规律,建立了自迁入至迁出在田间东西和南北两个方向上的空间变异曲线模型,并利用Surfer8.0软件对空间分布数据进行插值和模拟.结果表明,白背飞虱长翅型密度越高,空间变量的变化幅度越大;由随机因数引起的空间变异平均为37.6%,由自相关因数引起的空间变异为62.4%,各调查时间东西方向的空间相关范围都小于南北方向,前者平均为12.86m,后者为28.85m;空间插值表明白背飞虱长翅型种群在稻田的聚集斑块南北方向比东西方向长,即南北方向是白背飞虱长翅型聚集和扩散的主方向.%Whitebacked planthopper,Sogatella furcifera(Horváth) is a major immigration pest of rice crops in China.In order to provide a theoretical basis for its integrated control,dynamic process and spatial pattern of macropterous Sogatella furcifera were studied in the period from their immigration to emigration by using geostatistical methods in Xiushan county,and semivariogram curve models were established in directions from north to south and west to east.Isoclines maps of the pest at each stage were set by the geostatistical software Surfer 8.0 with Kriging interpolation.The variograms showed the higher the density was,the larger the space variation scope became.Average space variation was 37.6% caused by the random factor and 62.4% caused by the autocorrelation.The random degree of space variation became greater as the rice grew up.Space-related distance was 12.86 m in the direction of east-west and 28.85 m in south

  17. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  18. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  19. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  20. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  1. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    OpenAIRE

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological...

  2. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  3. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  4. Factor Analysis Applied the VFY-218 RCS Data

    Science.gov (United States)

    Woo, Alex; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Present statistical factor analysis of computer simulations and measurement data for the VFY-218 configuration. Factor analysis try to quantify the statistical grouping of measurements and simulations.

  5. Accuracy evaluation of different statistical and geostatistical censored data imputation approaches (Case study: Sari Gunay gold deposit

    Directory of Open Access Journals (Sweden)

    Babak Ghane

    2016-06-01

    Full Text Available Most of the geochemical datasets include missing data with different portions and this may cause a significant problem in geostatistical modeling or multivariate analysis of the data. Therefore, it is common to impute the missing data in most of geochemical studies. In this study, three approaches called half detection (HD, multiple imputation (MI, and the cosimulation based on Markov model 2 (MM2 are used to impute the censored data. According to the fact that the new datasets have to satisfy the original data underlying structure, the Multidimensional Scaling (MDS approach has been used to explore the validity of different imputation methods. Log-ratio transformation (alr transformation was performed to open the closed compositional data prior to applying the MDS method. Experiments showed that, based on the MDS approach, the MI and the MM2 could not satisfy the original underlying structure of the dataset as well as the HD approach. This is because these two mentioned approaches have produced values higher than the detection limit of the variables.

  6. APPLICATION OF BAYESIAN AND GEOSTATISTICAL MODELING TO THE ENVIRONMENTAL MONITORING OF CS-137 AT THE IDAHO NATIONAL LABORATORY

    Energy Technology Data Exchange (ETDEWEB)

    Kara G. Eby

    2010-08-01

    At the Idaho National Laboratory (INL) Cs-137 concentrations above the U.S. Environmental Protection Agency risk-based threshold of 0.23 pCi/g may increase the risk of human mortality due to cancer. As a leader in nuclear research, the INL has been conducting nuclear activities for decades. Elevated anthropogenic radionuclide levels including Cs-137 are a result of atmospheric weapons testing, the Chernobyl accident, and nuclear activities occurring at the INL site. Therefore environmental monitoring and long-term surveillance of Cs-137 is required to evaluate risk. However, due to the large land area involved, frequent and comprehensive monitoring is limited. Developing a spatial model that predicts Cs-137 concentrations at unsampled locations will enhance the spatial characterization of Cs-137 in surface soils, provide guidance for an efficient monitoring program, and pinpoint areas requiring mitigation strategies. The predictive model presented herein is based on applied geostatistics using a Bayesian analysis of environmental characteristics across the INL site, which provides kriging spatial maps of both Cs-137 estimates and prediction errors. Comparisons are presented of two different kriging methods, showing that the use of secondary information (i.e., environmental characteristics) can provide improved prediction performance in some areas of the INL site.

  7. Can Artificial Neural Networks be Applied in Seismic Predicition? Preliminary Analysis Applying Radial Topology. Case: Mexico

    CERN Document Server

    Mota-Hernandez, Cinthya; Alvarado-Corona, Rafael

    2014-01-01

    Tectonic earthquakes of high magnitude can cause considerable losses in terms of human lives, economic and infrastructure, among others. According to an evaluation published by the U.S. Geological Survey, 30 is the number of earthquakes which have greatly impacted Mexico from the end of the XIX century to this one. Based upon data from the National Seismological Service, on the period between January 1, 2006 and May 1, 2013 there have occurred 5,826 earthquakes which magnitude has been greater than 4.0 degrees on the Richter magnitude scale (25.54% of the total of earthquakes registered on the national territory), being the Pacific Plate and the Cocos Plate the most important ones. This document describes the development of an Artificial Neural Network (ANN) based on the radial topology which seeks to generate a prediction with an error margin lower than 20% which can inform about the probability of a future earthquake one of the main questions is: can artificial neural networks be applied in seismic forecast...

  8. Demonstration of a geostatistical approach to physically consistent downscaling of climate modeling simulations

    KAUST Repository

    Jha, Sanjeev Kumar

    2013-01-01

    A downscaling approach based on multiple-point geostatistics (MPS) is presented. The key concept underlying MPS is to sample spatial patterns from within training images, which can then be used in characterizing the relationship between different variables across multiple scales. The approach is used here to downscale climate variables including skin surface temperature (TSK), soil moisture (SMOIS), and latent heat flux (LH). The performance of the approach is assessed by applying it to data derived from a regional climate model of the Murray-Darling basin in southeast Australia, using model outputs at two spatial resolutions of 50 and 10 km. The data used in this study cover the period from 1985 to 2006, with 1985 to 2005 used for generating the training images that define the relationships of the variables across the different spatial scales. Subsequently, the spatial distributions for the variables in the year 2006 are determined at 10 km resolution using the 50 km resolution data as input. The MPS geostatistical downscaling approach reproduces the spatial distribution of TSK, SMOIS, and LH at 10 km resolution with the correct spatial patterns over different seasons, while providing uncertainty estimates through the use of multiple realizations. The technique has the potential to not only bridge issues of spatial resolution in regional and global climate model simulations but also in feature sharpening in remote sensing applications through image fusion, filling gaps in spatial data, evaluating downscaled variables with available remote sensing images, and aggregating/disaggregating hydrological and groundwater variables for catchment studies.

  9. Accounting for Transport Parameter Uncertainty in Geostatistical Groundwater Contaminant Release History Estimation

    Science.gov (United States)

    Ostrowski, J.; Shlomi, S.; Michalak, A.

    2007-12-01

    The process of estimating the release history of a contaminant in groundwater relies on coupling a limited number of concentration measurements with a groundwater flow and transport model in an inverse modeling framework. The information provided by available measurements is generally not sufficient to fully characterize the unknown release history; therefore, an accurate assessment of the estimation uncertainty is required. The modeler's level of confidence in the transport parameters, expressed as pdfs, can be incorporated into the inverse model to improve the accuracy of the release estimates. In this work, geostatistical inverse modeling is used in conjunction with Monte Carlo sampling of transport parameters to estimate groundwater contaminant release histories. Concentration non-negativity is enforced using a Gibbs sampling algorithm based on a truncated normal distribution. The method is applied to two one-dimensional test cases: a hypothetical dataset commonly used in validating contaminant source identification methods, and data collected from a tetrachloroethylene and trichloroethylene plume at the Dover Air Force Base in Delaware. The estimated release histories and associated uncertainties are compared to results from a geostatistical inverse model where uncertainty in transport parameters is ignored. Results show that the a posteriori uncertainty associated with the model that accounts for parameter uncertainty is higher, but that this model provides a more realistic representation of the release history based on available data. This modified inverse modeling technique has many applications, including assignment of liability in groundwater contamination cases, characterization of groundwater contamination, and model calibration.

  10. Video analysis applied to volleyball didactics to improve sport skills

    OpenAIRE

    Raiola, Gaetano; Parisi, Fabio; Giugno, Ylenia; Di Tore, Pio Alfredo

    2013-01-01

    The feedback method is increasingly used in learning new skills and improving performance. "Recent research, however, showed that the most objective and quantitative feedback is, theº greater its effect on performance". The video analysis, which is the analysis of sports performance by watching the video, is used primarily for use in the quantitative performance of athletes through the notational analysis. It may be useful to combine the quantitative and qualitative analysis of the single ges...

  11. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China

    International Nuclear Information System (INIS)

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. - Highlights: • Conditional inference tree can identify variables controlling metal distribution. • Finite mixture distribution model can partition natural and anthropogenic sources. • Geostatistics with stochastic models

  12. Applications of stochastic models and geostatistical analyses to study sources and spatial patterns of soil heavy metals in a metalliferous industrial district of China

    Energy Technology Data Exchange (ETDEWEB)

    Zhong, Buqing; Liang, Tao, E-mail: liangt@igsnrr.ac.cn; Wang, Lingqing; Li, Kexin

    2014-08-15

    An extensive soil survey was conducted to study pollution sources and delineate contamination of heavy metals in one of the metalliferous industrial bases, in the karst areas of southwest China. A total of 597 topsoil samples were collected and the concentrations of five heavy metals, namely Cd, As (metalloid), Pb, Hg and Cr were analyzed. Stochastic models including a conditional inference tree (CIT) and a finite mixture distribution model (FMDM) were applied to identify the sources and partition the contribution from natural and anthropogenic sources for heavy metal in topsoils of the study area. Regression trees for Cd, As, Pb and Hg were proved to depend mostly on indicators of anthropogenic activities such as industrial type and distance from urban area, while the regression tree for Cr was found to be mainly influenced by the geogenic characteristics. The FMDM analysis showed that the geometric means of modeled background values for Cd, As, Pb, Hg and Cr were close to their background values previously reported in the study area, while the contamination of Cd and Hg were widespread in the study area, imposing potentially detrimental effects on organisms through the food chain. Finally, the probabilities of single and multiple heavy metals exceeding the threshold values derived from the FMDM were estimated using indicator kriging (IK) and multivariate indicator kriging (MVIK). The high probabilities exceeding the thresholds of heavy metals were associated with metalliferous production and atmospheric deposition of heavy metals transported from the urban and industrial areas. Geostatistics coupled with stochastic models provide an effective way to delineate multiple heavy metal pollution to facilitate improved environmental management. - Highlights: • Conditional inference tree can identify variables controlling metal distribution. • Finite mixture distribution model can partition natural and anthropogenic sources. • Geostatistics with stochastic models

  13. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  14. Least Weighted Squares Applied to Robust Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    Olomouc: Palacký University, 2011 - (Fišerová, E.; Talašová, J.). s. 31-31 ISBN 978-80-244-2684-6. [ODAM 2011. Olomoucian Days of Applied Mathematics International Conference. 26.01.2011-28.01.2011, Olomouc] R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : least weightee squares * robust estimation * multivariate statistics * high dimension Subject RIV: BB - Applied Statistics, Operational Research

  15. An Analysis of the Economy Principle Applied in Cyber Language

    Institute of Scientific and Technical Information of China (English)

    肖钰敏

    2015-01-01

    With the development of network technology,cyber language,a new social dialect,is widely used in our life.The author analyzes how the economy principle is applied in cyber language from three aspects—word-formation,syntax and non-linguistic symbol.And the author collects,summarizes and analyzes the relevant language materials to prove the economy principle’s real existence in chat room and the reason why the economy principle is applied widely in cyber space.

  16. Geostatistic in Reservoir Characterization: from estimation to simulation methods

    Directory of Open Access Journals (Sweden)

    Mata Lima, H.

    2005-12-01

    Full Text Available In this article objective have been made to reviews different geostatistical methods available to estimate and simulate petrophysical properties (porosity and permeability of the reservoir. Different geostatistical techniques that allow the combination of hard and soft data are taken into account and one refers the main reason to use the geostatistical simulation rather than estimation. Uncertainty in reservoir characterization due to variogram assumption, which is a strict mathematical equation and can leads to serious simplification on description of the natural processes or phenomena under consideration, is treated here. Mutiple-point geostatistics methods based on the concept of training images, suggested by Strebelle (2000 and Caers (2003 owing to variogram limitation to capture complex heterogeneity, is another subject presented. This article intends to provide a review of geostatistical methods to serve the interest of students and researchers.Este artículo presenta una revisión de diversos métodos geoestatísticos disponibles para estimar y para simular características petrofísicas (porosidad y permeabilidad de la formación geológica (roca depósito del petróleo. Se presentan diversas técnicas geostatísticas que permiten la combinación de datos hard y soft y se explica la razón principal para utilizar la simulación geoestatística en vez de estimación. También se explica la incertidumbre en la caracterización del depósito debido a la asunción del variogram. El hecho de que el variogram sea una simple ecuación matemática conduce a la simplificación seria en la descripción de los procesos o de los fenómenos naturales bajo consideración. Los «métodos geostatísticos del Multiplepoint » (Multiple-point geostatistics methods basados en el concepto de training images, sugerido por Strebelle (2000 y Caers (2003, debido a la limitación del variogram para capturar heterogeneidad compleja es otro tema presentado. Este

  17. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions...

  18. An applied general equilibrium model for Dutch agribusiness policy analysis.

    NARCIS (Netherlands)

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly

  19. System Analysis Applying to Talent Resource Development Research

    Institute of Scientific and Technical Information of China (English)

    WANG Peng-tao; ZHENG Gang

    2001-01-01

    In the development research of talent resource, the most important of talent resource forecast and optimization is the structure of talent resource, requirement number and talent quality. The article establish factor reconstruction analysis forecast and talent quality model on the method: system reconstruction analysis and ensure most effective factor level in system, which is presented by G. J. Klirti, B.Jonesque. And performing dynamic analysis of example ration.

  20. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    Science.gov (United States)

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  1. Modal analysis applied to circular, rectangular, and coaxial waveguides

    Science.gov (United States)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  2. An applied general equilibrium model for Dutch agribusiness policy analysis.

    OpenAIRE

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly general and could be used to analyse a great variety of agricultural policy changes. However, generality requires that the model should be adapted and extended for special research questions. This...

  3. A behavioral prescription for promoting applied behavior analysis within pediatrics.

    OpenAIRE

    Allen, K D; Barone, V J; Kuhn, B R

    1993-01-01

    In recent decades, pediatric medicine has undergone a shift in focus from infectious diseases to the effects of behavior on the health and development of children. At the same time, behavior analysts have increasingly evaluated the direct application of their technology to the development and maintenance of child health behavior. Unfortunately, applied behavior analysts have developed their technology parallel to, rather than collaboratively with, pediatricians and, as a result, are not recog...

  4. Multisectorial models applied to the environment: an analysis for catalonia

    OpenAIRE

    Pié Dols, Laia

    2010-01-01

    The objective of this doctoral thesis is to apply different multisectorial models available to analyse the impact that would had on the Catalan economy as a result of the introduction of policies designed to reduce emissions of greenhouse effect gases and save energy, and also at the same time to improve the environmental competitiveness of both individual companies and the economy as a whole. For the purposes of this thesis I have analysed the six greenhouse gases that are regulated by the K...

  5. Analysis of OFDM Applied to Powerline High Speed Digital Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUANG Jian; YANG Gong-xu

    2003-01-01

    The low voltage powerline is becoming a powerful solution to home network, building automation, and internet access as a result of its wide distribution, easy access and little maintenance. The character of powerline channel is very complicated because it is an open net. This article analysed the character of the powerline channel,introduced the basics of OFDM(Orthogonal Frequency Division Multiplexing), and studied the OFDM applied into powerline high speed digital communication.

  6. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  7. Stochastic Local Interaction (SLI) model: Bridging machine learning and geostatistics

    Science.gov (United States)

    Hristopulos, Dionissios T.

    2015-12-01

    Machine learning and geostatistics are powerful mathematical frameworks for modeling spatial data. Both approaches, however, suffer from poor scaling of the required computational resources for large data applications. We present the Stochastic Local Interaction (SLI) model, which employs a local representation to improve computational efficiency. SLI combines geostatistics and machine learning with ideas from statistical physics and computational geometry. It is based on a joint probability density function defined by an energy functional which involves local interactions implemented by means of kernel functions with adaptive local kernel bandwidths. SLI is expressed in terms of an explicit, typically sparse, precision (inverse covariance) matrix. This representation leads to a semi-analytical expression for interpolation (prediction), which is valid in any number of dimensions and avoids the computationally costly covariance matrix inversion.

  8. 4th European Conference on Geostatistics for Environmental Applications

    CERN Document Server

    Carrera, Jesus; Gómez-Hernández, José

    2004-01-01

    The fourth edition of the European Conference on Geostatistics for Environmental Applications (geoENV IV) took place in Barcelona, November 27-29, 2002. As a proof that there is an increasing interest in environmental issues in the geostatistical community, the conference attracted over 100 participants, mostly Europeans (up to 10 European countries were represented), but also from other countries in the world. Only 46 contributions, selected out of around 100 submitted papers, were invited to be presented orally during the conference. Additionally 30 authors were invited to present their work in poster format during a special session. All oral and poster contributors were invited to submit their work to be considered for publication in this Kluwer series. All papers underwent a reviewing process, which consisted on two reviewers for oral presentations and one reviewer for posters. The book opens with one keynote paper by Philippe Naveau. It is followed by 40 papers that correspond to those presented orally d...

  9. Multi-Criteria GIS Methodology Focused on the Location of Optimal Places for Small Hydro Power Via Hydrological and Geostatistic Aplications; Metodologia SIG Multicriterio Enfocada a la Localizacion de Enclaves Optimos para Centrales de Minihidroelectricas mediante Aplicaciones Hidrologicas y Geoestadisticas

    Energy Technology Data Exchange (ETDEWEB)

    Paz, C. de la

    2013-02-01

    The main objective of this research is the development of a location methodology for sitting optimization of small hydro power (SHP) centrals. In order of achieve this goal, a Multi-Criteria Evaluation (MCE) methodology implemented through the use of tools in a GIS environment: Spatial Analysis, Geostatistic Analysis, and Hydrology have been developed. This methodology includes two different models based on the same MCE process. The substantial difference of both models is in the input data and the tools applied to estimate the energy resource and the principal factor of the methodology (caudal or accumulated flow). The first model is generated from caudal data obtained in the study area (El Bierzo), and the second one from pluviometric data and Digital Terrain Model (DTM). Both models include viability maps with greater ability areas to locate SHP facilities. As an additional objective, the study allows contrasting the results of the two developed models to evaluate their similarity. (Author)

  10. Applying measurement-based probabilistic timing analysis to buffer resources

    OpenAIRE

    Kosmidis L.; Vardanega T.; Abella J.; Quinones E.; Cazorla F.J.

    2013-01-01

    The use of complex hardware makes it difficult for current timing analysis techniques to compute trustworthy and tight worst-case execution time (WCET) bounds. Those techniques require detailed knowledge of the internal operation and state of the platform, at both the software and hardware level. Obtaining that information for modern hardware platforms is increasingly difficult. Measurement-Based Probabilistic Timing Analysis (MBPTA) reduces the cost of acquiring the knowledge needed for comp...

  11. Toward farm-based policy analysis: concepts applied in Haiti

    OpenAIRE

    Martinez, Juan Carlos; Sain, Gustavo; Yates, Michael

    1991-01-01

    Many policies - on the delivery of inputs or on marketing systems, credit, or extension - influence the potential utilization of new technologies. Through 'farm-based policy analysis' it is possible to use data generated in on-farm research (OFR) to identify policy constraints to the use of new technologies, and to effectively communicate that information to policy makers. This paper describes a tentative framework for farm-based policy analysis and suggests a sequence of five steps for the a...

  12. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  13. Modeling Paradigms Applied to the Analysis of European Air Quality

    OpenAIRE

    Makowski, M.

    2000-01-01

    The paper presents an overview of various modeling paradigms applicable to the analysis of complex decision-making that can be represented by large non-linear models. Such paradigms are illustrated by their application to the analysis of a model that helps to identify and analyze various cost-effective policy options aimed at improving European air quality. Also presented is the application of this model to support intergovernmental negotiations.

  14. Geostatistic in the evaluation of a granitic formation as a nuclear repository site selection method

    International Nuclear Information System (INIS)

    As a result of several preliminary investigations, an area of about 12 km2 was preselected at Sierra del Medio in northern Patagonia (Chubut Province) for the emplacement of a nuclear spent fuel repository. In the development of this detailed research were included the following technical items: remote sensing imagery analysis, vertical photointerpretation, geophysical and geological surveys by field crews, petrographic macro and micro samples classification, regional geomorphological and hydrogeological surveys, medium scale photogrammetric mapping, large scale field mapping, preliminary borehole drilling up to 280 meters deep and joints-discontinuities evaluations, chemical water and rock sampling, tectonics and modern volcanic activity, with the most important attention given to seismological prevention research. A joint census programme was developed through a regular sampling grid with systematically increased spatial density; obtained values were extrapolated by means of geostatistically supported methods. This paper presents the geostatistical evaluation of surface fracture rock behaviour at the preliminary selected site and the selection of a ''less fractured area'' of about 600x1000 meters. Furthermore, the location of four 700 m deep boreholes has been proposed as final repository-depth data gathering

  15. Spatial and temporal groundwater level variation geostatistical modeling in the city of Konya, Turkey.

    Science.gov (United States)

    Cay, Tayfun; Uyan, Mevlut

    2009-12-01

    Groundwater is one of the most important resources used for drinking and utility and irrigation purposes in the city of Konya, Turkey, as in many areas. The purpose of this study is to evaluate spatial and temporal changes in the level of groundwater by using geostatistical methods based on data from 91 groundwater wells during the period 1999 to 2003. Geostatistical methods have been used widely as a convenient tool to make decisions on the management of groundwater levels. To evaluate the spatial and temporal changes in the level of the groundwater, a vector-based geographic information system software package, ArcGIS 9.1 (Environmental Systems Research Institute, Redlands, California), was used for the application of an ordinary kriging method, with cross-validation leading to the estimation of groundwater levels. The average value of variogram (spherical model) for the spatial analysis was approximately 2150 m. Results of ordinary kriging for groundwater level drops were underestimated by 17%. Cross-validation errors were within an acceptable level. The kriging model also helps to detect risk-prone areas for groundwater abstraction. PMID:20099631

  16. Geostatistical Borehole Image-Based Mapping of Karst-Carbonate Aquifer Pores.

    Science.gov (United States)

    Sukop, Michael C; Cunningham, Kevin J

    2016-03-01

    Quantification of the character and spatial distribution of porosity in carbonate aquifers is important as input into computer models used in the calculation of intrinsic permeability and for next-generation, high-resolution groundwater flow simulations. Digital, optical, borehole-wall image data from three closely spaced boreholes in the karst-carbonate Biscayne aquifer in southeastern Florida are used in geostatistical experiments to assess the capabilities of various methods to create realistic two-dimensional models of vuggy megaporosity and matrix-porosity distribution in the limestone that composes the aquifer. When the borehole image data alone were used as the model training image, multiple-point geostatistics failed to detect the known spatial autocorrelation of vuggy megaporosity and matrix porosity among the three boreholes, which were only 10 m apart. Variogram analysis and subsequent Gaussian simulation produced results that showed a realistic conceptualization of horizontal continuity of strata dominated by vuggy megaporosity and matrix porosity among the three boreholes. PMID:26174850

  17. Proximal soil sensors and geostatistical tools in precision agriculture applications

    OpenAIRE

    Shaddad, Sameh

    2014-01-01

    Recognition of spatial variability is very important in precision agriculture applications. The use of proximal soil sensors and geostatistical techniques is highly recommended worldwide to detect spatial variation not only in fields but also within-field (micro-scale). This study involves, as a first step, the use of visible and near infrared (vis-NIR) spectroscopy to estimate soil key properties (6) and obtain high resolution maps that allow us to model the spatial variability in the soil. ...

  18. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  19. Inexpensive rf modeling and analysis techniques as applied to cyclotrons

    International Nuclear Information System (INIS)

    A review and expansion of the circuit analogy method of modeling and analysing multiconductor TEM mode rf resonators is described. This method was used to predict the performance of the NSCL K500 and K1200 cyclotron resonators and the results compared well to the measured performance. The method is currently being applied as the initial stage of the design process to optimize the performance of the rf resonators for a proposed K250 cyclotron for medical applications. Although this technique requires an experienced rf modeller, the input files tend to be simple and small, the software is very inexpensive or free, and the computer runtimes are nearly instantaneous

  20. Applied risk analysis to the future Brazilian electricity generation matrix

    Energy Technology Data Exchange (ETDEWEB)

    Maues, Jair; Fernandez, Eloi; Correa, Antonio

    2010-09-15

    This study compares energy conversion systems for the generation of electrical power, with an emphasis on the Brazilian energy matrix. The financial model applied in this comparison is based on the Portfolio Theory, developed by Harry Markowitz. The risk-return ratio related to the electrical generation mix predicted in the National Energy Plan - 2030, published in 2006 by the Brazilian Energy Research Office, is evaluated. The increase of non-traditional renewable energy in this expected electrical generating mix, specifically, residues of sugar cane plantations and wind energy, reduce not only the risk but also the average cost of the kilowatt-hour generated.

  1. Fractal and geostatistical methods for modeling of a fracture network

    International Nuclear Information System (INIS)

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D

  2. Fractal and geostatistical methods for modeling of a fracture network

    Energy Technology Data Exchange (ETDEWEB)

    Chiles, J.P.

    1988-08-01

    The modeling of fracture networks is useful for fluid flow and rock mechanics studies. About 6600 fracture traces were recorded on drifts of a uranium mine in a granite massif. The traces have an extension of 0.20-20 m. The network was studied by fractal and by geostatistical methods but can be considered neither as a fractal with a constant dimension nor a set of purely randomly located fractures. Two kinds of generalization of conventional models can still provide more flexibility for the characterization of the network: (a) a nonscaling fractal model with variable similarity dimension (for a 2-D network of traces, the dimension varying from 2 for the 10-m scale to 1 for the centimeter scale, (b) a parent-daughter model with a regionalized density; the geostatistical study allows a 3-D model to be established where: fractures are assumed to be discs; fractures are grouped in clusters or swarms; and fracturation density is regionalized (with two ranges at about 30 and 300 m). The fractal model is easy to fit and to simulate along a line, but 2-D and 3-D simulations are more difficult. The geostatistical model is more complex, but easy to simulate, even in 3-D.

  3. Large area forest inventory using Landsat ETM+: A geostatistical approach

    Science.gov (United States)

    Meng, Qingmin; Cieszewski, Chris; Madden, Marguerite

    Large area forest inventory is important for understanding and managing forest resources and ecosystems. Remote sensing, the Global Positioning System (GPS), and geographic information systems (GIS) provide new opportunities for forest inventory. This paper develops a new systematic geostatistical approach for predicting forest parameters, using integrated Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images, GPS, and GIS. Forest parameters, such as basal area, height, health conditions, biomass, or carbon, can be incorporated as a response variable, and the geostatistical approach can be used to predict parameter values for uninventoried points. Using basal area as the response and Landsat ETM+ images of pine stands in Georgia as auxiliary data, this approach includes univariate kriging (ordinary kriging and universal kriging) and multivariable kriging (co-kriging and regression kriging). The combination of bands 4, 3, and 2, as well as the combination of bands 5, 4, and 3, normalized difference vegetation index (NDVI), and principal components (PCs) were used in this study with co-kriging and regression kriging. Validation based on 200 randomly sampling points withheld field inventory was computed to evaluate the kriging performance and demonstrated that band combination 543 performed better than band combination 432, NDVI, and PCs. Regression kriging resulted in the smallest errors and the highest R-squared indicating the best geostatistical method for spatial predictions of pine basal area.

  4. Granger Causality Between Exports, Imports and GDP in France: Evidance from Using Geostatistical Models

    OpenAIRE

    Arshia Amiri; Ulf-G Gerdtham

    2012-01-01

    This paper introduces a new way of investigating linear and nonlinear Granger causality between exports, imports and economic growth in France over the period 1961_2006 with using geostatistical models (kiriging and Inverse distance weighting). Geostatistical methods are the ordinary methods for forecasting the locatins and making map in water engineerig, environment, environmental pollution, mining, ecology, geology and geography. Although, this is the first time which geostatistics knowledg...

  5. The colour analysis method applied to homogeneous rocks

    Science.gov (United States)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  6. Applying real options analysis to assess cleaner energy development strategies

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Ching-Tsung [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China); Lo, Shang-Lien, E-mail: sllo@ntu.edu.tw [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China); Lin, Tyrone T. [Department of International Business, National Dong Hwa University, 1, Sec. 2, Da Hsueh Road, Shou-Feng, Hualien 974, Taiwan (China)

    2011-10-15

    The energy industry, accounts for the largest portion of CO{sub 2} emissions, is facing the issue of compliance with the national clean energy policy. The methodology for evaluating the energy mix policy is crucial because of the characteristics of lead time embedded with the power generation facilities investment and the uncertainty of future electricity demand. In this paper, a modified binomial model based on sequential compound options, which may account for the lead time and uncertainty as a whole is established, and a numerical example on evaluating the optional strategies and the strategic value of the cleaner energy policy is also presented. It is found that the optimal decision at some nodes in the binomial tree is path dependent, which is different from the standard sequential compound option model with lead time or time lag concept. The proposed modified binomial sequential compound real options model can be generalized and extensively applied to solve the general decision problems that deal with the long lead time of many government policies as well as capital intensive investments. - Highlights: > Introducing a flexible strategic management approach for government policy making. > Developing a modified binomial real options model based on sequential compound options. > Proposing an innovative model for managing the long term policy with lead time. > Applying to evaluate the options of various scenarios of cleaner energy strategies.

  7. Applying real options analysis to assess cleaner energy development strategies

    International Nuclear Information System (INIS)

    The energy industry, accounts for the largest portion of CO2 emissions, is facing the issue of compliance with the national clean energy policy. The methodology for evaluating the energy mix policy is crucial because of the characteristics of lead time embedded with the power generation facilities investment and the uncertainty of future electricity demand. In this paper, a modified binomial model based on sequential compound options, which may account for the lead time and uncertainty as a whole is established, and a numerical example on evaluating the optional strategies and the strategic value of the cleaner energy policy is also presented. It is found that the optimal decision at some nodes in the binomial tree is path dependent, which is different from the standard sequential compound option model with lead time or time lag concept. The proposed modified binomial sequential compound real options model can be generalized and extensively applied to solve the general decision problems that deal with the long lead time of many government policies as well as capital intensive investments. - Highlights: → Introducing a flexible strategic management approach for government policy making. → Developing a modified binomial real options model based on sequential compound options. → Proposing an innovative model for managing the long term policy with lead time. → Applying to evaluate the options of various scenarios of cleaner energy strategies.

  8. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Gola, Giulio; Ravn, Ole

    2013-01-01

    causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for consequence reasoning by using rule base technology and describe the challenges for integrating functional...... consequence analysis to practical or online applications in supervision systems. It will also suggest a multiagent solution as the integration architecture for developing tools to facilitate the utilization results of functional consequence analysis. Finally a prototype of the multiagent reasoning system will...

  9. Reliability analysis of reactor systems by applying probability method

    International Nuclear Information System (INIS)

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component

  10. Applied Bibliometrics: Using Citation Analysis in the Journal Submission Process.

    Science.gov (United States)

    Robinson, Michael D.

    1991-01-01

    Discusses the use of citation analysis as an effective tool for scholars to determine what journals would be appropriate for publication of their work. Calculating citation distance is explained, and a study with economics journals is described that computed citation distance between previously published articles and journals in the field. (12…

  11. System analysis applied for controlling the quality of metallurgical rollers

    Directory of Open Access Journals (Sweden)

    L. Wojtynek

    2010-01-01

    Full Text Available In the work, the system analysis in the foundry where the quality management system has been implemented was described. The generalized model of the foundry’s production system was presented taking the company’s surrounding and process attitude into account.

  12. Analysis of modulated optical reflectance applied to magnetoelectric nanomaterials

    International Nuclear Information System (INIS)

    Structural and defectoscopic photothermal analysis of high spatial resolution is performed to sample ferromagnetic LSMO films. The modulated optical reflectance of the surface of the film in the laser focus is indicative of its magnetoelectric properties and is found to be proportional to the thermal variations of free carrier density. (authors)

  13. USB apply to field X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    This article analyzes the feasibility of application USB and GPS to field X-ray fluorescence analysis, and focuses on the hardware and firmware design of USB and multi-channel analyzer (MCA), then simply discusses the device driver design and the PC application software design. (authors)

  14. Applying an Activity System to Online Collaborative Group Work Analysis

    Science.gov (United States)

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  15. Applying AI tools to operational space environmental analysis

    Science.gov (United States)

    Krajnak, Mike; Jesse, Lisa; Mucks, John

    1995-01-01

    The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines

  16. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSISAPPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  17. Computational modeling applied to stress gradient analysis for metallic alloys

    International Nuclear Information System (INIS)

    Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

  18. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Diffusing Wave Spectroscopy (DWS) was studied as a method of laboratory analysis of sub-micron particles, and developed as a prospective in-line, industrial, process control sensor, capable of near real-time feedback. No sample pre-treatment was required and measurement was via a non-invasive, flexible, dip in probe. DWS relies on the concept of the diffusive migration of light, as opposed to the ballistic scatter model used in conventional dynamic light scattering. The specific requirements of the optoelectronic hardware, data analysis methods and light scattering model were studied experimentally and, where practical, theoretically resulting in a novel technique of analysis of particle suspensions and emulsions of volume fractions between 0.01 and 0.4. Operation at high concentrations made the technique oblivious to dust and contamination. A pure homodyne (autodyne) experimental arrangement described was resilient to environmental disturbances, unlike many other systems which utilise optical fibres or heterodyne operation. Pilot and subsequent prototype development led to a highly accurate method of size ranking, suitable for analysis of a wide range of suspensions and emulsions. The technique was shown to operate on real industrial samples with statistical variance as low as 0.3% with minimal software processing. Whilst the application studied was the analysis of TiO2 suspensions, a diverse range of materials including polystyrene beads, cell pastes and industrial cutting fluid emulsions were tested. Results suggest that, whilst all sizing should be comparative to suitable standards, concentration effects may be minimised and even completely modelled-out in many applications. Adhesion to the optical probe was initially a significant problem but was minimised after the evaluation and use of suitable non stick coating materials. Unexpected behaviour in the correlation in the region of short decay times led to consideration of the effects of rotational diffusion

  19. Biomass changes and geostatistical analysis of Xiao Hinggan region in recent 30 years%近30年来小兴安岭地区生物量变化及地统计分析

    Institute of Scientific and Technical Information of China (English)

    毛学刚; 李明泽; 范文义; 姜欢欢

    2011-01-01

    Based on remote sensing data from three periods of 1980s, 1990s, and the period after 2000 as well as the data of plots in forest resource inventory over the same periods, forest biomass of Xiao Hinggan region was estimated by using the remote sensing information model. With the combination of GIS and geo-statistics, this paper studies the temporal changes in forest biomass, spatial autocorrelation and heterogeneity of Xiao Hinggan region in the three periods of 1980s, 1990s, and the period after 2000. Results indicated that the overall biomass presented fluctuation change in the research area from the 1980s to the 2000s. With relatively low biological value, low-grade biomass was dominant in the 1980s, and there was contiguous distribution of low-value biomass, with high degree of spatial autocorrelation. However, the random factors of medium and higher biomass increased, indicating the man-made interference degree continuously strengthened. In the 1990s the main advantages biomass in the study area was medium biomass, which evolved from dominant low-grade biomass in the 1980s. The changes in the 10 years showed that overall biomass tended to recover. For the data were mainly concentrated in the late 1990s when the Natural Forest Protection Project (NFPP) had been launched that made the forest status towards a good direction, the overall biological value was increased. After 2000 the spatial autocorrelation of overall biomass in the research area was not high, but medium and higher biomass was similar and changed evenly in every direction. Median biomass was distributed widely, while high-value biomass was of small patches with fragmentation, and the spatial variability caused by random factors such as man-made disturbance or the factor of spatial autocorrelation was just similar, and appeared a stability trend.%以20世纪80年代、90年代、2000年以后三个时期的遥感数据和同期的森林资源清查样地数据为基础,应用遥感信息模型的

  20. LAMQS analysis applied to ancient Egyptian bronze coins

    Science.gov (United States)

    Torrisi, L.; Caridi, F.; Giuffrida, L.; Torrisi, A.; Mondio, G.; Serafino, T.; Caltabiano, M.; Castrizio, E. D.; Paniz, E.; Salici, A.

    2010-05-01

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  1. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  2. LAMQS analysis applied to ancient Egyptian bronze coins

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: lorenzo.torrisi@unime.i [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caridi, F.; Giuffrida, L.; Torrisi, A. [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Mondio, G.; Serafino, T. [Dipartimento di Fisica della Materia ed Ingegneria Elettronica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caltabiano, M.; Castrizio, E.D. [Dipartimento di Lettere e Filosofia dell' Universita di Messina, Polo Universitario dell' Annunziata, 98168 Messina (Italy); Paniz, E.; Salici, A. [Carabinieri, Reparto Investigazioni Scientifiche, S.S. 114, Km. 6, 400 Tremestieri, Messina (Italy)

    2010-05-15

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  3. LAMQS analysis applied to ancient Egyptian bronze coins

    International Nuclear Information System (INIS)

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  4. Nested sampling applied in Bayesian room-acoustics decay analysis.

    Science.gov (United States)

    Jasa, Tomislav; Xiang, Ning

    2012-11-01

    Room-acoustic energy decays often exhibit single-rate or multiple-rate characteristics in a wide variety of rooms/halls. Both the energy decay order and decay parameter estimation are of practical significance in architectural acoustics applications, representing two different levels of Bayesian probabilistic inference. This paper discusses a model-based sound energy decay analysis within a Bayesian framework utilizing the nested sampling algorithm. The nested sampling algorithm is specifically developed to evaluate the Bayesian evidence required for determining the energy decay order with decay parameter estimates as a secondary result. Taking the energy decay analysis in architectural acoustics as an example, this paper demonstrates that two different levels of inference, decay model-selection and decay parameter estimation, can be cohesively accomplished by the nested sampling algorithm. PMID:23145609

  5. Thermodynamic analysis applied to a food-processing plant

    Energy Technology Data Exchange (ETDEWEB)

    Ho, J.C.; Chandratilleke, T.T.

    1987-01-01

    Two production lines of a multi-product, food-processing plant are selected for energy auditing and analysis. Thermodynamic analysis showed that the first-law and second-law efficiencies are 81.5% and 26.1% for the instant-noodles line and 23.6% and 7.9% for the malt-beverage line. These efficiency values are dictated primarily by the major energy-consuming sub-processes of each production line. Improvements in both first-law and second-law efficiencies are possible for the plants if the use of steam for heating is replaced by gaseous or liquid fuels, the steam ejectors for creating vacuum are replaced by a mechanical pump, and employing the cooler surroundings to assist in the cooling process.

  6. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report

  7. Consumers' Demand for Pork Quality: Applying Semantic Network Analysis

    OpenAIRE

    Carola Grebitus; Maike Bruhn

    2006-01-01

    Consideration of consumers' demand for food quality entails several aspects. Quality itself is a complex and dynamic concept, and constantly evolving technical progress may cause changes in consumers' judgment of quality. To improve our understanding of the factors influencing the demand for quality, food quality must be defined and measured from the consumer's perspective (Cardello, 1995). The present analysis addresses the issue of food quality, focusing on pork—the food that respondents ...

  8. Positive behavior support: Expanding the application of applied behavior analysis

    OpenAIRE

    Anderson, Cynthia M.; Freeman, Kurt A.

    2000-01-01

    Positive behavior support (PBS) is an approach to providing services to individuals who exhibit challenging behavior. Since its inception in the early 1990s, PBS has received increasing attention from the behavior-analytic community. Some behavior analysts have embraced this approach, but others have voiced questions and concerns. In this paper we describe the framework of PBS and show that it is consistent with the tenets of behavior analysis. Also, we illustrate how the framework of PBS mig...

  9. Ion beam analysis techniques applied to large scale pollution studies

    International Nuclear Information System (INIS)

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 μm particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs

  10. Applying ABC analysis to the Navy's inventory management system

    OpenAIRE

    May, Benjamin

    2014-01-01

    Approved for public release; distribution is unlimited ABC Analysis is an inventory categorization technique used to classify and prioritize inventory items in an effort to better allocate business resources. A items are defined as the inventory items considered extremely important to the business, requiring strict oversight and control. B items are important to the business, but don’t require the tight controls and oversight required of the A items. C items are marginally important to the...

  11. Weighted gene coexpression network analysis strategies applied to mouse weight

    OpenAIRE

    Fuller, Tova F; Ghazalpour, Anatole; Aten, Jason E.; Drake, Thomas A; Lusis, Aldons J.; Horvath, Steve

    2007-01-01

    Systems-oriented genetic approaches that incorporate gene expression and genotype data are valuable in the quest for genetic regulatory loci underlying complex traits. Gene coexpression network analysis lends itself to identification of entire groups of differentially regulated genes—a highly relevant endeavor in finding the underpinnings of complex traits that are, by definition, polygenic in nature. Here we describe one such approach based on liver gene expression and genotype data from an ...

  12. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  13. Applying importance-performance analysis to evaluate banking service quality

    OpenAIRE

    André Luís Policani Freitas; Alline Sardinha Cordeiro Morais

    2012-01-01

    In an increasingly competitive market, the identification of the most important aspects and the measurement of service quality as perceived by the customers are important actions taken by organizations which seek the competitive advantage. In particular, this scenario is typical of Brazilian banking sector. In this context, this article presents an exploratory case study in which the Importance-Performance Analysis (IPA) was used to identify the strong and the weak points related to services ...

  14. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  15. Integration of dynamical data in a geostatistical model of reservoir; Integration des donnees dynamiques dans un modele geostatistique de reservoir

    Energy Technology Data Exchange (ETDEWEB)

    Costa Reis, L.

    2001-01-01

    We have developed in this thesis a methodology of integrated characterization of heterogeneous reservoirs, from geologic modeling to history matching. This methodology is applied to the reservoir PBR, situated in Campos Basin, offshore Brazil, which has been producing since June 1979. This work is an extension of two other thesis concerning geologic and geostatistical modeling of the reservoir PBR from well data and seismic information. We extended the geostatistical litho-type model to the whole reservoir by using a particular approach of the non-stationary truncated Gaussian simulation method. This approach facilitated the application of the gradual deformation method to history matching. The main stages of the methodology for dynamic data integration in a geostatistical reservoir model are presented. We constructed a reservoir model and the initial difficulties in the history matching led us to modify some choices in the geological, geostatistical and flow models. These difficulties show the importance of dynamic data integration in reservoir modeling. The petrophysical property assignment within the litho-types was done by using well test data. We used an inversion procedure to evaluate the petrophysical parameters of the litho-types. The up-scaling is a necessary stage to reduce the flow simulation time. We compared several up-scaling methods and we show that the passage from the fine geostatistical model to the coarse flow model should be done very carefully. The choice of the fitting parameter depends on the objective of the study. In the case of the reservoir PBR, where water is injected in order to improve the oil recovery, the water rate of the producing wells is directly related to the reservoir heterogeneity. Thus, the water rate was chosen as the fitting parameter. We obtained significant improvements in the history matching of the reservoir PBR. First, by using a method we have proposed, called patchwork. This method allows us to built a coherent

  16. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  17. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  18. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  19. Risk Analysis of the applied RFID system : Project Stolpen

    OpenAIRE

    Grunzke, Richard

    2007-01-01

    This thesis will be a risk analysis of a RFID-system for a logistical application. The system works as follows: Around Karlstad in Sweden there are three new weighing machines for lorries. The load weight will be measured for the police to control overweight and for logistical reasons such as issuing invoices and optimising the supply chain. The lorries do not have to stop to be weighed. They have to drive slowly over the weighing machine, so the loss of time is minimal. The lorries will be i...

  20. Methods of economic analysis applied to fusion research. Final report

    International Nuclear Information System (INIS)

    In this and previous efforts ECON has provided economic assessment of a fusion research program. This phase of study focused on two tasks, the first concerned with the economics of fusion in an economy that relies heavily upon synthetic fuels, and the second concerned with the overall economic effects of pursuing soft energy technologies instead of hard technologies. This report is organized in two parts, the first entitled An Economic Analysis of Coproduction of Fusion-Electric Energy and Other Products, and the second entitled Arguments Associated with the Choice of Potential Energy Futures

  1. Laser induced breakdown spectroscopy (LIBS) applied to plutonium analysis

    International Nuclear Information System (INIS)

    A Laser Induced Breakdown Spectroscopy (LIBS) system has been developed specifically for the quantitative analysis of gallium in plutonium dioxide in support of the MOX fuel development program. The advantage of this system is no sample preparation and the capability to analyze extremely small samples. Success in this application has prompted an expansion of the technique to other areas, including determination of plutonium isotopic ratios. This paper will present recent results for gallium content in PuO2 after processing via thermally induced gallium removal (TIGR). Data will also be presented for the determination of the plutonium 239/240 isotopic ratio

  2. Applying temporal network analysis to the venture capital market

    Science.gov (United States)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  3. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  4. Dynamical Systems Analysis Applied to Working Memory Data

    Directory of Open Access Journals (Sweden)

    FidanGasimova

    2014-07-01

    Full Text Available In the present paper we investigate weekly fluctuations in the working memory capacity (WMC assessed over a period of two years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure’s performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  5. High precision methods of neutron activation analysis applied to geochemistry

    International Nuclear Information System (INIS)

    Neutron activation analysis is a technique for measuring abundances of chemical elements, which differs from other methods in that it is based upon nuclear reactions instead of chemistry. This characteristic has special relevance for geochemistry because of its inherent sensitivity for trace elements which cannot be reached by other methods. 99% of the earth's crust is made up of just 8 elements, whereas the remaining 1% must accommodate 70-odd rock building trace elements of which about half can be measured by neutron activation analysis. In recent years, there has been much interest in these trace elements because they encompass diverse chemical properties. The present discussion of the technique is based upon more than 15 years of experience at the Lawrence Berkeley Laboratory and The Hebrew University of Jerusalem. This is not meant to intimate that the practices in other laboratories do not merit attention. Perhaps our approach differs from other published work in the emphasis given to sources of error and learning how to control them

  6. Geostatistical and Fractal Characteristics of Soil Moisture Patterns from Plot to Catchment Scale Datasets

    Science.gov (United States)

    Korres, Wolfgang; Reichenau, Tim G.; Fiener, Peter; Koyama, Christian N.; Bogena, Heye R.; Cornelissen, Thomas; Baatz, Roland; Herbst, Michael; Diekkrüger, Bernd; Vereecken, Harry; Schneider, Karl

    2016-04-01

    Soil moisture and its spatio-temporal pattern is a key variable in hydrology, meteorology and agriculture. The aim of the current study is to analyze spatio-temporal soil moisture patterns of 9 datasets from the Rur catchment (Western Germany) with a total area of 2364 km², consisting of a low mountain range (forest and grassland) and a loess plain dominated by arable land. Data was acquired across a variety of land use types, on different spatial scales (plot to mesoscale catchment) and with different methods (field measurements, remote sensing, and modelling). All datasets were analyzed using the same methodology. In a geostatistical analysis sill and range of the theoretical variogram were inferred. Based on this analysis, three groups of datasets with similar characteristics in the autocorrelation structure were identified: (i) modelled and measured datasets from a forest sub-catchment (influenced by soil properties and topography), (ii) remotely sensed datasets from the cropped part of the total catchment (influenced by the land-use structure of the cropped area), and (iii) modelled datasets from the cropped part of the Rur catchment (influenced by large scale variability of soil properties). A fractal analysis revealed that soil moisture patterns of all datasets show a multi-fractal behavior (varying fractal dimensions, patterns are only self-similar over certain ranges of scales), with at least one scale break and generally high fractal dimensions (high spatial variability). Corresponding scale breaks were found in various datasets and the factors explaining these scale breaks are consistent with the findings of the geostatistical analysis. The joined analysis of the different datasets showed that small differences in soil moisture dynamics, especially at maximum porosity and wilting point in the soils, can have a large influence on the soil moisture patterns and their autocorrelation structure.

  7. Applying Hybrid-Quantity Analysis in the Asia Semiconductor Industry

    Directory of Open Access Journals (Sweden)

    Chin-Yuan Fan

    2013-08-01

    Full Text Available The semiconductor market has gradually transitioned from advanced countries to the Asian-Pacific region. Since the 1980s, Taiwan has been developing its own semiconductor industry, and after 20 years of effort, has become one of the world's major exporters of semiconductor products. Therefore, to position Taiwan in relation to other countries for competitive advantage, as defined by technology and industrial development, requires a better understanding of the developmental trends of the semiconductor technology of major competing countries in the Asian-Pacific region. This can further provide the Taiwanese government with additional strategic development proposals. We used a combination of patents, data-mining methods [multidimensional scaling (MDS analysis, and K-means clustering] to explore competing technological and strategic-group relationships within the semiconductor industry in the Asian-Pacific region. We assessed the relative technological advantages of various organizations and proposed additional technology development strategy recommendations to the Taiwanese semiconductor industry.

  8. Applying importance-performance analysis to evaluate banking service quality

    Directory of Open Access Journals (Sweden)

    André Luís Policani Freitas

    2012-11-01

    Full Text Available In an increasingly competitive market, the identification of the most important aspects and the measurement of service quality as perceived by the customers are important actions taken by organizations which seek the competitive advantage. In particular, this scenario is typical of Brazilian banking sector. In this context, this article presents an exploratory case study in which the Importance-Performance Analysis (IPA was used to identify the strong and the weak points related to services provided by a bank. In order to check the reliability of the questionnaire, Cronbach's alpha and correlation analyses were used. The results are presented and some actions have been defined in order to improve the quality of services.

  9. Neutron activation analysis applied to nutritional and foodstuff studies

    International Nuclear Information System (INIS)

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  10. Sensitivity Analysis Applied in Design of Low Energy Office Building

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik

    2008-01-01

    Building performance can be expressed by different indicators as primary energy use, environmental load and/or the indoor environmental quality and a building performance simulation can provide the decision maker with a quantitative measure of the extent to which an integrated design solution...... satisfies the design requirements and objectives. In the design of sustainable Buildings it is beneficial to identify the most important design parameters in order to develop more efficiently alternative design solutions or reach optimized design solutions. A sensitivity analysis makes it possible to...... identify the most important parameters in relation to building performance and to focus design and optimization of sustainable buildings on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage of the building design process, where it...

  11. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  12. Map on predicted deposition of Cs-137 in Spanish soils from geostatistical analyses

    International Nuclear Information System (INIS)

    The knowledge of the distribution of 137Cs deposition over Spanish mainland soils along with the geographical, physical and morphological terrain information enable us to know the 137Cs background content in soil. This could be useful as a tool in a hypothetical situation of an accident involving a radioactive discharge or in soil erosion studies. A Geographic Information System (GIS) would allow the gathering of all the mentioned information. In this work, gamma measurements of 137Cs on 34 Spanish mainland soils, rainfall data taken from 778 weather stations, soil types and geographical and physical terrain information were input into a GIS. Geostatistical techniques were applied to interpolate values of 137Cs activity at unsampled places, obtaining prediction maps of 137Cs deposition. Up to now, geostatistical methods have been used to model spatial continuity of data. Through semivariance and cross-covariance functions the spatial correlation of such data can be studied and described. Ordinary and simple kriging techniques were carried out to map spatial patterns of 137Cs deposition, and ordinary and simple co-kriging were used to improve the prediction map obtained through a second related variable: namely the rainfall. To choose the best prediction map of 137Cs deposition, the spatial dependence of the variable, the correlation coefficient and the prediction errors were evaluated using the different models previously mentioned. The best result for 137Cs deposition map was obtained when applying the co-kriging techniques. - Highlights: ► Implementation of 137Cs activity data, in Spanish soils, in a GIS. ► Prediction models were performed of Cs-137 fallout with kriging techniques. ► More accurate prediction surfaces were obtained using cokriging techniques. ► Rainfall is the second variable used to cokriging technique.

  13. Geostatistical Evaluation of Spring Water Quality in an Urbanizing Carbonate Aquifer

    Science.gov (United States)

    McGinty, A.; Welty, C.

    2003-04-01

    As part of an investigation of the impacts of urbanization on the hydrology and ecology of Valley Creek watershed near Philadelphia, Pennsylvania, we have analyzed the chemical composition of 110 springs to assess the relative influence of geology and anthropogenic activities on water quality. The 60 km^2 watershed is underlain by productive fractured rock aquifers composed of Cambrian and Ordovician carbonate rocks in the central valley and Cambrian crystalline and siliciclastic rocks (quartzite and phyllite) in the north and south hills that border the valley. All tributaries of the surface water system originate in the crystalline and siliciclastic hills. The watershed is covered by 17% impervious area and contains 6 major hazardous waste sites, one active quarrying operation and one golf course; 25% of the area utilizes septic systems for sewage disposal. We identified 172 springs, 110 of which had measurable flow rates ranging from 0.002 to 5 l/s. The mapped surficial geology appears as an anisotropic pattern, with long bands of rock formations paralleling the geographic orientation of the valley. Mapped development appears as a more isotropic pattern, characterized by isolated patches of land use that are not coincident with the evident geologic pattern. Superimposed upon these characteristics is a dense array of depressions and shallow sinkholes in the carbonate rocks, and a system of major faults at several formation contacts. We used indicator geostatistics to quantitatively characterize the spatial extent of the major geologic formations and patterns of land use. Maximum correlation scales for the rock types corresponded with strike direction and ranged from 1000 to 3000 m. Anisotropy ratios ranged from 2 to 4. Land-use correlation scales were generally smaller (200 to 500 m) with anisotropy ratios of around 1.2, i.e., nearly isotropic as predicted. Geostatistical analysis of spring water quality parameters related to geology (pH, specific conductance

  14. Applied climate-change analysis: the climate wizard tool.

    Directory of Open Access Journals (Sweden)

    Evan H Girvetz

    Full Text Available BACKGROUND: Although the message of "global climate change" is catalyzing international action, it is local and regional changes that directly affect people and ecosystems and are of immediate concern to scientists, managers, and policy makers. A major barrier preventing informed climate-change adaptation planning is the difficulty accessing, analyzing, and interpreting climate-change information. To address this problem, we developed a powerful, yet easy to use, web-based tool called Climate Wizard (http://ClimateWizard.org that provides non-climate specialists with simple analyses and innovative graphical depictions for conveying how climate has and is projected to change within specific geographic areas throughout the world. METHODOLOGY/PRINCIPAL FINDINGS: To demonstrate the Climate Wizard, we explored historic trends and future departures (anomalies in temperature and precipitation globally, and within specific latitudinal zones and countries. We found the greatest temperature increases during 1951-2002 occurred in northern hemisphere countries (especially during January-April, but the latitude of greatest temperature change varied throughout the year, sinusoidally ranging from approximately 50 degrees N during February-March to 10 degrees N during August-September. Precipitation decreases occurred most commonly in countries between 0-20 degrees N, and increases mostly occurred outside of this latitudinal region. Similarly, a quantile ensemble analysis based on projections from 16 General Circulation Models (GCMs for 2070-2099 identified the median projected change within countries, which showed both latitudinal and regional patterns in projected temperature and precipitation change. CONCLUSIONS/SIGNIFICANCE: The results of these analyses are consistent with those reported by the Intergovernmental Panel on Climate Change, but at the same time, they provide examples of how Climate Wizard can be used to explore regionally- and temporally

  15. Q uantitative methods applied in the analysis of teenagers problems

    Directory of Open Access Journals (Sweden)

    Constanţa Popescu

    2015-12-01

    Full Text Available The theme of the article is the study of teenagers problems based on quantitative methods, the scientific approach is divided into two parts: the part of knowledge and the practical approach. During the first part we describe the problems of adolescents based on national and international literature, and during the second part we use some quantitative methods (diagnosis, regression and investigation we aim to achieve an in-depth analysis of the addressed topic. Through the diagnosis we highlight changes in numerical terms of the number of adolescents, and also their problems: poverty and delinquency. Regression functions are used to show the nature, direction and intensity of the relationship between a number of causal variables and the outcome variable. The investigation aims to identify the extent to which cultural values of the country leave their mark on the perception of the importance of family and friends for teens. The main conclusions of the research points out the fact that the decrease in the number of Romanian teenagers their problems still persist.

  16. Quantitative phase imaging applied to laser damage detection and analysis.

    Science.gov (United States)

    Douti, Dam-Bé L; Chrayteh, Mhamad; Aknoun, Sherazade; Doualle, Thomas; Hecquet, Christophe; Monneret, Serge; Gallais, Laurent

    2015-10-01

    We investigate phase imaging as a measurement method for laser damage detection and analysis of laser-induced modification of optical materials. Experiments have been conducted with a wavefront sensor based on lateral shearing interferometry associated with a high-magnification optical microscope. The system has been used for the in-line observation of optical thin films and bulk samples, laser irradiated in two different conditions: 500 fs pulses at 343 and 1030 nm, and millisecond to second irradiation with a CO2 laser at 10.6 μm. We investigate the measurement of the laser-induced damage threshold of optical material by detection and phase changes and show that the technique realizes high sensitivity with different optical path measurements lower than 1 nm. Additionally, the quantitative information on the refractive index or surface modification of the samples under test that is provided by the system has been compared to classical metrology instruments used for laser damage or laser ablation characterization (an atomic force microscope, a differential interference contrast microscope, and an optical surface profiler). An accurate in-line measurement of the morphology of laser-ablated sites, from few nanometers to hundred microns in depth, is shown. PMID:26479612

  17. Perturbation Method of Analysis Applied to Substitution Measurements of Buckling

    International Nuclear Information System (INIS)

    Calculations with two-group perturbation theory on substitution experiments with homogenized regions show that a condensation of the results into a one-group formula is possible, provided that a transition region is introduced in a proper way. In heterogeneous cores the transition region comes in as a consequence of a new cell concept. By making use of progressive substitutions the properties of the transition region can be regarded as fitting parameters in the evaluation procedure. The thickness of the region is approximately equal to the sum of 1/(1/τ + 1/L2)1/2 for the test and reference regions. Consequently a region where L2 >> τ, e.g. D2O, contributes with √τ to the thickness. In cores where τ >> L2 , e.g. H2O assemblies, the thickness of the transition region is determined by L. Experiments on rod lattices in D2O and on test regions of D2O alone (where B2 = - 1/L2 ) are analysed. The lattice measurements, where the pitches differed by a factor of √2, gave excellent results, whereas the determination of the diffusion length in D2O by this method was not quite successful. Even regions containing only one test element can be used in a meaningful way in the analysis

  18. Ion Beam Analysis applied to laser-generated plasmas

    International Nuclear Information System (INIS)

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed

  19. Ion Beam Analysis applied to laser-generated plasmas

    Science.gov (United States)

    Cutroneo, M.; Macková, A.; Havranek, V.; Malinsky, P.; Torrisi, L.; Kormunda, M.; Barchuk, M.; Ullschmied, J.; Dudzak, R.

    2016-04-01

    This paper presents the research activity on Ion Beam Analysis methods performed at Tandetron Laboratory (LT) of the Institute of Nuclear Physics AS CR, Rez, Czech Republic. Recently, many groups are paying attention to implantation by laser generated plasma. This process allows to insert a controllable amount of energetic ions into the surface layers of different materials modifying the physical and chemical properties of the surface material. Different substrates are implanted by accelerated ions from plasma through terawatt iodine laser, at nominal intensity of 1015 W/cm2, at the PALS Research Infrastructure AS CR, in the Czech Republic. This regime of the laser matter interaction generates, multi-MeV proton beams, and multi-charged ions that are tightly confined in time (hundreds ps) and space (source radius of a few microns). These ion beams have a much lower transverse temperature, a much shorter duration and a much higher current than those obtainable from conventional accelerators. The implementation of protons and ions acceleration driven by ultra-short high intensity lasers is exhibited by adopting suitable irradiation conditions as well as tailored targets. An overview of implanted targets and their morphological and structural characterizations is presented and discussed.

  20. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  1. Statistical analysis and Kalman filtering applied to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Much theoretical research has been carried out on the development of statistical methods for nuclear material accountancy. In practice, physical, financial and time constraints mean that the techniques must be adapted to give an optimal performance in plant conditions. This thesis aims to bridge the gap between theory and practice, to show the benefits to be gained from a knowledge of the facility operation. Four different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an 'accountancy tank' is investigated. Secondly, an analysis of the calibration data for the same tank is presented, establishing bounds for the error and suggesting a means of reducing them. Thirdly, a plant-specific method of producing an optimal statistic from the input, output and inventory data, to help decide between 'material loss' and 'no loss' hypotheses, is developed and compared with existing general techniques. Finally, an application of the Kalman Filter to materials accountancy is developed, to demonstrate the advantages of state-estimation techniques. The results of the analyses and comparisons illustrate the importance of taking into account a complete and accurate knowledge of the plant operation, measurement system, and calibration methods, to derive meaningful results from statistical tests on materials accountancy data, and to give a better understanding of critical random and systematic error sources. The analyses were carried out on the head-end of the Fast Reactor Reprocessing Plant, where fuel from the prototype fast reactor is cut up and dissolved. However, the techniques described are general in their application. (author)

  2. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  3. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  4. The schism between experimental and applied behavior analysis: Is it real and who cares? 1

    OpenAIRE

    Poling, Alan; Picker, Mitchell; Grossett, Deborah; Hall-Johnson, Earl; Holbrook, Maurice

    1981-01-01

    This paper addresses the relationship between the experimental analysis of behavior and applied behavior analysis. Citation data indicate that across time the Journal of the Experimental Analysis of Behavior, and other experimental sources, have been referenced increasingly infrequently in the Journal of Applied Behavior Analysis, Behavior Therapy, and Behavior Research and Therapy. Such sources are now rarely cited in these journals, and never have been regularly referenced in Behavior Modif...

  5. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    Science.gov (United States)

    Hashemi, Seyyedhossein; Javaherian, Abdolrahim; Ataee-pour, Majid; Khoshdel, Hossein

    2014-12-01

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data. This

  6. Two-point versus multiple-point geostatistics: the ability of geostatistical methods to capture complex geobodies and their facies associations—an application to a channelized carbonate reservoir, southwest Iran

    International Nuclear Information System (INIS)

    Facies models try to explain facies architectures which have a primary control on the subsurface heterogeneities and the fluid flow characteristics of a given reservoir. In the process of facies modeling, geostatistical methods are implemented to integrate different sources of data into a consistent model. The facies models should describe facies interactions; the shape and geometry of the geobodies as they occur in reality. Two distinct categories of geostatistical techniques are two-point and multiple-point (geo) statistics (MPS). In this study, both of the aforementioned categories were applied to generate facies models. A sequential indicator simulation (SIS) and a truncated Gaussian simulation (TGS) represented two-point geostatistical methods, and a single normal equation simulation (SNESIM) selected as an MPS simulation representative. The dataset from an extremely channelized carbonate reservoir located in southwest Iran was applied to these algorithms to analyze their performance in reproducing complex curvilinear geobodies. The SNESIM algorithm needs consistent training images (TI) in which all possible facies architectures that are present in the area are included. The TI model was founded on the data acquired from modern occurrences. These analogies delivered vital information about the possible channel geometries and facies classes that are typically present in those similar environments. The MPS results were conditioned to both soft and hard data. Soft facies probabilities were acquired from a neural network workflow. In this workflow, seismic-derived attributes were implemented as the input data. Furthermore, MPS realizations were conditioned to hard data to guarantee the exact positioning and continuity of the channel bodies. A geobody extraction workflow was implemented to extract the most certain parts of the channel bodies from the seismic data. These extracted parts of the channel bodies were applied to the simulation workflow as hard data

  7. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    Science.gov (United States)

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  8. Geostatistical ore reserve estimation for a roll-front type uranium deposit (practitioner's guide)

    International Nuclear Information System (INIS)

    This report comprises two parts. Part I contains illustrative examples of each phase of a geostatistical study using a roll-front type uranium deposit. Part II contains five computer programs and comprehensive users' manuals for these programs which are necessary to make a practical geostatistical study

  9. Spatial continuity measures for probabilistic and deterministic geostatistics

    Energy Technology Data Exchange (ETDEWEB)

    Isaaks, E.H.; Srivastava, R.M.

    1988-05-01

    Geostatistics has traditionally used a probabilistic framework, one in which expected values or ensemble averages are of primary importance. The less familiar deterministic framework views geostatistical problems in terms of spatial integrals. This paper outlines the two frameworks and examines the issue of which spatial continuity measure, the covariance C(h) or the variogram ..sigma..(h), is appropriate for each framework. Although C(h) and ..sigma..(h) were defined originally in terms of spatial integrals, the convenience of probabilistic notation made the expected value definitions more common. These now classical expected value definitions entail a linear relationship between C(h) and ..sigma..(h); the spatial integral definitions do not. In a probabilistic framework, where available sample information is extrapolated to domains other than the one which was sampled, the expected value definitions are appropriate; furthermore, within a probabilistic framework, reasons exist for preferring the variogram to the covariance function. In a deterministic framework, where available sample information is interpolated within the same domain, the spatial integral definitions are appropriate and no reasons are known for preferring the variogram. A case study on a Wiener-Levy process demonstrates differences between the two frameworks and shows that, for most estimation problems, the deterministic viewpoint is more appropriate. Several case studies on real data sets reveal that the sample covariance function reflects the character of spatial continuity better than the sample variogram. From both theoretical and practical considerations, clearly for most geostatistical problems, direct estimation of the covariance is better than the traditional variogram approach.

  10. Geostatistical Characteristics of the Structure of Spatial Variation of Electrical Power in the National 110 KV Network Including Results of Variogram Model Components Filtering

    OpenAIRE

    Barbara Namysłowska-Wilczyńska; Artur Wilczyński

    2015-01-01

    The paper provides results of analysing the superficial variability of electrical power using two geostatistical methods – lognormal kriging and ordinary kriging. The research work was to provide detailed characterization and identification of the electrical load variability structure at nodes of a 110 kV network over the whole territory of Poland having been analyzed on the basis of results from kriging techniques applied. The paper proposes the methodology using two techniques of mode...

  11. Geostatistical uncertainty of assessing air quality using high-spatial-resolution lichen data: A health study in the urban area of Sines, Portugal.

    Science.gov (United States)

    Ribeiro, Manuel C; Pinho, P; Branquinho, C; Llop, Esteve; Pereira, Maria J

    2016-08-15

    In most studies correlating health outcomes with air pollution, personal exposure assignments are based on measurements collected at air-quality monitoring stations not coinciding with health data locations. In such cases, interpolators are needed to predict air quality in unsampled locations and to assign personal exposures. Moreover, a measure of the spatial uncertainty of exposures should be incorporated, especially in urban areas where concentrations vary at short distances due to changes in land use and pollution intensity. These studies are limited by the lack of literature comparing exposure uncertainty derived from distinct spatial interpolators. Here, we addressed these issues with two interpolation methods: regression Kriging (RK) and ordinary Kriging (OK). These methods were used to generate air-quality simulations with a geostatistical algorithm. For each method, the geostatistical uncertainty was drawn from generalized linear model (GLM) analysis. We analyzed the association between air quality and birth weight. Personal health data (n=227) and exposure data were collected in Sines (Portugal) during 2007-2010. Because air-quality monitoring stations in the city do not offer high-spatial-resolution measurements (n=1), we used lichen data as an ecological indicator of air quality (n=83). We found no significant difference in the fit of GLMs with any of the geostatistical methods. With RK, however, the models tended to fit better more often and worse less often. Moreover, the geostatistical uncertainty results showed a marginally higher mean and precision with RK. Combined with lichen data and land-use data of high spatial resolution, RK is a more effective geostatistical method for relating health outcomes with air quality in urban areas. This is particularly important in small cities, which generally do not have expensive air-quality monitoring stations with high spatial resolution. Further, alternative ways of linking human activities with their

  12. Geostatistical modelling of carbon monoxide levels in Khartoum State (Sudan) - GIS pilot based study

    International Nuclear Information System (INIS)

    The objective of this study is to develop a digital GIS model; that can evaluate, predict and visualize carbon monoxide (CO) levels in Khartoum state. To achieve this aim, sample data had been collected, processed and managed to generate a dynamic GIS model of carbon monoxide levels in the study area. Parametric data collected from the field and analysis carried throughout this study show that (CO) emissions were lower than the allowable ambient air quality standards released by National Environment Protection Council (NEPC-USA) for 1998. However, this pilot study has found emissions of (CO) in Omdurman city were the highest. This pilot study shows that GIS and geostatistical modeling can be used as a powerful tool to produce maps of exposure. (authors)

  13. Introduction to this Special Issue on Geostatistics and Scaling of Remote Sensing

    Science.gov (United States)

    Quattrochi, Dale A.

    1999-01-01

    The germination of this special PE&RS issue began at the Royal Geographical Society (with the Institute of British Geographers)(RCS-IBC) annual meeting in January, 1997 held at the University of Exeter in Exeter, England. The cold and snow of an England winter were greatly tempered by the friendly and cordial discussions that ensued at the meeting on possible ways to foster both dialog and research across "the Big Pond" between geographers in the US and the UK on the use of geostatistics and geospatial techniques for remote sensing of land surface processes. It was decided that one way to stimulate and enhance cooperation on the application of geostatistics and geospatial methods in remote sensing was to hold parallel sessions on these topics at appropriate meeting venues in 1998 in both the US and the UK Selected papers given at these sessions would be published as a special issue of PE&RS on the US side, and as a special issue of Computers and Geosciences (C&G) on the UK side, to highlight the commonality in research on geostatistics and geospatial methods in remote sensing and spatial data analysis on both sides of the Atlantic Ocean. As a consequence, a session on "Ceostatistics and Geospatial Techniques for Remote Sensing of Land Surface Processes" was held at the Association of American Geographers (AAG) annual meeting in Boston, Massachusetts in March, 1998, sponsored by the AAG's Remote Sensing Specialty Group (RSSG). A similar session was held at the RGS-IBG annual meeting in Guildford, Surrey, England in January 1998, organized by the Modeling and Advanced Techniques Special Interest Group (MAT SIG) of the Remote Sensing Society (RSS). The six papers that in part, comprise this issue of PE&RS, are the US complement to such a dual journal publication effort. Both of us are co-editors of each of the journal special issues, with the lead editor of each journal being from their respective side of the Atlantic where the journals are published. The special

  14. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  15. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    Science.gov (United States)

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  16. Cross-covariance functions for multivariate geostatistics

    KAUST Repository

    Genton, Marc G.

    2015-05-01

    Continuously indexed datasets with multiple variables have become ubiquitous in the geophysical, ecological, environmental and climate sciences, and pose substantial analysis challenges to scientists and statisticians. For many years, scientists developed models that aimed at capturing the spatial behavior for an individual process; only within the last few decades has it become commonplace to model multiple processes jointly. The key difficulty is in specifying the cross-covariance function, that is, the function responsible for the relationship between distinct variables. Indeed, these cross-covariance functions must be chosen to be consistent with marginal covariance functions in such a way that the second-order structure always yields a nonnegative definite covariance matrix. We review the main approaches to building cross-covariance models, including the linear model of coregionalization, convolution methods, the multivariate Matérn and nonstationary and space-time extensions of these among others. We additionally cover specialized constructions, including those designed for asymmetry, compact support and spherical domains, with a review of physics-constrained models. We illustrate select models on a bivariate regional climate model output example for temperature and pressure, along with a bivariate minimum and maximum temperature observational dataset; we compare models by likelihood value as well as via cross-validation co-kriging studies. The article closes with a discussion of unsolved problems. © Institute of Mathematical Statistics, 2015.

  17. Testing geostatistical methods to combine radar and rain gauges for precipitation mapping in a mountainous region

    Science.gov (United States)

    Erdin, R.; Frei, C.; Sideris, I.; Kuensch, H.-R.

    2010-09-01

    There is an increasing demand for accurate mapping of precipitation at a spatial resolution of kilometers. Radar and rain gauges - the two main precipitation measurement systems - exhibit complementary strengths and weaknesses. Radar offers high spatial and temporal resolution but lacks accuracy of absolute values, whereas rain gauges provide accurate values at their specific point location but suffer from poor spatial representativeness. Methods of geostatistical mapping have been proposed to combine radar and rain gauge data for quantitative precipitation estimation (QPE). The aim is to combine the respective strengths and compensate for the respective weaknesses of the two observation platforms. Several studies have demonstrated the potential of these methods over topography of moderate complexity, but their performance remains unclear for high-mountain regions where rainfall patterns are complex, the representativeness of rain gauge measurements is limited and radar observations are obstructed. In this study we examine the potential and limitations of two frequently used geostatistical mapping methods for the territory of Switzerland, where the mountain chain of the Alps poses particular challenges to QPE. The two geostatistical methods explored are kriging with external drift (KED) using radar as drift variable and ordinary kriging of radar errors (OKRE). The radar data is a composite from three C-band radars using a constant Z-R relationship, advanced correction processings for visibility, ground clutter and beam shielding and a climatological bias adjustment. The rain gauge data originates from an automatic network with a typical inter-station distance of 25 km. Both combination methods are applied to a set of case examples representing typical rainfall situations in the Alps with their inherent challenges at daily and hourly time resolution. The quality of precipitation estimates is assessed by several skill scores calculated from cross validation errors at

  18. Geostatistics and cost-effective environmental remediation

    International Nuclear Information System (INIS)

    Numerous sites within the U.S. Department of Energy (DOE) complex have been contaminated with various radioactive and hazardous materials by defense-related activities during the post-World War II era. The perception is that characterization and remediation of these contaminated sites will be too costly using currently available technology. Consequently, the DOE Office of Technology Development has funded development of a number of alternative processes for characterizing and remediating these sites. The former Feed-Materials Processing Center near Fernald, Ohio (USA), was selected for demonstrating several innovative technologies. Contamination at the Fernald site consists principally of particulate uranium and derivative compounds in surficial soil. A field-characterization demonstration program was conducted during the summer of 1994 specifically to demonstrate the relative economic performance of seven proposed advanced-characterization tools for measuring uranium activity of in-situ soils. These innovative measurement technologies are principally radiation detectors of varied designs. Four industry-standard measurement technologies, including conventional, regulatory-agency-accepted soil sampling followed by laboratory geochemical analysis, were also demonstrated during the program for comparative purposes. A risk-based economic-decision model has been used to evaluate the performance of these alternative characterization tools. The decision model computes the dollar value of an objective function for each of the different characterization approaches. The methodology not only can assist site operators to choose among engineering alternatives for site characterization and/or remediation, but also can provide an objective and quantitative basis for decisions with respect to the completeness of site characterization

  19. Soil moisture estimation by assimilating L-band microwave brightness temperature with geostatistics and observation localization.

    Science.gov (United States)

    Han, Xujun; Li, Xin; Rigon, Riccardo; Jin, Rui; Endrizzi, Stefano

    2015-01-01

    The observation could be used to reduce the model uncertainties with data assimilation. If the observation cannot cover the whole model area due to spatial availability or instrument ability, how to do data assimilation at locations not covered by observation? Two commonly used strategies were firstly described: One is covariance localization (CL); the other is observation localization (OL). Compared with CL, OL is easy to parallelize and more efficient for large-scale analysis. This paper evaluated OL in soil moisture profile characterizations, in which the geostatistical semivariogram was used to fit the spatial correlated characteristics of synthetic L-Band microwave brightness temperature measurement. The fitted semivariogram model and the local ensemble transform Kalman filter algorithm are combined together to weight and assimilate the observations within a local region surrounding the grid cell of land surface model to be analyzed. Six scenarios were compared: 1_Obs with one nearest observation assimilated, 5_Obs with no more than five nearest local observations assimilated, and 9_Obs with no more than nine nearest local observations assimilated. The scenarios with no more than 16, 25, and 36 local observations were also compared. From the results we can conclude that more local observations involved in assimilation will improve estimations with an upper bound of 9 observations in this case. This study demonstrates the potentials of geostatistical correlation representation in OL to improve data assimilation of catchment scale soil moisture using synthetic L-band microwave brightness temperature, which cannot cover the study area fully in space due to vegetation effects. PMID:25635771

  20. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

    Directory of Open Access Journals (Sweden)

    Larisa Preda

    2007-05-01

    Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

  1. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  2. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    Science.gov (United States)

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  3. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kolotilina, L.; Nikishin, A.; Yeremin, A. [and others

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  4. Applying static code analysis to firewall policies for the purpose of anomaly detection

    OpenAIRE

    Zaliva, Vadim

    2011-01-01

    Treating modern firewall policy languages as imperative, special purpose programming languages, in this article we will try to apply static code analysis techniques for the purpose of anomaly detection. We will first abstract a policy in common firewall policy language into an intermediate language, and then we will try to apply anomaly detection algorithms to it. The contributions made by this work are: 1. An analysis of various control flow instructions in popular firewall policy languages ...

  5. Geostatistical methods for rock mass quality prediction using borehole and geophysical survey data

    Science.gov (United States)

    Chen, J.; Rubin, Y.; Sege, J. E.; Li, X.; Hehua, Z.

    2015-12-01

    For long, deep tunnels, the number of geotechnical borehole investigations during the preconstruction stage is generally limited. Yet tunnels are often constructed in geological structures with complex geometries, and in which the rock mass is fragmented from past structural deformations. Tunnel Geology Prediction (TGP) is a geophysical technique widely used during tunnel construction in China to ensure safety during construction and to prevent geological disasters. In this paper, geostatistical techniques were applied in order to integrate seismic velocity from TGP and borehole information into spatial predictions of RMR (Rock Mass Rating) in unexcavated areas. This approach is intended to apply conditional probability methods to transform seismic velocities to directly observed RMR values. The initial spatial distribution of RMR, inferred from the boreholes, was updated by including geophysical survey data in a co-kriging approach. The method applied to a real tunnel project shows significant improvements in rock mass quality predictions after including geophysical survey data, leading to better decision-making for construction safety design.

  6. Estimating geostatistical parameters and spatially-variable hydraulic conductivity within a catchment system using an ensemble smoother

    Directory of Open Access Journals (Sweden)

    R. T. Bailey

    2011-10-01

    be used to identify the geostatistical parameter values of the aquifer system. In general, water table data have a much greater ability than streamflow data to condition K. Future research includes applying the methodology to an actual regional study site.

  7. Estimating geostatistical parameters and spatially-variable hydraulic conductivity within a catchment system using an ensemble smoother

    Directory of Open Access Journals (Sweden)

    R. T. Bailey

    2012-02-01

    be used to identify the geostatistical parameter values of the aquifer system. In general, water table data have a much greater ability than streamflow data to condition K. Future research includes applying the methodology to an actual regional study site.

  8. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  9. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg; Windfeld, Kristian

    1990-01-01

    Particulate surface contamination is of concern in production industries such as food processing, aerospace, electronics and semiconductor manufacturing. There is also an increased awareness that surface contamination should be monitored in industrial hygiene surveys. A conceptual and theoretical...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...

  10. A Geostatistical Approach to Indoor Surface Sampling Strategies

    DEFF Research Database (Denmark)

    Schneider, Thomas; Petersen, Ole Holm; Nielsen, Allan Aasbjerg;

    1990-01-01

    contamination, sampled from small areas on a table, have been used to illustrate the method. First, the spatial correlation is modelled and the parameters estimated from the data. Next, it is shown how the contamination at positions not measured can be estimated with kriging, a minimum mean square error method...... framework for designing sampling strategies is thus developed. The distribution and spatial correlation of surface contamination can be characterized using concepts from geostatistical science, where spatial applications of statistics is most developed. The theory is summarized and particulate surface...... using the global information. Then methods for choosing a proper sampling area for a single sample of dust on a table are given. The global contamination of an object is determined by a maximum likelihood estimator. Finally, it is shown how specified experimental goals can be included to determine a...

  11. The use of sequential indicator simulation to characterize geostatistical uncertainty

    International Nuclear Information System (INIS)

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds

  12. Principal Component Geostatistical Approach for large-dimensional inverse problems

    Science.gov (United States)

    Kitanidis, P. K.; Lee, J.

    2014-07-01

    The quasi-linear geostatistical approach is for weakly nonlinear underdetermined inverse problems, such as Hydraulic Tomography and Electrical Resistivity Tomography. It provides best estimates as well as measures for uncertainty quantification. However, for its textbook implementation, the approach involves iterations, to reach an optimum, and requires the determination of the Jacobian matrix, i.e., the derivative of the observation function with respect to the unknown. Although there are elegant methods for the determination of the Jacobian, the cost is high when the number of unknowns, m, and the number of observations, n, is high. It is also wasteful to compute the Jacobian for points away from the optimum. Irrespective of the issue of computing derivatives, the computational cost of implementing the method is generally of the order of m2n, though there are methods to reduce the computational cost. In this work, we present an implementation that utilizes a matrix free in terms of the Jacobian matrix Gauss-Newton method and improves the scalability of the geostatistical inverse problem. For each iteration, it is required to perform K runs of the forward problem, where K is not just much smaller than m but can be smaller that n. The computational and storage cost of implementation of the inverse procedure scales roughly linearly with m instead of m2 as in the textbook approach. For problems of very large m, this implementation constitutes a dramatic reduction in computational cost compared to the textbook approach. Results illustrate the validity of the approach and provide insight in the conditions under which this method perform best.

  13. Critical Analysis of a Website: A Critique based on Critical Applied Linguistics and Critical Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rina Agustina

    2013-05-01

    Full Text Available E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1 critical applied linguistics  (CAL aspects and (2 critical  discourse analysis (CDA. After analysing the website using CAL and CDA, it was found that the website was adequate for beginner, in which it provided fun learning through games as well as challenged learners’ to test their vocabulary. Despite of these strengths, there were several issues required further thinking in terms of learners’ broad knowledge, such as, some of learning materials focused on states in America. It was quite difficult for EFL learners if they did not have adequate general knowledge. Thus, the findings implied that the website could be used as a supporting learning material, which accompanied textbooks and vocabulary exercise books.

  14. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  15. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  16. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    Science.gov (United States)

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  17. Applied Behavior Analysis: Its Impact on the Treatment of Mentally Retarded Emotionally Disturbed People.

    Science.gov (United States)

    Matson, Johnny L.; Coe, David A.

    1992-01-01

    This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)

  18. Geographical distribution of the annual mean radon concentrations in primary schools of Southern Serbia – application of geostatistical methods

    International Nuclear Information System (INIS)

    Between 2008 and 2011 a survey of radon (222Rn) was performed in schools of several districts of Southern Serbia. Some results have been published previously (Žunić et al., 2010; Carpentieri et al., 2011; Žunić et al., 2013). This article concentrates on the geographical distribution of the measured Rn concentrations. Applying geostatistical methods we generate “school radon maps” of expected concentrations and of estimated probabilities that a concentration threshold is exceeded. The resulting maps show a clearly structured spatial pattern which appears related to the geological background. In particular in areas with vulcanite and granitoid rocks, elevated radon (Rn) concentrations can be expected. The “school radon map” can therefore be considered as proxy to a map of the geogenic radon potential, and allows identification of radon-prone areas, i.e. areas in which higher Rn radon concentrations can be expected for natural reasons. It must be stressed that the “radon hazard”, or potential risk, estimated this way, has to be distinguished from the actual radon risk, which is a function of exposure. This in turn may require (depending on the target variable which is supposed to measure risk) considering demographic and sociological reality, i.e. population density, distribution of building styles and living habits. -- Highlights: • A map of Rn concentrations in primary schools of Southern Serbia. • Application of geostatistical methods. • Correlation with geology found. • Can serve as proxy to identify radon prone areas

  19. Geostatistical interpolation of daily rainfall at catchment scale: the use of several variogram models in the Ourthe and Ambleve catchments, Belgium

    Directory of Open Access Journals (Sweden)

    S. Ly

    2011-07-01

    Full Text Available Spatial interpolation of precipitation data is of great importance for hydrological modelling. Geostatistical methods (kriging are widely applied in spatial interpolation from point measurement to continuous surfaces. The first step in kriging computation is the semi-variogram modelling which usually used only one variogram model for all-moment data. The objective of this paper was to develop different algorithms of spatial interpolation for daily rainfall on 1 km2 regular grids in the catchment area and to compare the results of geostatistical and deterministic approaches. This study leaned on 30-yr daily rainfall data of 70 raingages in the hilly landscape of the Ourthe and Ambleve catchments in Belgium (2908 km2. This area lies between 35 and 693 m in elevation and consists of river networks, which are tributaries of the Meuse River. For geostatistical algorithms, seven semi-variogram models (logarithmic, power, exponential, Gaussian, rational quadratic, spherical and penta-spherical were fitted to daily sample semi-variogram on a daily basis. These seven variogram models were also adopted to avoid negative interpolated rainfall. The elevation, extracted from a digital elevation model, was incorporated into multivariate geostatistics. Seven validation raingages and cross validation were used to compare the interpolation performance of these algorithms applied to different densities of raingages. We found that between the seven variogram models used, the Gaussian model was the most frequently best fit. Using seven variogram models can avoid negative daily rainfall in ordinary kriging. The negative estimates of kriging were observed for convective more than stratiform rain. The performance of the different methods varied slightly according to the density of raingages, particularly between 8 and 70 raingages but it was much different for interpolation using 4 raingages. Spatial interpolation with the geostatistical and

  20. Geostatistical methods for the integrated information; Metodos geoestadisticos para la integracion de informacion

    Energy Technology Data Exchange (ETDEWEB)

    Cassiraga, E.F.; Gomez-Hernandez, J.J. [Departamento de Ingenieria Hidraulica y Medio Ambiente, Universidad Politecnica de Valencia, Valencia (Spain)

    1996-10-01

    The main objective of this report is to describe the different geostatistical techniques to use the geophysical and hydrological parameters. We analyze the characteristics of estimation methods used in others studies.

  1. Continuous Wavelet and Hilbert-Huang Transforms Applied for Analysis of Active and Reactive Power Consumption

    OpenAIRE

    Avdakovic Samir; Bosovic Adnan

    2014-01-01

    Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real ...

  2. Statistical tool for soil biology : X. Geostatistical analysis

    OpenAIRE

    Rossi, Jean-Pierre; Lavelle, P.; Tondoh, J.E.

    1995-01-01

    Les organismes du sol présentent généralement des patrons de distribution spatiale à diverses échelles. Les méthodes classiques d'étude de la distribution spatiale sont basées sur divers indices d'agrégation ainsi que sur l'analyse des distributions de fréquence. Ces méthodes ne prennent pas en considération la position des points d'échantillonnage et par conséquent n'apportent pas d'information sur la distribution spatiale des organismes aux échelles supérieures à l'unité d'échantillonnage. ...

  3. Bayesian geostatistical prediction of the intensity of infection with Schistosoma mansoni in East Africa

    OpenAIRE

    Archie C.A. Clements; MOYEED, RANA; Brooker, Simon

    2006-01-01

    A Bayesian geostatistical model was developed to predict the intensity of infection with Schistosoma mansoni in East Africa. Epidemiological data from purposively-designed and standardized surveys were available for 31,458 schoolchildren (90% aged between 6-16 years) from 459 locations across the region and used in combination with remote sensing environmental data to identify factors associated with spatial variation in infection patterns. The geostatistical model explicitly takes into accou...

  4. Trends in applied econometrics software development 1985-2008, an analysis of Journal of Applied Econometrics research articles, software reviews, data and code

    OpenAIRE

    Ooms, M.

    2008-01-01

    Trends in software development for applied econometrics emerge from an analysis of the research articles and software reviews of the Journal of Applied Econometrics, appearing since 1986. The data and code archive of the journal provides more specific information on software use for applied econometrics since 1995. GAUSS, Stata, MATLAB and Ox have been the most important softwares after 2001. I compare these higher level programming languages and R in somewhat more detail. An increasing numbe...

  5. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    OpenAIRE

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the...

  6. Comparison of factor analysis and phase analysis methods applied to cardiac scintigraphy

    International Nuclear Information System (INIS)

    Since 1982 factor analysis has been proposed as an alternative to phase analysis for the processing of heart scintigraphy at equilibrium. Three factor analysis algorithms have been described and the clinical evaluation of these methods has been carried out in 128 patients with coronary artery disease and compared with phase analysis. The study indicates that factor analysis methods are not more accurate than phase analysis for the detection of abnormalities of ventricular contraction

  7. Use of geostatistics for remediation planning to transcend urban political boundaries

    International Nuclear Information System (INIS)

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial–residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. - Highlights: ► Point samples and property boundaries do not appropriately determine the extent of soil contamination. ► Kriging and co-kriging provide best concentration estimates for mapping soil contamination and refining clean-up sites. ► Maps provide a visual representation of geostatistical results to communities to aid in geostatistical decision making. ► Incorporating community input into the assessment of neighborhoods is good public policy practice. - Using geostatistical interpolation and mapping results to involve the affected community can substantially improve remediation planning and promote its long-term effectiveness.

  8. Assessing Landscape-Scale Soil Moisture Distribution Using Auxiliary Sensing Technologies and Multivariate Geostatistics

    Science.gov (United States)

    Landrum, C.; Castrignanò, A.; Mueller, T.; Zourarakis, D.; Zhu, J.

    2013-12-01

    from 2010 LiDAR returns collected at ≤1 m nominal pulse spacing. Exploratory statistics revealed 12 variables that best associate with soil moisture including slope, elevation, calcium, organic matter, clay, sand and geoelectric measurements (ERa for each date). A linear combination of basic variogram functions, called the linear model of coregionalization (LMC), was fitted using a matrix of direct and cross experimental variograms constituting the 12 different variables studied. The LMC consisted of 3 basic components: nugget, spherical (short range scale=40m) and exponential (long range scale=250m) where each component explained 17%, 22% and 60% of the total measured variation, respectively. Applying principal component analysis to the coregionalization matrix at each spatial scale produced a set of regionalized factors summarizing the variation at that spatial scale. Mapping regionalized factors decomposes the total measured system variation into scale-dependent synthetic homogeneous zones that lend insight into the properties influencing SMV. Results suggest that soil texture and OM drive the soil moisture variation under the soil moisture regimes observed. This study shows the potential for using ERa and multivariate statistics to develop soil moisture management strategies under water stressed conditions.

  9. Assimilation of Satellite Soil Moisture observation with the Particle Filter-Markov Chain Monte Carlo and Geostatistical Modeling

    Science.gov (United States)

    Moradkhani, Hamid; Yan, Hongxiang

    2016-04-01

    Soil moisture simulation and prediction are increasingly used to characterize agricultural droughts but the process suffers from data scarcity and quality. The satellite soil moisture observations could be used to improve model predictions with data assimilation. Remote sensing products, however, are typically discontinuous in spatial-temporal coverages; while simulated soil moisture products are potentially biased due to the errors in forcing data, parameters, and deficiencies of model physics. This study attempts to provide a detailed analysis of the joint and separate assimilation of streamflow and Advanced Scatterometer (ASCAT) surface soil moisture into a fully distributed hydrologic model, with the use of recently developed particle filter-Markov chain Monte Carlo (PF-MCMC) method. A geostatistical model is introduced to overcome the satellite soil moisture discontinuity issue where satellite data does not cover the whole study region or is significantly biased, and the dominant land cover is dense vegetation. The results indicate that joint assimilation of soil moisture and streamflow has minimal effect in improving the streamflow prediction, however, the surface soil moisture field is significantly improved. The combination of DA and geostatistical approach can further improve the surface soil moisture prediction.

  10. GEOSTATISTICAL MODEL EVALUATION OF LIMING ON OSIJEK-BARANYA COUNTY EXAMPLE

    Directory of Open Access Journals (Sweden)

    Vladimir Vukadinović

    2008-12-01

    Full Text Available Unfavorable pH of soil is the main reason for several different problems in debalance of mineral nutrition which can cause many problems in plant growth, such as leaves and fruit chlorosis and necrosis, etc. Therefore, liming as a measure for improving amount of acids soils must be conducted very carefully, with detail chemical soil analyses. This paper presents a segment of computer model for liming recommendation at the example of Osijek-Baranya County. Results of liming recommendation were obtained by geostatistical interpolation method – kriging. Totals of 9023 soil samples were analyzed in the period 2003–2007. The substitution acidity average was 5.49 (minimum 3.41 to maximum 8.20. Kriging shown that 241 379 ha (58.3% area of Osijek-Baranya County were acids soil. Therefore 90 593 ha have substitution acidity lower than 4.5 and 150 786 ha have pH International Symposium on Soil and Plant Analysis: Soil, Plant and Water Analysis: Quality Analytical Tools for an Era of Ecological Awareness. Soil and Plant Analysis Council, Cancun, Mexico, 36.KCl between 4.5 and 5.5. Except carbocalk, other "slowly-effect" materials can be recommended for liming, especially for vineyards and orchards.

  11. Large to intermediate-scale aquifer heterogeneity in fine-grain dominated alluvial fans (Cenozoic As Pontes Basin, northwestern Spain): insight based on three-dimensional geostatistical reconstruction

    Science.gov (United States)

    Falivene, O.; Cabrera, L.; Sáez, A.

    2007-08-01

    Facies reconstructions are used in hydrogeology to improve the interpretation of aquifer permeability distribution. In the absence of sufficient data to define the heterogeneity due to geological processes, uncertainties in the distribution of aquifer hydrofacies and characteristics may appear. Geometric and geostatistical methods are used to understand and model aquifer hydrofacies distribution, providing models to improve comprehension and development of aquifers. However, these models require some input statistical parameters that can be difficult to infer from the study site. A three-dimensional reconstruction of a kilometer scale fine-grain dominated Cenozoic alluvial fan derived from more than 200 continuously cored, closely spaced, and regularly distributed wells is presented. The facies distributions were reconstructed using a genetic stratigraphic subdivision and a deterministic geostatistical algorithm. The reconstruction is only slightly affected by variations in the geostatistical input parameters because of the high-density data set. Analysis of the reconstruction allowed identification in the proximal to medial alluvial fan zones of several laterally extensive sand bodies with relatively higher permeability; these sand bodies were quantified in terms of volume, mean thickness, maximum area, and maximum equivalent diameter. These quantifications provide trends and geological scenarios for input statistical parameters to model aquifer systems in similar alluvial fan depositional settings.

  12. Efficient Geostatistical Inversion under Transient Flow Conditions in Heterogeneous Porous Media

    Science.gov (United States)

    Klein, Ole; Cirpka, Olaf A.; Bastian, Peter; Ippisch, Olaf

    2014-05-01

    a reasonable range. Transient inversion, however, requires time series of measurements and therefore typically leads to a large number of observations, and under these circumstances the existing methods become unfeasible. We present an extension of the existing inversion methods to instationary flow regimes. Our approach uses a Conjugate Gradients scheme preconditioned with the prior covariance matrix QY Y to avoid both multiplications with QY Y -1 and the explicit assembly of Hz. Instead, one combined adjoint model run is used for all observations at once. As the computing time of our approach is largely independent of the number of measurements used for inversion, the presented method can be applied to large data sets. This facilitates the treatment of applications with variable boundary conditions (nearby rivers, precipitation). We integrate the geostatistical inversion method into the software framework DUNE, enabling the use of high-performance-computing techniques and full parallelization. Feasibility of our approach is demonstrated through the joint inversion of several synthetic data sets in two and three dimensions, e.g. estimation of hydraulic conductivity using hydraulic head values and tracer concentrations, and scalability of the new method is analyzed. A comparison of the new method with existing geostatistical inversion approaches highlights its advantages and drawbacks and demonstrates scenarios in which our scheme can be beneficial.

  13. The Geostatistical Framework for Spatial Prediction%空间预测的地统计学框架

    Institute of Scientific and Technical Information of China (English)

    张景雄; 姚娜

    2008-01-01

    Geostatistics provides a coherent framework for spatial prediction and uncertainty assessment, whereby spatial dependence, as quantified by variograms, is utilized for best linear unbiased estimation of a regionalized variable at unsampied locations. Geostatistics for prediction of continuous regionalized variables is reviewed, with key methods underlying the derivation of major variants of uni-variate Kriging described in an easy-to-follow manner. This paper will contribute to demystification and, hence, popularization of geostatistics in geoinformatics communities.

  14. Geostatistical application for the variability study of the Callovo-Oxfordian unit

    International Nuclear Information System (INIS)

    impact of the model and therefore the importance of the data analyses. As the the correlation between seismic and log impedances is estimated only at core drill location, the coefficient correlation is defined accurately but locally. Moreover, for a correlation coefficient close to 1, a good knowledge of the seismic impedance is necessary. Therefore a detailed data analysis has been performed. The results of this analysis are listed: - The values of the seismic impedance and log-impedance increase towards the limits of the unit of the Callovo-Oxfordian. The correlation between the two impedances decreases if we exclude the edges of the unit. - The distribution of the impedances puts in evidence a subdivision in two sub-units. These sub-units have been studied separately: they seem to have different geostatistical behaviors. The low sub-unit is more homogeneous than the upper one. - Low values of the seismic impedance in the low sub-unit are observed in the south-east part of the zone of study. - Application of this methodology to 3D seismic impedances. - Validation of the variogram model could be more precise with 3D information. - Study of the relationship between the impedances and the geological characteristics (content of carbonates, clay and silt) before addressing the case of the geomechanical variability. (authors)

  15. Eigenvalues Analysis Applied to the Stability Study of a Variable Speed Pump Turbine Unit

    OpenAIRE

    Han, Michel; Kawkabani, Basile; Simond, Jean-Jacques

    2012-01-01

    This paper presents an eigenvalues analysis method for an induction machine model based on phase variables a, b, c. This approach is validated by comparison with ad, q approach and applied to a variable speed pump-turbine unit using a simplified model of the converters. The proposed approach permits the study and analysis of the interaction between mechanical, electric and regulation parts of the system. Index Terms—control systems, numerical simulation, induction motors, AC-DC power converte...

  16. On the relation between applied behavior analysis and positive behavioral support

    OpenAIRE

    Carr, James E.; Sidener, Tina M

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of tec...

  17. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    OpenAIRE

    Clayton, J. D.

    2016-01-01

    Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceram...

  18. Applying Multiscale Entropy to the Complexity Analysis of Rainfall-Runoff Relationships

    OpenAIRE

    Chien-Ming Chou

    2012-01-01

    This paper presents a novel framework for the complexity analysis of rainfall, runoff, and runoff coefficient (RC) time series using multiscale entropy (MSE). The MSE analysis of RC time series was used to investigate changes in the complexity of rainfall-runoff processes due to human activities. Firstly, a coarse graining process was applied to a time series. The sample entropy was then computed for each coarse-grained time series, and plotted as a function of the scale factor. The proposed ...

  19. Does size matter? Separations on guard columns for fast sample analysis applied to bioenergy research

    OpenAIRE

    Bauer, Stefan; Ibanez, Ana B

    2015-01-01

    Background Increasing sample throughput is needed when large numbers of samples have to be processed. In chromatography, one strategy is to reduce column length for decreased analysis time. Therefore, the feasibility of analyzing samples simply on a guard column was explored using refractive index and ultraviolet detection. Results from the guard columns were compared to the analyses using the standard 300 mm Aminex HPX-87H column which is widely applied to the analysis of samples from many b...

  20. Structural Integrity Analysis of the RBMK Reactor Critical Structures Applying Probabilistic Methods

    OpenAIRE

    Dundulis, Gintautas; Kulak, Ronald; Alzbutas, Robertas; Uspuras, Eugenijus

    2010-01-01

    The probability-based approach that integrates deterministic and probabilistic methods was developed to analyse failures of NPP buildings and components. This methodology was applied to safety analysis of the Ignalina NPP. The application of this methodology to two postulated accidents―pipe whip impact and aircraft crash―is presented in this chapter. The NEPTUNE software system was used for the deterministic transient analysis of the pipe whip impact and aircraft crash accidents. Many det...

  1. Spatial variability of selected physicochemical parameters within peat deposits in small valley mire: a geostatistical approach

    Directory of Open Access Journals (Sweden)

    Pawłowski Dominik

    2014-12-01

    Full Text Available Geostatistical methods for 2D and 3D modelling spatial variability of selected physicochemical properties of biogenic sediments were applied to a small valley mire in order to identify the processes that lead to the formation of various types of peat. A sequential Gaussian simulation was performed to reproduce the statistical distribution of the input data (pH and organic matter and their semivariances, as well as to honouring of data values, yielding more ‘realistic’ models that show microscale spatial variability, despite the fact that the input sample cores were sparsely distributed in the X-Y space of the study area. The stratigraphy of peat deposits in the Ldzań mire shows a record of long-term evolution of water conditions, which is associated with the variability in water supply over time. Ldzań is a fen (a rheotrophic mire with a through-flow of groundwater. Additionally, the vicinity of the Grabia River is marked by seasonal inundations of the southwest part of the mire and increased participation of mineral matter in the peat. In turn, the upper peat layers of some of the central part of Ldzań mire are rather spongy, and these peat-forming phytocoenoses probably formed during permanent waterlogging.

  2. Geostatistical investigations for suitable mapping of the water table: the Bordeaux case (France)

    Science.gov (United States)

    Guekie simo, Aubin Thibaut; Marache, Antoine; Lastennet, Roland; Breysse, Denys

    2016-02-01

    Methodologies have been developed to establish realistic water-table maps using geostatistical methods: ordinary kriging (OK), cokriging (CoK), collocated cokriging (CoCoK), and kriging with external drift (KED). In fact, in a hilly terrain, when piezometric data are sparsely distributed over large areas, the water-table maps obtained by these methods provide exact water levels at monitoring wells but fail to represent the groundwater flow system, manifested through an interpolated water table above the topography. A methodology is developed in order to rebuild water-table maps for urban areas at the city scale. The interpolation methodology is presented and applied in a case study where water levels are monitored at a set of 47 points for a part urban domain covering 25.6 km2 close to Bordeaux city, France. To select the best method, a geographic information system was used to visualize surfaces reconstructed with each method. A cross-validation was carried out to evaluate the predictive performances of each kriging method. KED proves to be the most accurate and yields a better description of the local fluctuations induced by the topography (natural occurrence of ridges and valleys).

  3. Spatial distribution of indoor radon in Triveneto (Northern Italy): A geostatistical approach

    International Nuclear Information System (INIS)

    The study of spatial distribution of the indoor radon has assumed in the last years a lot of interest. The geostatistical techniques turn out to be particularly promising. The present work presents the results of a study where around 4000 indoor radon data from Veneto, Friuli Venezia-Giulia and Alto Adige, collected during the sampling campaigns performed in dwellings and in schools, have been analyzed. After the definition of the common data set, the study of the spatial distribution of the phenomenon has been performed by examining the experimental vario-grams. De-clustering techniques have been applied. Predictive maps were defined by using simulation techniques; they allow to determine the probabilities of exceeding defined concentration levels, the 'radon-prone' areas. Systematic results regarding the validation of these maps are reported. This methodological study indicates how it is possible to understand the geographical variability of the phenomenon, trying to find out correlations among indoor radon, geological characteristics (i.e. lithology, morphology, tectonics, soil gas) and building-specific features, which can significantly influence radon concentrations. (authors)

  4. Study on the spatial pattern of rainfall erosivity based on geostatistics in Hebei Province,China

    Institute of Scientific and Technical Information of China (English)

    Mingxin MEN; Zhenrong YU; Hao XU

    2008-01-01

    The objective of this article was to study the spatial distribution pattern of rainfall erosivity.The precipitation data at each climatological station in Hebei Province,China were collected and analyzed and modeled with SPSS and ArcGIS.A simple model of estimating rainfall erosivity was developed based on the weather station data.Also,the annual average rainfall erosivity was calculated with this model.The predicted errors,statistical feature values and prediction maps obtained by using different interpolation methods were compared.The result indicated that second-order ordinary Kriging method performed better than both zero and first-order ordinary Kriging methods.Within the method of second-order trend,Gaussian semi-variogram model performed better than other interpolation methods with the spherical or exponential models.Applying geostatistics to study rainfall erosivity spatial pattern will help to accurately and quantitatively evaluate soil erosion risk.Our research also provides digital maps that can assist in policy making in the regional soil and water conservation planning and management strategies.

  5. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    structure evaluation by assessing the local identifiability characteristics of the parameters. Moreover, such a procedure should be generic to make sure it can be applied independent from the structure of the model. We hereby apply a numerical identifiability approach which is based on the work of Walter...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring....... In contrast, the practical identifiability analysis revealed that high values of the forward rate parameter Vf led to identifiability problems. These problems were even more pronounced athigher substrate concentrations, which illustrates the importance of a proper experimental designto avoid...

  6. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    Science.gov (United States)

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  7. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    Science.gov (United States)

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours of…

  8. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    Science.gov (United States)

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  9. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    Science.gov (United States)

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  10. Evolution of Applied Behavior Analysis in the Treatment of Individuals With Autism

    Science.gov (United States)

    Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.

    2005-01-01

    Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…

  11. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    Science.gov (United States)

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  12. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    Science.gov (United States)

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  13. INDEPENDENT COMPONENT ANALYSIS (ICA) APPLIED TO LONG BUNCH BEAMS IN THE LOS ALAMOS PROTON STORAGE RING

    Energy Technology Data Exchange (ETDEWEB)

    Kolski, Jeffrey S. [Los Alamos National Laboratory; Macek, Robert J. [Los Alamos National Laboratory; McCrady, Rodney C. [Los Alamos National Laboratory; Pang, Xiaoying [Los Alamos National Laboratory

    2012-05-14

    Independent component analysis (ICA) is a powerful blind source separation (BSS) method. Compared to the typical BSS method, principal component analysis (PCA), which is the BSS foundation of the well known model independent analysis (MIA), ICA is more robust to noise, coupling, and nonlinearity. ICA of turn-by-turn beam position data has been used to measure the transverse betatron phase and amplitude functions, dispersion function, linear coupling, sextupole strength, and nonlinear beam dynamics. We apply ICA in a new way to slices along the bunch and discuss the source signals identified as betatron motion and longitudinal beam structure.

  14. Preliminary Hazard Analysis applied to Uranium Hexafluoride - UF6 production plant

    International Nuclear Information System (INIS)

    The purpose of this paper is to present the results of the Preliminary hazard Analysis applied to the UF6 Production Process, which is part of the UF6 Conversion Plant. The Conversion Plant has designed to produce a high purified UF6 in accordance with the nuclear grade standards. This Preliminary Hazard Analysis is the first step in the Risk Management Studies, which are under current development. The analysis evaluated the impact originated from the production process in the plant operators, members of public, equipment, systems and installations as well as the environment. (author)

  15. Kriging in the Shadows: Geostatistical Interpolation for Remote Sensing

    Science.gov (United States)

    Rossi, Richard E.; Dungan, Jennifer L.; Beck, Louisa R.

    1994-01-01

    It is often useful to estimate obscured or missing remotely sensed data. Traditional interpolation methods, such as nearest-neighbor or bilinear resampling, do not take full advantage of the spatial information in the image. An alternative method, a geostatistical technique known as indicator kriging, is described and demonstrated using a Landsat Thematic Mapper image in southern Chiapas, Mexico. The image was first classified into pasture and nonpasture land cover. For each pixel that was obscured by cloud or cloud shadow, the probability that it was pasture was assigned by the algorithm. An exponential omnidirectional variogram model was used to characterize the spatial continuity of the image for use in the kriging algorithm. Assuming a cutoff probability level of 50%, the error was shown to be 17% with no obvious spatial bias but with some tendency to categorize nonpasture as pasture (overestimation). While this is a promising result, the method's practical application in other missing data problems for remotely sensed images will depend on the amount and spatial pattern of the unobscured pixels and missing pixels and the success of the spatial continuity model used.

  16. Soil Organic Carbon Mapping by Geostatistics in Europe Scale

    Science.gov (United States)

    Aksoy, E.; Panagos, P.; Montanarella, L.

    2013-12-01

    Accuracy in assessing the distribution of soil organic carbon (SOC) is an important issue because SOC is an important soil component that plays key roles in the functions of both natural ecosystems and agricultural systems. The SOC content varies from place to place and it is strongly related with climate variables (temperature and rainfall), terrain features, soil texture, parent material, vegetation, land-use types, and human management (management and degradation) at different spatial scales. Geostatistical techniques allow for the prediction of soil properties using soil information and environmental covariates. In this study, assessment of SOC distribution has been predicted with Regression-Kriging method in Europe scale. In this prediction, combination of the soil samples which were collected from the LUCAS (European Land Use/Cover Area frame statistical Survey) & BioSoil Projects, with local soil data which were collected from six different CZOs in Europe and ten spatial predictors (slope, aspect, elevation, CTI, CORINE land-cover classification, parent material, texture, WRB soil classification, annual average temperature and precipitation) were used. Significant correlation between the covariates and the organic carbon dependent variable was found. Moreover, investigating the contribution of local dataset in watershed scale into regional dataset in European scale was an important challenge.

  17. A geostatistical study of the Ventersdorp Contact and Vaal Reefs

    International Nuclear Information System (INIS)

    Selective removal of gold and uranium from Witwatersrand ores, with waste or barren rock being reduced to a minimum, has become economically attractive in the light of ever-increasing mining costs. The feasibility of selective mining by means of a mechanical device known as the 'rock-cutter' is investigated in the present pilot geostatistical survey by a quantitative study of the interrelationships between gold, uranium and other minerals present in the Vaal Reef at Hartebeestfontein and Zandpan gold mines and also in the Ventersdorp Contact Reef at Venterspost gold mine. In the reefs examined, radioactivity, due mainly to the presence of uraninite, can be used with a high degree of confidence to localize areas also rich in gold, and therefore to position the rock-cutter. The phyllosilicates chlorite, muscovite and pyrophyllite as well as the heavy minerals pyrite, chromite and zircon are not sufficiently well correlated to either gold or uranium to provide meaningful information for positioning the rock-cutter

  18. Geostatistical interpolation for modelling SPT data in northern Izmir

    Indian Academy of Sciences (India)

    Selim Altun; A Burak Göktepe; Alper Sezer

    2013-12-01

    In this study, it was aimed to map the corrected Standard Penetration Test(SPT) values in Karşıyaka city center by kriging approach. Six maps were prepared by this geostatistical approach at depths of 3, 6, 9, 13.5, 18 and 25.5m. Borehole test results obtained from 388 boreholes in central Karşıyaka were used to model the spatial variation of $(\\text{N}_1)_{\\text{60cs}}$ values in an area of 5.5 km2. Corrections were made for depth, hammer energy, rod length, sampler, borehole diameter and fines content, to the data in hand. At various depths, prepared variograms and the kriging method were used together to model the variation of corrected SPT data in the region, which enabled the estimation of missing data in the region. The results revealed that the estimation ability of the models were acceptable, which were validated by a number of parameters as well as the comparisons of the actual and estimated data. Outcomes of this study can be used in microzonation studies, site response analyses, calculation of bearing capacity of subsoils in the region and producing a number of parameters which are empirically related to corrected SPT number as well.

  19. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis...... in MFA, two mathematical concepts for the quantification of uncertainties were applied to Austrian palladium (Pd) resource flows and evaluated: (1) uncertainty ranges expressed by fuzzy sets and (2) uncertainty ranges defined by normal distributions given as mean values and standard deviations. Whereas...... the Austrian resource system, whereas approximately 70% of the Pd in the EOL consumer products is recovered in waste management. In conclusion, systematic uncertainty analysis is an integral part of MFA required to provide robust decision support in resource management....

  20. Characterisation of contaminated metals using an advanced statistical toolbox - Geostatistical characterisation of contaminated metals: methodology and illustrations

    International Nuclear Information System (INIS)

    Radiological characterisation plays an important role in the process to recycle contaminated or potentially contaminated metals. It is a platform for planning, identification of the extent and nature of contamination, assessing potential risk impacts, cost estimation, radiation protection, management of material arising from decommissioning as well as for the release of the materials as well as the disposal of the generated secondary waste as radioactive waste. Key issues in radiological characterisation are identification of objectives, development of a measurement and sampling strategy (probabilistic, judgmental or a combination thereof), knowledge management, traceability, recording and processing of obtained information. By applying advanced combination of statistical and geostatistical in the concept better performance can be achieved at a lower cost. This paper will describe the benefits with the usage of the available methods in the different stages of the characterisation, treatment and clearance processes aiming for reliable results in line with the data quality objectives. (authors)

  1. National youth sedentary behavior and physical activity daily patterns using latent class analysis applied to accelerometry

    OpenAIRE

    Evenson, Kelly R.; Wen, Fang; Hales, Derek; Herring, Amy H.

    2016-01-01

    Background Applying latent class analysis (LCA) to accelerometry can help elucidated underlying patterns. This study described the patterns of accelerometer-determined sedentary behavior and physical activity among youth by applying LCA to a nationally representative United States (US) sample. Methods Using 2003–2006 National Health and Nutrition Examination Survey data, 3998 youths 6–17 years wore an ActiGraph 7164 accelerometer for one week, providing > =3 days of wear for > =8 h/day from 6...

  2. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  3. Structure analysis of interstellar clouds: II. Applying the Delta-variance method to interstellar turbulence

    CERN Document Server

    Ossenkopf, V; Stutzki, J

    2008-01-01

    The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. In paper I we proposed essential improvements to the Delta-variance analysis. In this paper we apply the improved Delta-variance analysis to i) a hydrodynamic turbulence simulation with prominent density and velocity structures, ii) an observed intensity map of rho Oph with irregular boundaries and variable uncertainties of the different data points, and iii) a map of the turbulent velocity structure in the Polaris Flare affected by the intensity dependence on the centroid velocity determination. The tests confirm the extended capabilities of the improved Delta-variance analysis. Prominent spatial scales were accurately identified and artifacts from a variable reliability of the data were removed. The analysis of the hydrodynamic simulations showed that the injection of a turbulent velocity structure creates the most prominent density structures are produced on a sca...

  4. Critical Analysis of a Website: A Critique based on Critical Applied Linguistics and Critical Discourse Analysis

    OpenAIRE

    Rina Agustina

    2013-01-01

    E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1) critical applied linguistics  (...

  5. A Review of Temporal Aspects of Hand Gesture Analysis Applied to Discourse Analysis and Natural Conversation

    Directory of Open Access Journals (Sweden)

    Renata C. B. Madeo

    2013-08-01

    Full Text Available Lately, there has been an increasinginterest in hand gesture analysis systems. Recent works have employedpattern recognition techniques and have focused on the development of systems with more natural userinterfaces. These systems may use gestures to control interfaces or recognize sign language gestures, whichcan provide systems with multimodal interaction; or consist in multimodal tools to help psycholinguists tounderstand new aspects of discourse analysis and to automate laborious tasks.Gestures are characterizedby several aspects, mainly by movementsand sequence of postures. Since data referring to movementsorsequencescarry temporal information, this paper presents aliteraturereviewabouttemporal aspects ofhand gesture analysis, focusing on applications related to natural conversation and psycholinguisticanalysis, using Systematic Literature Review methodology. In our results, we organized works according totype of analysis, methods, highlighting the use of Machine Learning techniques, and applications.

  6. Comparative study of the geostatistical ore reserve estimation method over the conventional methods

    International Nuclear Information System (INIS)

    Part I contains a comprehensive treatment of the comparative study of the geostatistical ore reserve estimation method over the conventional methods. The conventional methods chosen for comparison were: (a) the polygon method, (b) the inverse of the distance squared method, and (c) a method similar to (b) but allowing different weights in different directions. Briefly, the overall result from this comparative study is in favor of the use of geostatistics in most cases because the method has lived up to its theoretical claims. A good exposition on the theory of geostatistics, the adopted study procedures, conclusions and recommended future research are given in Part I. Part II of this report contains the results of the second and the third study objectives, which are to assess the potential benefits that can be derived by the introduction of the geostatistical method to the current state-of-the-art in uranium reserve estimation method and to be instrumental in generating the acceptance of the new method by practitioners through illustrative examples, assuming its superiority and practicality. These are given in the form of illustrative examples on the use of geostatistics and the accompanying computer program user's guide

  7. The detection of thermophilous forest hotspots in Poland using geostatistical interpolation of plant richness

    Directory of Open Access Journals (Sweden)

    Marcin Kiedrzyński

    2014-07-01

    Full Text Available Attempts to study biodiversity hotspots on a regional scale should combine compositional and functionalist criteria. The detection of hotspots in this study uses one ecologically similar group of high conservation value species as hotspot indicators, as well as focal habitat indicators, to detect the distribution of suitable environmental conditions. The method is assessed with reference to thermophilous forests in Poland – key habitats for many rare and relict species. Twenty-six high conservation priority species were used as hotspot indicators, and ten plant taxa characteristic of the Quercetalia pubescenti-petraeae phytosociological order were used as focal habitat indicators. Species distribution data was based on a 10 × 10 km grid. The number of species per grid square was interpolated by the ordinary kriging geostatistical method. Our analysis largely determined the distribution of areas with concentration of thermophilous forest flora, but also regional disjunctions and geographical barriers. Indicator species richness can be interpreted as a reflection of the actual state of habitat conditions. It can also be used to determine the location of potential species refugia and possible past and future migration routes.

  8. Usage of multivariate geostatistics in interpolation processes for meteorological precipitation maps

    Science.gov (United States)

    Gundogdu, Ismail Bulent

    2015-09-01

    Long-term meteorological data are very important both for the evaluation of meteorological events and for the analysis of their effects on the environment. Prediction maps which are constructed by different interpolation techniques often provide explanatory information. Conventional techniques, such as surface spline fitting, global and local polynomial models, and inverse distance weighting may not be adequate. Multivariate geostatistical methods can be more significant, especially when studying secondary variables, because secondary variables might directly affect the precision of prediction. In this study, the mean annual and mean monthly precipitations from 1984 to 2014 for 268 meteorological stations in Turkey have been used to construct country-wide maps. Besides linear regression, the inverse square distance and ordinary co-Kriging (OCK) have been used and compared to each other. Also elevation, slope, and aspect data for each station have been taken into account as secondary variables, whose use has reduced errors by up to a factor of three. OCK gave the smallest errors (1.002 cm) when aspect was included.

  9. [Clustering analysis applied to near-infrared spectroscopy analysis of Chinese traditional medicine].

    Science.gov (United States)

    Liu, Mu-qing; Zhou, De-cheng; Xu, Xin-yuan; Sun, Yao-jie; Zhou, Xiao-li; Han, Lei

    2007-10-01

    The present article discusses the clustering analysis used in the near-infrared (NIR) spectroscopy analysis of Chinese traditional medicines, which provides a new method for the classification of Chinese traditional medicines. Samples selected purposely in the authors' research to measure their absorption spectra in seconds by a multi-channel NIR spectrometer developed in the authors' lab were safrole, eucalypt oil, laurel oil, turpentine, clove oil and three samples of costmary oil from different suppliers. The spectra in the range of 0.70-1.7 microm were measured with air as background and the results indicated that they are quite distinct. Qualitative mathematical model was set up and cluster analysis based on the spectra was carried out through different clustering methods for optimization, and came out the cluster correlation coefficient of 0.9742 in the authors' research. This indicated that cluster analysis of the group of samples is practicable. Also it is reasonable to get the result that the calculated classification of 8 samples was quite accorded with their characteristics, especially the three samples of costmary oil were in the closest classification of the clustering analysis. PMID:18306778

  10. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    Science.gov (United States)

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  11. Common cause evaluations in applied risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system

  12. Stability analysis of multi-infeed HVDC system applying VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe

    2010-01-01

    /EMTDC to verify the theoretical analysis. Simulation results indicate that this dual infeed HVDC system can realize higher stability than single infeed HVDC system. And different control strategies on a VSC-HVDC link may result in different influence on AC voltage and active power oscillation during......This paper presents a general model of dual infeed HVDC system applying VSC-HVDC, which can be used as an element in large multi infeed HVDC system. The model may have different structure under different grid faults because of the action of breakers. Hence power flow of the system based on this...... model is analyzed under steady state and different kinds of grid fault transient situations. Two main control methods applied on VSC-HVDC link in this dual infeed HVDC system are investigated, and comparative analysis of them under transient situation is presented. A simulation model is built in PSCAD...

  13. TAPPS Release 1: Plugin-Extensible Platform for Technical Analysis and Applied Statistics

    Directory of Open Access Journals (Sweden)

    Justin Sam Chew

    2016-01-01

    Full Text Available We present the first release of TAPPS (Technical Analysis and Applied Statistics System; a Python implementation of a thin software platform aimed towards technical analyses and applied statistics. The core of TAPPS is a container for 2-dimensional data frame objects and a TAPPS command language. TAPPS language is not meant to be a programming language for script and plugin development but for the operational purposes. In this aspect, TAPPS language takes on the flavor of SQL rather than R, resulting in a shallower learning curve. All analytical functions are implemented as plugins. This results in a defined plugin system, which enables rapid development and incorporation of analysis functions. TAPPS Release 1 is released under GNU General Public License 3 for academic and non-commercial use. TAPPS code repository can be found at http://github.com/mauriceling/tapps.

  14. What are the contemporary trends in more widely comprehended fields of thermodynamics and applied thermal analysis?

    Czech Academy of Sciences Publication Activity Database

    Šesták, Jaroslav

    Warszawa : Institute of Physical Chemistry of PAS, 2005 - (Gierycz, P.; Zieborak-Tomaszkiewicz, I.), s. 239-259 ISBN 83-920719-4-8. [Third Summer School of Thermodynamics. Zakopane (PL), 11.09.2005-17.09.2005] R&D Projects: GA ČR(CZ) GA522/04/0384 Institutional research plan: CEZ:AV0Z10100521 Keywords : applied thermal analysis Subject RIV: BJ - Thermodynamics

  15. The ZEW combined microsimulation-CGE model : innovative tool for applied policy analysis

    OpenAIRE

    Clauss, Markus; Schubert, Stefanie

    2009-01-01

    This contribution describes the linkage of microsimulation models and computable general equilibrium (CGE) models using two already established models called "STSM" and "PACE-L" used by the Centre for European Economic Research. This state of the art research method for applied policy analysis combines the advantages of both model types: On the one hand, microsimulation models allow for detailed labor supply and distributional effects due to policy measures, as individual household data is us...

  16. Hypergraph Modelling and Graph Clustering Process Applied to Co-word Analysis

    OpenAIRE

    Polanco, Xavier; San Juan, Eric

    2007-01-01

    We argue that any document set can be modelled as a hypergraph, and we apply a graph clustering process as a way of analysis. A variant of the single link clustering is presented, and we assert that it is better suited to extract interesting clusters formed along easily interpretable paths of associated items than algorithms based on detecting high density regions. We propose a methodology that involves the extraction of similarity graphs from the indexed-dataset represented as a hypergraph. ...

  17. Applied behavior analysis as intervention for autism: definition, features and philosophical concepts

    Directory of Open Access Journals (Sweden)

    Síglia Pimentel Höher Camargo

    2013-11-01

    Full Text Available Autism spectrum disorder (ASD is a lifelong pervasive developmental disorder with no known causes and cure. However, educational and behavioral interventions with a foundation in applied behavior analysis (ABA have been shown to improve a variety of skill areas such as communication, social, academic, and adaptive behaviors of individuals with ASD. The goal of this work is to present the definition, features and philosophical concepts that underlie ABA and make this science an effective intervention method for people with autism.

  18. A comparative analysis of three metaheuristic methods applied to fuzzy cognitive maps learning

    OpenAIRE

    Bruno A. Angélico; Márcio Mendonça; Taufik Abrão; Arruda, Lúcia Valéria R. de

    2013-01-01

    This work analyses the performance of three different population-based metaheuristic approaches applied to Fuzzy cognitive maps (FCM) learning in qualitative control of processes. Fuzzy cognitive maps permit to include the previous specialist knowledge in the control rule. Particularly, Particle Swarm Optimization (PSO), Genetic Algorithm (GA) and an Ant Colony Optimization (ACO) are considered for obtaining appropriate weight matrices for learning the FCM. A statistical convergence analysis ...

  19. English Language Assessment in the Colleges of Applied Sciences in Oman: Thematic Document Analysis

    OpenAIRE

    Fatma Al Hajri

    2014-01-01

    Proficiency in English language and how it is measured have become central issues in higher education research as the English language is increasingly used as a medium of instruction and a criterion for admission to education. This study evaluated the English language assessment in the foundation Programme at the Colleges of Applied sciences in Oman. It used thematic analysis in studying 118 documents on language assessment. Three main findings were reported: compatibility between what was ta...

  20. An Analysis of Research Production in Corpus Linguistics Applied to Translation

    OpenAIRE

    Candel-Mora, Miguel Ángel; Vargas Sierra, Chelo

    2013-01-01

    The aim of the paper is to analyze with data the consolidation of corpus methods in translation and to specify which issues are under research and the features that characterize these studies. To that end, different contributions to corpus linguistics research, teaching and practice of translation were compiled to build a sufficiently representative sample: 389 bibliographic records on corpus linguistics applied to translation. This study deals with the identification and analysis of differen...

  1. Sensitivity analysis applied to the disposal of low and medium-level waste

    International Nuclear Information System (INIS)

    The second French centre for disposal of radioactive waste will become operational in the early 1990s. In order to model the site, a sensitivity analysis and calculations of the radiological consequences are applied to the disposal of the waste. These techniques point out the main radionuclides to be taken into account, identify the sensitive parameters of the different barriers and evaluate the safety margins achieved through structural design and the application of general safety requirements. (author)

  2. Reliability analysis of protection systems in NPP applying fault-tree analysis method

    International Nuclear Information System (INIS)

    This paper demonstrates the applicability and limits of dependability analysis in nuclear power plants (NPPS) based on the reactor protection refurbishment project (RRP) in NPP Paks. This paper illustrates case studies from the reliability analysis for NPP Paks. It also investigates the solutions for the connection between the data acquisition and subsystem control units (TSs) and the voter units (VTs), it analyzes the influence of the voting in the VT computer level, it studies the effects of the testing procedures to the dependability parameters. (author)

  3. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author)

  4. Classical linear-control analysis applied to business-cycle dynamics and stability

    Science.gov (United States)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  5. Sensitivity and uncertainty analysis applied to a repository in rock salt

    International Nuclear Information System (INIS)

    This document describes the sensitivity and uncertainty analysis with UNCSAM, as applied to a repository in rock salt for the EVEREST project. UNCSAM is a dedicated software package for sensitivity and uncertainty analysis, which was already used within the preceding PROSA project. The use of UNCSAM provides a flexible interface to EMOSECN by substituting the sampled values in the various input files to be used by EMOSECN; the model calculations for this repository were performed with the EMOSECN code. Preceding the sensitivity and uncertainty analysis, a number of preparations has been carried out to facilitate EMOSECN with the probabilistic input data. For post-processing the EMOSECN results, the characteristic output signals were processed. For the sensitivity and uncertainty analysis with UNCSAM the stochastic input, i.e. sampled values, and the output for the various EMOSECN runs have been analyzed. (orig.)

  6. Sequential analysis applied to clinical trials in dentistry: a systematic review.

    Science.gov (United States)

    Bogowicz, P; Flores-Mir, C; Major, P W; Heo, G

    2008-01-01

    Clinical trials employ sequential analysis for the ethical and economic benefits it brings. In dentistry, as in other fields, resources are scarce and efforts are made to ensure that patients are treated ethically. The objective of this systematic review was to characterise the use of sequential analysis for clinical trials in dentistry. We searched various databases from 1900 through to January 2008. Articles were selected for review if they were clinical trials in the field of dentistry that had applied some form of sequential analysis. Selection was carried out independently by two of the authors. We included 18 trials from various specialties, which involved many different interventions. We conclude that sequential analysis seems to be underused in this field but that there are sufficient methodological resources in place for future applications.Evidence-Based Dentistry (2008) 9, 55-62. doi:10.1038/sj.ebd.6400587. PMID:18584009

  7. Applying observations of work activity in designing prototype data analysis tools

    Energy Technology Data Exchange (ETDEWEB)

    Springmeyer, R.R.

    1993-07-06

    Designers, implementers, and marketers of data analysis tools typically have different perspectives than users. Consequently, data analysis often find themselves using tools focused on graphics and programming concepts rather than concepts which reflect their own domain and the context of their work. Some user studies focus on usability tests late in development; others observe work activity, but fail to show how to apply that knowledge in design. This paper describes a methodology for applying observations of data analysis work activity in prototype tool design. The approach can be used both in designing improved data analysis tools, and customizing visualization environments to specific applications. We present an example of user-centered design for a prototype tool to cull large data sets. We revisit the typical graphical approach of animating a large data set from the point of view of an analysis who is culling data. Field evaluations using the prototype tool not only revealed valuable usability information, but initiated in-depth discussions about user`s work, tools, technology, and requirements.

  8. Continuous Wavelet and Hilbert-Huang Transforms Applied for Analysis of Active and Reactive Power Consumption

    Directory of Open Access Journals (Sweden)

    Avdakovic Samir

    2014-08-01

    Full Text Available Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real power distribution transformer substation in urban part of Sarajevo city. Using the continuous wavelet transform (CWT with wavelet power spectrum and global wavelet spectrum some properties of analysed time series are determined. Then, empirical mode decomposition (EMD and Hilbert-Huang Transform (HHT are applied for the analyses of the same time series and the results showed that both applied approaches can provide very useful information about the behaviour of power consumption for observed time interval and different period (frequency bands. Also it can be noticed that the results obtained by global wavelet spectrum and marginal Hilbert spectrum are very similar, thus confirming that both approaches could be used for identification of main properties of active and reactive power consumption time series.

  9. Unsupervised classification of multivariate geostatistical data: Two algorithms

    Science.gov (United States)

    Romary, Thomas; Ors, Fabien; Rivoirard, Jacques; Deraisme, Jacques

    2015-12-01

    With the increasing development of remote sensing platforms and the evolution of sampling facilities in mining and oil industry, spatial datasets are becoming increasingly large, inform a growing number of variables and cover wider and wider areas. Therefore, it is often necessary to split the domain of study to account for radically different behaviors of the natural phenomenon over the domain and to simplify the subsequent modeling step. The definition of these areas can be seen as a problem of unsupervised classification, or clustering, where we try to divide the domain into homogeneous domains with respect to the values taken by the variables in hand. The application of classical clustering methods, designed for independent observations, does not ensure the spatial coherence of the resulting classes. Image segmentation methods, based on e.g. Markov random fields, are not adapted to irregularly sampled data. Other existing approaches, based on mixtures of Gaussian random functions estimated via the expectation-maximization algorithm, are limited to reasonable sample sizes and a small number of variables. In this work, we propose two algorithms based on adaptations of classical algorithms to multivariate geostatistical data. Both algorithms are model free and can handle large volumes of multivariate, irregularly spaced data. The first one proceeds by agglomerative hierarchical clustering. The spatial coherence is ensured by a proximity condition imposed for two clusters to merge. This proximity condition relies on a graph organizing the data in the coordinates space. The hierarchical algorithm can then be seen as a graph-partitioning algorithm. Following this interpretation, a spatial version of the spectral clustering algorithm is also proposed. The performances of both algorithms are assessed on toy examples and a mining dataset.

  10. Geostatistical inspired metamodeling and optimization of nanoscale analog circuits

    Science.gov (United States)

    Okobiah, Oghenekarho

    The current trend towards miniaturization of modern consumer electronic devices significantly affects their design. The demand for efficient all-in-one appliances leads to smaller, yet more complex and powerful nanoelectronic devices. The increasing complexity in the design of such nanoscale Analog/Mixed-Signal Systems-on-Chip (AMS-SoCs) presents difficult challenges to designers. One promising design method used to mitigate the burden of this design effort is the use of metamodeling (surrogate) modeling techniques. Their use significantly reduces the time for computer simulation and design space exploration and optimization. This dissertation addresses several issues of metamodeling based nanoelectronic based AMS design exploration. A surrogate modeling technique which uses geostatistical based Kriging prediction methods in creating metamodels is proposed. Kriging prediction techniques take into account the correlation effects between input parameters for performance point prediction. We propose the use of Kriging to utilize this property for the accurate modeling of process variation effects of designs in the deep nanometer region. Different Kriging methods have been explored for this work such as simple and ordinary Kriging. We also propose another metamodeling technique Kriging-Bootstrapped Neural Network that combines the accuracy and process variation awareness of Kriging with artificial neural network models for ultra-fast and accurate process aware metamodeling design. The proposed methodologies combine Kriging metamodels with selected algorithms for ultra-fast layout optimization. The selected algorithms explored are: Gravitational Search Algorithm (GSA), Simulated Annealing Optimization (SAO), and Ant Colony Optimization (ACO). Experimental results demonstrate that the proposed Kriging metamodel based methodologies can perform the optimizations with minimal computational burden compared to traditional (SPICE-based) design flows.

  11. A geostatistical autopsy of the Austrian indoor radon survey (1992-2002)

    International Nuclear Information System (INIS)

    Indoor radon (Rn) concentrations have been measured intensively in many countries to assess the burden of diseases associated with exposure to this radioactive gas. So-called radon-risk maps have consequently been produced to delineate areas with high levels. Geostatistical techniques are commonly used nowadays to map a range of environmental variables, in particular to generate probability maps of exceeding a given threshold. However, very few case studies in which indoor radon measurements have been investigated using geostatistical techniques have been published so far. By analyzing around 12,000 Rn measurements made in Austrian ground floors during a 10-year survey, we aim here to review and discuss the potential of geostatistics for mapping an environmental variable that shows very strong local variability. In particular, we show how kriging of the scale components can shed new light on various factors that affect the very high spatial variability of the variable

  12. Use of geostatistical modeling to capture complex geology in finite-element analyses

    International Nuclear Information System (INIS)

    This paper summarizes a number of transient thermal analyses performed for a representative two-dimensional cross section of volcanic tuffs at Yucca Mountain using the finite element, nonlinear heat-conduction code COYOTE-II. In addition to conventional design analyses, in which material properties are formulated as a uniform single material and as horizontally layered, internally uniform matters, an attempt was made to increase the resemblance of the thermal property field to the actual geology by creating two fairly complex, geologically realistic models. The first model was created by digitizing an existing two-dimensional geologic cross section of Yucca Mountain. The second model was created using conditional geostatistical simulation. Direct mapping of geostatistically generated material property fields onto finite element computational meshes was demonstrated to yield temperature fields approximately equivalent to those generated through more conventional procedures. However, the ability to use the geostatistical models offers a means of simplifying the physical-process analyses

  13. Use of geostatistics for remediation planning to transcend urban political boundaries.

    Science.gov (United States)

    Milillo, Tammy M; Sinha, Gaurav; Gardella, Joseph A

    2012-11-01

    Soil remediation plans are often dictated by areas of jurisdiction or property lines instead of scientific information. This study exemplifies how geostatistically interpolated surfaces can substantially improve remediation planning. Ordinary kriging, ordinary co-kriging, and inverse distance weighting spatial interpolation methods were compared for analyzing surface and sub-surface soil sample data originally collected by the US EPA and researchers at the University at Buffalo in Hickory Woods, an industrial-residential neighborhood in Buffalo, NY, where both lead and arsenic contamination is present. Past clean-up efforts estimated contamination levels from point samples, but parcel and agency jurisdiction boundaries were used to define remediation sites, rather than geostatistical models estimating the spatial behavior of the contaminants in the soil. Residents were understandably dissatisfied with the arbitrariness of the remediation plan. In this study we show how geostatistical mapping and participatory assessment can make soil remediation scientifically defensible, socially acceptable, and economically feasible. PMID:22771352

  14. Geostatistical Prediction of Ocean Outfall Plume Characteristics Based on an Autonomous Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    Patrícia Alexandra Gregório Ramos

    2013-07-01

    Full Text Available Geostatistics has been successfully used to analyse and characterize the spatial variability of environmental properties. Besides providing estimated values at unsampled locations, geostatistics measures the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. This work uses universal block kriging to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign. The aim is to distinguish the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents, which are valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies may be very helpful in the future to validate dispersion models.

  15. Neutron activation analysis as applied to instrumental analysis of trace elements from seawater

    International Nuclear Information System (INIS)

    Particulate matter collected from the coastal area delimited by the mouth of the river Volturno and the Sabaudia lake has been analyzed by instrumental neutron activation analysis for its content of twenty-two trace elements. The results for surface water and bottom water are reported separately, thus evidencing the effect of sampling depth on the concentration of many elements. The necessity of accurately 'cleaning' the filters before use is stressed

  16. Guidelines for depth data collection in rivers when applying interpolation techniques (kriging for river restoration

    Directory of Open Access Journals (Sweden)

    M. Rivas-Casado

    2007-05-01

    Full Text Available River restoration appraisal requires the implementation of monitoring programmes that assess the river site before and after the restoration project. However, little work has yet been developed to design effective and efficient sampling strategies. Three main variables need to be considered when designing monitoring programmes: space, time and scale. The aim of this paper is to describe the methodology applied to analyse the variation of depth in space, scale and time so more comprehensive monitoring programmes can be developed. Geostatistical techniques were applied to study the spatial dimension (sampling strategy and density, spectral analysis was used to study the scale at which depth shows cyclic patterns, whilst descriptive statistics were used to assess the temporal variation. A brief set of guidelines have been summarised in the conclusion.

  17. Criticality analysis of thermal reactors for two energy groups applying Monte Carlo and neutron Albedo method

    International Nuclear Information System (INIS)

    The Albedo method applied to criticality calculations to nuclear reactors is characterized by following the neutron currents, allowing to make detailed analyses of the physics phenomena about interactions of the neutrons with the core-reflector set, by the determination of the probabilities of reflection, absorption, and transmission. Then, allowing to make detailed appreciations of the variation of the effective neutron multiplication factor, keff. In the present work, motivated for excellent results presented in dissertations applied to thermal reactors and shieldings, was described the methodology to Albedo method for the analysis criticality of thermal reactors by using two energy groups admitting variable core coefficients to each re-entrant current. By using the Monte Carlo KENO IV code was analyzed relation between the total fraction of neutrons absorbed in the core reactor and the fraction of neutrons that never have stayed into the reflector but were absorbed into the core. As parameters of comparison and analysis of the results obtained by the Albedo method were used one dimensional deterministic code ANISN (ANIsotropic SN transport code) and Diffusion method. The keff results determined by the Albedo method, to the type of analyzed reactor, showed excellent agreement. Thus were obtained relative errors of keff values smaller than 0,78% between the Albedo method and code ANISN. In relation to the Diffusion method were obtained errors smaller than 0,35%, showing the effectiveness of the Albedo method applied to criticality analysis. The easiness of application, simplicity and clarity of the Albedo method constitute a valuable instrument to neutronic calculations applied to nonmultiplying and multiplying media. (author)

  18. An evaluation of an operating BWR piping system damping during earthquake by applying auto regressive analysis

    International Nuclear Information System (INIS)

    The observation of the equipment and piping system installed in an operating nuclear power plant in earthquakes is very umportant for evaluating and confirming the adequacy and the safety margin expected in the design stage. By analyzing observed earthquake records, it can be expected to get the valuable data concerning the behavior of those in earthquakes, and extract the information about the aseismatic design parameters for those systems. From these viewpoints, an earthquake observation system was installed in a reactor building in an operating plant. Up to now, the records of three earthquakes were obtained with this system. In this paper, an example of the analysis of earthquake records is shown, and the main purpose of the analysis was the evaluation of the vibration mode, natural frequency and damping factor of this piping system. Prior to the earthquake record analysis, the eigenvalue analysis for this piping system was performed. Auto-regressive analysis was applied to the observed acceleration time history which was obtained with a piping system installed in an operating BWR. The results of earthquake record analysis agreed well with the results of eigenvalue analysis. (Kako, I.)

  19. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    Directory of Open Access Journals (Sweden)

    J.D. Clayton

    2016-08-01

    Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.

  20. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John;

    multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners...... hydraulic gradient across the control plane and are consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox...