WorldWideScience

Sample records for quantitative spatial analysis

  1. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  2. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    Science.gov (United States)

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  3. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  4. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  5. Problems in implementation of the spatial plan of the Republic of Srpska until 2015: Quantitative analysis

    Directory of Open Access Journals (Sweden)

    Bijelić Branislav

    2017-01-01

    Full Text Available The implementation of spatial plans in the Republic of Srpska is certainly the weakest phase of the process of spatial planning in this entity. It is particularly evident in the case of the Spatial Plan of the Republic of Srpska until 2015 which is the highest strategic spatial planning document in the Republic of Srpska. More precisely, the implementation of spatial plans has been defined as the carrying out of spatial planning documents, i.e. planning propositions as defined in the spatial plans. For the purpose of this paper, a quantitative analysis of the implementation of the planning propositions envisioned by this document has been carried out. The difference between what was planned and what was implemented at the end of the planning period (ex-post evaluation of planning decisions is presented in this paper. The weighting factor is defined for each thematic field and planning proposition, where the main criterion for determining the weighting factor is the share of the planning proposition and thematic field in the estimated total costs of the plan (financial criterion. The paper has also tackled the issue of the implementation of the Spatial Plan of Bosnia and Herzegovina for the period 1981 - 2000, as well as of the Spatial Plan of the Republic of Srpska 1996 - 2001 - Phased Plan for the period 1996 - 2001, as the previous strategic spatial planning documents of the highest rank covering the area of the Republic of Srpska. The research results have proven primary hypothesis of the paper that the level of the implementation of Spatial Plan of the Republic of Srpska until 2015 is less than 10%.

  6. Mutual misunderstanding and avoidance, misrepresentations and disciplinary politics: spatial science and quantitative analysis in (United Kingdom) geographical curricula

    DEFF Research Database (Denmark)

    Johnston, Ron; Harris, Richard J; Jones, Kelvyn

    2014-01-01

    are those commonly categorised by such terms as ?spatial science? and ?quantitative analysis?. Critics of these areas often write as if the type of work undertaken in the 1960s?1970s still characterises them today, with little appreciation of contemporary activities. This article responds to such claims...... by presenting the current nature of work in those areas ? very different from that of several decades ago ? and makes the case for their inclusion in curricula so that students (most of whom will not proceed to research in the areas) can appreciate the underlying principles of quantitative analyses...... and their important role in the formation of an informed citizenry in data-driven, evidence-based policy societies....

  7. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  8. Backyard housing in Gauteng: An analysis of spatial dynamics

    African Journals Online (AJOL)

    Backyard housing in Gauteng: An analysis of spatial dynamics. Yasmin Shapurjee ... Drawing on quantitative geo-demographic data from GeoTerraImage (GTI). (2010), Knowledge .... a fundamental role in absorbing demand for low-income ...

  9. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  10. High spatial resolution quantitative MR images: an experimental study of dedicated surface coils

    International Nuclear Information System (INIS)

    Gensanne, D; Josse, G; Lagarde, J M; Vincensini, D

    2006-01-01

    Measuring spin-spin relaxation times (T 2 ) by quantitative MR imaging represents a potentially efficient tool to evaluate the physicochemical properties of various media. However, noise in MR images is responsible for uncertainties in the determination of T 2 relaxation times, which limits the accuracy of parametric tissue analysis. The required signal-to-noise ratio (SNR) depends on the T 2 relaxation behaviour specific to each tissue. Thus, we have previously shown that keeping the uncertainty in T 2 measurements within a limit of 10% implies that SNR values be greater than 100 and 300 for mono- and biexponential T 2 relaxation behaviours, respectively. Noise reduction can be obtained either by increasing the voxel size (i.e., at the expense of spatial resolution) or by using high sensitivity dedicated surface coils (which allows us to increase SNR without deteriorating spatial resolution in an excessive manner). However, surface coil sensitivity is heterogeneous, i.e., it- and hence SNR-decreases with increasing depth, and the more so as the coil radius is smaller. The use of surface coils is therefore limited to the analysis of superficial structure such as the hypodermic tissue analysed here. The aim of this work was to determine the maximum limits of spatial resolution and depth compatible with reliable in vivo T 2 quantitative MR images using dedicated surface coils available on various clinical MR scanners. The average thickness of adipose tissue is around 15 mm, and the results obtained have shown that obtaining reliable biexponential relaxation analysis requires a minimum achievable voxel size of 13 mm 3 for a conventional volume birdcage coil and only of 1.7 mm 3 for the smallest available surface coil (23 mm in diameter). Further improvement in spatial resolution allowing us to detect low details in MR images without deteriorating parametric T 2 images can be obtained by image filtering. By using the non-linear selective blurring filter described in a

  11. High resolution or optimum resolution? Spatial analysis of the Federmesser site at Andernach, Germany

    NARCIS (Netherlands)

    Stapert, D; Street, M

    1997-01-01

    This paper discusses spatial analysis at site level. It is suggested that spatial analysis has to proceed in several levels, from global to more detailed questions, and that optimum resolution should be established when applying any quantitative methods in this field. As an example, the ring and

  12. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  13. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Quantitative studies with the gamma-camera: correction for spatial and energy distortion

    International Nuclear Information System (INIS)

    Soussaline, F.; Todd-Pokropek, A.E.; Raynaud, C.

    1977-01-01

    The gamma camera sensitivity distribution is an important source of error in quantitative studies. In addition, spatial distortion produces apparent variations in count density which degrades quantitative studies. The flood field image takes into account both effects and is influenced by the pile-up of the tail distribution. It is essential to measure separately each of these parameters. These were investigated using a point source displaced by a special scanning table with two X, Y stepping motors of 10 micron precision. The spatial distribution of the sensitivity, spatial distortion and photopeak in the field of view were measured and compared for different setting-up of the camera and PM gains. For well-tuned cameras, the sensitivity is fairly constant, while the variations appearing in the flood field image are primarily due to spatial distortion, the former more dependent than the latter on the energy window setting. This indicates why conventional flood field uniformity correction must not be applied. A correction technique to improve the results in quantitative studies has been tested using a continuously matched energy window at every point within the field. A method for correcting spatial distortion is also proposed, where, after an adequately sampled measurement of this error, a transformation can be applied to calculate the true position of events. The knowledge of the magnitude of these parameters is essential in the routine use and design of detector systems

  15. Spatial data analysis for exploration of regional scale geothermal resources

    Science.gov (United States)

    Moghaddam, Majid Kiavarz; Noorollahi, Younes; Samadzadegan, Farhad; Sharifi, Mohammad Ali; Itoi, Ryuichi

    2013-10-01

    Defining a comprehensive conceptual model of the resources sought is one of the most important steps in geothermal potential mapping. In this study, Fry analysis as a spatial distribution method and 5% well existence, distance distribution, weights of evidence (WofE), and evidential belief function (EBFs) methods as spatial association methods were applied comparatively to known geothermal occurrences, and to publicly-available regional-scale geoscience data in Akita and Iwate provinces within the Tohoku volcanic arc, in northern Japan. Fry analysis and rose diagrams revealed similar directional patterns of geothermal wells and volcanoes, NNW-, NNE-, NE-trending faults, hotsprings and fumaroles. Among the spatial association methods, WofE defined a conceptual model correspondent with the real world situations, approved with the aid of expert opinion. The results of the spatial association analyses quantitatively indicated that the known geothermal occurrences are strongly spatially-associated with geological features such as volcanoes, craters, NNW-, NNE-, NE-direction faults and geochemical features such as hotsprings, hydrothermal alteration zones and fumaroles. Geophysical data contains temperature gradients over 100 °C/km and heat flow over 100 mW/m2. In general, geochemical and geophysical data were better evidence layers than geological data for exploring geothermal resources. The spatial analyses of the case study area suggested that quantitative knowledge from hydrothermal geothermal resources was significantly useful for further exploration and for geothermal potential mapping in the case study region. The results can also be extended to the regions with nearly similar characteristics.

  16. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  17. New quantitative approaches reveal the spatial preference of nuclear compartments in mammalian fibroblasts.

    Science.gov (United States)

    Weston, David J; Russell, Richard A; Batty, Elizabeth; Jensen, Kirsten; Stephens, David A; Adams, Niall M; Freemont, Paul S

    2015-03-06

    The nuclei of higher eukaryotic cells display compartmentalization and certain nuclear compartments have been shown to follow a degree of spatial organization. To date, the study of nuclear organization has often involved simple quantitative procedures that struggle with both the irregularity of the nuclear boundary and the problem of handling replicate images. Such studies typically focus on inter-object distance, rather than spatial location within the nucleus. The concern of this paper is the spatial preference of nuclear compartments, for which we have developed statistical tools to quantitatively study and explore nuclear organization. These tools combine replicate images to generate 'aggregate maps' which represent the spatial preferences of nuclear compartments. We present two examples of different compartments in mammalian fibroblasts (WI-38 and MRC-5) that demonstrate new knowledge of spatial preference within the cell nucleus. Specifically, the spatial preference of RNA polymerase II is preserved across normal and immortalized cells, whereas PML nuclear bodies exhibit a change in spatial preference from avoiding the centre in normal cells to exhibiting a preference for the centre in immortalized cells. In addition, we show that SC35 splicing speckles are excluded from the nuclear boundary and localize throughout the nucleoplasm and in the interchromatin space in non-transformed WI-38 cells. This new methodology is thus able to reveal the effect of large-scale perturbation on spatial architecture and preferences that would not be obvious from single cell imaging.

  18. Dahl (S × R) rat congenic strain analysis confirms and defines a chromosome 17 spatial navigation quantitative trait locus to <10 Mbp.

    Science.gov (United States)

    Herrera, Victoria L; Pasion, Khristine A; Tan, Glaiza A; Ruiz-Opazo, Nelson

    2013-01-01

    A quantitative trait locus (QTL) linked with ability to find a platform in the Morris Water Maze (MWM) was located on chromosome 17 (Nav-5 QTL) using intercross between Dahl S and Dahl R rats. We developed two congenic strains, S.R17A and S.R17B introgressing Dahl R-chromosome 17 segments into Dahl S chromosome 17 region spanning putative Nav-5 QTL. Performance analysis of S.R17A, S.R17B and Dahl S rats in the Morris water maze (MWM) task showed a significantly decreased spatial navigation performance in S.R17B congenic rats when compared with Dahl S controls (P = 0.02). The S.R17A congenic segment did not affect MWM performance delimiting Nav-5 to the chromosome 17 65.02-74.66 Mbp region. Additional fine mapping is necessary to identify the specific gene variant accounting for Nav-5 effect on spatial learning and memory in Dahl rats.

  19. Dahl (S × R rat congenic strain analysis confirms and defines a chromosome 17 spatial navigation quantitative trait locus to <10 Mbp.

    Directory of Open Access Journals (Sweden)

    Victoria L Herrera

    Full Text Available A quantitative trait locus (QTL linked with ability to find a platform in the Morris Water Maze (MWM was located on chromosome 17 (Nav-5 QTL using intercross between Dahl S and Dahl R rats. We developed two congenic strains, S.R17A and S.R17B introgressing Dahl R-chromosome 17 segments into Dahl S chromosome 17 region spanning putative Nav-5 QTL. Performance analysis of S.R17A, S.R17B and Dahl S rats in the Morris water maze (MWM task showed a significantly decreased spatial navigation performance in S.R17B congenic rats when compared with Dahl S controls (P = 0.02. The S.R17A congenic segment did not affect MWM performance delimiting Nav-5 to the chromosome 17 65.02-74.66 Mbp region. Additional fine mapping is necessary to identify the specific gene variant accounting for Nav-5 effect on spatial learning and memory in Dahl rats.

  20. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  1. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  2. Prospects for higher spatial resolution quantitative X-ray analysis using transition element L-lines

    Science.gov (United States)

    Statham, P.; Holland, J.

    2014-03-01

    Lowering electron beam kV reduces electron scattering and improves spatial resolution of X-ray analysis. However, a previous round robin analysis of steels at 5 - 6 kV using Lα-lines for the first row transition elements gave poor accuracies. Our experiments on SS63 steel using Lα-lines show similar biases in Cr and Ni that cannot be corrected with changes to self-absorption coefficients or carbon coating. The inaccuracy may be caused by different probabilities for emission and anomalous self-absorption for the La-line between specimen and pure element standard. Analysis using Ll(L3-M1)-lines gives more accurate results for SS63 plausibly because the M1-shell is not so vulnerable to the atomic environment as the unfilled M4,5-shell. However, Ll-intensities are very weak and WDS analysis may be impractical for some applications. EDS with large area SDD offers orders of magnitude faster analysis and achieves similar results to WDS analysis with Lα-lines but poorer energy resolution precludes the use of Ll-lines in most situations. EDS analysis of K-lines at low overvoltage is an alternative strategy for improving spatial resolution that could give higher accuracy. The trade-off between low kV versus low overvoltage is explored in terms of sensitivity for element detection for different elements.

  3. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  4. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  5. A spatially explicit and quantitative vulnerability assessment of ecosystem service change in Europe

    NARCIS (Netherlands)

    Metzger, M.J.; Schröter, D.; Leemans, R.; Cramer, W.

    2008-01-01

    Environmental change alters ecosystem functioning and may put the provision of services to human at risk. This paper presents a spatially explicit and quantitative assessment of the corresponding vulnerability for Europe, using a new framework designed to answer multidisciplinary policy relevant

  6. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    Science.gov (United States)

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  7. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  8. Perspectives on spatial data analysis

    CERN Document Server

    Rey, Sergio

    2010-01-01

    This book takes both a retrospective and prospective view of the field of spatial analysis by combining selected reprints of classic articles by Arthur Getis with current observations by leading experts in the field. Four main aspects are highlighted, dealing with spatial analysis, pattern analysis, local statistics as well as illustrative empirical applications. Researchers and students will gain an appreciation of Getis' methodological contributions to spatial analysis and the broad impact of the methods he has helped pioneer on an impressively broad array of disciplines including spatial epidemiology, demography, economics, and ecology. The volume is a compilation of high impact original contributions, as evidenced by citations, and the latest thinking on the field by leading scholars. This makes the book ideal for advanced seminars and courses in spatial analysis as well as a key resource for researchers seeking a comprehensive overview of recent advances and future directions in the field.

  9. Professional analysis in spatial planning

    Directory of Open Access Journals (Sweden)

    Andrej Černe

    2005-12-01

    Full Text Available Spatial analysis contributes to accomplishment of the three basic aims of spatial planning: it is basic element for setting spatial policies, concepts and strategies, gives basic information to inhabitants, land owners, investors, planners and helps in performing spatial policies, strategies, plans, programmes and projects. Analysis in planning are generally devoted to: understand current circumstances and emerging conditions within planning decisions; determine priorities of open questions and their solutions; formulate general principles for further development.

  10. Real-time and quantitative isotropic spatial resolution susceptibility imaging for magnetic nanoparticles

    Science.gov (United States)

    Pi, Shiqiang; Liu, Wenzhong; Jiang, Tao

    2018-03-01

    The magnetic transparency of biological tissue allows the magnetic nanoparticle (MNP) to be a promising functional sensor and contrast agent. The complex susceptibility of MNPs, strongly influenced by particle concentration, excitation magnetic field and their surrounding microenvironment, provides significant implications for biomedical applications. Therefore, magnetic susceptibility imaging of high spatial resolution will give more detailed information during the process of MNP-aided diagnosis and therapy. In this study, we present a novel spatial magnetic susceptibility extraction method for MNPs under a gradient magnetic field, a low-frequency drive magnetic field, and a weak strength high-frequency magnetic field. Based on this novel method, a magnetic particle susceptibility imaging (MPSI) of millimeter-level spatial resolution (<3 mm) was achieved using our homemade imaging system. Corroborated by the experimental results, the MPSI shows real-time (1 s per frame acquisition) and quantitative abilities, and isotropic high resolution.

  11. A quantitative method for determining spatial discriminative capacity

    Directory of Open Access Journals (Sweden)

    Dennis Robert G

    2008-03-01

    Full Text Available Abstract Background The traditional two-point discrimination (TPD test, a widely used tactile spatial acuity measure, has been criticized as being imprecise because it is based on subjective criteria and involves a number of non-spatial cues. The results of a recent study showed that as two stimuli were delivered simultaneously, vibrotactile amplitude discrimination became worse when the two stimuli were positioned relatively close together and was significantly degraded when the probes were within a subject's two-point limen. The impairment of amplitude discrimination with decreasing inter-probe distance suggested that the metric of amplitude discrimination could possibly provide a means of objective and quantitative measurement of spatial discrimination capacity. Methods A two alternative forced-choice (2AFC tracking procedure was used to assess a subject's ability to discriminate the amplitude difference between two stimuli positioned at near-adjacent skin sites. Two 25 Hz flutter stimuli, identical except for a constant difference in amplitude, were delivered simultaneously to the hand dorsum. The stimuli were initially spaced 30 mm apart, and the inter-stimulus distance was modified on a trial-by-trial basis based on the subject's performance of discriminating the stimulus with higher intensity. The experiment was repeated via sequential, rather than simultaneous, delivery of the same vibrotactile stimuli. Results Results obtained from this study showed that the performance of the amplitude discrimination task was significantly degraded when the stimuli were delivered simultaneously and were near a subject's two-point limen. In contrast, subjects were able to correctly discriminate between the amplitudes of the two stimuli when they were sequentially delivered at all inter-probe distances (including those within the two-point limen, and improved when an adapting stimulus was delivered prior to simultaneously delivered stimuli. Conclusion

  12. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  13. A matter of ephemerality: the study of Kel Tadrart Tuareg (southwest Libya campsites via quantitative spatial analysis

    Directory of Open Access Journals (Sweden)

    Stefano Biagetti

    2016-03-01

    Full Text Available We examined the settlement structure from the Kel Tadrart Tuareg, a small pastoral society from southwest Libya. Our objective was to apply spatial analysis to establish the statistical significance of specific patterns in the settlement layout. In particular, we examined whether there is a separation between domestic and livestock spaces, and whether particular residential features dedicated to guests are spatially isolated. We used both established statistical techniques and newly developed bespoke analyses to test our hypotheses, and then discuss the results in the light of possible applications to other case studies.

  14. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  15. Spatial analysis and planning under imprecision

    CERN Document Server

    Leung, Y

    1988-01-01

    The book deals with complexity, imprecision, human valuation, and uncertainty in spatial analysis and planning, providing a systematic exposure of a new philosophical and theoretical foundation for spatial analysis and planning under imprecision. Regional concepts and regionalization, spatial preference-utility-choice structures, spatial optimization with single and multiple objectives, dynamic spatial systems and their controls are analyzed in sequence.The analytical framework is based on fuzzy set theory. Basic concepts of fuzzy set theory are first discussed. Many numerical examples and emp

  16. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  17. SRXRF analysis with spatial resolution of dental calculus

    International Nuclear Information System (INIS)

    Sanchez, Hector Jorge; Perez, Carlos Alberto; Grenon, Miriam

    2000-01-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45 deg. + 45 deg.) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μmx50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm

  18. SRXRF analysis with spatial resolution of dental calculus

    Science.gov (United States)

    Sánchez, Héctor Jorge; Pérez, Carlos Alberto; Grenón, Miriam

    2000-09-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45°+45°) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μm×50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm.

  19. Recent developments in spatial analysis spatial statistics, behavioural modelling, and computational intelligence

    CERN Document Server

    Getis, Arthur

    1997-01-01

    In recent years, spatial analysis has become an increasingly active field, as evidenced by the establishment of educational and research programs at many universities. Its popularity is due mainly to new technologies and the development of spatial data infrastructures. This book illustrates some recent developments in spatial analysis, behavioural modelling, and computational intelligence. World renown spatial analysts explain and demonstrate their new and insightful models and methods. The applications are in areas of societal interest such as the spread of infectious diseases, migration behaviour, and retail and agricultural location strategies. In addition, there is emphasis on the uses of new technologoies for the analysis of spatial data through the application of neural network concepts.

  20. Using semi-variogram analysis for providing spatially distributed information on soil surface condition for land surface modeling

    Science.gov (United States)

    Croft, Holly; Anderson, Karen; Kuhn, Nikolaus J.

    2010-05-01

    The ability to quantitatively and spatially assess soil surface roughness is important in geomorphology and land degradation studies. Soils can experience rapid structural degradation in response to land cover changes, resulting in increased susceptibility to erosion and a loss of Soil Organic Matter (SOM). Changes in soil surface condition can also alter sediment detachment, transport and deposition processes, infiltration rates and surface runoff characteristics. Deriving spatially distributed quantitative information on soil surface condition for inclusion in hydrological and soil erosion models is therefore paramount. However, due to the time and resources involved in using traditional field sampling techniques, there is a lack of spatially distributed information on soil surface condition. Laser techniques can provide data for a rapid three dimensional representation of the soil surface at a fine spatial resolution. This provides the ability to capture changes at the soil surface associated with aggregate breakdown, flow routing, erosion and sediment re-distribution. Semi-variogram analysis of the laser data can be used to represent spatial dependence within the dataset; providing information about the spatial character of soil surface structure. This experiment details the ability of semi-variogram analysis to spatially describe changes in soil surface condition. Soil for three soil types (silt, silt loam and silty clay) was sieved to produce aggregates between 1 mm and 16 mm in size and placed evenly in sample trays (25 x 20 x 2 cm). Soil samples for each soil type were exposed to five different durations of artificial rainfall, to produce progressively structurally degraded soil states. A calibrated laser profiling instrument was used to measure surface roughness over a central 10 x 10 cm plot of each soil state, at 2 mm sample spacing. The laser data were analysed within a geostatistical framework, where semi-variogram analysis quantitatively represented

  1. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  2. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  3. A precategorical spatial-data metamodel

    OpenAIRE

    Steven A Roberts; G Brent Hall; Paul H Calamai

    2006-01-01

    Increasing recognition of the extent and speed of habitat fragmentation and loss, particularly in the urban fringe, is driving the need to analyze qualitatively and quantitatively regional landscape structure for decision support in land-use planning and environmental-policy implementation. The spatial analysis required in this area is not well served by existing spatial-data models. In this paper a new theoretical spatial-data metamodel is introduced as a tool for addressing such needs and a...

  4. Automated high resolution full-field spatial coherence tomography for quantitative phase imaging of human red blood cells

    Science.gov (United States)

    Singla, Neeru; Dubey, Kavita; Srivastava, Vishal; Ahmad, Azeem; Mehta, D. S.

    2018-02-01

    We developed an automated high-resolution full-field spatial coherence tomography (FF-SCT) microscope for quantitative phase imaging that is based on the spatial, rather than the temporal, coherence gating. The Red and Green color laser light was used for finding the quantitative phase images of unstained human red blood cells (RBCs). This study uses morphological parameters of unstained RBCs phase images to distinguish between normal and infected cells. We recorded the single interferogram by a FF-SCT microscope for red and green color wavelength and average the two phase images to further reduced the noise artifacts. In order to characterize anemia infected from normal cells different morphological features were extracted and these features were used to train machine learning ensemble model to classify RBCs with high accuracy.

  5. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  6. Quantitative Motion Analysis of Tai Chi Chuan: The Upper Extremity Movement

    Directory of Open Access Journals (Sweden)

    Tsung-Jung Ho

    2018-01-01

    Full Text Available The quantitative and reproducible analysis of the standard body movement in Tai Chi Chuan (TCC was performed in this study. We aimed to provide a reference of the upper extremities for standardizing TCC practice. Microsoft Kinect was used to record the motion during the practice of TCC. The preparation form and eight essential forms of TCC performed by an instructor and 101 practitioners were analyzed in this study. The instructor completed an entire TCC practice cycle and performed the cycle 12 times. An entire cycle of TCC was performed by practitioners and images were recorded for statistics analysis. The performance of the instructor showed high similarity (Pearson correlation coefficient (r=0.71~0.84 to the first practice cycle. Among the 9 forms, lay form had the highest similarity (rmean=0.90 and push form had the lowest similarity (rmean=0.52. For the practitioners, ward off form (rmean=0.51 and roll back form (rmean=0.45 had the highest similarity with moderate correlation. We used Microsoft Kinect to record the spatial coordinates of the upper extremity joints during the practice of TCC and the data to perform quantitative and qualitative analysis of the joint positions and elbow joint angle.

  7. Spatial heterogeneity analysis of brain activation in fMRI

    Directory of Open Access Journals (Sweden)

    Lalit Gupta

    2014-01-01

    Full Text Available In many brain diseases it can be qualitatively observed that spatial patterns in blood oxygenation level dependent (BOLD activation maps appear more (diffusively distributed than in healthy controls. However, measures that can quantitatively characterize this spatial distributiveness in individual subjects are lacking. In this study, we propose a number of spatial heterogeneity measures to characterize brain activation maps. The proposed methods focus on different aspects of heterogeneity, including the shape (compactness, complexity in the distribution of activated regions (fractal dimension and co-occurrence matrix, and gappiness between activated regions (lacunarity. To this end, functional MRI derived activation maps of a language and a motor task were obtained in language impaired children with (Rolandic epilepsy and compared to age-matched healthy controls. Group analysis of the activation maps revealed no significant differences between patients and controls for both tasks. However, for the language task the activation maps in patients appeared more heterogeneous than in controls. Lacunarity was the best measure to discriminate activation patterns of patients from controls (sensitivity 74%, specificity 70% and illustrates the increased irregularity of gaps between activated regions in patients. The combination of heterogeneity measures and a support vector machine approach yielded further increase in sensitivity and specificity to 78% and 80%, respectively. This illustrates that activation distributions in impaired brains can be complex and more heterogeneous than in normal brains and cannot be captured fully by a single quantity. In conclusion, heterogeneity analysis has potential to robustly characterize the increased distributiveness of brain activation in individual patients.

  8. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  9. Quantitative spatial analysis of the mouse brain lipidome by pressurized liquid extraction surface analysis

    DEFF Research Database (Denmark)

    Almeida, Reinaldo; Berzina, Zane; Christensen, Eva Arnspang

    2015-01-01

    extracted directly from tissue sections. PLESA uses a sealed and pressurized sampling probe that enables the use of chloroform-containing extraction solvents for efficient in situ lipid microextraction with a spatial resolution of 400 μm. Quantification of lipid species is achieved by the inclusion...

  10. Evaluation of the Possibility of Applying Spatial 3D Imaging Using X-Ray Computed Tomography Reconstruction Methods for Quantitative Analysis of Multiphase Materials / Rentgenowska Analiza Ilościowa Materiałów Wielofazowych Z Wykorzystaniem Przestrzennego Obrazowania (3D Przy Użyciu Metod Rekonstrukcji Tomografii Komputerowej

    Directory of Open Access Journals (Sweden)

    Matysik P.

    2015-12-01

    Full Text Available In this paper the possibility of using X-ray computed tomography (CT in quantitative metallographic studies of homogeneous and composite materials is presented. Samples of spheroidal cast iron, Fe-Ti powder mixture compact and epoxy composite reinforced with glass fibers, were subjected to comparative structural tests. Volume fractions of each of the phase structure components were determined by conventional methods with the use of a scanning electron microscopy (SEM and X-ray diffraction (XRD quantitative analysis methods. These results were compared with those obtained by the method of spatial analysis of the reconstructed CT image. Based on the comparative analysis, taking into account the selectivity of data verification methods and the accuracy of the obtained results, the authors conclude that the method of computed tomography is suitable for quantitative analysis of several types of structural materials.

  11. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  12. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    Science.gov (United States)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  13. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  14. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  15. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  16. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  17. Quantitative fluorescence loss in photobleaching for analysis of protein transport and aggregation

    Directory of Open Access Journals (Sweden)

    Wüstner Daniel

    2012-11-01

    Full Text Available Abstract Background Fluorescence loss in photobleaching (FLIP is a widely used imaging technique, which provides information about protein dynamics in various cellular regions. In FLIP, a small cellular region is repeatedly illuminated by an intense laser pulse, while images are taken with reduced laser power with a time lag between the bleaches. Despite its popularity, tools are lacking for quantitative analysis of FLIP experiments. Typically, the user defines regions of interest (ROIs for further analysis which is subjective and does not allow for comparing different cells and experimental settings. Results We present two complementary methods to detect and quantify protein transport and aggregation in living cells from FLIP image series. In the first approach, a stretched exponential (StrExp function is fitted to fluorescence loss (FL inside and outside the bleached region. We show by reaction–diffusion simulations, that the StrExp function can describe both, binding/barrier–limited and diffusion-limited FL kinetics. By pixel-wise regression of that function to FL kinetics of enhanced green fluorescent protein (eGFP, we determined in a user-unbiased manner from which cellular regions eGFP can be replenished in the bleached area. Spatial variation in the parameters calculated from the StrExp function allow for detecting diffusion barriers for eGFP in the nucleus and cytoplasm of living cells. Polyglutamine (polyQ disease proteins like mutant huntingtin (mtHtt can form large aggregates called inclusion bodies (IB’s. The second method combines single particle tracking with multi-compartment modelling of FL kinetics in moving IB’s to determine exchange rates of eGFP-tagged mtHtt protein (eGFP-mtHtt between aggregates and the cytoplasm. This method is self-calibrating since it relates the FL inside and outside the bleached regions. It makes it therefore possible to compare release kinetics of eGFP-mtHtt between different cells and

  18. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  19. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  20. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  1. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  2. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  3. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  4. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Spatial structure of neuropeptide allatostatin-4

    International Nuclear Information System (INIS)

    Veliyeva, L.I.; Aliyev, E.Z.

    2011-01-01

    By method of conformational analysis there was determined the spatial structure of neuropeptide allatostatin-4 belonging to allatostatins family. On the basic of value of intramolecular conformational energy calculation was conducted quantitative assessment of the stability of molecule's possible conformational status in dipolar medium terms

  6. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  7. Spatial analysis of various multiplex cinema types

    Directory of Open Access Journals (Sweden)

    Young-Seo Park

    2016-03-01

    Full Text Available This study identifies the spatial characteristics and relationships of each used space according to the multiplex type. In this study, multiplexes are classified according to screen rooms and circulation systems, and each used space is quantitatively analyzed. The multiplex type based on screen rooms and moving line systems influences the relationship and characteristics of each used space in various ways. In particular, the structure of the used space of multiplexes has a significant effect on profit generation and audience convenience.

  8. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  9. A method for the quantitative metallographic analysis of nuclear fuels (Programme QMA)

    International Nuclear Information System (INIS)

    Moreno, A.; Sari, C.

    1978-01-01

    A method is described for the quantitative analysis of features such as voids, cracks, phases, inclusions and grains distributed on random plane sections of fuel materials. An electronic image analyzer, Quantimet, attached to a MM6 Leitz microscope was used to measure size, area, perimeter and shape of features dispersed in a matrix. The apparatus is driven by a computer which calculates the size, area and perimeter distribution, form factors and orientation of the features as well as the inclusion content of the matrix expressed in weight per cent. A computer programme, QMA, executes the spatial correction of the measured two-dimensional sections and delivers the true distribution of feature sizes in a three-dimensional system

  10. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  11. Growth of solid domains in model membranes: quantitative image analysis reveals a strong correlation between domain shape and spatial position

    DEFF Research Database (Denmark)

    Bernchou, Uffe; Ipsen, John Hjort; Simonsen, Adam Cohen

    2009-01-01

    . To analyze this effect, the nucleation points were used as generators in a Voronoi construction. Associated with each generator is a Voronoi polygon that contains all points closer to this generator than to any other. Through a detailed quantitative analysis of the Voronoi cells and the domains, we have...

  12. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  13. Spatial Econometric data analysis: moving beyond traditional models

    NARCIS (Netherlands)

    Florax, R.J.G.M.; Vlist, van der A.J.

    2003-01-01

    This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling

  14. Variability of apparently homogeneous soilscapes in São Paulo state, Brazil: I. spatial analysis

    Directory of Open Access Journals (Sweden)

    M. van Den Berg

    2000-06-01

    Full Text Available The spatial variability of strongly weathered soils under sugarcane and soybean/wheat rotation was quantitatively assessed on 33 fields in two regions in São Paulo State, Brazil: Araras (15 fields with sugarcane and Assis (11 fields with sugarcane and seven fields with soybean/wheat rotation. Statistical methods used were: nested analysis of variance (for 11 fields, semivariance analysis and analysis of variance within and between fields. Spatial levels from 50 m to several km were analyzed. Results are discussed with reference to a previously published study carried out in the surroundings of Passo Fundo (RS. Similar variability patterns were found for clay content, organic C content and cation exchange capacity. The fields studied are quite homogeneous with respect to these relatively stable soil characteristics. Spatial variability of other characteristics (resin extractable P, pH, base- and Al-saturation and also soil colour, varies with region and, or land use management. Soil management for sugarcane seems to have induced modifications to greater depths than for soybean/wheat rotation. Surface layers of soils under soybean/wheat present relatively little variation, apparently as a result of very intensive soil management. The major part of within-field variation occurs at short distances (< 50 m in all study areas. Hence, little extra information would be gained by increasing sampling density from, say, 1/km² to 1/50 m². For many purposes, the soils in the study regions can be mapped with the same observation density, but residual variance will not be the same in all areas. Bulk sampling may help to reveal spatial patterns between 50 and 1.000 m.

  15. Quantitative reconstruction from a single diffraction-enhanced image

    International Nuclear Information System (INIS)

    Paganin, D.M.; Lewis, R.A.; Kitchen, M.

    2003-01-01

    Full text: We develop an algorithm for using a single diffraction-enhanced image (DEI) to obtain a quantitative reconstruction of the projected thickness of a single-material sample which is embedded within a substrate of approximately constant thickness. This algorithm is used to quantitatively map inclusions in a breast phantom, from a single synchrotron DEI image. In particular, the reconstructed images quantitatively represent the projected thickness in the bulk of the sample, in contrast to DEI images which greatly emphasise sharp edges (high spatial frequencies). In the context of an ultimate aim of improved methods for breast cancer detection, the reconstructions are potentially of greater diagnostic value compared to the DEI data. Lastly, we point out that the methods of analysis presented here are also applicable to the quantitative analysis of differential interference contrast (DIC) images

  16. Nanoscale nuclear architecture for cancer diagnosis by spatial-domain low-coherence quantitative phase microscopy

    Science.gov (United States)

    Wang, Pin; Bista, Rajan K.; Khalbuss, Walid E.; Qiu, Wei; Staton, Kevin D.; Zhang, Lin; Brentnall, Teresa A.; Brand, Randall E.; Liu, Yang

    2011-03-01

    Alterations in nuclear architecture are the hallmark diagnostic characteristic of cancer cells. In this work, we show that the nuclear architectural characteristics quantified by spatial-domain low-coherence quantitative phase microscopy (SL-QPM), is more sensitive for the identification of cancer cells than conventional cytopathology. We demonstrated the importance of nuclear architectural characteristics in both an animal model of intestinal carcinogenesis - APC/Min mouse model and human cytology specimens with colorectal cancer by identifying cancer from cytologically noncancerous appearing cells. The determination of nanoscale nuclear architecture using this simple and practical optical instrument is a significant advance towards cancer diagnosis.

  17. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  18. Quantitative analysis of oxygen depth distribution by means of deuteron reaction

    International Nuclear Information System (INIS)

    Dyumin, A.N.; Eremin, V.K.; Konnikov, S.G.

    1993-01-01

    Experimentally are investigated and realized possibilities for using the reaction for quantitative determination of the depth profiles of the oxygen distribution in HTSC structures in layers up to 10 4 A. It is concluded that in the near-surface layers when profiling the oxygen content is achieved the spatial resolution of 150 A

  19. Model Interpretation of Topological Spatial Analysis for the Visually Impaired (Blind Implemented in Google Maps

    Directory of Open Access Journals (Sweden)

    Marcelo Franco Porto

    2013-06-01

    Full Text Available The technological innovations promote the availability of geographic information on the Internet through Web GIS such as Google Earth and Google Maps. These systems contribute to the teaching and diffusion of geographical knowledge that instigates the recognition of the space we live in, leading to the creation of a spatial identity. In these products available on the Web, the interpretation and analysis of spatial information gives priority to one of the human senses: vision. Due to the fact that this representation of information is transmitted visually (image and vectors, a portion of the population is excluded from part of this knowledge because categories of analysis of geographic data such as borders, territory, and space can only be understood by people who can see. This paper deals with the development of a model of interpretation of topological spatial analysis based on the synthesis of voice and sounds that can be used by the visually impaired (blind.The implementation of a prototype in Google Maps and the usability tests performed are also examined. For the development work it was necessary to define the model of topological spatial analysis, focusing on computational implementation, which allows users to interpret the spatial relationships of regions (countries, states and municipalities, recognizing its limits, neighborhoods and extension beyond their own spatial relationships . With this goal in mind, several interface and usability guidelines were drawn up to be used by the visually impaired (blind. We conducted a detailed study of the Google Maps API (Application Programming Interface, which was the environment selected for prototype development, and studied the information available for the users of that system. The prototype was developed based on the synthesis of voice and sounds that implement the proposed model in C # language and in .NET environment. To measure the efficiency and effectiveness of the prototype, usability

  20. Spatial and temporal epidemiological analysis in the Big Data era.

    Science.gov (United States)

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  1. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  2. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    Science.gov (United States)

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular

  3. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    Directory of Open Access Journals (Sweden)

    Hongoh Valerie

    2011-12-01

    Full Text Available Abstract The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector

  4. Evaluating Motion. Spatial User Behavior in Virtual Environments

    DEFF Research Database (Denmark)

    Drachen, Anders; Canossa, Alessandro

    2011-01-01

    User-behaviour analysis has only recently been adapted to the context of the virtual world domain and remains limited in its application. Behaviour analysis is based on instrumentation data, automated, detailed, quantitative information about user behaviour within the virtual environment (VE......) of digital games. A key advantage of the method in comparison with existing user-research methods, such as usability- and playability-testing is that it permits very large sample sizes. Furthermore, games are in the vast majority of cases based on spatial, VEs within which the players operate and through...... which they experience the games. Therefore, spatial behaviour analyses are useful to game research and design. In this paper, spatial analysis methods are introduced and arguments posed for their use in user-behaviour analysis. Case studies involving data from thousands of players are used to exemplify...

  5. Spatially variable stage-driven groundwater-surface water interaction inferred from time-frequency analysis of distributed temperature sensing data

    Science.gov (United States)

    Mwakanyamale, Kisa; Slater, Lee; Day-Lewis, Frederick D.; Elwaseif, Mehrez; Johnson, Carole D.

    2012-01-01

    Characterization of groundwater-surface water exchange is essential for improving understanding of contaminant transport between aquifers and rivers. Fiber-optic distributed temperature sensing (FODTS) provides rich spatiotemporal datasets for quantitative and qualitative analysis of groundwater-surface water exchange. We demonstrate how time-frequency analysis of FODTS and synchronous river stage time series from the Columbia River adjacent to the Hanford 300-Area, Richland, Washington, provides spatial information on the strength of stage-driven exchange of uranium contaminated groundwater in response to subsurface heterogeneity. Although used in previous studies, the stage-temperature correlation coefficient proved an unreliable indicator of the stage-driven forcing on groundwater discharge in the presence of other factors influencing river water temperature. In contrast, S-transform analysis of the stage and FODTS data definitively identifies the spatial distribution of discharge zones and provided information on the dominant forcing periods (≥2 d) of the complex dam operations driving stage fluctuations and hence groundwater-surface water exchange at the 300-Area.

  6. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  7. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    Science.gov (United States)

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  8. Quantitative tradeoffs between spatial, temporal, and thermometric resolution of nonresonant Raman thermometry for dynamic experiments.

    Science.gov (United States)

    McGrane, Shawn D; Moore, David S; Goodwin, Peter M; Dattelbaum, Dana M

    2014-01-01

    The ratio of Stokes to anti-Stokes nonresonant spontaneous Raman can provide an in situ thermometer that is noncontact, independent of any material specific parameters or calibrations, can be multiplexed spatially with line imaging, and can be time resolved for dynamic measurements. However, spontaneous Raman cross sections are very small, and thermometric measurements are often limited by the amount of laser energy that can be applied without damaging the sample or changing its temperature appreciably. In this paper, we quantitatively detail the tradeoff space between spatial, temporal, and thermometric accuracy measurable with spontaneous Raman. Theoretical estimates are pinned to experimental measurements to form realistic expectations of the resolution tradeoffs appropriate to various experiments. We consider the effects of signal to noise, collection efficiency, laser heating, pulsed laser ablation, and blackbody emission as limiting factors, provide formulae to help choose optimal conditions and provide estimates relevant to planning experiments along with concrete examples for single-shot measurements.

  9. Spatial analysis of the electrical energy demand in Greece

    International Nuclear Information System (INIS)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2017-01-01

    The Electrical Energy Demand (EED) of the agricultural, commercial and industrial sector in Greece, as well as its use for domestic activities, public and municipal authorities and street lighting are analysed spatially using Geographical Information System and spatial statistical methods. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the NUTS (Nomenclature of Territorial Units for Statistics) level 3. The aim is to identify spatial patterns of the EED and its transformations such as the ratios of the EED to socioeconomic variables, i.e. the population, the total area, the population density and the Gross Domestic Product (GDP). Based on the analysis, Greece is divided in five regions, each one with a different development model, i.e. Attica and Thessaloniki which are two heavily populated major poles, Thessaly and Central Greece which form a connected geographical region with important agricultural and industrial sector, the islands and some coastal areas which are characterized by an important commercial sector and the rest Greek areas. The spatial patterns can provide additional information for policy decision about the electrical energy management and better representation of the regional socioeconomic conditions. - Highlights: • We visualize spatially the Electrical Energy Demand (EED) in Greece. • We apply spatial analysis methods to the EED data. • Spatial patterns of the EED are identified. • Greece is classified in five distinct groups, based on the analysis. • The results can be used for optimal planning of the electric system.

  10. Automated Spatial Brain Normalization and Hindbrain White Matter Reference Tissue Give Improved [(18)F]-Florbetaben PET Quantitation in Alzheimer's Model Mice.

    Science.gov (United States)

    Overhoff, Felix; Brendel, Matthias; Jaworska, Anna; Korzhova, Viktoria; Delker, Andreas; Probst, Federico; Focke, Carola; Gildehaus, Franz-Josef; Carlsen, Janette; Baumann, Karlheinz; Haass, Christian; Bartenstein, Peter; Herms, Jochen; Rominger, Axel

    2016-01-01

    Preclinical PET studies of β-amyloid (Aβ) accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore, we aimed to evaluate systematically the (1) impact of an automated algorithm for spatial brain normalization, and (2) intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a 6 week longitudinal setting (N = 37) and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40) were investigated by [(18)F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ). For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX∕REF), relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen's d), and in reference to gold standard histopathological quantitation (Pearson's R). Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ≥0.99) between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX∕REF. All SUVRCTX∕REF methods performed better than SUVCTX both with regard to longitudinal stability (d≥1.21 vs. d = 0.23) and histological gold standard agreement (R≥0.66 vs. R≥0.31). Voxel-wise analysis suggested a physiologically implausible longitudinal decrease by global mean scaling. The hindbrain white matter reference (R mean = 0.75) was slightly superior to the brainstem (R mean = 0.74) and the cerebellum (R mean = 0.73). Automated

  11. Spatial analysis of weed patterns

    NARCIS (Netherlands)

    Heijting, S.

    2007-01-01

    Keywords: Spatial analysis, weed patterns, Mead’s test, space-time correlograms, 2-D correlograms, dispersal, Generalized Linear Models, heterogeneity, soil, Taylor’s power law. Weeds in agriculture occur in patches. This thesis is a contribution to the characterization of this patchiness, to its

  12. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  13. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Special issues in quantitation of brain receptors and related markers by emission computed tomography

    International Nuclear Information System (INIS)

    Links, J.M.

    1998-01-01

    Emission computed tomography provides an opportunity to quantify neurotransmitter-neuro receptor systems in vivo. In order to do so, very high image quality and quantitative accuracy are required. Quantitation of receptor systems involves considerations of physical effects (such as finite spatial resolution, scatter, and attenuation), instrumentation design (such as spatial sampling), image processing (such as filtering), and data analysis (such as kinetic modeling). Appropriate application of these considerations can lead to useful results, but emerging approaches promise even greater levels of accuracy and precision

  15. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  16. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  17. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  18. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  19. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  20. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  1. Crash rates analysis in China using a spatial panel model

    Directory of Open Access Journals (Sweden)

    Wonmongo Lacina Soro

    2017-10-01

    Full Text Available The consideration of spatial externalities in traffic safety analysis is of paramount importance for the success of road safety policies. Yet, the quasi-totality of spatial dependence studies on crash rates is performed within the framework of single-equation spatial cross-sectional studies. The present study extends the spatial cross-sectional scheme to a spatial fixed-effects panel model estimated using the maximum likelihood method. The spatial units are the 31 administrative regions of mainland China over the period 2004–2013. The presence of neighborhood effects is evidenced through the Moran's I statistic. Consistent with previous studies, the analysis reveals that omitting the spatial effects in traffic safety analysis is likely to bias the estimation results. The spatial and error lags are all positive and statistically significant suggesting similarities of crash rates pattern in neighboring regions. Some other explanatory variables, such as freight traffic, the length of paved roads and the populations of age 65 and above are related to higher rates while the opposite trend is observed for the Gross Regional Product, the urban unemployment rate and passenger traffic.

  2. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  3. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link...

  4. Quantitative Analysis of Rat Dorsal Root Ganglion Neurons Cultured on Microelectrode Arrays Based on Fluorescence Microscopy Image Processing.

    Science.gov (United States)

    Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo

    2015-12-01

    Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.

  5. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  6. Research progress and hotspot analysis of spatial interpolation

    Science.gov (United States)

    Jia, Li-juan; Zheng, Xin-qi; Miao, Jin-li

    2018-02-01

    In this paper, the literatures related to spatial interpolation between 1982 and 2017, which are included in the Web of Science core database, are used as data sources, and the visualization analysis is carried out according to the co-country network, co-category network, co-citation network, keywords co-occurrence network. It is found that spatial interpolation has experienced three stages: slow development, steady development and rapid development; The cross effect between 11 clustering groups, the main convergence of spatial interpolation theory research, the practical application and case study of spatial interpolation and research on the accuracy and efficiency of spatial interpolation. Finding the optimal spatial interpolation is the frontier and hot spot of the research. Spatial interpolation research has formed a theoretical basis and research system framework, interdisciplinary strong, is widely used in various fields.

  7. Quantitative analysis of the chromatin of lymphocytes: an assay on comparative structuralism.

    Science.gov (United States)

    Meyer, F

    1980-01-01

    With 26 letters we can form all the words we use, and with a few words it is possible to form an infinite number of different meaningful sentences. In our case, the letters will be a few simple neighborhood image transformations and area measurements. The paper shows how, by iterating these transformations, it is possible to obtain a good quantitative description of the nuclear structure of Feulgen-stained lymphocytes (CLL and normal). The fact that we restricted ourselves to a small number of image transformations made it possible to construct an image analysis system (TAS) able to do these transformations very quickly. We will see, successively, how to segment the nucleus itself, the chromatin, and the interchromatinic channels, how openings and closings lead to size and spatial distribution curves, and how skeletons may be used for measuring the lengths of interchromatinic channels.

  8. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  9. Asymptotic analysis of spatial discretizations in implicit Monte Carlo

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.

    2009-01-01

    We perform an asymptotic analysis of spatial discretizations in Implicit Monte Carlo (IMC). We consider two asymptotic scalings: one that represents a time step that resolves the mean-free time, and one that corresponds to a fixed, optically large time step. We show that only the latter scaling results in a valid spatial discretization of the proper diffusion equation, and thus we conclude that IMC only yields accurate solutions when using optically large spatial cells if time steps are also optically large. We demonstrate the validity of our analysis with a set of numerical examples.

  10. Regional Convergence of Income: Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Vera Ivanovna Ivanova

    2014-12-01

    Full Text Available Russia has a huge territory and a strong interregional heterogeneity, so we can assume that geographical factors have a significant impact on the pace of economic growth in Russian regions. Therefore the article is focused on the following issues: 1 correlation between comparative advantages of geographical location and differences in growth rates; 2 impact of more developed regions on their neighbors and 3 correlation between economic growth of regions and their spatial interaction. The article is devoted to the empirical analysis of regional per capita incomes from 1996 to 2012 and explores the dynamics of the spatial autocorrelation of regional development indicator. It is shown that there is a problem of measuring the intensity of spatial dependence: factor value of Moran’s index varies greatly depending on the choice of the matrix of distances. In addition, with the help of spatial econometrics the author tests the following hypotheses: 1 there is convergence between regions for a specified period; 2 the process of beta convergence is explained by the spatial arrangement of regions and 3 there is positive impact of market size on regional growth. The author empirically confirmed all three hypotheses

  11. Quantitative analysis of geomorphic processes using satellite image data at different scales

    Science.gov (United States)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  12. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  13. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  14. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  15. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  16. Quantifying spatial heterogeneity from images

    International Nuclear Information System (INIS)

    Pomerantz, Andrew E; Song Yiqiao

    2008-01-01

    Visualization techniques are extremely useful for characterizing natural materials with complex spatial structure. Although many powerful imaging modalities exist, simple display of the images often does not convey the underlying spatial structure. Instead, quantitative image analysis can extract the most important features of the imaged object in a manner that is easier to comprehend and to compare from sample to sample. This paper describes the formulation of the heterogeneity spectrum to show the extent of spatial heterogeneity as a function of length scale for all length scales to which a particular measurement is sensitive. This technique is especially relevant for describing materials that simultaneously present spatial heterogeneity at multiple length scales. In this paper, the heterogeneity spectrum is applied for the first time to images from optical microscopy. The spectrum is measured for thin section images of complex carbonate rock cores showing heterogeneity at several length scales in the range 10-10 000 μm.

  17. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  18. Quantitative Imaging in Cancer Evolution and Ecology

    Science.gov (United States)

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  19. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  20. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  1. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  2. Geospatial analysis platform: Supporting strategic spatial analysis and planning

    CSIR Research Space (South Africa)

    Naude, A

    2008-11-01

    Full Text Available Whilst there have been rapid advances in satellite imagery and related fine resolution mapping and web-based interfaces (e.g. Google Earth), the development of capabilities for strategic spatial analysis and planning support has lagged behind...

  3. Spatial evolutionary epidemiology of spreading epidemics.

    Science.gov (United States)

    Lion, S; Gandon, S

    2016-10-26

    Most spatial models of host-parasite interactions either neglect the possibility of pathogen evolution or consider that this process is slow enough for epidemiological dynamics to reach an equilibrium on a fast timescale. Here, we propose a novel approach to jointly model the epidemiological and evolutionary dynamics of spatially structured host and pathogen populations. Starting from a multi-strain epidemiological model, we use a combination of spatial moment equations and quantitative genetics to analyse the dynamics of mean transmission and virulence in the population. A key insight of our approach is that, even in the absence of long-term evolutionary consequences, spatial structure can affect the short-term evolution of pathogens because of the build-up of spatial differentiation in mean virulence. We show that spatial differentiation is driven by a balance between epidemiological and genetic effects, and this quantity is related to the effect of kin competition discussed in previous studies of parasite evolution in spatially structured host populations. Our analysis can be used to understand and predict the transient evolutionary dynamics of pathogens and the emergence of spatial patterns of phenotypic variation. © 2016 The Author(s).

  4. Rings and sector : intrasite spatial analysis of stone age sites

    NARCIS (Netherlands)

    Stapert, Durk

    1992-01-01

    This thesis deals with intrasite spatial analysis: the analysis of spatial patterns on site level. My main concern has been to develop a simple method for analysing Stone Age sites of a special type: those characterised by the presence of a hearth closely associated in space with an artefact

  5. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  6. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  7. Stereological analysis of spatial structures

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  8. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    International Nuclear Information System (INIS)

    Ryan, C.G.; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-01-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  9. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  10. ImaEdge - a platform for quantitative analysis of the spatiotemporal dynamics of cortical proteins during cell polarization.

    Science.gov (United States)

    Zhang, Zhen; Lim, Yen Wei; Zhao, Peng; Kanchanawong, Pakorn; Motegi, Fumio

    2017-12-15

    Cell polarity involves the compartmentalization of the cell cortex. The establishment of cortical compartments arises from the spatial bias in the activity and concentration of cortical proteins. The mechanistic dissection of cell polarity requires the accurate detection of dynamic changes in cortical proteins, but the fluctuations of cell shape and the inhomogeneous distributions of cortical proteins greatly complicate the quantitative extraction of their global and local changes during cell polarization. To address these problems, we introduce an open-source software package, ImaEdge, which automates the segmentation of the cortex from time-lapse movies, and enables quantitative extraction of cortical protein intensities. We demonstrate that ImaEdge enables efficient and rigorous analysis of the dynamic evolution of cortical PAR proteins during Caenorhabditis elegans embryogenesis. It is also capable of accurate tracking of varying levels of transgene expression and discontinuous signals of the actomyosin cytoskeleton during multiple rounds of cell division. ImaEdge provides a unique resource for quantitative studies of cortical polarization, with the potential for application to many types of polarized cells.This article has an associated First Person interview with the first authors of the paper. © 2017. Published by The Company of Biologists Ltd.

  11. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    Science.gov (United States)

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  12. Label-free cell-cycle analysis by high-throughput quantitative phase time-stretch imaging flow cytometry

    Science.gov (United States)

    Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.

    2018-02-01

    Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.

  13. Quantitative imaging of turbulent and reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Paul, P.H. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Quantitative digital imaging, using planar laser light scattering techniques is being developed for the analysis of turbulent and reacting flows. Quantitative image data, implying both a direct relation to flowfield variables as well as sufficient signal and spatial dynamic range, can be readily processed to yield two-dimensional distributions of flowfield scalars and in turn two-dimensional images of gradients and turbulence scales. Much of the development of imaging techniques to date has concentrated on understanding the requisite molecular spectroscopy and collision dynamics to be able to determine how flowfield variable information is encoded into the measured signal. From this standpoint the image is seen as a collection of single point measurements. The present effort aims at realizing necessary improvements in signal and spatial dynamic range, signal-to-noise ratio and spatial resolution in the imaging system as well as developing excitation/detection strategies which provide for a quantitative measure of particular flowfield scalars. The standard camera used for the study is an intensified CCD array operated in a conventional video format. The design of the system was based on detailed modeling of signal and image transfer properties of fast UV imaging lenses, image intensifiers and CCD detector arrays. While this system is suitable for direct scalar imaging, derived quantities (e.g. temperature or velocity images) require an exceptionally wide dynamic range imaging detector. To apply these diagnostics to reacting flows also requires a very fast shuttered camera. The authors have developed and successfully tested a new type of gated low-light level detector. This system relies on fast switching of proximity focused image-diode which is direct fiber-optic coupled to a cooled CCD array. Tests on this new detector show significant improvements in detection limit, dynamic range and spatial resolution as compared to microchannel plate intensified arrays.

  14. Is there a critical lesion site for unilateral spatial neglect? A meta-analysis using activation likelihood estimation.

    Directory of Open Access Journals (Sweden)

    Pascal eMolenberghs

    2012-04-01

    Full Text Available The critical lesion site responsible for the syndrome of unilateral spatial neglect has been debated for more than a decade. Here we performed an activation likelihood estimation (ALE to provide for the first time an objective quantitative index of the consistency of lesion sites across anatomical group studies of spatial neglect. The analysis revealed several distinct regions in which damage has consistently been associated with spatial neglect symptoms. Lesioned clusters were located in several cortical and subcortical regions of the right hemisphere, including the middle and superior temporal gyrus, inferior parietal lobule, intraparietal sulcus, precuneus, middle occipital gyrus, caudate nucleus and posterior insula, as well as in the white matter pathway corresponding to the posterior part of the superior longitudinal fasciculus. Further analyses suggested that separate lesion sites are associated with impairments in different behavioural tests, such as line bisection and target cancellation. Similarly, specific subcomponents of the heterogeneous neglect syndrome, such as extinction and allocentric and personal neglect, are associated with distinct lesion sites. Future progress in delineating the neuropathological correlates of spatial neglect will depend upon the development of more refined measures of perceptual and cognitive functions than those currently available in the clinical setting.

  15. Research on the spatial analysis method of seismic hazard for island

    International Nuclear Information System (INIS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-01-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform. (paper)

  16. Research on the spatial analysis method of seismic hazard for island

    Science.gov (United States)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  17. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  18. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  19. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  20. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  1. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  2. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  3. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  4. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  5. Application of Fourier analysis to multispectral/spatial recognition

    Science.gov (United States)

    Hornung, R. J.; Smith, J. A.

    1973-01-01

    One approach for investigating spectral response from materials is to consider spatial features of the response. This might be accomplished by considering the Fourier spectrum of the spatial response. The Fourier Transform may be used in a one-dimensional to multidimensional analysis of more than one channel of data. The two-dimensional transform represents the Fraunhofer diffraction pattern of the image in optics and has certain invariant features. Physically the diffraction pattern contains spatial features which are possibly unique to a given configuration or classification type. Different sampling strategies may be used to either enhance geometrical differences or extract additional features.

  6. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  7. Quantitative proteomic analysis of post-translational modifications of human histones

    DEFF Research Database (Denmark)

    Beck, Hans Christian; Nielsen, Eva C; Matthiesen, Rune

    2006-01-01

    , and H4 in a site-specific and dose-dependent manner. This unbiased analysis revealed that a relative increase in acetylated peptide from the histone variants H2A, H2B, and H4 was accompanied by a relative decrease of dimethylated Lys(57) from histone H2B. The dose-response results obtained...... by quantitative proteomics of histones from HDACi-treated cells were consistent with Western blot analysis of histone acetylation, cytotoxicity, and dose-dependent expression profiles of p21 and cyclin A2. This demonstrates that mass spectrometry-based quantitative proteomic analysis of post-translational...

  8. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  9. A synchrotron-based local computed tomography combined with data-constrained modelling approach for quantitative analysis of anthracite coal microstructure

    International Nuclear Information System (INIS)

    Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng

    2014-01-01

    A quantitative local computed tomography combined with data-constrained modelling has been developed. The method could improve distinctly the spatial resolution and the composition resolution in a sample larger than the field of view, for quantitative characterization of three-dimensional distributions of material compositions and void. Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials

  10. Spatial gene expression quantification: a tool for analysis of in situ hybridizations in sea anemone Nematostella vectensis

    Directory of Open Access Journals (Sweden)

    Botman Daniel

    2012-10-01

    Full Text Available Abstract Background Spatial gene expression quantification is required for modeling gene regulation in developing organisms. The fruit fly Drosophila melanogaster is the model system most widely applied for spatial gene expression analysis due to its unique embryonic properties: the shape does not change significantly during its early cleavage cycles and most genes are differentially expressed along a straight axis. This system of development is quite exceptional in the animal kingdom. In the sea anemone Nematostella vectensis the embryo changes its shape during early development; there are cell divisions and cell movement, like in most other metazoans. Nematostella is an attractive case study for spatial gene expression since its transparent body wall makes it accessible to various imaging techniques. Findings Our new quantification method produces standardized gene expression profiles from raw or annotated Nematostella in situ hybridizations by measuring the expression intensity along its cell layer. The procedure is based on digital morphologies derived from high-resolution fluorescence pictures. Additionally, complete descriptions of nonsymmetric expression patterns have been constructed by transforming the gene expression images into a three-dimensional representation. Conclusions We created a standard format for gene expression data, which enables quantitative analysis of in situ hybridizations from embryos with various shapes in different developmental stages. The obtained expression profiles are suitable as input for optimization of gene regulatory network models, and for correlation analysis of genes from dissimilar Nematostella morphologies. This approach is potentially applicable to many other metazoan model organisms and may also be suitable for processing data from three-dimensional imaging techniques.

  11. Quantitative risk analysis for landslides ‒ Examples from Bíldudalur, NW-Iceland

    Directory of Open Access Journals (Sweden)

    R. Bell

    2004-01-01

    Full Text Available Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative

  12. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  13. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  14. Detecting the Land-Cover Changes Induced by Large-Physical Disturbances Using Landscape Metrics, Spatial Sampling, Simulation and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2009-08-01

    Full Text Available The objectives of the study are to integrate the conditional Latin Hypercube Sampling (cLHS, sequential Gaussian simulation (SGS and spatial analysis in remotely sensed images, to monitor the effects of large chronological disturbances on spatial characteristics of landscape changes including spatial heterogeneity and variability. The multiple NDVI images demonstrate that spatial patterns of disturbed landscapes were successfully delineated by spatial analysis such as variogram, Moran’I and landscape metrics in the study area. The hybrid method delineates the spatial patterns and spatial variability of landscapes caused by these large disturbances. The cLHS approach is applied to select samples from Normalized Difference Vegetation Index (NDVI images from SPOT HRV images in the Chenyulan watershed of Taiwan, and then SGS with sufficient samples is used to generate maps of NDVI images. In final, the NDVI simulated maps are verified using indexes such as the correlation coefficient and mean absolute error (MAE. Therefore, the statistics and spatial structures of multiple NDVI images present a very robust behavior, which advocates the use of the index for the quantification of the landscape spatial patterns and land cover change. In addition, the results transferred by Open Geospatial techniques can be accessed from web-based and end-user applications of the watershed management.

  15. Sex differences in visual-spatial working memory: A meta-analysis.

    Science.gov (United States)

    Voyer, Daniel; Voyer, Susan D; Saint-Aubin, Jean

    2017-04-01

    Visual-spatial working memory measures are widely used in clinical and experimental settings. Furthermore, it has been argued that the male advantage in spatial abilities can be explained by a sex difference in visual-spatial working memory. Therefore, sex differences in visual-spatial working memory have important implication for research, theory, and practice, but they have yet to be quantified. The present meta-analysis quantified the magnitude of sex differences in visual-spatial working memory and examined variables that might moderate them. The analysis used a set of 180 effect sizes from healthy males and females drawn from 98 samples ranging in mean age from 3 to 86 years. Multilevel meta-analysis was used on the overall data set to account for non-independent effect sizes. The data also were analyzed in separate task subgroups by means of multilevel and mixed-effects models. Results showed a small but significant male advantage (mean d = 0.155, 95 % confidence interval = 0.087-0.223). All the tasks produced a male advantage, except for memory for location, where a female advantage emerged. Age of the participants was a significant moderator, indicating that sex differences in visual-spatial working memory appeared first in the 13-17 years age group. Removing memory for location tasks from the sample affected the pattern of significant moderators. The present results indicate a male advantage in visual-spatial working memory, although age and specific task modulate the magnitude and direction of the effects. Implications for clinical applications, cognitive model building, and experimental research are discussed.

  16. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  17. Spatially resolved element analysis of historical violin varnishes by use of muPIXE.

    Science.gov (United States)

    von Bohlen, Alex; Röhrs, Stefan; Salomon, Joseph

    2007-02-01

    External muPIXE has been used for characterisation of small samples of varnish from historical violins, and pieces of varnished wood from historical and modern stringed instruments. To obtain spatially resolved information about the distribution of elements across the varnish layers single-spot analysis, line-scans, and area-mapping were performed. Local resolution of approximately 20 mum was obtained from the 3 MeV, 1 nA proton micro-probe. Results from simultaneous multi-element determination of Na, Mg, Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Br, Rb, Sr, Ag, Cd, Sn, Ba, and Pb in historical varnishes are presented. Semi-quantitative evaluation of line-scans recorded on diverse historical varnishes is reported. The applied method is discussed in detail and the results obtained are critically reviewed and compared with those in the literature.

  18. Spatial analysis of hemorrhagic fever with renal syndrome in China

    Directory of Open Access Journals (Sweden)

    Yang Hong

    2006-04-01

    Full Text Available Abstract Background Hemorrhagic fever with renal syndrome (HFRS is endemic in many provinces with high incidence in mainland China, although integrated intervention measures including rodent control, environment management and vaccination have been implemented for over ten years. In this study, we conducted a geographic information system (GIS-based spatial analysis on distribution of HFRS cases for the whole country with an objective to inform priority areas for public health planning and resource allocation. Methods Annualized average incidence at a county level was calculated using HFRS cases reported during 1994–1998 in mainland China. GIS-based spatial analyses were conducted to detect spatial autocorrelation and clusters of HFRS incidence at the county level throughout the country. Results Spatial distribution of HFRS cases in mainland China from 1994 to 1998 was mapped at county level in the aspects of crude incidence, excess hazard and spatial smoothed incidence. The spatial distribution of HFRS cases was nonrandom and clustered with a Moran's I = 0.5044 (p = 0.001. Spatial cluster analyses suggested that 26 and 39 areas were at increased risks of HFRS (p Conclusion The application of GIS, together with spatial statistical techniques, provide a means to quantify explicit HFRS risks and to further identify environmental factors responsible for the increasing disease risks. We demonstrate a new perspective of integrating such spatial analysis tools into the epidemiologic study and risk assessment of HFRS.

  19. Multi-spatial analysis of aeolian dune-field patterns

    Science.gov (United States)

    Ewing, Ryan C.; McDonald, George D.; Hayes, Alex G.

    2015-07-01

    Aeolian dune-fields are composed of different spatial scales of bedform patterns that respond to changes in environmental boundary conditions over a wide range of time scales. This study examines how variations in spatial scales of dune and ripple patterns found within dune fields are used in environmental reconstructions on Earth, Mars and Titan. Within a single bedform type, different spatial scales of bedforms emerge as a pattern evolves from an initial state into a well-organized pattern, such as with the transition from protodunes to dunes. Additionally, different types of bedforms, such as ripples, coarse-grained ripples and dunes, coexist at different spatial scales within a dune-field. Analysis of dune-field patterns at the intersection of different scales and types of bedforms at different stages of development provides a more comprehensive record of sediment supply and wind regime than analysis of a single scale and type of bedform. Interpretations of environmental conditions from any scale of bedform, however, are limited to environmental signals associated with the response time of that bedform. Large-scale dune-field patterns integrate signals over long-term climate cycles and reveal little about short-term variations in wind or sediment supply. Wind ripples respond instantly to changing conditions, but reveal little about longer-term variations in wind or sediment supply. Recognizing the response time scales across different spatial scales of bedforms maximizes environmental interpretations from dune-field patterns.

  20. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  1. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  2. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  3. Quantitative diagnosis of bladder cancer by morphometric analysis of HE images

    Science.gov (United States)

    Wu, Binlin; Nebylitsa, Samantha V.; Mukherjee, Sushmita; Jain, Manu

    2015-02-01

    In clinical practice, histopathological analysis of biopsied tissue is the main method for bladder cancer diagnosis and prognosis. The diagnosis is performed by a pathologist based on the morphological features in the image of a hematoxylin and eosin (HE) stained tissue sample. This manuscript proposes algorithms to perform morphometric analysis on the HE images, quantify the features in the images, and discriminate bladder cancers with different grades, i.e. high grade and low grade. The nuclei are separated from the background and other types of cells such as red blood cells (RBCs) and immune cells using manual outlining, color deconvolution and image segmentation. A mask of nuclei is generated for each image for quantitative morphometric analysis. The features of the nuclei in the mask image including size, shape, orientation, and their spatial distributions are measured. To quantify local clustering and alignment of nuclei, we propose a 1-nearest-neighbor (1-NN) algorithm which measures nearest neighbor distance and nearest neighbor parallelism. The global distributions of the features are measured using statistics of the proposed parameters. A linear support vector machine (SVM) algorithm is used to classify the high grade and low grade bladder cancers. The results show using a particular group of nuclei such as large ones, and combining multiple parameters can achieve better discrimination. This study shows the proposed approach can potentially help expedite pathological diagnosis by triaging potentially suspicious biopsies.

  4. Evaluation of breast lesions by contrast enhanced ultrasound: Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Wan Caifeng; Du Jing; Fang Hua; Li Fenghua; Wang Lin

    2012-01-01

    Objective: To evaluate and compare the diagnostic performance of qualitative, quantitative and combined analysis for characterization of breast lesions in contrast enhanced ultrasound (CEUS), with histological results used as the reference standard. Methods: Ninety-one patients with 91 breast lesions BI-RADS 3–5 at US or mammography underwent CEUS. All lesions underwent qualitative and quantitative enhancement evaluation. Receiver operating characteristic (ROC) curve analysis was performed to evaluate the diagnostic performance of different analytical method for discrimination between benign and malignant breast lesions. Results: Histopathologic analysis of the 91 lesions revealed 44 benign and 47 malignant. For qualitative analysis, benign and malignant lesions differ significantly in enhancement patterns (p z1 ), 0.768 (A z2 ) and 0.926(A z3 ) respectively. The values of A z1 and A z3 were significantly higher than that for A z2 (p = 0.024 and p = 0.008, respectively). But there was no significant difference between the values of A z1 and A z3 (p = 0.625). Conclusions: The diagnostic performance of qualitative and combined analysis was significantly higher than that for quantitative analysis. Although quantitative analysis has the potential to differentiate benign from malignant lesions, it has not yet improved the final diagnostic accuracy.

  5. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  6. Analysis of spatial diffusion of ferric ions in PVA-GTA gel dosimeters through magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Marrale, Maurizio [Dipartimento di Fisica e Chimica, Universitá di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Istituto Nazionale di Fisica Nucleare (INFN) – Gruppo V Sezione di Catania, Via Santa Sofia, 64, 95123 Catania (Italy); ATeN Center, Università di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Collura, Giorgio [Dipartimento di Fisica e Chimica, Universitá di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Istituto Nazionale di Fisica Nucleare (INFN) – Gruppo V Sezione di Catania, Via Santa Sofia, 64, 95123 Catania (Italy); Gallo, Salvatore, E-mail: salvatore.gallo05@unipa.it [Dipartimento di Fisica e Chimica, Universitá di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Istituto Nazionale di Fisica Nucleare (INFN) – Gruppo V Sezione di Catania, Via Santa Sofia, 64, 95123 Catania (Italy); Dipartimento di Fisica, Universitá di Milano, Via Giovanni Celoria 16, 20133 Milano (Italy); Nici, Stefania [Dipartimento di Fisica e Chimica, Universitá di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Tranchina, Luigi [ATeN Center, Università di Palermo, Viale delle Scienze, Edificio 18, 90128 Palermo (Italy); Abbate, Boris Federico [U.O.C. Fisica Sanitaria, A.R.N.A.S., Ospedale Civico Palermo, Piazza Nicola Leotta 4, 90127 Palermo (Italy); Marineo, Sandra; Caracappa, Santo [Istituto Zooprofilattico Sperimentale della Sicilia (IZS), Via Gino Marinuzzi, 3, 90129 Palermo (Italy); and others

    2017-04-01

    Highlights: • Analysis of ferric ions diffusion throughout the gel matrix in PVA-GTA samples. • Measurements with preclinical 7T MRI scanner with spatial resolution of 200 μm. • Diffusion process is much slower for PVA-GTA gels than for agarose ones. - Abstract: This work focused on the analysis of the temporal diffusion of ferric ions through PVA-GTA gel dosimeters. PVA-GTA gel samples, partly exposed with 6 MV X-rays in order to create an initial steep gradient, were mapped using magnetic resonance imaging on a 7T MRI scanner for small animals. Multiple images of the gels were acquired over several hours after irradiation and were analyzed to quantitatively extract the signal profile. The spatial resolution achieved is 200 μm and this makes this technique particularly suitable for the analysis of steep gradients of ferric ion concentration. The results obtained with PVA-GTA gels were compared with those achieved with agarose gels, which is a standard dosimetric gel formulation. The analysis showed that the diffusion process is much slower (more than five times) for PVA-GTA gels than for agarose ones. Furthermore, it is noteworthy that the diffusion coefficient value obtained through MRI analysis is significantly consistent with that obtained in separate study Marini et al. (Submitted for publication) using a totally independent method such as spectrophotometry. This is a valuable result highlighting that the good dosimetric features of this gel matrix not only can be reproduced but also can be measured through independent experimental techniques based on different physical principles.

  7. Global sensitivity analysis for models with spatially dependent outputs

    International Nuclear Information System (INIS)

    Iooss, B.; Marrel, A.; Jullien, M.; Laurent, B.

    2011-01-01

    The global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Meta-model-based techniques have been developed in order to replace the CPU time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common meta-model-based sensitivity analysis methods are well suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the meta-modeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol' indices is thus obtained. (authors)

  8. Quantitative 3D Analysis of Nuclear Morphology and Heterochromatin Organization from Whole-Mount Plant Tissue Using NucleusJ.

    Science.gov (United States)

    Desset, Sophie; Poulet, Axel; Tatout, Christophe

    2018-01-01

    Image analysis is a classical way to study nuclear organization. While nuclear organization used to be investigated by colorimetric or fluorescent labeling of DNA or specific nuclear compartments, new methods in microscopy imaging now enable qualitative and quantitative analyses of chromatin pattern, and nuclear size and shape. Several procedures have been developed to prepare samples in order to collect 3D images for the analysis of spatial chromatin organization, but only few preserve the positional information of the cell within its tissue context. Here, we describe a whole mount tissue preparation procedure coupled to DNA staining using the PicoGreen ® intercalating agent suitable for image analysis of the nucleus in living and fixed tissues. 3D Image analysis is then performed using NucleusJ, an open source ImageJ plugin, which allows for quantifying variations in nuclear morphology such as nuclear volume, sphericity, elongation, and flatness as well as in heterochromatin content and position in respect to the nuclear periphery.

  9. Quantitative atom probe analysis of nanostructure containing clusters and precipitates with multiple length scales

    International Nuclear Information System (INIS)

    Marceau, R.K.W.; Stephenson, L.T.; Hutchinson, C.R.; Ringer, S.P.

    2011-01-01

    A model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered in this study. It has recently been shown that the addition of the GP zones to such microstructures can lead to significant increases in strength without a decrease in the uniform elongation. In this study, atom probe tomography (APT) has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. Recent nuclear magnetic resonance (NMR) analysis has clearly shown strain-induced dissolution of the GP zones, which is supported by the current APT data with additional spatial information. There is significant repartitioning of Cu from the GP zones into the solid solution during deformation. A new approach for cluster finding in APT data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features in the solid solution solute as a function of applied strain. -- Research highlights: → A new approach for cluster finding in atom probe tomography (APT) data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features with multiple length scales. → In this study, a model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered. → APT has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. → It is clearly shown that there is strain-induced dissolution of the GP zones with significant repartitioning of Cu from the GP zones into the solid solution during deformation.

  10. Analysis of vaginal microbicide film hydration kinetics by quantitative imaging refractometry.

    Science.gov (United States)

    Rinehart, Matthew; Grab, Sheila; Rohan, Lisa; Katz, David; Wax, Adam

    2014-01-01

    We have developed a quantitative imaging refractometry technique, based on holographic phase microscopy, as a tool for investigating microscopic structural changes in water-soluble polymeric materials. Here we apply the approach to analyze the structural degradation of vaginal topical microbicide films due to water uptake. We implemented transmission imaging of 1-mm diameter film samples loaded into a flow chamber with a 1.5×2 mm field of view. After water was flooded into the chamber, interference images were captured and analyzed to obtain high resolution maps of the local refractive index and subsequently the volume fraction and mass density of film material at each spatial location. Here, we compare the hydration dynamics of a panel of films with varying thicknesses and polymer compositions, demonstrating that quantitative imaging refractometry can be an effective tool for evaluating and characterizing the performance of candidate microbicide film designs for anti-HIV drug delivery.

  11. 3D spatially-adaptive canonical correlation analysis: Local and global methods.

    Science.gov (United States)

    Yang, Zhengshi; Zhuang, Xiaowei; Sreenivasan, Karthik; Mishra, Virendra; Curran, Tim; Byrd, Richard; Nandy, Rajesh; Cordes, Dietmar

    2018-04-01

    Local spatially-adaptive canonical correlation analysis (local CCA) with spatial constraints has been introduced to fMRI multivariate analysis for improved modeling of activation patterns. However, current algorithms require complicated spatial constraints that have only been applied to 2D local neighborhoods because the computational time would be exponentially increased if the same method is applied to 3D spatial neighborhoods. In this study, an efficient and accurate line search sequential quadratic programming (SQP) algorithm has been developed to efficiently solve the 3D local CCA problem with spatial constraints. In addition, a spatially-adaptive kernel CCA (KCCA) method is proposed to increase accuracy of fMRI activation maps. With oriented 3D spatial filters anisotropic shapes can be estimated during the KCCA analysis of fMRI time courses. These filters are orientation-adaptive leading to rotational invariance to better match arbitrary oriented fMRI activation patterns, resulting in improved sensitivity of activation detection while significantly reducing spatial blurring artifacts. The kernel method in its basic form does not require any spatial constraints and analyzes the whole-brain fMRI time series to construct an activation map. Finally, we have developed a penalized kernel CCA model that involves spatial low-pass filter constraints to increase the specificity of the method. The kernel CCA methods are compared with the standard univariate method and with two different local CCA methods that were solved by the SQP algorithm. Results show that SQP is the most efficient algorithm to solve the local constrained CCA problem, and the proposed kernel CCA methods outperformed univariate and local CCA methods in detecting activations for both simulated and real fMRI episodic memory data. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  13. Spatial Assessment of Road Traffic Injuries in the Greater Toronto Area (GTA: Spatial Analysis Framework

    Directory of Open Access Journals (Sweden)

    Sina Tehranchi

    2017-03-01

    Full Text Available This research presents a Geographic Information Systems (GIS and spatial analysis approach based on the global spatial autocorrelation of road traffic injuries for identifying spatial patterns. A locational spatial autocorrelation was also used for identifying traffic injury at spatial level. Data for this research study were acquired from Canadian Institute for Health Information (CIHI based on 2004 and 2011. Moran’s I statistics were used to examine spatial patterns of road traffic injuries in the Greater Toronto Area (GTA. An assessment of Getis-Ord Gi* statistic was followed as to identify hot spots and cold spots within the study area. The results revealed that Peel and Durham have the highest collision rate for other motor vehicle with motor vehicle. Geographic weighted regression (GWR technique was conducted to test the relationships between the dependent variable, number of road traffic injury incidents and independent variables such as number of seniors, low education, unemployed, vulnerable groups, people smoking and drinking, urban density and average median income. The result of this model suggested that number of seniors and low education have a very strong correlation with the number of road traffic injury incidents.

  14. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  15. Spatio-temporal Analysis of Hydrological Drought at Catchment Scale Using a Spatially-distributed Hydrological Model

    NARCIS (Netherlands)

    Mercado, Vitali Diaz; Perez, Gerald Corzo; Solomatine, Dimitri; Lanen, Van Henny A.J.

    2016-01-01

    Lately, drought is more intense and much more severe around the globe, causing more deaths than other hazards in the past century. Drought can be characterized quantitatively for its spatial extent, intensity and duration by using drought indicators. Several indicators have been developed in

  16. Spatial compression algorithm for the analysis of very large multivariate images

    Science.gov (United States)

    Keenan, Michael R [Albuquerque, NM

    2008-07-15

    A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.

  17. Using Spatial Semantics and Interactions to Identify Urban Functional Regions

    Directory of Open Access Journals (Sweden)

    Yandong Wang

    2018-03-01

    Full Text Available The spatial structures of cities have changed dramatically with rapid socio-economic development in ways that are not well understood. To support urban structural analysis and rational planning, we propose a framework to identify urban functional regions and quantitatively explore the intensity of the interactions between them, thus increasing the understanding of urban structures. A method for the identification of functional regions via spatial semantics is proposed, which involves two steps: (1 the study area is classified into three types of functional regions using taxi origin/destination (O/D flows; and (2 the spatial semantics for the three types of functional regions are demonstrated based on point-of-interest (POI categories. To validate the existence of urban functional regions, we explored the intensity of interactions quantitatively between them. A case study using POI data and taxi trajectory data from Beijing validates the proposed framework. The results show that the proposed framework can be used to identify urban functional regions and promotes an enhanced understanding of urban structures.

  18. Investigation of Spatial Data with Open Source Social Network Analysis and Geographic Information Systems Applications

    Science.gov (United States)

    Sabah, L.; Şimşek, M.

    2017-11-01

    Social networks are the real social experience of individuals in the online environment. In this environment, people use symbolic gestures and mimics, sharing thoughts and content. Social network analysis is the visualization of complex and large quantities of data to ensure that the overall picture appears. It is the understanding, development, quantitative and qualitative analysis of the relations in the social networks of Graph theory. Social networks are expressed in the form of nodes and edges. Nodes are people/organizations, and edges are relationships between nodes. Relations are directional, non-directional, weighted, and weightless. The purpose of this study is to examine the effects of social networks on the evaluation of person data with spatial coordinates. For this, the cluster size and the effect on the geographical area of the circle where the placements of the individual are influenced by the frequently used placeholder feature in the social networks have been studied.

  19. Multi-component time, spatial and frequency analysis of Paleoclimatic Data

    Science.gov (United States)

    Cristiano, Luigia; Stampa, Johannes; Feeser, Ingo; Dörfler, Walter; Meier, Thomas

    2017-04-01

    The investigation of the paleoclimatic data offers a powerful tool for understanding the impact of extreme climatic events as well as gradual climatic variations on the human development and cultural changes. The current global record of paleoclimatic data is relatively rich but is not generally uniformly structured and regionally distributed. The general characteristic of the reconstructed time series of paleoclimatic data is a not constant sampling interval and data resolution together with the presence of gaps in the record. Our database consists of pollen concentration from annually laminated lake sediments in two sites in Northern Germany. Such data characteristic offers the possibility for high-resolution palynological and sedimentological analyses on a well constrained time scale. Specifically we are interested to investigate the time dependence of proxies, and time and spatial correlation of the different observables respect each other. We present here a quantitative analysis of the pollent data in the frequency and time. In particular we are interested to understand the complexity of the system and understand the cause of sudden as well as the slow changes in the time dependence of the observables. We show as well our approach for handling the not uniform sampling interval and the broad frequency content characterizing the paleoclimatic databases. In particular we worked to the development of a robust data analysis to answer the key questions about the correlation between rapid climatic changes and changes in the human habits and quantitatively elaborate a model for the processed data. Here we present the preliminary results on synthetics as well as on real data for the data visualization for the trend identification with a smoothing procedure, for the identification of sharp changes in the data as function of time with AutoRegressive approach. In addition to that we use the cross-correlation and cross spectrum by applying the Multiple Filtering Technique

  20. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  1. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  2. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  3. Spatial pattern recognition of seismic events in South West Colombia

    Science.gov (United States)

    Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber

    2013-09-01

    Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.

  4. Analysis of vaginal microbicide film hydration kinetics by quantitative imaging refractometry.

    Directory of Open Access Journals (Sweden)

    Matthew Rinehart

    Full Text Available We have developed a quantitative imaging refractometry technique, based on holographic phase microscopy, as a tool for investigating microscopic structural changes in water-soluble polymeric materials. Here we apply the approach to analyze the structural degradation of vaginal topical microbicide films due to water uptake. We implemented transmission imaging of 1-mm diameter film samples loaded into a flow chamber with a 1.5×2 mm field of view. After water was flooded into the chamber, interference images were captured and analyzed to obtain high resolution maps of the local refractive index and subsequently the volume fraction and mass density of film material at each spatial location. Here, we compare the hydration dynamics of a panel of films with varying thicknesses and polymer compositions, demonstrating that quantitative imaging refractometry can be an effective tool for evaluating and characterizing the performance of candidate microbicide film designs for anti-HIV drug delivery.

  5. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  6. Spatial Analysis GIS Model for Identifying the Risk Induced by Landslides. A Case Study: A.T.U. of Șieu

    Directory of Open Access Journals (Sweden)

    Dorel Colniţă

    2016-11-01

    Full Text Available The risk induced by landslides on residential infrastructure, transport infrastructure and agricultural land causes problems of local management that need to be solved by reducing negative effects and decrease the frequency of their occurrence. This study followed the development and implementation of a model for identifying the risk induced by landslides through the analysis of spatial occurrence probability for landslides at the administrative territorial unit of Șieu, following the semi-quantitative method governed in Romania by G.D. no 447/2003 and then through the exposure of housing infrastructure at landslides was possible to frame landslides on risk classes. The entire approach was based on GIS spatial analysis, creating a specific detailed database of causing and triggering factors of landslides and not at least, a database for risk receptors, in this study, represented by the constructions of villages associated with the studied administrative territorial units. The final result of the model highlights the framing of constructions on qualitative risk classes at landslides, revealing the elements of infrastructure that need post and pre event measures of protection.

  7. Advanced spatial metrics analysis in cellular automata land use and cover change modeling

    International Nuclear Information System (INIS)

    Zamyatin, Alexander; Cabral, Pedro

    2011-01-01

    This paper proposes an approach for a more effective definition of cellular automata transition rules for landscape change modeling using an advanced spatial metrics analysis. This approach considers a four-stage methodology based on: (i) the search for the appropriate spatial metrics with minimal correlations; (ii) the selection of the appropriate neighborhood size; (iii) the selection of the appropriate technique for spatial metrics application; and (iv) the analysis of the contribution level of each spatial metric for joint use. The case study uses an initial set of 7 spatial metrics of which 4 are selected for modeling. Results show a better model performance when compared to modeling without any spatial metrics or with the initial set of 7 metrics.

  8. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  9. Spatial and Angular Moment Analysis of Continuous and Discretized Transport Problems

    International Nuclear Information System (INIS)

    Brantley, Patrick S.; Larsen, Edward W.

    2000-01-01

    A new theoretical tool for analyzing continuous and discretized transport equations is presented. This technique is based on a spatial and angular moment analysis of the analytic transport equation, which yields exact expressions for the 'center of mass' and 'squared radius of gyration' of the particle distribution. Essentially the same moment analysis is applied to discretized particle transport problems to determine numerical expressions for the center of mass and squared radius of gyration. Because this technique makes no assumption about the optical thickness of the spatial cells or about the amount of absorption in the system, it is applicable to problems that cannot be analyzed by a truncation analysis or an asymptotic diffusion limit analysis. The spatial differencing schemes examined (weighted- diamond, lumped linear discontinuous, and multiple balance) yield a numerically consistent expression for computing the squared radius of gyration plus an error term that depends on the mesh spacing, quadrature constants, and material properties of the system. The numerical results presented suggest that the relative accuracy of spatial differencing schemes for different types of problems can be assessed by comparing the magnitudes of these error terms

  10. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  11. SDAR 1.0 a New Quantitative Toolkit for Analyze Stratigraphic Data

    Science.gov (United States)

    Ortiz, John; Moreno, Carlos; Cardenas, Andres; Jaramillo, Carlos

    2015-04-01

    complex tasks of statistical analysis (i.e., R language provide more than hundred spatial libraries that allow users to explore various Geostatistics and spatial analysis). Consequently, SDAR allows a deeper exploration of the stratigraphic data collected in the field, it will allow the geoscientific community in the near future to develop complex analyses related with the distribution in space and time of rock sequences, such as lithofacial correlations, by a multivariate comparison between empirical SCs with quantitative lithofacial models established from modern sedimentary environments.

  12. Role of image analysis in quantitative characterisation of nuclear fuel materials

    International Nuclear Information System (INIS)

    Dubey, J.N.; Rao, T.S.; Pandey, V.D.; Majumdar, S.

    2005-01-01

    Image analysis is one of the important techniques, widely used for materials characterization. It provides the quantitative estimation of the microstructural features present in the material. This information is very much valuable for finding out the criteria for taking up the fuel for high burn up. Radiometallurgy Division has been carrying out development and fabrication of plutonium related fuels for different type of reactors viz. Purnima, Fast Breeder Test Reactor (FBTR), Prototype Fast Breeder Reactor (PFBR), Boiling Water Reactor (BWR), Advanced Heavy Water Reactor (AHWR), Pressurised Heavy Water Reactor (PHWR) and KAMINI Reactor. Image analysis has been carried out on microstructures of PHWR, AHWR, FBTR and KAMINI fuels. Samples were prepared as per standard ASTM metallographic procedure. Digital images of the microstructure of these specimens were obtained using CCD camera, attached to the optical microscope. These images are stores on computer and used for detection and analysis of features of interest with image analysis software. Quantitative image analysis technique has been standardised and used for finding put type of the porosity, its size, shape and distribution in the above sintered oxide and carbide fuels. This technique has also been used for quantitative estimation of different phases present in KAMINI fuel. Image analysis results have been summarised and presented in this paper. (author)

  13. Research approach for forming a new typology of spatial planning theory

    Directory of Open Access Journals (Sweden)

    Bulajić Vladan

    2011-01-01

    Full Text Available What is being suggested in this paper is the research approach for the classification of theoretical contributions in the scientific domain of the spatial planning. Typology is a multidimensional classification, actually it is the framework for the understanding of the subject area, theory and practice, ideas and methodologies. The complex approach is needed to organize the complex and diverse domain of spatial planning theory, which has been shaped by different schools of thought and the influences of the related scientific disciplines. It has been suggested that the research approach becomes the bridge between two cultures, in other words it should be the synthesis of the qualitative and quantitative methods of the typology construction. With the analysis of the existing typologies, which are quantitatively derived, the chosen concepts will be improved and completed due to the computerized statistical analysis of the appropriate bibliometrical data. Moreover, the procedure in the opposite direction will be used, which also connects empiric types with their conceptual counterparts. With that approach, the main aim is to achieve the comprehensive classification scheme, which will take part of the platform for integration of the interdisciplinary approach in the spatial planning domain. That concept of the research belongs to the wider approach that has got the aim that with the scientific innovations and imaginations bring about the solving of the problems and challenges that the spatial planning faces with. The forming of the new typology is the first step in that direction.

  14. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  15. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  16. Quantitative EDXS analysis of organic materials using the ζ-factor method

    International Nuclear Information System (INIS)

    Fladischer, Stefanie; Grogger, Werner

    2014-01-01

    In this study we successfully applied the ζ-factor method to perform quantitative X-ray analysis of organic thin films consisting of light elements. With its ability to intrinsically correct for X-ray absorption, this method significantly improved the quality of the quantification as well as the accuracy of the results compared to conventional techniques in particular regarding the quantification of light elements. We describe in detail the process of determining sensitivity factors (ζ-factors) using a single standard specimen and the involved parameter optimization for the estimation of ζ-factors for elements not contained in the standard. The ζ-factor method was then applied to perform quantitative analysis of organic semiconducting materials frequently used in organic electronics. Finally, the results were verified and discussed concerning validity and accuracy. - Highlights: • The ζ-factor method is used for quantitative EDXS analysis of light elements. • We describe the process of determining ζ-factors from a single standard in detail. • Organic semiconducting materials are successfully quantified

  17. Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography

    International Nuclear Information System (INIS)

    Prado, T. L.; Galuzio, P. P.; Lopes, S. R.; Viana, R. L.

    2014-01-01

    Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with a better control over the spurious fragments in the image

  18. Quantitative determination of absorbed hydrogen in oxidised zircaloy by means of neutron radiography

    International Nuclear Information System (INIS)

    Grosse, M.; Lehmann, E.; Vontobel, P.; Steinbrueck, M.

    2006-01-01

    Hydrogen absorbed in steam-oxidised zircaloy can be determined quantitatively by means of neutron radiography. Correlation parameters between the total cross section and hydrogen content as well as oxide layer thickness were determined quantitatively. At H/Zr atomic ratios lower than 1.0, linear correlations between the hydrogen content and total cross section exist. The total cross section of Zr is lower and the effect of the hydrogen is higher in radiography measurements with a cold neutron spectrum than with a thermal spectrum. A Be filter reduces the effects of lower wavelength and epithermal neutrons and extends the linear correlations to higher H/Zr atomic ratios. Due to the better possibilities of background corrections, the neutron image should be detected by a CCD camera for a proper quantitative analysis with a medium spatial resolution of about 0.1 mm. A higher spatial resolution, but larger uncertainties in the quantitative hydrogen determination are achieved by measurements with imaging plates. The effect of oxygen layers on the total cross section is much smaller than the effect of hydrogen. The total cross section measured depends linearly on the oxide layer thickness

  19. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  20. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  1. Critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P T; McCulloch, J [Glasgow Univ. (UK)

    1983-06-13

    Semi-quantitative analysis (e.g. optical density ratios) of (/sup 14/C)2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of /sup 14/C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of /sup 14/C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of (/sup 14/C)2-deoxyglucose autoradiograms is undertaken.

  2. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  3. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  4. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  5. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  6. Time dependent analysis of Xenon spatial oscillations in small power reactors

    International Nuclear Information System (INIS)

    Decco, Claudia Cristina Ghirardello

    1997-01-01

    This work presents time dependent analysis of xenon spatial oscillations studying the influence of the power density distribution, type of reactivity perturbation, power level and core size, using the one-dimensional and three-dimensional analysis with the MID2 and citation codes, respectively. It is concluded that small pressurized water reactors with height smaller than 1.5 m are stable and do not have xenon spatial oscillations. (author)

  7. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  8. Spectro-spatial analysis of wave packet propagation in nonlinear acoustic metamaterials

    Science.gov (United States)

    Zhou, W. J.; Li, X. P.; Wang, Y. S.; Chen, W. Q.; Huang, G. L.

    2018-01-01

    The objective of this work is to analyze wave packet propagation in weakly nonlinear acoustic metamaterials and reveal the interior nonlinear wave mechanism through spectro-spatial analysis. The spectro-spatial analysis is based on full-scale transient analysis of the finite system, by which dispersion curves are generated from the transmitted waves and also verified by the perturbation method (the L-P method). We found that the spectro-spatial analysis can provide detailed information about the solitary wave in short-wavelength region which cannot be captured by the L-P method. It is also found that the optical wave modes in the nonlinear metamaterial are sensitive to the parameters of the nonlinear constitutive relation. Specifically, a significant frequency shift phenomenon is found in the middle-wavelength region of the optical wave branch, which makes this frequency region behave like a band gap for transient waves. This special frequency shift is then used to design a direction-biased waveguide device, and its efficiency is shown by numerical simulations.

  9. Assessment of tuberculosis spatial hotspot areas in Antananarivo, Madagascar, by combining spatial analysis and genotyping.

    Science.gov (United States)

    Ratovonirina, Noël Harijaona; Rakotosamimanana, Niaina; Razafimahatratra, Solohery Lalaina; Raherison, Mamy Serge; Refrégier, Guislaine; Sola, Christophe; Rakotomanana, Fanjasoa; Rasolofo Razanamparany, Voahangy

    2017-08-14

    Tuberculosis (TB) remains a public health problem in Madagascar. A crucial element of TB control is the development of an easy and rapid method for the orientation of TB control strategies in the country. Our main objective was to develop a TB spatial hotspot identification method by combining spatial analysis and TB genotyping method in Antananarivo. Sputa of new pulmonary TB cases from 20 TB diagnosis and treatment centers (DTCs) in Antananarivo were collected from August 2013 to May 2014 for culture. Mycobacterium tuberculosis complex (MTBC) clinical isolates were typed by spoligotyping on a Luminex® 200 platform. All TB patients were respectively localized according to their neighborhood residence and the spatial distribution of all pulmonary TB patients and patients with genotypic clustered isolates were scanned respectively by the Kulldorff spatial scanning method for identification of significant spatial clustering. Areas exhibiting spatial clustering of patients with genotypic clustered isolates were considered as hotspot TB areas for transmission. Overall, 467 new cases were included in the study, and 394 spoligotypes were obtained (84.4%). New TB cases were distributed in 133 of the 192 Fokontany (administrative neighborhoods) of Antananarivo (1 to 15 clinical patients per Fokontany) and patients with genotypic clustered isolates were distributed in 127 of the 192 Fokontany (1 to 13 per Fokontany). A single spatial focal point of epidemics was detected when ignoring genotypic data (p = 0.039). One Fokontany of this focal point and three additional ones were detected to be spatially clustered when taking genotypes into account (p Madagascar and will allow better TB control strategies by public health authorities.

  10. Quantitative X-ray analysis of biological fluids: the microdroplet technique

    International Nuclear Information System (INIS)

    Roinel, N.

    1988-01-01

    X-ray microanalysis can be used to quantitatively determine the elemental composition of microvolumes of biological fluids. This article describes the various steps in preparation of microdroplets for analysis: The manufacturing of micropipettes, the preparation of the specimen support, the deposition of droplets on the support, shock-freezing, and lyophilization. Examples of common artifacts (incomplete rehydration prior to freezing or partial rehydration after lyophilization) are demonstrated. Analysis can be carried out either by wavelength-dispersive analysis, which is the most sensitive method, or by energy-dispersive analysis, which is more commonly available. The minimum detectable concentration is 0.05 mmol.liter-1 for 0.1-nl samples analyzed by wavelength-dispersive spectrometry and 0.5-1 mmol.liter-1 for samples analyzed by energy-dispersive spectrometry. A major problem, especially in wavelength-dispersive analysis, where high beam currents are used, is radiation damage to the specimen; in particular chloride (but also other elements) can be lost. Quantitative analysis requires the use of standard solutions with elemental concentration in the same range as those present in the specimen

  11. A quantitative method for zoning of protected areas and its spatial ecological implications.

    Science.gov (United States)

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  12. A spatial epidemiological analysis of self-rated mental health in the slums of Dhaka

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-05-01

    Full Text Available Abstract Background The deprived physical environments present in slums are well-known to have adverse health effects on their residents. However, little is known about the health effects of the social environments in slums. Moreover, neighbourhood quantitative spatial analyses of the mental health status of slum residents are still rare. The aim of this paper is to study self-rated mental health data in several slums of Dhaka, Bangladesh, by accounting for neighbourhood social and physical associations using spatial statistics. We hypothesised that mental health would show a significant spatial pattern in different population groups, and that the spatial patterns would relate to spatially-correlated health-determining factors (HDF. Methods We applied a spatial epidemiological approach, including non-spatial ANOVA/ANCOVA, as well as global and local univariate and bivariate Moran's I statistics. The WHO-5 Well-being Index was used as a measure of self-rated mental health. Results We found that poor mental health (WHO-5 scores Conclusions Spatial patterns of mental health were detected and could be partly explained by spatially correlated HDF. We thereby showed that the socio-physical neighbourhood was significantly associated with health status, i.e., mental health at one location was spatially dependent on the mental health and HDF prevalent at neighbouring locations. Furthermore, the spatial patterns point to severe health disparities both within and between the slums. In addition to examining health outcomes, the methodology used here is also applicable to residuals of regression models, such as helping to avoid violating the assumption of data independence that underlies many statistical approaches. We assume that similar spatial structures can be found in other studies focussing on neighbourhood effects on health, and therefore argue for a more widespread incorporation of spatial statistics in epidemiological studies.

  13. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  14. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  15. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  16. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  17. Abundant Topological Outliers in Social Media Data and Their Effect on Spatial Analysis.

    Science.gov (United States)

    Westerholt, Rene; Steiger, Enrico; Resch, Bernd; Zipf, Alexander

    2016-01-01

    Twitter and related social media feeds have become valuable data sources to many fields of research. Numerous researchers have thereby used social media posts for spatial analysis, since many of them contain explicit geographic locations. However, despite its widespread use within applied research, a thorough understanding of the underlying spatial characteristics of these data is still lacking. In this paper, we investigate how topological outliers influence the outcomes of spatial analyses of social media data. These outliers appear when different users contribute heterogeneous information about different phenomena simultaneously from similar locations. As a consequence, various messages representing different spatial phenomena are captured closely to each other, and are at risk to be falsely related in a spatial analysis. Our results reveal indications for corresponding spurious effects when analyzing Twitter data. Further, we show how the outliers distort the range of outcomes of spatial analysis methods. This has significant influence on the power of spatial inferential techniques, and, more generally, on the validity and interpretability of spatial analysis results. We further investigate how the issues caused by topological outliers are composed in detail. We unveil that multiple disturbing effects are acting simultaneously and that these are related to the geographic scales of the involved overlapping patterns. Our results show that at some scale configurations, the disturbances added through overlap are more severe than at others. Further, their behavior turns into a volatile and almost chaotic fluctuation when the scales of the involved patterns become too different. Overall, our results highlight the critical importance of thoroughly considering the specific characteristics of social media data when analyzing them spatially.

  18. Spatial analysis of 4,5-dichloro-2-n-octyl-4-isothiazolin-3-one (Sea-Nine 211) concentrations and probabilistic risk to marine organisms in Hiroshima Bay, Japan

    International Nuclear Information System (INIS)

    Mochida, Kazuhiko; Hano, Takeshi; Onduka, Toshimitsu; Ichihashi, Hideki; Amano, Haruna; Ito, Mana; Ito, Katsutoshi; Tanaka, Hiroyuki; Fujii, Kazunori

    2015-01-01

    We analyzed the spatial distribution of an antifouling biocide, 4,5-dichloro-2-n-octyl-4-isothiazolin-3-one (Sea-Nine 211) in the surface water and sediments of Hiroshima Bay, Japan to determine the extent of contamination by this biocide. A quantitative estimate of the environmental concentration distribution (ECD) and species sensitivity distributions (SSDs) for marine organisms were derived by using a Bayesian statistical model to carry out a probabilistic ecological risk analysis, such as calculation of the expected potentially affected fraction (EPAF). The spatial distribution analysis supported the notion that Sea-Nine 211 is used mainly for treatment of ship hulls in Japan. The calculated EPAF suggests that approximately up to a maximum of 0.45% of marine species are influenced by the toxicity of Sea-Nine 211 in Hiroshima Bay. In addition, estimation of the ecological risk with a conventional risk quotient method indicated that the risk was a cause for concern in Hiroshima Bay. - Highlights: • Spatial distribution analysis exhibits the dynamics of Sea-Nine 211 in Hiroshima Bay. • Probabilistic ecological risk of the biocide was estimated with a Bayesian approach. • Approximately up to 0.45% of marine species were possibly influenced by the toxicity. • The risk analysis concludes that Sea-Nine 211 should be a priority for further work. - Spatial distribution of an antifouling biocide and quantification of its ecological risk were elucidated

  19. Phase analysis in duplex stainless steel: comparison of EBSD and quantitative metallography methods

    International Nuclear Information System (INIS)

    Michalska, J; Chmiela, B

    2014-01-01

    The purpose of the research was to work out the qualitative and quantitative analysis of phases in DSS in as-received state and after thermal aging. For quantitative purposes, SEM observations, EDS analyses and electron backscattered diffraction (EBSD) methods were employed. Qualitative analysis of phases was performed by two methods: EBSD and classical quantitative metallography. A juxtaposition of different etchants for the revealing of microstructure and brief review of sample preparation methods for EBSD studies were presented. Different ways of sample preparation were tested and based on these results a detailed methodology of DSS phase analysis was developed including: surface finishing, selective etching methods and image acquisition. The advantages and disadvantages of applied methods were pointed out and compared the accuracy of the analysis phase performed by both methods

  20. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  1. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. 3D Auger quantitative depth profiling of individual nanoscaled III–V heterostructures

    Energy Technology Data Exchange (ETDEWEB)

    Hourani, W. [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Gorbenko, V. [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Univ. Grenoble Alpes, LTM, CNRS, F-38000 Grenoble (France); Barnes, J.-P.; Guedj, C. [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France); Cipro, R.; Moeyaert, J.; David, S.; Bassani, F.; Baron, T. [Univ. Grenoble Alpes, LTM, CNRS, F-38000 Grenoble (France); Martinez, E., E-mail: eugenie.martinez@cea.fr [Univ. Grenoble Alpes, F-38000 Grenoble (France); CEA, LETI, MINATEC Campus, F-38054 Grenoble (France)

    2016-11-15

    Highlights: • The nanoscale chemical characterization of III–V heterostructures is performed using Auger depth profiling below decananometric spatial resolution. • Reliable indium quantification is achieved on planar structures for thicknesses down to 9 nm. • Quantitative 3D compositional depth profiles are obtained on patterned structures, with sufficient lateral resolution to analyze one single trench. • The Auger intrinsic spatial resolution is estimated around 150–200 nm using a comparison with HAADF-STEM. • Auger and SIMS provide reliable in-depth chemical analysis of such complex 3D heterostructures, in particular regarding indium quantification. - Abstract: The nanoscale chemical characterization of III–V heterostructures is performed using Auger depth profiling below decananometric spatial resolution. This technique is successfully applied to quantify the elemental composition of planar and patterned III–V heterostructures containing InGaAs quantum wells. Reliable indium quantification is achieved on planar structures for thicknesses down to 9 nm. Quantitative 3D compositional depth profiles are obtained on patterned structures, for trench widths down to 200 nm. The elemental distributions obtained in averaged and pointed mode are compared. For this last case, we show that Zalar rotation during sputtering is crucial for a reliable indium quantification. Results are confirmed by comparisons with secondary ion mass spectrometry, photoluminescence spectroscopy, transmission electron microscopy and electron dispersive X-ray spectroscopy. The Auger intrinsic spatial resolution is quantitatively measured using an original methodology based on the comparison with high angle annular dark field scanning transmission electron microscopy measurements at the nanometric scale.

  3. SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA) TRAINING COURSE

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  4. Spatial Data Analysis: Recommendations for Educational Infrastructure in Sindh

    Directory of Open Access Journals (Sweden)

    Abdul Aziz Ansari

    2017-06-01

    Full Text Available Analysing the Education infrastructure has become a crucial activity in imparting quality teaching and resources to students. Facilitations required in improving current education status and future schools is an important analytical component. This is best achieved through a Geographical Information System (GIS analysis of the spatial distribution of schools. In this work, we will execute GIS Analytics on the rural and urban school distributions in Sindh, Pakistan. Using a reliable dataset collected from an international survey team, GIS analysis is done with respect to: 1 school locations, 2 school facilities (water, sanitation, class rooms etc. and 3 student’s results. We will carry out analysis at district level by presenting several spatial results. Correlational analysis of highly influential factors, which may impact the educational performance will generate recommendations for planning and development in weak areas which will provide useful insights regarding effective utilization of resources and new locations to build future schools. The time series analysis will predict the future results which may be witnessed through keen observations and data collections.

  5. [Ecological risk assessment of land use based on exploratory spatial data analysis (ESDA): a case study of Haitan Island, Fujian Province].

    Science.gov (United States)

    Wu, Jian; Chen, Peng; Wen, Chao-Xiang; Fu, Shi-Feng; Chen, Qing-Hui

    2014-07-01

    As a novel environment management tool, ecological risk assessment has provided a new perspective for the quantitative evaluation of ecological effects of land-use change. In this study, Haitan Island in Fujian Province was taken as a case. Based on the Landsat TM obtained in 1990, SPOT5 RS images obtained in 2010, general layout planning map of Pingtan Comprehensive Experimental Zone in 2030, as well as the field investigation data, we established an ecological risk index to measure ecological endpoints. By using spatial autocorrelation and semivariance analysis of Exploratory Spatial Data Analysis (ESDA), the ecological risk of Haitan Island under different land-use situations was assessed, including the past (1990), present (2010) and future (2030), and the potential risk and its changing trend were analyzed. The results revealed that the ecological risk index showed obvious scale effect, with strong positive correlation within 3000 meters. High-high (HH) and low-low (LL) aggregations were predominant types in spatial distribution of ecological risk index. The ecological risk index showed significant isotropic characteristics, and its spatial distribution was consistent with Anselin Local Moran I (LISA) distribution during the same period. Dramatic spatial distribution change of each ecological risk area was found among 1990, 2010 and 2030, and the fluctuation trend and amplitude of different ecological risk areas were diverse. The low ecological risk area showed a rise-to-fall trend while the medium and high ecological risk areas showed a fall-to-rise trend. In the planning period, due to intensive anthropogenic disturbance, the high ecological risk area spread throughout the whole region. To reduce the ecological risk in land-use and maintain the regional ecological security, the following ecological risk control strategies could be adopted, i.e., optimizing the spatial pattern of land resources, protecting the key ecoregions and controlling the scale of

  6. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  7. A method for three-dimensional quantitative observation of the microstructure of biological samples

    Science.gov (United States)

    Wang, Pengfei; Chen, Dieyan; Ma, Wanyun; Wu, Hongxin; Ji, Liang; Sun, Jialin; Lv, Danyu; Zhang, Lu; Li, Ying; Tian, Ning; Zheng, Jinggao; Zhao, Fengying

    2009-07-01

    Contemporary biology has developed into the era of cell biology and molecular biology, and people try to study the mechanism of all kinds of biological phenomena at the microcosmic level now. Accurate description of the microstructure of biological samples is exigent need from many biomedical experiments. This paper introduces a method for 3-dimensional quantitative observation on the microstructure of vital biological samples based on two photon laser scanning microscopy (TPLSM). TPLSM is a novel kind of fluorescence microscopy, which has excellence in its low optical damage, high resolution, deep penetration depth and suitability for 3-dimensional (3D) imaging. Fluorescent stained samples were observed by TPLSM, and afterward the original shapes of them were obtained through 3D image reconstruction. The spatial distribution of all objects in samples as well as their volumes could be derived by image segmentation and mathematic calculation. Thus the 3-dimensionally and quantitatively depicted microstructure of the samples was finally derived. We applied this method to quantitative analysis of the spatial distribution of chromosomes in meiotic mouse oocytes at metaphase, and wonderful results came out last.

  8. Quantitative analysis of background parenchymal enhancement in whole breast on MRI: Influence of menstrual cycle and comparison with a qualitative analysis.

    Science.gov (United States)

    Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee

    2018-06-01

    We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p quantitative BPE (r = 0.63, p Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Statistical analysis of long term spatial and temporal trends of ...

    Indian Academy of Sciences (India)

    Statistical analysis of long term spatial and temporal trends of temperature ... CGCM3; HadCM3; modified Mann–Kendall test; statistical analysis; Sutlej basin. ... Water Resources Systems Division, National Institute of Hydrology, Roorkee 247 ...

  10. Towards quantitative laser-induced breakdown spectroscopy analysis of soil samples

    International Nuclear Information System (INIS)

    Bousquet, B.; Sirven, J.-B.; Canioni, L.

    2007-01-01

    A quantitative analysis of chromium in soil samples is presented. Different emission lines related to chromium are studied in order to select the best one for quantitative features. Important matrix effects are demonstrated from one soil to the other, preventing any prediction of concentration in different soils on the basis of a univariate calibration curve. Finally, a classification of the LIBS data based on a series of Principal Component Analyses (PCA) is applied to a reduced dataset of selected spectral lines related to the major chemical elements in the soils. LIBS data of heterogeneous soils appear to be widely dispersed, which leads to a reconsideration of the sampling step in the analysis process

  11. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  12. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  13. Image Chunking: Defining Spatial Building Blocks for Scene Analysis.

    Science.gov (United States)

    1987-04-01

    mumgs0.USmusa 7.AUWOJO 4. CIUTAC Rm6ANT Wuugme*j James V/. Mlahoney DACA? 6-85-C-00 10 NOQ 1 4-85-K-O 124 Artificial Inteligence Laboratory US USS 545...0197 672 IMAGE CHUWING: DEINING SPATIAL UILDING PLOCKS FOR 142 SCENE ANRLYSIS(U) MASSACHUSETTS INST OF TECH CAIIAIDGE ARTIFICIAL INTELLIGENCE LAO J...Technical Report 980 F-Image Chunking: Defining Spatial Building Blocks for Scene DTm -Analysis S ELECTED James V. Mahoney’ MIT Artificial Intelligence

  14. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  15. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  16. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  17. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  18. Analysing and Correcting the Differences between Multi-Source and Multi-Scale Spatial Remote Sensing Observations

    Science.gov (United States)

    Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun

    2014-01-01

    Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding

  19. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  20. Spatial analysis methods and land-use planning models for rural areas

    Directory of Open Access Journals (Sweden)

    Patrizia Tassinari

    2009-10-01

    Full Text Available The work presents a brief report of the main results of a study carried out by the Spatial Engineering Division of the Department of Agricultural Economics and Engineering of the University of Bologna, within a broader PRIN 2005 research project concerning landscape and economic analysis, planning and programming. In particular, the study focuses on the design of spatial analysis methods aimed at building knowledge frameworks of the various natural and anthropic resources of rural areas. The goal is to increase the level of spatial and information detail of common databases, thus allowing higher accuracy and effectiveness of the analyses needed to achieve the goals of new generation spatial and agriculture planning. Specific in-depth analyses allowed to define techniques useful in order to reduce the increase in survey costs. Moreover, the work reports the main results regarding a multicriteria model for the analysis of the countryside defined by the research. Such model is aimed to assess the various agricultural, environmental and landscape features, vocations, expressions and attitudes, and support the definition and implementation of specific and targeted planning and programming policies.

  1. Single particle transfer for quantitative analysis with total-reflection X-ray fluorescence spectrometry

    International Nuclear Information System (INIS)

    Esaka, Fumitaka; Esaka, Konomi T.; Magara, Masaaki; Sakurai, Satoshi; Usuda, Shigekazu; Watanabe, Kazuo

    2006-01-01

    The technique of single particle transfer was applied to quantitative analysis with total-reflection X-ray fluorescence (TXRF) spectrometry. The technique was evaluated by performing quantitative analysis of individual Cu particles with diameters between 3.9 and 13.2 μm. The direct quantitative analysis of the Cu particle transferred onto a Si carrier gave a discrepancy between measured and calculated Cu amounts due to the absorption effects of incident and fluorescent X-rays within the particle. By the correction for the absorption effects, the Cu amounts in individual particles could be determined with the deviation within 10.5%. When the Cu particles were dissolved with HNO 3 solution prior to the TXRF analysis, the deviation was improved to be within 3.8%. In this case, no correction for the absorption effects was needed for quantification

  2. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  3. Capacity analysis of spectrum sharing spatial multiplexing MIMO systems

    KAUST Repository

    Yang, Liang

    2014-12-01

    This paper considers a spectrum sharing (SS) multiple-input multiple-output (MIMO) system operating in a Rayleigh fading environment. First the capacity of a single-user SS spatial multiplexing system is investigated in two scenarios that assume different receivers. To explicitly show the capacity scaling law of SS MIMO systems, some approximate capacity expressions for the two scenarios are derived. Next, we extend our analysis to a multiple user system with zero-forcing receivers (ZF) under spatially-independent scheduling and analyze the sum-rate. Furthermore, we provide an asymptotic sum-rate analysis to investigate the effects of different parameters on the multiuser diversity gain. Our results show that the secondary system with a smaller number of transmit antennas Nt and a larger number of receive antennas Nr can achieve higher capacity at lower interference temperature Q, but at high Q the capacity follows the scaling law of the conventional MIMO systems. However, for a ZF SS spatial multiplexing system, the secondary system with small Nt and large Nr can achieve the highest capacity throughout the entire region of Q. For a ZF SS spatial multiplexing system with scheduling, the asymptotic sum-rate scales like Ntlog2(Q(KNtNp-1)/Nt), where Np denotes the number of antennas of the primary receiver and K represents the number of secondary transmitters.

  4. Violence in public transportation: an approach based on spatial analysis.

    Science.gov (United States)

    Sousa, Daiane Castro Bittencourt de; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-12-11

    To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis.

  5. A GIS-based disaggregate spatial watershed analysis using RADAR data

    International Nuclear Information System (INIS)

    Al-Hamdan, M.

    2002-01-01

    Hydrology is the study of water in all its forms, origins, and destinations on the earth.This paper develops a novel modeling technique using a geographic information system (GIS) to facilitate watershed hydrological routing using RADAR data. The RADAR rainfall data, segmented to 4 km by 4 km blocks, divides the watershed into several sub basins which are modeled independently. A case study for the GIS-based disaggregate spatial watershed analysis using RADAR data is provided for South Fork Cowikee Creek near Batesville, Alabama. All the data necessary to complete the analysis is maintained in the ArcView GIS software. This paper concludes that the GIS-Based disaggregate spatial watershed analysis using RADAR data is a viable method to calculate hydrological routing for large watersheds. (author)

  6. Analysis of extent and spatial pattern change of mangrove ecosystem in Mangunharjo Sub-district from 2007 to 2017

    Science.gov (United States)

    Nugraha, S. B.; Sidiq, W. A. B. N.; Setyowati, D. L.; Martuti, N. K. T.

    2018-03-01

    This study aims to determine changes in the extent and spatial patterns of mangrove ecosystems in Mangunharjo Sub-district from 2007, 2012 and 2017. The main data source of this research is Digital Globe Imagery of Mangunharjo Sub-district and surrounding area. The extent and spatial pattern of the mangrove ecosystem were obtained from visual interpretation result of the time series image and accuracy tested with field survey data, and then the analysis was conducted quantitatively and qualitatively. The result of time series data analysis shows that there is an enhancement of mangrove forest area in Mangunharjo Sub-district from 2007-2017. In the first five years (2007-2012), the area of mangrove ecosystem increased from 9.01 Ha to 19.78 Ha, and then in the next five years (2012-2017), it was increased significantly from 19.78 Ha to 68.47 Ha. If analyzed from the spatial pattern, in 2007-2012 the mangrove ecosystems were distributed extends along the river border ponds, while in 2012-2017 it already clustered to form a certain area located at the estuary. The increasing of mangrove area in Mangunharjo Sub-district is a result of hard work with various parties, from the government institution, individual and company which launched mangrove ecosystem recovery program especially in the coastal area of Semarang City. With the better mangrove ecosystem is expected to help restore and prevent the occurrence of environmental damage in the coastal area of Semarang City due to abrasion, seawater intrusion, and tidal flood.

  7. Quantitative Analysis of the Effect of Iterative Reconstruction Using a Phantom: Determining the Appropriate Blending Percentage

    Science.gov (United States)

    Kim, Hyun Gi; Lee, Young Han; Choi, Jin-Young; Park, Mi-Suk; Kim, Myeong-Jin; Kim, Ki Whang

    2015-01-01

    Purpose To investigate the optimal blending percentage of adaptive statistical iterative reconstruction (ASIR) in a reduced radiation dose while preserving a degree of image quality and texture that is similar to that of standard-dose computed tomography (CT). Materials and Methods The CT performance phantom was scanned with standard and dose reduction protocols including reduced mAs or kVp. Image quality parameters including noise, spatial, and low-contrast resolution, as well as image texture, were quantitatively evaluated after applying various blending percentages of ASIR. The optimal blending percentage of ASIR that preserved image quality and texture compared to standard dose CT was investigated in each radiation dose reduction protocol. Results As the percentage of ASIR increased, noise and spatial-resolution decreased, whereas low-contrast resolution increased. In the texture analysis, an increasing percentage of ASIR resulted in an increase of angular second moment, inverse difference moment, and correlation and in a decrease of contrast and entropy. The 20% and 40% dose reduction protocols with 20% and 40% ASIR blending, respectively, resulted in an optimal quality of images with preservation of the image texture. Conclusion Blending the 40% ASIR to the 40% reduced tube-current product can maximize radiation dose reduction and preserve adequate image quality and texture. PMID:25510772

  8. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    International Nuclear Information System (INIS)

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  9. Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition

    Science.gov (United States)

    Li, Wei; Wu, Bing; Liu, Chunlei

    2011-01-01

    Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002

  10. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  11. Analysis Of Influence Of Spatial Planning On Performance Of Regional Development At Waropen District. Papua Indonesia

    Directory of Open Access Journals (Sweden)

    Suwandi

    2015-08-01

    Full Text Available The various problems in regional spatial planning in Waropen District Papua shows that the Spatial Planning RTRW of Waropen District Papua drafted in 2010 has not had a positive contribution to the settlement of spatial planning problems. This is most likely caused by the inconsistency in the spatial planning. This study tried to observe the consistency of spatial planning as well as its relation to the regional development performance. The method used to observe the consistency of the preparation of guided Spatial Planning RTRW is the analysis of comparative table followed by analysis of verbal logic. In order to determine if the preparation of Spatial Planning RTRW has already paid attention on the synergy with the surrounding regions Inter-Regional Context a map overlay was conducted followed by analysis of verbal logic. To determine the performance of the regional development a Principal Components Analysis PCA was done. The analysis results showed that inconsistencies in the spatial planning had caused a variety of problems that resulted in decreased performance of the regional development. The main problems that should receive more attention are infrastructure development growth economic growth transportation aspect and new properties.

  12. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  13. Spatial Analysis of Accident Spots Using Weighted Severity Index ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Spatial Analysis of Accident Spots Using Weighted Severity Index (WSI) and ... pedestrians avoiding the use of pedestrian bridges/aid even when they are available. ..... not minding an unforeseen obstruction, miscalculations and wrong break.

  14. Evaluation of shear wave elastography for differential diagnosis of breast lesions: A new qualitative analysis versus conventional quantitative analysis.

    Science.gov (United States)

    Ren, Wei-Wei; Li, Xiao-Long; Wang, Dan; Liu, Bo-Ji; Zhao, Chong-Ke; Xu, Hui-Xiong

    2018-04-13

    To evaluate a special kind of ultrasound (US) shear wave elastography for differential diagnosis of breast lesions, using a new qualitative analysis (i.e. the elasticity score in the travel time map) compared with conventional quantitative analysis. From June 2014 to July 2015, 266 pathologically proven breast lesions were enrolled in this study. The maximum, mean, median, minimum, and standard deviation of shear wave speed (SWS) values (m/s) were assessed. The elasticity score, a new qualitative feature, was evaluated in the travel time map. The area under the receiver operating characteristic (AUROC) curves were plotted to evaluate the diagnostic performance of both qualitative and quantitative analyses for differentiation of breast lesions. Among all quantitative parameters, SWS-max showed the highest AUROC (0.805; 95% CI: 0.752, 0.851) compared with SWS-mean (0.786; 95% CI:0.732, 0.834; P = 0.094), SWS-median (0.775; 95% CI:0.720, 0.824; P = 0.046), SWS-min (0.675; 95% CI:0.615, 0.731; P = 0.000), and SWS-SD (0.768; 95% CI:0.712, 0.817; P = 0.074). The AUROC of qualitative analysis in this study obtained the best diagnostic performance (0.871; 95% CI: 0.825, 0.909, compared with the best parameter of SWS-max in quantitative analysis, P = 0.011). The new qualitative analysis of shear wave travel time showed the superior diagnostic performance in the differentiation of breast lesions in comparison with conventional quantitative analysis.

  15. A book review of Spatial data analysis in ecology and agriculture using R

    Science.gov (United States)

    Spatial Data Analysis in Ecology and Agriculture Using R is a valuable resource to assist agricultural and ecological researchers with spatial data analyses using the R statistical software(www.r-project.org). Special emphasis is on spatial data sets; how-ever, the text also provides ample guidance ...

  16. Spatial-temporal data model and fractal analysis of transportation network in GIS environment

    Science.gov (United States)

    Feng, Yongjiu; Tong, Xiaohua; Li, Yangdong

    2008-10-01

    How to organize transportation data characterized by multi-time, multi-scale, multi-resolution and multi-source is one of the fundamental problems of GIS-T development. A spatial-temporal data model for GIS-T is proposed based on Spatial-temporal- Object Model. Transportation network data is systemically managed using dynamic segmentation technologies. And then a spatial-temporal database is built to integrally store geographical data of multi-time for transportation. Based on the spatial-temporal database, functions of spatial analysis of GIS-T are substantively extended. Fractal module is developed to improve the analyzing in intensity, density, structure and connectivity of transportation network based on the validation and evaluation of topologic relation. Integrated fractal with GIS-T strengthens the functions of spatial analysis and enriches the approaches of data mining and knowledge discovery of transportation network. Finally, the feasibility of the model and methods are tested thorough Guangdong Geographical Information Platform for Highway Project.

  17. Comparative analysis of time efficiency and spatial resolution between different EIT reconstruction algorithms

    International Nuclear Information System (INIS)

    Kacarska, Marija; Loskovska, Suzana

    2002-01-01

    In this paper comparative analysis between different EIT algorithms is presented. Analysis is made for spatial and temporal resolution of obtained images by several different algorithms. Discussions consider spatial resolution dependent on data acquisition method, too. Obtained results show that conventional applied-current EIT is more powerful compared to induced-current EIT. (Author)

  18. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  19. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  20. A critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    International Nuclear Information System (INIS)

    Kelly, P.T.; McCulloch, J.

    1983-01-01

    Semi-quantitative analysis (e.g. optical density ratios) of [ 14 C]2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of 14 C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of 14 C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of [ 14 C]2-deoxyglucose autoradiograms is undertaken. (Auth.)

  1. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  2. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  3. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  4. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  5. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  6. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  8. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  9. CMEIAS-Aided Microscopy of the Spatial Ecology of Individual Bacterial Interactions Involving Cell-to-Cell Communication within Biofilms

    Directory of Open Access Journals (Sweden)

    Frank B. Dazzo

    2012-05-01

    Full Text Available This paper describes how the quantitative analytical tools of CMEIAS image analysis software can be used to investigate in situ microbial interactions involving cell-to-cell communication within biofilms. Various spatial pattern analyses applied to the data extracted from the 2-dimensional coordinate positioning of individual bacterial cells at single-cell resolution indicate that microbial colonization within natural biofilms is not a spatially random process, but rather involves strong positive interactions between communicating cells that influence their neighbors’ aggregated colonization behavior. Geostatistical analysis of the data provide statistically defendable estimates of the micrometer scale and interpolation maps of the spatial heterogeneity and local intensity at which these microbial interactions autocorrelate with their spatial patterns of distribution. Including in situ image analysis in cell communication studies fills an important gap in understanding the spatially dependent microbial ecophysiology that governs the intensity of biofilm colonization and its unique architecture.

  10. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  11. Spatial analysis of Schistosomiasis in Hubei Province, China: a GIS-based analysis of Schistosomiasis from 2009 to 2013.

    Directory of Open Access Journals (Sweden)

    Yan-Yan Chen

    Full Text Available Schistosomiasis remains a major public health problem in China. The major endemic areas are located in the lake and marshland regions of southern China, particularly in areas along the middle and low reach of the Yangtze River. Spatial analytical techniques are often used in epidemiology to identify spatial clusters in disease regions. This study assesses the spatial distribution of schistosomiasis and explores high-risk regions in Hubei Province, China to provide guidance on schistosomiasis control in marshland regions.In this study, spatial autocorrelation methodologies, including global Moran's I and local Getis-Ord statistics, were utilized to describe and map spatial clusters and areas where human Schistosoma japonicum infection is prevalent at the county level in Hubei province. In addition, linear logistic regression model was used to determine the characteristics of spatial autocorrelation with time.The infection rates of S. japonicum decreased from 2009 to 2013. The global autocorrelation analysis results on the infection rate of S. japonicum for five years showed statistical significance (Moran's I > 0, P < 0.01, which suggested that spatial clusters were present in the distribution of S. japonicum infection from 2009 to 2013. Local autocorrelation analysis results showed that the number of highly aggregated areas ranged from eight to eleven within the five-year analysis period. The highly aggregated areas were mainly distributed in eight counties.The spatial distribution of human S. japonicum infections did not exhibit a temporal change at the county level in Hubei Province. The risk factors that influence human S. japonicum transmission may not have changed after achieving the national criterion of infection control. The findings indicated that spatial-temporal surveillance of S. japonicum transmission plays a significant role on schistosomiasis control. Timely and integrated prevention should be continued, especially in the Yangtze

  12. Analysis of Regional Unemployment in Russia and Germany: Spatial-Econometric Approach

    Directory of Open Access Journals (Sweden)

    Elena Vyacheslavovna Semerikova

    2015-06-01

    Full Text Available The study was supported by the Government of the Russian Federation, grant No.11.G34.31.0059. This paper analyzes the regional unemployment in Russia and Germany in 2005-2010 and addresses issues of choosing the right specification of spatial-econometric models. The analysis based on data of 75 Russian and 370 German regions showed that for Germany the choice of the spatial weighting matrix has a more significant influence on the parameter estimates than for Russia. Presumably this is due to stronger linkages between regional labor markets in Germany compared to Russia. The authors also proposed an algorithm for choosing between spatial matrices and demonstrated the application of this algorithm on simulated Russian data. The authors found that 1 the deviation of the results from the true ones increases when the spatial dependence between regions is higher and 2 the matrix of inverse distances is more preferable than the boundary one for the analysis of regional unemployment in Russia (because of the lower value of the mean squared error. The authors are also planning to apply the proposed algorithm for simulated data of Germany. These results allow accounting the spatial dependence more correctly when modeling regional unemployment which is very important for making proper regional policy

  13. Quantitative data analysis with SPSS release 8 for Windows a guide for social scientists

    CERN Document Server

    Bryman, Alan

    2002-01-01

    The latest edition of this best-selling introduction to Quantitative Data Analysis through the use of a computer package has been completely updated to accommodate the needs of users of SPSS Release 8 for Windows. Like its predecessor, it provides a non-technical approach to quantitative data analysis and a user-friendly introduction to the widely used SPSS for Windows. It assumes no previous familiarity with either statistics or computing but takes the reader step-by-step through the techniques, reinforced by exercises for further practice. Techniques explained in Quantitative Data Analysis with SPSS Release 8 for Windows include: * correlation * simple and multiple regression * multivariate analysis of variance and covariance * factor analysis The book also covers issues such as sampling, statistical significance, conceptualization and measurement and the selection of appropriate tests. For further information or to download the book's datasets, please visit the webstite: http://www.routledge.com/textbooks/...

  14. Extreme Precipitation Estimation with Typhoon Morakot Using Frequency and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2011-01-01

    Full Text Available Typhoon Morakot lashed Taiwan and produced copious amounts of precipitation in 2009. From the point view of hydrological statistics, the impact of the precipitation from typhoon Morakot using a frequency analysis can be analyzed and discussed. The frequency curve, which was fitted mathematically to historical observed data, can be used to estimate the probability of exceedance for runoff events of a certain magnitude. The study integrates frequency analysis and spatial analysis to assess the effect of Typhoon Morakot event on rainfall frequency in the Gaoping River basin of southern Taiwan. First, extreme rainfall data are collected at sixteen stations for durations of 1, 3, 6, 12, and 24 hours and then an appropriate probability distribution was selected to analyze the impact of the extreme hydrological event. Spatial rainfall patterns for a return period of 200-yr with 24-hr duration with and without Typhoon Morakot are estimated. Results show that the rainfall amount is significantly different with long duration with and without the event for frequency analysis. Furthermore, spatial analysis shows that extreme rainfall for a return period of 200-yr is highly dependent on topography and is smaller in the southwest than that in the east. The results not only demonstrate the distinct effect of Typhoon Morakot on frequency analysis, but also could provide reference in future planning of hydrological engineering.

  15. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  16. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  17. New quantitative methods for mineral and porosity mapping in clayey materials: application to the compacted bentonites of engineered barriers

    International Nuclear Information System (INIS)

    Pret, D.

    2003-12-01

    Clayey materials are well known for their non permeable properties and their textural changes between the dry and hydrated states. Their porous network is classically investigated in the dry state using bulk measurements. However, the relationship between porosity and mineral spatial heterogeneities in the hydrated state is poorly understood. The textural analysis limits induce some difficulties to understand the migration of solute species into compacted bentonites (as for nuclear waste repository). The goal of this work is to improve the analysis techniques for hydrated clayey materials in order to provide a multi-scale quantitative petrography. The bentonite samples are impregnated using a resin whose properties are close to water ones. The classical petrographic study reveals strong heterogeneities of spatial and size distributions of porosity and minerals. SEM images analysis allows a quantification and a simple mapping of pores and minerals into unaltered bentonites. Nevertheless, as alterations are suspected to happen in the repository context, two methods for the analysis of all types of materials have been also developed. Two specific softwares permits the treatments of autoradiographs and chemical element maps obtained using electron microprobe. The results are quantitative maps highlighting the spatial porosity heterogeneities from the decimetric to the micrometric scales. All pore sizes are taken into account including clay interlayer spaces. Moreover, an accurate mineral mapping is also supplied on millimetric areas with a spatial resolution close to the micrometer. In a widely point of view, this work provides new complementary tools for the textural analysis of fine grained materials and the improvement of migration modelling of solute species. (author)

  18. Image-based quantification and mathematical modeling of spatial heterogeneity in ESC colonies.

    Science.gov (United States)

    Herberg, Maria; Zerjatke, Thomas; de Back, Walter; Glauche, Ingmar; Roeder, Ingo

    2015-06-01

    Pluripotent embryonic stem cells (ESCs) have the potential to differentiate into cells of all three germ layers. This unique property has been extensively studied on the intracellular, transcriptional level. However, ESCs typically form clusters of cells with distinct size and shape, and establish spatial structures that are vital for the maintenance of pluripotency. Even though it is recognized that the cells' arrangement and local interactions play a role in fate decision processes, the relations between transcriptional and spatial patterns have not yet been studied. We present a systems biology approach which combines live-cell imaging, quantitative image analysis, and multiscale, mathematical modeling of ESC growth. In particular, we develop quantitative measures of the morphology and of the spatial clustering of ESCs with different expression levels and apply them to images of both in vitro and in silico cultures. Using the same measures, we are able to compare model scenarios with different assumptions on cell-cell adhesions and intercellular feedback mechanisms directly with experimental data. Applying our methodology to microscopy images of cultured ESCs, we demonstrate that the emerging colonies are highly variable regarding both morphological and spatial fluorescence patterns. Moreover, we can show that most ESC colonies contain only one cluster of cells with high self-renewing capacity. These cells are preferentially located in the interior of a colony structure. The integrated approach combining image analysis with mathematical modeling allows us to reveal potential transcription factor related cellular and intercellular mechanisms behind the emergence of observed patterns that cannot be derived from images directly. © 2015 International Society for Advancement of Cytometry.

  19. SCGICAR: Spatial concatenation based group ICA with reference for fMRI data analysis.

    Science.gov (United States)

    Shi, Yuhu; Zeng, Weiming; Wang, Nizhuan

    2017-09-01

    With the rapid development of big data, the functional magnetic resonance imaging (fMRI) data analysis of multi-subject is becoming more and more important. As a kind of blind source separation technique, group independent component analysis (GICA) has been widely applied for the multi-subject fMRI data analysis. However, spatial concatenated GICA is rarely used compared with temporal concatenated GICA due to its disadvantages. In this paper, in order to overcome these issues and to consider that the ability of GICA for fMRI data analysis can be improved by adding a priori information, we propose a novel spatial concatenation based GICA with reference (SCGICAR) method to take advantage of the priori information extracted from the group subjects, and then the multi-objective optimization strategy is used to implement this method. Finally, the post-processing means of principal component analysis and anti-reconstruction are used to obtain group spatial component and individual temporal component in the group, respectively. The experimental results show that the proposed SCGICAR method has a better performance on both single-subject and multi-subject fMRI data analysis compared with classical methods. It not only can detect more accurate spatial and temporal component for each subject of the group, but also can obtain a better group component on both temporal and spatial domains. These results demonstrate that the proposed SCGICAR method has its own advantages in comparison with classical methods, and it can better reflect the commonness of subjects in the group. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  1. [A spatially explicit analysis of traffic accidents involving pedestrians and cyclists in Berlin].

    Science.gov (United States)

    Lakes, Tobia

    2017-12-01

    In many German cities and counties, sustainable mobility concepts that strengthen pedestrian and cyclist traffic are promoted. From the perspectives of urban development, traffic planning and public healthcare, a spatially differentiated analysis of traffic accident data is decisive. 1) The identification of spatial and temporal patterns of the distribution of accidents involving cyclists and pedestrians, 2) the identification of hotspots and exploration of possible underlying causes and 3) the critical discussion of benefits and challenges of the results and the derivation of conclusions. Spatio-temporal distributions of data from accident statistics in Berlin involving pedestrians and cyclists from 2011 to 2015 were analysed with geographic information systems (GIS). While the total number of accidents remains relatively stable for pedestrian and cyclist accidents, the spatial distribution analysis shows, however, that there are significant spatial clusters (hotspots) of traffic accidents with a strong concentration in the inner city area. In a critical discussion, the benefits of geographic concepts are identified, such as spatially explicit health data (in this case traffic accident data), the importance of the integration of other data sources for the evaluation of the health impact of areas (traffic accident statistics of the police), and the possibilities and limitations of spatial-temporal data analysis (spatial point-density analyses) for the derivation of decision-supported recommendations and for the evaluation of policy measures of health prevention and of health-relevant urban development.

  2. Geo-Nested Analysis: Mixed-Methods Research with Spatially Dependent Data

    NARCIS (Netherlands)

    Harbers, I.; Ingram, M.C.

    Mixed-methods designs, especially those where cases selected for small-N analysis (SNA) are nested within a large-N analysis (LNA), have become increasingly popular. Yet, since the LNA in this approach assumes that units are independently distributed, such designs are unable to account for spatial

  3. Spatial data analysis and integration for regional-scale geothermal potential mapping, West Java, Indonesia

    Energy Technology Data Exchange (ETDEWEB)

    Carranza, Emmanuel John M.; Barritt, Sally D. [Department of Earth Systems Analysis, International Institute for Geo-information Science and Earth Observation (ITC), Enschede (Netherlands); Wibowo, Hendro; Sumintadireja, Prihadi [Laboratory of Volcanology and Geothermal, Geology Department, Institute of Technology Bandung (ITB), Bandung (Indonesia)

    2008-06-15

    Conceptual modeling and predictive mapping of potential for geothermal resources at the regional-scale in West Java are supported by analysis of the spatial distribution of geothermal prospects and thermal springs, and their spatial associations with geologic features derived from publicly available regional-scale spatial data sets. Fry analysis shows that geothermal occurrences have regional-scale spatial distributions that are related to Quaternary volcanic centers and shallow earthquake epicenters. Spatial frequency distribution analysis shows that geothermal occurrences have strong positive spatial associations with Quaternary volcanic centers, Quaternary volcanic rocks, quasi-gravity lows, and NE-, NNW-, WNW-trending faults. These geological features, with their strong positive spatial associations with geothermal occurrences, constitute spatial recognition criteria of regional-scale geothermal potential in a study area. Application of data-driven evidential belief functions in GIS-based predictive mapping of regional-scale geothermal potential resulted in delineation of high potential zones occupying 25% of West Java, which is a substantial reduction of the search area for further exploration of geothermal resources. The predicted high potential zones delineate about 53-58% of the training geothermal areas and 94% of the validated geothermal occurrences. The results of this study demonstrate the value of regional-scale geothermal potential mapping in: (a) data-poor situations, such as West Java, and (b) regions with geotectonic environments similar to the study area. (author)

  4. Quantitative, depth-resolved determination of particle motion using multi-exposure, spatial frequency domain laser speckle imaging.

    Science.gov (United States)

    Rice, Tyler B; Kwan, Elliott; Hayakawa, Carole K; Durkin, Anthony J; Choi, Bernard; Tromberg, Bruce J

    2013-01-01

    Laser Speckle Imaging (LSI) is a simple, noninvasive technique for rapid imaging of particle motion in scattering media such as biological tissue. LSI is generally used to derive a qualitative index of relative blood flow due to unknown impact from several variables that affect speckle contrast. These variables may include optical absorption and scattering coefficients, multi-layer dynamics including static, non-ergodic regions, and systematic effects such as laser coherence length. In order to account for these effects and move toward quantitative, depth-resolved LSI, we have developed a method that combines Monte Carlo modeling, multi-exposure speckle imaging (MESI), spatial frequency domain imaging (SFDI), and careful instrument calibration. Monte Carlo models were used to generate total and layer-specific fractional momentum transfer distributions. This information was used to predict speckle contrast as a function of exposure time, spatial frequency, layer thickness, and layer dynamics. To verify with experimental data, controlled phantom experiments with characteristic tissue optical properties were performed using a structured light speckle imaging system. Three main geometries were explored: 1) diffusive dynamic layer beneath a static layer, 2) static layer beneath a diffuse dynamic layer, and 3) directed flow (tube) submerged in a dynamic scattering layer. Data fits were performed using the Monte Carlo model, which accurately reconstructed the type of particle flow (diffusive or directed) in each layer, the layer thickness, and absolute flow speeds to within 15% or better.

  5. Non-standard spatial statistics and spatial econometrics

    CERN Document Server

    Griffith, Daniel A

    2011-01-01

    Spatial statistics and spatial econometrics are recent sprouts of the tree "spatial analysis with measurement". Still, several general themes have emerged. Exploring selected fields of possible interest is tantalizing, and this is what the authors aim here.

  6. Spatial analysis and characteristics of pig farming in Thailand.

    Science.gov (United States)

    Thanapongtharm, Weerapong; Linard, Catherine; Chinson, Pornpiroon; Kasemsuwan, Suwicha; Visser, Marjolein; Gaughan, Andrea E; Epprech, Michael; Robinson, Timothy P; Gilbert, Marius

    2016-10-06

    In Thailand, pig production intensified significantly during the last decade, with many economic, epidemiological and environmental implications. Strategies toward more sustainable future developments are currently investigated, and these could be informed by a detailed assessment of the main trends in the pig sector, and on how different production systems are geographically distributed. This study had two main objectives. First, we aimed to describe the main trends and geographic patterns of pig production systems in Thailand in terms of pig type (native, breeding, and fattening pigs), farm scales (smallholder and large-scale farming systems) and type of farming systems (farrow-to-finish, nursery, and finishing systems) based on a very detailed 2010 census. Second, we aimed to study the statistical spatial association between these different types of pig farming distribution and a set of spatial variables describing access to feed and markets. Over the last decades, pig population gradually increased, with a continuously increasing number of pigs per holder, suggesting a continuing intensification of the sector. The different pig-production systems showed very contrasted geographical distributions. The spatial distribution of large-scale pig farms corresponds with that of commercial pig breeds, and spatial analysis conducted using Random Forest distribution models indicated that these were concentrated in lowland urban or peri-urban areas, close to means of transportation, facilitating supply to major markets such as provincial capitals and the Bangkok Metropolitan region. Conversely the smallholders were distributed throughout the country, with higher densities located in highland, remote, and rural areas, where they supply local rural markets. A limitation of the study was that pig farming systems were defined from the number of animals per farm, resulting in their possible misclassification, but this should have a limited impact on the main patterns revealed

  7. Colorectal carcinoma: Ex vivo evaluation using 3-T high-spatial-resolution quantitative T2 mapping and its correlation with histopathologic findings.

    Science.gov (United States)

    Yamada, Ichiro; Yoshino, Norio; Hikishima, Keigo; Miyasaka, Naoyuki; Yamauchi, Shinichi; Uetake, Hiroyuki; Yasuno, Masamichi; Saida, Yukihisa; Tateishi, Ukihide; Kobayashi, Daisuke; Eishi, Yoshinobu

    2017-05-01

    In this study, we aimed to evaluate the feasibility of determining the mural invasion depths of colorectal carcinomas using high-spatial-resolution (HSR) quantitative T2 mapping on a 3-T magnetic resonance (MR) scanner. Twenty colorectal specimens containing adenocarcinomas were imaged on a 3-T MR system equipped with a 4-channel phased-array surface coil. HSR quantitative T2 maps were acquired using a spin-echo sequence with a repetition time/echo time of 7650/22.6-361.6ms (16 echoes), 87×43.5-mm field of view, 2-mm section thickness, 448×224 matrix, and average of 1. HSR fast-spin-echo T2-weighted images were also acquired. Differences between the T2 values (ms) of the tumor tissue, colorectal wall layers, and fibrosis were measured, and the MR images and histopathologic findings were compared. In all specimens (20/20, 100%), the HSR quantitative T2 maps clearly depicted an 8-layer normal colorectal wall in which the T2 values of each layer differed from those of the adjacent layer(s) (PT2 maps and histopathologic data yielded the same findings regarding the tumor invasion depth. Our results indicate that 3-T HSR quantitative T2 mapping is useful for distinguishing colorectal wall layers and differentiating tumor and fibrotic tissues. Accordingly, this technique could be used to determine mural invasion by colorectal carcinomas with a high level of accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Quantitative analysis of eyes and other optical systems in linear optics.

    Science.gov (United States)

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  9. Quantitative X ray analysis system. User's manual and guide to X ray fluorescence technique

    International Nuclear Information System (INIS)

    2009-01-01

    This guide covers trimmed and re-arranged version 3.6 of the Quantitative X ray Analysis System (QXAS) software package that includes the most frequently used methods of quantitative analysis. QXAS is a comprehensive quantitative analysis package that has been developed by the IAEA through research and technical contracts. Additional development has also been carried out in the IAEA Laboratories in Seibersdorf where QXAS was extensively tested. New in this version of the manual are the descriptions of the Voigt-profile peak fitting, the backscatter fundamental parameters' and emission-transmission methods of chemical composition analysis, an expanded chapter on the X ray fluorescence physics, and completely revised and increased number of practical examples of utilization of the QXAS software package. The analytical data accompanying this manual were collected in the IAEA Seibersdorf Laboratories in the years 2006/2007

  10. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  11. Quantitative analysis of elastography images in the detection of breast cancer

    International Nuclear Information System (INIS)

    Landoni, V.; Francione, V.; Marzi, S.; Pasciuti, K.; Ferrante, F.; Saracca, E.; Pedrini, M.; Strigari, L.; Crecco, M.; Di Nallo, A.

    2012-01-01

    Purpose: The aim of this study was to develop a quantitative method for breast cancer diagnosis based on elastosonography images in order to reduce whenever possible unnecessary biopsies. The proposed method was validated by correlating the results of quantitative analysis with the diagnosis assessed by histopathologic exam. Material and methods: 109 images of breast lesions (50 benign and 59 malignant) were acquired with the traditional B-mode technique and with elastographic modality. Images in Digital Imaging and COmmunications in Medicine format (DICOM) were exported into a software, written in Visual Basic, especially developed to perform this study. The lesion was contoured and the mean grey value and softness inside the region of interest (ROI) were calculated. The correlations between variables were investigated and receiver operating characteristic (ROC) curve analysis was performed to assess the diagnostic accuracy of the proposed method. Pathologic results were used as standard reference. Results: Both the mean grey value and the softness inside the ROI resulted statistically different at the t test for the two populations of lesions (i.e., benign versus malignant): p < 0.0001. The area under the curve (AUC) was 0.924 (0.834–0.973) and 0.917 (0.826–0.970) for the mean grey value and for the softness respectively. Conclusions: Quantitative elastosonography is a promising ultrasound technique in the detection of breast cancer but large prospective trials are necessary to determine whether quantitative analysis of images can help to overcome some pitfalls of the methodic.

  12. Renal geology (quantitative renal stone analysis) by 'Fourier transform infrared spectroscopy'.

    Science.gov (United States)

    Singh, Iqbal

    2008-01-01

    To prospectively determine the precise stone composition (quantitative analysis) by using infrared spectroscopy in patients with urinary stone disease presenting to our clinic. To determine an ideal method for stone analysis suitable for use in a clinical setting. After routine and a detailed metabolic workup of all patients of urolithiasis, stone samples of 50 patients of urolithiasis satisfying the entry criteria were subjected to the Fourier transform infrared spectroscopic analysis after adequate sample homogenization at a single testing center. Calcium oxalate monohydrate and dihydrate stone mixture was most commonly encountered in 35 (71%) followed by calcium phosphate, carbonate apatite, magnesium ammonium hexahydrate and xanthine stones. Fourier transform infrared spectroscopy allows an accurate, reliable quantitative method of stone analysis. It also helps in maintaining a computerized large reference library. Knowledge of precise stone composition may allow the institution of appropriate prophylactic therapy despite the absence of any detectable metabolic abnormalities. This may prevent and or delay stone recurrence.

  13. Spatially Resolved Analysis of Bragg Selectivity

    Directory of Open Access Journals (Sweden)

    Tina Sabel

    2015-11-01

    Full Text Available This paper targets an inherent control of optical shrinkage in photosensitive polymers, contributing by means of spatially resolved analysis of volume holographic phase gratings. Point by point scanning of the local material response to the Gaussian intensity distribution of the recording beams is accomplished. Derived information on the local grating period and grating slant is evaluated by mapping of optical shrinkage in the lateral plane as well as through the depth of the layer. The influence of recording intensity, exposure duration and the material viscosity on the Bragg selectivity is investigated.

  14. Quantitative proteomic analysis of ibuprofen-degrading Patulibacter sp. strain I11

    DEFF Research Database (Denmark)

    Almeida, Barbara; Kjeldal, Henrik; Lolas, Ihab Bishara Yousef

    2013-01-01

    was identified and quantified by gel based shotgun-proteomics. In total 251 unique proteins were quantitated using this approach. Biological process and pathway analysis indicated a number of proteins that were up-regulated in response to active degradation of ibuprofen, some of them are known to be involved...... in the degradation of aromatic compounds. Data analysis revealed that several of these proteins are likely involved in ibuprofen degradation by Patulibacter sp. strain I11.......Ibuprofen is the third most consumed pharmaceutical drug in the world. Several isolates have been shown to degrade ibuprofen, but very little is known about the biochemistry of this process. This study investigates the degradation of ibuprofen by Patulibacter sp. strain I11 by quantitative...

  15. A magneto-optical microscope for quantitative measurement of magnetic microstructures.

    Science.gov (United States)

    Patterson, W C; Garraud, N; Shorman, E E; Arnold, D P

    2015-09-01

    An optical system is presented to quantitatively map the stray magnetic fields of microscale magnetic structures, with field resolution down to 50 μT and spatial resolution down to 4 μm. The system uses a magneto-optical indicator film (MOIF) in conjunction with an upright reflective polarizing light microscope to generate optical images of the magnetic field perpendicular to the image plane. A novel single light path construction and discrete multi-image polarimetry processing method are used to extract quantitative areal field measurements from the optical images. The integrated system including the equipment, image analysis software, and experimental methods are described. MOIFs with three different magnetic field ranges are calibrated, and the entire system is validated by measurement of the field patterns from two calibration samples.

  16. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  17. Contribution of the surface contamination of uranium-materials on the quantitative analysis results by electron probe microbeam analysis

    International Nuclear Information System (INIS)

    Bonino, O.; Fournier, C.; Fucili, C.; Dugne, O.; Merlet, C.

    2000-01-01

    The analytical testing of uranium materials is necessary for quality research and development in nuclear industry applications (enrichment, safety studies, fuel, etc). Electron Probe Microbeam Analysis Wavelength Dispersive Spectrometry (EPMA-WDS) is a dependable non-destructive analytical technology. The characteristic X-ray signal is measured to identify and quantify the sample components, and the analyzed volume is about one micron cube. The surface contamination of uranium materials modifies and contributes to the quantitative analysis results of EPMA-WDS. This contribution is not representative of the bulk. A thin oxidized layer appears in the first instants after preparation (burnishing, cleaning) as well as a carbon contamination layer, due to metallographic preparation and carbon cracking under the impact of the electron probe. Several analytical difficulties subsequently arise, including an overlapping line between the carbon Ka ray and the Uranium U NIVOVI ray. Sensitivity and accuracy of the quantification of light elements like carbon and oxygen are also reduced by the presence of uranium. The aim of this study was to improve the accuracy of quantitative analysis on uranium materials by EPMA-WDS by taking account of the contribution of surface contamination. The first part of this paper is devoted to the study of the contaminated surface of the uranium materials U, UFe 2 and U 6 Fe a few hours after preparation. These oxidation conditions are selected so as to reproduce the same contamination surfaces occurring in microprobe analytical conditions. Surface characterization techniques were SIMS and Auger spectroscopy. The contaminated surfaces are shown. They consist of successive layers: a carbon layer, an oxidized iron layer, followed by an iron depletion layer (only in UFe 2 and U 6 Fe), and a ternary oxide layer (U-Fe-O for UFe 2 et U 6 Fe and UO 2+x for uranium). The second part of the paper addresses the estimation of the errors in quantitative

  18. A new quantitative analysis on nitriding kinetics in the oxidized Zry-4 at 900-1200 .deg. C

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sanggi [ACT Co. Ltd., Daejeon (Korea, Republic of)

    2016-10-15

    Two major roles of nitrogen on the zirconium based cladding degradation were identified: mechanical degradation of the cladding, and the additional chemical heat release. It has long been known that accelerated oxidation can occur in air due to the nitrogen. In addition, significant uptake of nitrogen can also occur. The nitriding of pre-oxidized zirconium based alloys leads to micro porous and less coherent oxide scales. This paper aims to quantitatively investigate the nitriding mechanism and kinetics by proposing a new methodology that is coupled with the mass balance analysis and the optical microscope image processing analysis. A new quantitative analysis methodology is described in chapter 2 and the investigation of the nitriding kinetics is performed in chapter 3. The experimental details are previously reported in. Previously only qualitative analysis was performed in, and hence the quantitative analysis will be performed in this paper. In this paper, the nitriding kinetics and mechanism were quantitatively analyzed by the new proposed analysis methods: the mass balance analysis and the optical microscope image processing analysis. Using these combined methods, the mass gain curves and the optical microscopes are analyzed in very detail, and the mechanisms of nitriding accelerated, stabilized and saturated behaviors were well understood. This paper has two very distinctive achievements as follows: 1) Development of very effective quantitative analysis methods only using two main results of oxidation tests: No detailed analytical sample measurements (e.g. TEM, EPMA and so on.) were required. These methods can effectively reduce the cost and effort of the post-test investigation. 2) The first identification of the nitriding behaviors and its very accurate analysis in a quantitative way. Based on this quantitative analysis results on the nitriding kinetics, these new findings will contribute significantly the understanding the air oxidation behaviors and model

  19. Functional characterization and quantitative expression analysis of two GnRH-related peptide receptors in the mosquito, Aedes aegypti.

    Science.gov (United States)

    Oryan, Alireza; Wahedi, Azizia; Paluzzi, Jean-Paul V

    2018-03-04

    To cope with stressful events such as flight, organisms have evolved various regulatory mechanisms, often involving control by endocrine-derived factors. In insects, two stress-related factors include the gonadotropin-releasing hormone-related peptides adipokinetic hormone (AKH) and corazonin (CRZ). AKH is a pleiotropic hormone best known as a substrate liberator of proteins, lipids, and carbohydrates. Although a universal function has not yet been elucidated, CRZ has been shown to have roles in pigmentation, ecdysis or act as a cardiostimulatory factor. While both these neuropeptides and their respective receptors (AKHR and CRZR) have been characterized in several organisms, details on their specific roles within the disease vector, Aedes aegypti, remain largely unexplored. Here, we obtained three A. aegypti AKHR transcript variants and further identified the A. aegypti CRZR receptor. Receptor expression using a heterologous functional assay revealed that these receptors exhibit a highly specific response for their native ligands. Developmental quantitative expression analysis of CRZR revealed enrichment during the pupal and adult stages. In adults, quantitative spatial expression analysis revealed CRZR transcript in a variety of organs including head, thoracic ganglia, primary reproductive organs (ovary and testis), as well as male carcass. This suggest CRZ may play a role in ecdysis, and neuronal expression of CRZR indicates a possible role for CRZ within the nervous system. Quantitative developmental expression analysis of AKHR identified significant transcript enrichment in early adult stages. AKHR transcript was observed in the head, thoracic ganglia, accessory reproductive tissues and the carcass of adult females, while it was detected in the abdominal ganglia and enriched significantly in the carcass of adult males, which supports the known function of AKH in energy metabolism. Collectively, given the enrichment of CRZR and AKHR in the primary and

  20. Comparative analysis of elements and models of implementation in local-level spatial plans in Serbia

    Directory of Open Access Journals (Sweden)

    Stefanović Nebojša

    2017-01-01

    Full Text Available Implementation of local-level spatial plans is of paramount importance to the development of the local community. This paper aims to demonstrate the importance of and offer further directions for research into the implementation of spatial plans by presenting the results of a study on models of implementation. The paper describes the basic theoretical postulates of a model for implementing spatial plans. A comparative analysis of the application of elements and models of implementation of plans in practice was conducted based on the spatial plans for the local municipalities of Arilje, Lazarevac and Sremska Mitrovica. The analysis includes four models of implementation: the strategy and policy of spatial development; spatial protection; the implementation of planning solutions of a technical nature; and the implementation of rules of use, arrangement and construction of spaces. The main results of the analysis are presented and used to give recommendations for improving the elements and models of implementation. Final deliberations show that models of implementation are generally used in practice and combined in spatial plans. Based on the analysis of how models of implementation are applied in practice, a general conclusion concerning the complex character of the local level of planning is presented and elaborated. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. TR 36035: Spatial, Environmental, Energy and Social Aspects of Developing Settlements and Climate Change - Mutual Impacts and Grant no. III 47014: The Role and Implementation of the National Spatial Plan and Regional Development Documents in Renewal of Strategic Research, Thinking and Governance in Serbia

  1. Patient-specific coronary blood supply territories for quantitative perfusion analysis

    Science.gov (United States)

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.

    2018-01-01

    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  2. Quantitative analysis of γ–oryzanol content in cold pressed rice bran oil by TLC–image analysis method

    Directory of Open Access Journals (Sweden)

    Apirak Sakunpak

    2014-02-01

    Conclusions: The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  3. Terrain feature recognition for synthetic aperture radar (SAR) imagery employing spatial attributes of targets

    Science.gov (United States)

    Iisaka, Joji; Sakurai-Amano, Takako

    1994-08-01

    This paper describes an integrated approach to terrain feature detection and several methods to estimate spatial information from SAR (synthetic aperture radar) imagery. Spatial information of image features as well as spatial association are key elements in terrain feature detection. After applying a small feature preserving despeckling operation, spatial information such as edginess, texture (smoothness), region-likeliness and line-likeness of objects, target sizes, and target shapes were estimated. Then a trapezoid shape fuzzy membership function was assigned to each spatial feature attribute. Fuzzy classification logic was employed to detect terrain features. Terrain features such as urban areas, mountain ridges, lakes and other water bodies as well as vegetated areas were successfully identified from a sub-image of a JERS-1 SAR image. In the course of shape analysis, a quantitative method was developed to classify spatial patterns by expanding a spatial pattern through the use of a series of pattern primitives.

  4. Optimal climate policy is a utopia. From quantitative to qualitative cost-benefit analysis

    International Nuclear Information System (INIS)

    Van den Bergh, Jeroen C.J.M.

    2004-01-01

    The dominance of quantitative cost-benefit analysis (CBA) and optimality concepts in the economic analysis of climate policy is criticised. Among others, it is argued to be based in a misplaced interpretation of policy for a complex climate-economy system as being analogous to individual inter-temporal welfare optimisation. The transfer of quantitative CBA and optimality concepts reflects an overly ambitious approach that does more harm than good. An alternative approach is to focus the attention on extreme events, structural change and complexity. It is argued that a qualitative rather than a quantitative CBA that takes account of these aspects can support the adoption of a minimax regret approach or precautionary principle in climate policy. This means: implement stringent GHG reduction policies as soon as possible

  5. Method of quantitative analysis of superconducting metal-conducting composite materials

    International Nuclear Information System (INIS)

    Bogomolov, V.N.; Zhuravlev, V.V.; Petranovskij, V.P.; Pimenov, V.A.

    1990-01-01

    Technique for quantitative analysis of superconducting metal-containing composite materials, SnO 2 -InSn, WO 3 -InW, Zn)-InZn in particular, has been developed. The method of determining metal content in a composite is based on the dependence of superconducting transition temperature on alloy composition. Sensitivity of temperature determination - 0.02K, error of analysis for InSn system - 0.5%

  6. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  7. Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.

    Science.gov (United States)

    Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young

    2016-01-01

    We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.

  8. Visuo-Spatial Performance in Autism: A Meta-Analysis

    Science.gov (United States)

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  9. GProX, a user-friendly platform for bioinformatics analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-08-01

    Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.

  10. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    International Nuclear Information System (INIS)

    Fernandez-Ruiz, R.; Garcia-Heras, M.

    2008-01-01

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies

  11. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-Ruiz, R. [Servicio Interdepartamental de Investigacion, Facultad de Ciencias, Universidad Autonoma de Madrid, Modulo C-9, Laboratorio de TXRF, Crta. Colmenar, Km 15, Cantoblanco, E-28049, Madrid (Spain)], E-mail: ramon.fernandez@uam.es; Garcia-Heras, M. [Grupo de Arqueometria de Vidrios y Materiales Ceramicos, Instituto de Historia, Centro de Ciencias Humanas y Sociales, CSIC, C/ Albasanz, 26-28, 28037 Madrid (Spain)

    2008-09-15

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies.

  12. Study on methods of quantitative analysis of the biological thin samples in EM X-ray microanalysis

    International Nuclear Information System (INIS)

    Zhang Detian; Zhang Xuemin; He Kun; Yang Yi; Zhang Sa; Wang Baozhen

    2000-01-01

    Objective: To study the methods of quantitative analysis of the biological thin samples. Methods: Hall theory was used to study the qualitative analysis, background subtraction, peel off overlap peaks; external radiation and aberrance of spectra. Results: The results of reliable qualitative analysis and precise quantitative analysis were achieved. Conclusion: The methods for analysis of the biological thin samples in EM X-ray microanalysis can be used in biomedical research

  13. Development of spatial data guidelines and standards: spatial data set documentation to support hydrologic analysis in the U.S. Geological Survey

    Science.gov (United States)

    Fulton, James L.

    1992-01-01

    Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support

  14. Cross-Industry Spatially Localized Innovation Networks

    Directory of Open Access Journals (Sweden)

    Aleksandr Evseevich Karlik

    2016-12-01

    Full Text Available This article’s objective is to develop conceptual approach to the study of key decision-making factors of cross-industry spatially localized innovation networks regularities by the application of quantitative and qualitative data of St. Petersburg Innovation and Technology Cluster of Machinery Manufacturing and Metalworking. The paper is based on the previous research findings which conclude that such networks have a set of opportunities and constraints for innovation. The hypothesis is that in the clusters, representing a special type of these networks, the spatial proximity partly offsets the negative impact of industrial distance. The authors propose a structural and logical model of strategic decision-making to analyze these effects on innovation. They specify network’s influences on performance: cognitive diversity; knowledge and expertise; structural autonomy and equivalence. The model is applied to spatially localized cross-industry cluster and then improved in accordance with the obtained results for accounting resource flows. It allowed to take into account the dynamics of innovation activity and to develop the practical implications in the particular business context. The analysis identified the peculiarities of spatially localized crossindustry innovation cooperation in perspective of the combinations of tangible resources, information and other intangible resources for the renewal of mature industries. The research results can be used in business as well as in industrial and regional economic policy. In the conclusion, the article outlines future research directions: a comprehensive empirical study with the analysis of data on the factors of cross-industry cooperation which were identified in this paper with testing of causal relations; the developing an approach to the study of spatially localized networks based on the exchange of primary resources in the economic system stability framework.

  15. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  16. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  17. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  18. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  19. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  20. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass

  1. Analysis of Spatial Concepts, Spatial Skills and Spatial Representations in New York State Regents Earth Science Examinations

    Science.gov (United States)

    Kastens, Kim A.; Pistolesi, Linda; Passow, Michael J.

    2014-01-01

    Research has shown that spatial thinking is important in science in general, and in Earth Science in particular, and that performance on spatially demanding tasks can be fostered through instruction. Because spatial thinking is rarely taught explicitly in the U.S. education system, improving spatial thinking may be "low-hanging fruit" as…

  2. GProX, a User-Friendly Platform for Bioinformatics Analysis and Visualization of Quantitative Proteomics Data

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy

    2011-01-01

    -friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface...... such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical...... displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics...

  3. Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface

    Directory of Open Access Journals (Sweden)

    Markus M. Knodel

    2018-01-01

    Full Text Available Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles.

  4. Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface.

    Science.gov (United States)

    Knodel, Markus M; Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; Targett-Adams, Paul; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel

    2018-01-08

    Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles.

  5. Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface

    Science.gov (United States)

    Nägel, Arne; Reiter, Sebastian; Vogel, Andreas; McLauchlan, John; Herrmann, Eva; Wittum, Gabriel

    2018-01-01

    Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles. PMID:29316722

  6. Quantitative Analysis of Hepatitis C NS5A Viral Protein Dynamics on the ER Surface

    KAUST Repository

    Knodel, Markus

    2018-01-08

    Exploring biophysical properties of virus-encoded components and their requirement for virus replication is an exciting new area of interdisciplinary virological research. To date, spatial resolution has only rarely been analyzed in computational/biophysical descriptions of virus replication dynamics. However, it is widely acknowledged that intracellular spatial dependence is a crucial component of virus life cycles. The hepatitis C virus-encoded NS5A protein is an endoplasmatic reticulum (ER)-anchored viral protein and an essential component of the virus replication machinery. Therefore, we simulate NS5A dynamics on realistic reconstructed, curved ER surfaces by means of surface partial differential equations (sPDE) upon unstructured grids. We match the in silico NS5A diffusion constant such that the NS5A sPDE simulation data reproduce experimental NS5A fluorescence recovery after photobleaching (FRAP) time series data. This parameter estimation yields the NS5A diffusion constant. Such parameters are needed for spatial models of HCV dynamics, which we are developing in parallel but remain qualitative at this stage. Thus, our present study likely provides the first quantitative biophysical description of the movement of a viral component. Our spatio-temporal resolved ansatz paves new ways for understanding intricate spatial-defined processes central to specfic aspects of virus life cycles.

  7. A Geospatial Cyberinfrastructure for Urban Economic Analysis and Spatial Decision-Making

    Directory of Open Access Journals (Sweden)

    Michael F. Goodchild

    2013-05-01

    Full Text Available Urban economic modeling and effective spatial planning are critical tools towards achieving urban sustainability. However, in practice, many technical obstacles, such as information islands, poor documentation of data and lack of software platforms to facilitate virtual collaboration, are challenging the effectiveness of decision-making processes. In this paper, we report on our efforts to design and develop a geospatial cyberinfrastructure (GCI for urban economic analysis and simulation. This GCI provides an operational graphic user interface, built upon a service-oriented architecture to allow (1 widespread sharing and seamless integration of distributed geospatial data; (2 an effective way to address the uncertainty and positional errors encountered in fusing data from diverse sources; (3 the decomposition of complex planning questions into atomic spatial analysis tasks and the generation of a web service chain to tackle such complex problems; and (4 capturing and representing provenance of geospatial data to trace its flow in the modeling task. The Greater Los Angeles Region serves as the test bed. We expect this work to contribute to effective spatial policy analysis and decision-making through the adoption of advanced GCI and to broaden the application coverage of GCI to include urban economic simulations.

  8. A Spatial Analysis of Tourism Activity in Romania

    Directory of Open Access Journals (Sweden)

    Daniela Luminita Constantin

    2018-02-01

    Full Text Available Location is a key concept in tourism sector analysis, given the dependence of this activity on the natural, built, cultural and social characteristics of a certain territory. As a result, the tourist zoning is an important instrument for delimiting tourist areas in accordance with multiple criteria, so as to lay the foundations for finding the most suitable solutions of turning to good account the resources in this field. The modern approaches proposed in this paper use a series of analytical tools that combine GIS and spatial agglomeration analysis based techniques. They can be also employed in order to examine and explain the differences between tourist zones (and sub-zones in terms of economic and social results and thus to suggest realistic ways to improve the efficiency and effectiveness of tourist activities in various geographical areas. In the described context this paper proposes an interdisciplinary perspective (spatial statistics and Geographical Information Systems for analysing the tourism activity in Romania, mainly aiming to identify the agglomerations of companies acting in this industry and assess their performance and contribution to the economic development of the corresponding regions. It also intends to contribute to a better understanding of the way in which tourism related business activities develop, in order to enhance appropriate support networks. Territorial and spatial statistics, as well as GIS based analyses are applied, using data about all companies acting in tourism industry in Romania provided by the National Authority for Tourism as well as data from the Environmental Systems Research Institute (ESRI.

  9. Spatially resolved δ13C analysis using laser ablation isotope ratio mass spectrometry

    Science.gov (United States)

    Moran, J.; Riha, K. M.; Nims, M. K.; Linley, T. J.; Hess, N. J.; Nico, P. S.

    2014-12-01

    Inherent geochemical, organic matter, and microbial heterogeneity over small spatial scales can complicate studies of carbon dynamics through soils. Stable isotope analysis has a strong history of helping track substrate turnover, delineate rhizosphere activity zones, and identifying transitions in vegetation cover, but most traditional isotope approaches are limited in spatial resolution by a combination of physical separation techniques (manual dissection) and IRMS instrument sensitivity. We coupled laser ablation sampling with isotope measurement via IRMS to enable spatially resolved analysis over solid surfaces. Once a targeted sample region is ablated the resulting particulates are entrained in a helium carrier gas and passed through a combustion reactor where carbon is converted to CO2. Cyrotrapping of the resulting CO2 enables a reduction in carrier gas flow which improves overall measurement sensitivity versus traditional, high flow sample introduction. Currently we are performing sample analysis at 50 μm resolution, require 65 ng C per analysis, and achieve measurement precision consistent with other continuous flow techniques. We will discuss applications of the laser ablation IRMS (LA-IRMS) system to microbial communities and fish ecology studies to demonstrate the merits of this technique and how similar analytical approaches can be transitioned to soil systems. Preliminary efforts at analyzing soil samples will be used to highlight strengths and limitations of the LA-IRMS approach, paying particular attention to sample preparation requirements, spatial resolution, sample analysis time, and the types of questions most conducive to analysis via LA-IRMS.

  10. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  11. Limitations for qualitative and quantitative neutron activation analysis using reactor neutrons

    International Nuclear Information System (INIS)

    El-Abbady, W.H.; El-Tanahy, Z.H.; El-Hagg, A.A.; Hassan, A.M.

    1999-01-01

    In this work, the most important limitations for qualitative and quantitative analysis using reactor neutrons for activation are reviewed. Each limitation is discussed using different examples of activated samples. Photopeak estimation, nuclear reactions interference and neutron flux measurements are taken into consideration. Solutions for high accuracy evaluation in neutron activation analysis applications are given. (author)

  12. Image analysis

    International Nuclear Information System (INIS)

    Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.

    1994-01-01

    This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs

  13. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  14. Positive Analysis of Invasive Species Control as a Dynamic Spatial Process

    OpenAIRE

    Buyuktahtakin, Esra; Feng, Zhuo; Olsson, Aaryn; Frisvold, George B.; Szidarovszky, Ferenc

    2010-01-01

    This paper models control of invasive buffelgrass (Pennisetum ciliare), a fire-prone African bunchgrass spreading rapidly across the southern Arizona desert as a spatial dynamic process. Buffelgrass spreads over a gridded landscape. Weed carrying capacity, treatment costs, and damages vary over grid cells. Damage from buffelgrass depends on its spatial distribution in relation to valued resources. We conduct positive analysis of recommended heuristic strategies for buffelgrass control, evalua...

  15. Disturbed Intracardiac Flow Organization After Atrioventricular Septal Defect Correction as Assessed With 4D Flow Magnetic Resonance Imaging and Quantitative Particle Tracing

    NARCIS (Netherlands)

    Calkoen, Emmeline E.; de Koning, Patrick J. H.; Blom, Nico A.; Kroft, Lucia J. M.; de Roos, Albert; Wolterbeek, Ron; Roest, Arno A. W.; Westenberg, Jos J. M.

    2015-01-01

    Objectives Four-dimensional (3 spatial directions and time) velocity-encoded flow magnetic resonance imaging with quantitative particle tracing analysis allows assessment of left ventricular (LV) blood flow organization. Corrected atrioventricular septal defect (AVSD) patients have an abnormal left

  16. HIGH SPATIAL-RESOLUTION IMAGING OF TE INCLUSIONS IN CZT MATERIAL

    International Nuclear Information System (INIS)

    CAMARDA, G.S.; BOLOTNIKOV, A.E.; CARINI, G.A.; CUI, Y.; KOHMAN, K.T.; LI, L.; JAMES, R.B.

    2006-01-01

    We present new results from our studies of defects in current single-crystal CdZnTe material. Our previous measurements, carried out on thin (∼1 mm) and long (>12 mm) CZT detectors, indicated that small (1-20 (micro)m) Te inclusions can significantly degrade the device's energy resolution and detection efficiency. We are conducting detailed studies of the effects of Te inclusions by employing different characterization techniques with better spatial resolution, such as quantitative fluorescence mapping, X-ray micro-diffraction, and TEM. Also, IR microscopy and gamma-mapping with pulse-shape analysis with higher spatial resolution generated more accurate results in the areas surrounding the micro-defects (Te inclusions). Our results reveal how the performance of CdZnTe detectors is influenced by Te inclusions, such as their spatial distribution, concentration, and size. We also discuss a model of charge transport through areas populated with Te inclusions

  17. Augmentation of Explicit Spatial Configurations by Knowledge-Based Inference on Geometric Fields

    Directory of Open Access Journals (Sweden)

    Dan Tappan

    2009-04-01

    Full Text Available A spatial configuration of a rudimentary, static, realworld scene with known objects (animals and properties (positions and orientations contains a wealth of syntactic and semantic spatial information that can contribute to a computational understanding far beyond what its quantitative details alone convey. This work presents an approach that (1 quantitatively represents what a configuration explicitly states, (2 integrates this information with implicit, commonsense background knowledge of its objects and properties, (3 infers additional, contextually appropriate, commonsense spatial information from and about their interrelationships, and (4 augments the original representation with this combined information. A semantic network represents explicit, quantitative information in a configuration. An inheritance-based knowledge base of relevant concepts supplies implicit, qualitative background knowledge to support semantic interpretation. Together, these structures provide a simple, nondeductive, constraint-based, geometric logical formalism to infer substantial implicit knowledge for intrinsic and deictic frames of spatial reference.

  18. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  19. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  20. Spatial Intensity Duration Frequency Relationships Using Hierarchical Bayesian Analysis for Urban Areas

    Science.gov (United States)

    Rupa, Chandra; Mujumdar, Pradeep

    2016-04-01

    In urban areas, quantification of extreme precipitation is important in the design of storm water drains and other infrastructure. Intensity Duration Frequency (IDF) relationships are generally used to obtain design return level for a given duration and return period. Due to lack of availability of extreme precipitation data for sufficiently large number of years, estimating the probability of extreme events is difficult. Typically, a single station data is used to obtain the design return levels for various durations and return periods, which are used in the design of urban infrastructure for the entire city. In an urban setting, the spatial variation of precipitation can be high; the precipitation amounts and patterns often vary within short distances of less than 5 km. Therefore it is crucial to study the uncertainties in the spatial variation of return levels for various durations. In this work, the extreme precipitation is modeled spatially using the Bayesian hierarchical analysis and the spatial variation of return levels is studied. The analysis is carried out with Block Maxima approach for defining the extreme precipitation, using Generalized Extreme Value (GEV) distribution for Bangalore city, Karnataka state, India. Daily data for nineteen stations in and around Bangalore city is considered in the study. The analysis is carried out for summer maxima (March - May), monsoon maxima (June - September) and the annual maxima rainfall. In the hierarchical analysis, the statistical model is specified in three layers. The data layer models the block maxima, pooling the extreme precipitation from all the stations. In the process layer, the latent spatial process characterized by geographical and climatological covariates (lat-lon, elevation, mean temperature etc.) which drives the extreme precipitation is modeled and in the prior level, the prior distributions that govern the latent process are modeled. Markov Chain Monte Carlo (MCMC) algorithm (Metropolis Hastings

  1. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  2. [Human body meridian spatial decision support system for clinical treatment and teaching of acupuncture and moxibustion].

    Science.gov (United States)

    Wu, Dehua

    2016-01-01

    The spatial position and distribution of human body meridian are expressed limitedly in the decision support system (DSS) of acupuncture and moxibustion at present, which leads to the failure to give the effective quantitative analysis on the spatial range and the difficulty for the decision-maker to provide a realistic spatial decision environment. Focusing on the limit spatial expression in DSS of acupuncture and moxibustion, it was proposed that on the basis of the geographic information system, in association of DSS technology, the design idea was developed on the human body meridian spatial DSS. With the 4-layer service-oriented architecture adopted, the data center integrated development platform was taken as the system development environment. The hierarchical organization was done for the spatial data of human body meridian via the directory tree. The structured query language (SQL) server was used to achieve the unified management of spatial data and attribute data. The technologies of architecture, configuration and plug-in development model were integrated to achieve the data inquiry, buffer analysis and program evaluation of the human body meridian spatial DSS. The research results show that the human body meridian spatial DSS could reflect realistically the spatial characteristics of the spatial position and distribution of human body meridian and met the constantly changeable demand of users. It has the powerful spatial analysis function and assists with the scientific decision in clinical treatment and teaching of acupuncture and moxibustion. It is the new attempt to the informatization research of human body meridian.

  3. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    Science.gov (United States)

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  4. The Structure of Spatial Ability Items: A Faceted Analysis.

    Science.gov (United States)

    Guttman, Ruth; Shoham, Ilana

    1982-01-01

    Eight spatial tests assembled with a mapping sentence of four content facets (rule type, dimensionality, presence or absence of rotation, and test format) were administered to 800 individuals. Smallest Space Analysis of an intercorrelation matrix yielded three facets which formed distinct regions in a two-dimensional projection of a…

  5. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  6. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    Science.gov (United States)

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  7. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    Science.gov (United States)

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  8. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  9. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  10. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  11. Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai; Tian, Yuan; Liu, Tao; Thomas, Stefani N.; Chen, Li; Schnaubelt, Michael; Boja, Emily; Hiltket, Tara; Kinsinger, Christopher; Rodriguez, Henry; Davies, Sherri; Li, Shunqiang; Snider, Jacqueline E.; Erdmann-Gilmore, Petra; Tabb, David L.; Townsend, Reid; Ellis, Matthew; Rodland, Karin D.; Smith, Richard D.; Carr, Steven A.; Zhang, Zhen; Chan, Daniel W.; Zhang, Hui

    2017-09-21

    The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assess the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.

  12. A resilience-oriented approach for quantitatively assessing recurrent spatial-temporal congestion on urban roads.

    Directory of Open Access Journals (Sweden)

    Junqing Tang

    Full Text Available Traffic congestion brings not only delay and inconvenience, but other associated national concerns, such as greenhouse gases, air pollutants, road safety issues and risks. Identification, measurement, tracking, and control of urban recurrent congestion are vital for building a livable and smart community. A considerable amount of works has made contributions to tackle the problem. Several methods, such as time-based approaches and level of service, can be effective for characterizing congestion on urban streets. However, studies with systemic perspectives have been minor in congestion quantification. Resilience, on the other hand, is an emerging concept that focuses on comprehensive systemic performance and characterizes the ability of a system to cope with disturbance and to recover its functionality. In this paper, we symbolized recurrent congestion as internal disturbance and proposed a modified metric inspired by the well-applied "R4" resilience-triangle framework. We constructed the metric with generic dimensions from both resilience engineering and transport science to quantify recurrent congestion based on spatial-temporal traffic patterns and made the comparison with other two approaches in freeway and signal-controlled arterial cases. Results showed that the metric can effectively capture congestion patterns in the study area and provides a quantitative benchmark for comparison. Also, it suggested not only a good comparative performance in measuring strength of proposed metric, but also its capability of considering the discharging process in congestion. The sensitivity tests showed that proposed metric possesses robustness against parameter perturbation in Robustness Range (RR, but the number of identified congestion patterns can be influenced by the existence of ϵ. In addition, the Elasticity Threshold (ET and the spatial dimension of cell-based platform differ the congestion results significantly on both the detected number and

  13. Brain regions associated with cognitive impairment in patients with Parkinson disease: quantitative analysis of cerebral blood flow using 123I iodoamphetamine SPECT.

    Science.gov (United States)

    Hattori, Naoya; Yabe, Ichiro; Hirata, Kenji; Shiga, Tohru; Sakushima, Ken; Tsuji-Akimoto, Sachiko; Sasaki, Hidenao; Tamaki, Nagara

    2013-05-01

    Cognitive impairment is a representative neuropsychiatric presentation that accompanies Parkinson disease (PD). The purpose of this study was to localize the cerebral regions associated with cognitive impairment in patients with PD using quantitative SPECT. Thirty-two patients with PD (mean [SD] age, 75 [8] years; 25 women; Hoehn-Yahr scores from 2 to 5) underwent quantitative brain SPECT using 123I iodoamphetamine. Parametric images of regional cerebral blood flow (rCBF) were spatially normalized to the standard brain atlas. First, voxel-by-voxel comparison between patients with PD with versus without cognitive impairment was performed to visualize overall trend of regional differences. Next, the individual quantitative rCBF values were extracted in representative cortical regions using a standard region-of-interest template to compare the quantitative rCBF values. Patients with cognitive impairment showed trends of lower rCBF in the left frontal and temporal cortices as well as in the bilateral medial frontal and anterior cingulate cortices in the voxel-by-voxel analyses. Region-of-interest-based analysis demonstrated significantly lower rCBF in the bilateral anterior cingulate cortices (right, 25.8 [5.5] vs 28.9 [5.7] mL per 100 g/min, P left, 25.8 [5.8] vs 29.1 [5.7] mL per 100 g/min, P left frontal and temporal cortices as well as in the bilateral medial frontal and anterior cingulate cortices. The results suggested dysexecutive function as an underlining mechanism of cognitive impairment in patients with PD.

  14. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  15. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  16. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  17. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  18. Evidence for fish dispersal from spatial analysis of stream network topology

    Science.gov (United States)

    Hitt, N.P.; Angermeier, P.L.

    2008-01-01

    Developing spatially explicit conservation strategies for stream fishes requires an understanding of the spatial structure of dispersal within stream networks. We explored spatial patterns of stream fish dispersal by evaluating how the size and proximity of connected streams (i.e., stream network topology) explained variation in fish assemblage structure and how this relationship varied with local stream size. We used data from the US Environmental Protection Agency's Environmental Monitoring and Assessment Program in wadeable streams of the Mid-Atlantic Highlands region (n = 308 sites). We quantified stream network topology with a continuous analysis based on the rate of downstream flow accumulation from sites and with a discrete analysis based on the presence of mainstem river confluences (i.e., basin area >250 km2) within 20 fluvial km (fkm) from sites. Continuous variation in stream network topology was related to local species richness within a distance of ???10 fkm, suggesting an influence of fish dispersal within this spatial grain. This effect was explained largely by catostomid species, cyprinid species, and riverine species, but was not explained by zoogeographic regions, ecoregions, sampling period, or spatial autocorrelation. Sites near mainstem river confluences supported greater species richness and abundance of catostomid, cyprinid, and ictalurid fishes than did sites >20 fkm from such confluences. Assemblages at sites on the smallest streams were not related to stream network topology, consistent with the hypothesis that local stream size regulates the influence of regional dispersal. These results demonstrate that the size and proximity of connected streams influence the spatial distribution of fish and suggest that these influences can be incorporated into the designs of stream bioassessments and reserves to enhance management efficacy. ?? 2008 by The North American Benthological Society.

  19. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  20. Visualization and quantitative analysis of the CSF pulsatile flow with cine MR phase imaging

    International Nuclear Information System (INIS)

    Katayama, Shinji; Itoh, Takahiko; Kinugasa, Kazushi; Asari, Shoji; Nishimoto, Akira; Tsuchida, Shohei; Ono, Atsushi; Ikezaki, Yoshikazu; Yoshitome, Eiji.

    1991-01-01

    The visualization and the quantitative analysis of the CSF pulsatile flow were performed on ten healthy volunteers with cine MR phase imaging, a combination of the phase-contrast technique and the cardiac-gating technique. The velocities appropriate for the visualization and the quantitative analysis of the CSF pulsatile flow were from 6.0 cm/sec to 15.0 cm/sec. The applicability of this method for the quantitative analysis was proven with a steady-flow phantom. Phase images clearly demonstrated a to-and-fro motion of the CSF flow in the anterior subarachnoid space and in the posterior subarachnoid space. The flow pattern of CSF on healthy volunteers depends on the cardiac cycle. In the anterior subarachnoid space, the cephalic CSF flow continued until a 70-msec delay after the R-wave of the ECG and then reversed to caudal. At 130-190 msec, the caudal CSF flow reached its maximum velocity; thereafter it reversed again to cephalic. The same turn appeared following the phase, but then the amplitude decreased. The cephalic peaked at 370-430 msec, while the caudal peaked at 490-550 msec. The flow pattern of the CSF flow in the posterior subarachnoid space was almost identical to that in the anterior subarachnoid space. Cine MR phase imaging is thus useful for the visualization and the quantitative analysis of the CSF pulsative flow. (author)

  1. Quantitative tomography of hydrogen precharged and uncharged Al-Zn-Mg-Cu alloy after tensile fracture

    Energy Technology Data Exchange (ETDEWEB)

    Gupta, C., E-mail: joy_gupta71@yahoo.co.in [Department of Mechanical Engineering, Toyohashi University of Technology, Toyohashi, Aichi 441-8150 (Japan); Toda, H.; Fujioka, T.; Kobayashi, M. [Department of Mechanical Engineering, Toyohashi University of Technology, Toyohashi, Aichi 441-8150 (Japan); Hoshino, H. [Department of Mechanical Engineering, Toyohashi University of Technology, Toyohashi, Aichi 441-8150 (Japan); Japan Synchrotron Radiation Institute, Sayo-Gun, Hyogo (Japan); Uesugi, K.; Takeuchi, A.; Suzuki, Y. [Japan Synchrotron Radiation Institute, Sayo-Gun, Hyogo (Japan)

    2016-07-18

    Quantitative tomography is carried out on datasets derived from tensile fracture sample of electrochemically precharged Al-Zn-Mg-Cu alloy in the underaged condition and its uncharged counterpart. It is shown that precharging which induces a transition of tensile fracture mode from ductile to brittle, results in a significant increase in micro-damage content in the regions near the fracture surfaces. Using quantitative tomography analysis based on spatial mapping of morphologically segmented micro-damage content of the datasets it is found that the precharged sample contains an inhomogenous distribution of micro-pores near grain boundaries. It is also shown that the spatial architecture of micro-pores in the dataset is not influenced by the plastic zone of the intergranular cracks lying along the grain boundaries. Contrastingly the micro-pores in the tomographic dataset of the uncharged sample are shown to be present near intermetallic particles. It is therefore rationalized that the spatial architecture of micro-pores in the datasets from uncharged sample originate from particle cracking during ductile fracture, and from the tendency for damage enhancement by the synergism of hydrogen exposure near grain boundaries and localization of deformation in the precharged sample dataset.

  2. Quantitative electron microscope autoradiography: application of multiple linear regression analysis

    International Nuclear Information System (INIS)

    Markov, D.V.

    1986-01-01

    A new method for the analysis of high resolution EM autoradiographs is described. It identifies labelled cell organelle profiles in sections on a strictly statistical basis and provides accurate estimates for their radioactivity without the need to make any assumptions about their size, shape and spatial arrangement. (author)

  3. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    Science.gov (United States)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of 30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  4. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  5. Quantitative analysis of food and feed samples with droplet digital PCR.

    Directory of Open Access Journals (Sweden)

    Dany Morisset

    Full Text Available In this study, the applicability of droplet digital PCR (ddPCR for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs. Real-time quantitative polymerase chain reaction (qPCR is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  6. Spatial correlation analysis of urban traffic state under a perspective of community detection

    Science.gov (United States)

    Yang, Yanfang; Cao, Jiandong; Qin, Yong; Jia, Limin; Dong, Honghui; Zhang, Aomuhan

    2018-05-01

    Understanding the spatial correlation of urban traffic state is essential for identifying the evolution patterns of urban traffic state. However, the distribution of traffic state always has characteristics of large spatial span and heterogeneity. This paper adapts the concept of community detection to the correlation network of urban traffic state and proposes a new perspective to identify the spatial correlation patterns of traffic state. In the proposed urban traffic network, the nodes represent road segments, and an edge between a pair of nodes is added depending on the result of significance test for the corresponding correlation of traffic state. Further, the process of community detection in the urban traffic network (named GWPA-K-means) is applied to analyze the spatial dependency of traffic state. The proposed method extends the traditional K-means algorithm in two steps: (i) redefines the initial cluster centers by two properties of nodes (the GWPA value and the minimum shortest path length); (ii) utilizes the weight signal propagation process to transfer the topological information of the urban traffic network into a node similarity matrix. Finally, numerical experiments are conducted on a simple network and a real urban road network in Beijing. The results show that GWPA-K-means algorithm is valid in spatial correlation analysis of traffic state. The network science and community structure analysis perform well in describing the spatial heterogeneity of traffic state on a large spatial scale.

  7. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti.

    Science.gov (United States)

    Curtis, Andrew; Blackburn, Jason K; Widmer, Jocelyn M; Morris, J Glenn

    2013-04-15

    Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these "hotspots". Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio

  8. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  9. Examination of quantitative accuracy of PIXE analysis for atmospheric aerosol particle samples. PIXE analysis of NIST air particulate on filter media

    International Nuclear Information System (INIS)

    Saitoh, Katsumi; Sera, Koichiro

    2005-01-01

    In order to confirm accuracy of the direct analysis of filter samples containing atmospheric aerosol particles collected on a polycarbonate membrane filter by PIXE, we carried out PIXE analysis on a National Institute of Standards and Technology (NIST, USA) air particulate on filter media (SRM 2783). For 16 elements with NIST certified values determined by PIXE analysis - Na, Mg, Al, Si, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn and Pb - quantitative values were 80-110% relative to NIST certified values except for Na, Al, Si and Ni. Quantitative values of Na, Al and Si were 140-170% relative to NIST certified values, which were all high, and Ni was 64%. One possible reason why the quantitative values of Na, Al and Si were higher than the NIST certified values could be the difference in the X-ray spectrum analysis method used. (author)

  10. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Quantitative analysis of the epithelial lining architecture in radicular cysts and odontogenic keratocysts

    Directory of Open Access Journals (Sweden)

    Landini Gabriel

    2006-02-01

    Full Text Available Abstract Background This paper describes a quantitative analysis of the cyst lining architecture in radicular cysts (of inflammatory aetiology and odontogenic keratocysts (thought to be developmental or neoplastic including its 2 counterparts: solitary and associated with the Basal Cell Naevus Syndrome (BCNS. Methods Epithelial linings from 150 images (from 9 radicular cysts, 13 solitary keratocysts and 8 BCNS keratocysts were segmented into theoretical cells using a semi-automated partition based on the intensity of the haematoxylin stain which defined exclusive areas relative to each detected nucleus. Various morphometrical parameters were extracted from these "cells" and epithelial layer membership was computed using a systematic clustering routine. Results Statistically significant differences were observed across the 3 cyst types both at the morphological and architectural levels of the lining. Case-wise discrimination between radicular cysts and keratocyst was highly accurate (with an error of just 3.3%. However, the odontogenic keratocyst subtypes could not be reliably separated into the original classes, achieving discrimination rates slightly above random allocations (60%. Conclusion The methodology presented is able to provide new measures of epithelial architecture and may help to characterise and compare tissue spatial organisation as well as provide useful procedures for automating certain aspects of histopathological diagnosis.

  12. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  13. Cell Matrix Remodeling Ability Shown by Image Spatial Correlation

    Science.gov (United States)

    Chiu, Chi-Li; Digman, Michelle A.; Gratton, Enrico

    2013-01-01

    Extracellular matrix (ECM) remodeling is a critical step of many biological and pathological processes. However, most of the studies to date lack a quantitative method to measure ECM remodeling at a scale comparable to cell size. Here, we applied image spatial correlation to collagen second harmonic generation (SHG) images to quantitatively evaluate the degree of collagen remodeling by cells. We propose a simple statistical method based on spatial correlation functions to determine the size of high collagen density area around cells. We applied our method to measure collagen remodeling by two breast cancer cell lines (MDA-MB-231 and MCF-7), which display different degrees of invasiveness, and a fibroblast cell line (NIH/3T3). We found distinct collagen compaction levels of these three cell lines by applying the spatial correlation method, indicating different collagen remodeling ability. Furthermore, we quantitatively measured the effect of Latrunculin B and Marimastat on MDA-MB-231 cell line collagen remodeling ability and showed that significant collagen compaction level decreases with these treatments. PMID:23935614

  14. Global and 3D spatial assessment of neuroinflammation in rodent models of Multiple Sclerosis.

    Directory of Open Access Journals (Sweden)

    Shashank Gupta

    Full Text Available Multiple Sclerosis (MS is a progressive autoimmune inflammatory and demyelinating disease of the central nervous system (CNS. T cells play a key role in the progression of neuroinflammation in MS and also in the experimental autoimmune encephalomyelitis (EAE animal models for the disease. A technology for quantitative and 3 dimensional (3D spatial assessment of inflammation in this and other CNS inflammatory conditions is much needed. Here we present a procedure for 3D spatial assessment and global quantification of the development of neuroinflammation based on Optical Projection Tomography (OPT. Applying this approach to the analysis of rodent models of MS, we provide global quantitative data of the major inflammatory component as a function of the clinical course. Our data demonstrates a strong correlation between the development and progression of neuroinflammation and clinical disease in several mouse and a rat model of MS refining the information regarding the spatial dynamics of the inflammatory component in EAE. This method provides a powerful tool to investigate the effect of environmental and genetic forces and for assessing the therapeutic effects of drug therapy in animal models of MS and other neuroinflammatory/neurodegenerative disorders.

  15. Spatial analysis on human brucellosis incidence in mainland China: 2004–2010

    Science.gov (United States)

    Zhang, Junhui; Yin, Fei; Zhang, Tao; Yang, Chao; Zhang, Xingyu; Feng, Zijian; Li, Xiaosong

    2014-01-01

    Objectives China has experienced a sharply increasing rate of human brucellosis in recent years. Effective spatial monitoring of human brucellosis incidence is very important for successful implementation of control and prevention programmes. The purpose of this paper is to apply exploratory spatial data analysis (ESDA) methods and the empirical Bayes (EB) smoothing technique to monitor county-level incidence rates for human brucellosis in mainland China from 2004 to 2010 by examining spatial patterns. Methods ESDA methods were used to characterise spatial patterns of EB smoothed incidence rates for human brucellosis based on county-level data obtained from the China Information System for Disease Control and Prevention (CISDCP) in mainland China from 2004 to 2010. Results EB smoothed incidence rates for human brucellosis were spatially dependent during 2004–2010. The local Moran test identified significantly high-risk clusters of human brucellosis (all p values brucellosis incidence. PMID:24713215

  16. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    Science.gov (United States)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  17. A borax fusion technique for quantitative X-ray fluorescence analysis

    NARCIS (Netherlands)

    van Willigen, J.H.H.G.; Kruidhof, H.; Dahmen, E.A.M.F.

    1971-01-01

    A borax fusion technique to cast glass discs for quantitative X-ray analysis is described in detail. The method is based on the “nonwetting” properties of a Pt/Au alloy towards molten borax, on the favourable composition of the flux and finally on the favourable form of the casting mould. The

  18. Spatially-Heterodyned Holography

    Science.gov (United States)

    Thomas, Clarence E [Knoxville, TN; Hanson, Gregory R [Clinton, TN

    2006-02-21

    A method of recording a spatially low-frequency heterodyne hologram, including spatially heterodyne fringes for Fourier analysis, includes: splitting a laser beam into a reference beam and an object beam; interacting the object beam with an object; focusing the reference beam and the object beam at a focal plane of a digital recorder to form a spatially low-frequency heterodyne hologram including spatially heterodyne fringes for Fourier analysis; digital recording the spatially low-frequency heterodyne hologram; Fourier transforming axes of the recorded spatially low-frequency heterodyne hologram including spatially heterodyne fringes in Fourier space to sit on top of a heterodyne carrier frequency defined by an angle between the reference beam and the object beam; cutting off signals around an origin; and performing an inverse Fourier transform.

  19. Spatial analysis of the Rio de Janeiro metropolitan area and social and environmental management issues

    DEFF Research Database (Denmark)

    Ribeiro, Gustavo

    2005-01-01

    and infrastructural data. Through these three levels of spatial analysis it is possible to develop and to support a more comprehensible study of urban development of the Rio de Janeiro Metropolitan Area. The aim of this study is (a) to develop an alternative spatial analysis leading to a more comprehensive...... understanding of the urban development process and its correlation not just with political-administrative borders but also to ecological systems: (b) to identify the correlations between infrastructure and socio-economical data in the Rio de Janeiro Metropolitan Area (c) to evaluate urban development dynamics...... in the period between 1990 and 2000, based on the application of the above-mentioned data to the three spatial levels in question. This paper highlights the role of the hydrographical systems of the Rio de Janeiro Metropolitan Area as an important spatial unit of analysis to understand the metropolitan urban...

  20. Urbanization and Land Use Changes in Peri-Urban Area using Spatial Analysis Methods (Case Study: Ciawi Urban Areas, Bogor Regency)

    Science.gov (United States)

    Cahya, D. L.; Martini, E.; Kasikoen, K. M.

    2018-02-01

    Urbanization is shown by the increasing percentage of the population in urban areas. In Indonesia, the percentage of urban population increased dramatically form 17.42% (1971) to 42.15% (2010). This resulted in increased demand for housing. Limited land in the city area push residents looking for an alternative location of his residence to the peri-urban areas. It is accompanied by a process of land conversion from green area into built-up area. Continuous land conversion in peri-urban area is becoming increasingly widespread. Bogor Regency as part of the Jakarta Metropolitan Area is experiencing rapid development. This regency has been experienced land-use change very rapidly from agricultural areas into urban built up areas. Aim of this research is to analyze the effect of urbanization on land use changes in peri-urban areas using spatial analysis methods. This research used case study of Ciawi Urban Area that experiencing rapid development. Method of this research is using descriptive quantitative approach. Data used in this research is primary data (field survey) and secondary data (maps). To analyze land use change is using Geographic Information System (GIS) as spatial analysis methods. The effect of urbanization on land use changes in Ciawi Urban Area from year 2013 to 2015 is significant. The reduction of farm land is around -4.00% and wetland is around - 2.51%. The increasing area for hotel/villa/resort is around 3.10%. Based on this research, local government (Bogor Regency) should be alert to the land use changes that does not comply with the land use plan and also consistently apply the spatial planning.

  1. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  2. Complex pedigree analysis to detect quantitative trait loci in dairy cattle

    NARCIS (Netherlands)

    Bink, M.C.A.M.

    1998-01-01

    In dairy cattle, many quantitative traits of economic importance show phenotypic variation. For breeding purposes the analysis of this phenotypic variation and uncovering the contribution of genetic factors is very important. Usually, the individual gene effects contributing to the

  3. Explorative analysis of long time series of very high resolution spatial rainfall

    DEFF Research Database (Denmark)

    Thomassen, Emma Dybro; Sørup, Hjalte Jomo Danielsen; Scheibel, Marc

    2017-01-01

    . For each method a set of 17 variables are used to describe the properties of each event, e.g. duration, maximum volumes, spatial coverage and heterogeneity, and movement of cells. A total of 5-9 dimensions can be found in the data, which can be interpreted as a rough indication of how many independent...... simple scaling across the set of variables, i.e. the level of each variable varies signicantly, but not the overall structure of the spatial precipitation. The analysis show that there is a good potential for making a spatial weather generator for high spatio-temporal precipitation for precipitation...

  4. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    International Nuclear Information System (INIS)

    Pořízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.

    2014-01-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  5. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    Energy Technology Data Exchange (ETDEWEB)

    Pořízka, P. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Demidov, A. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Kaiser, J. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Keivanian, J. [Institute for Mining, Technical University Clausthal, Erzstraße 18, 38678 Clausthal-Zellerfeld (Germany); Gornushkin, I. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Panne, U. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Chemistry Department, Humboldt Univerisät zu Berlin, Brook-Taylor-Straße 2, D-12489 Berlin (Germany); Riedel, J., E-mail: jens.riedel@bam.de [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany)

    2014-11-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  6. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    International Nuclear Information System (INIS)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr; Pontuschka, W M; Mamani, J B; Costa-Filho, A J; Vieira, E D

    2008-01-01

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133 + . The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133 + cells (∼6.16 x 10 5 pg in the volume of 2 μl containing 4.5 x 10 11 SPION). The quantitative method led to the result of 1.70 x 10 -13 mol of Fe (9.5 pg), or 7.0 x 10 6 nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  7. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  8. Spatial pattern of Amazonian timber species using cartesian and spatial coordinates method

    Directory of Open Access Journals (Sweden)

    Tiago Monteiro Condé

    2016-06-01

    Full Text Available Geographic information system (GIS applied to forest analysis permit the recognition and analysis of spatial patterns of species in two and three dimensional. The aim of this study to demonstrate the efficiency of cartesian and spatial coordinates method (MCCE, method of correcting UTM coordinates of trees location in accordance with the location of field or Cartesian (X ,Y, combined with natural neighbor index (ANND in recognition and analysis of spatial distribution patterns of four commercial timber species in forest management in Caracaraí, Roraima State, Brazil. Simulations were performed on 9 ha, divided into 100 plots of 100 m2 each. Collected data were DBH > 10 cm, commercial and total heights, cartesian coordinates (X,Y and spatial coordinates (UTM. Random spatial patterns were observed in Eschweilera bracteosa and Manilkara huberi. The dispersed and rare spatial patterns were observed in Dinizia excelsa and Cedrelinga cateniformis. MCCE proved to be an efficient method in the recognition and analysis of spatial patterns of native species from Amazon rain forest, as forest planning becomes easier by 2D and 3D simulations.

  9. Spatial Analysis of Political Capital Citation Using Remote Sensing ...

    African Journals Online (AJOL)

    Spatial Analysis of Political Capital Citation Using Remote Sensing and GIS; A Case Study of Lokoja. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more information about how to print, save, ...

  10. Multi-criteria decision analysis and spatial statistic: an approach to determining human vulnerability to vector transmission of Trypanosoma cruzi

    Directory of Open Access Journals (Sweden)

    Diego Montenegro

    Full Text Available BACKGROUND Chagas disease (CD, caused by the protozoan Trypanosoma cruzi, is a neglected human disease. It is endemic to the Americas and is estimated to have an economic impact, including lost productivity and disability, of 7 billion dollars per year on average. OBJECTIVES To assess vulnerability to vector-borne transmission of T. cruzi in domiciliary environments within an area undergoing domiciliary vector interruption of T. cruzi in Colombia. METHODS Multi-criteria decision analysis [preference ranking method for enrichment evaluation (PROMETHEE and geometrical analysis for interactive assistance (GAIA methods] and spatial statistics were performed on data from a socio-environmental questionnaire and an entomological survey. In the construction of multi-criteria descriptors, decision-making processes and indicators of five determinants of the CD vector pathway were summarily defined, including: (1 house indicator (HI; (2 triatominae indicator (TI; (3 host/reservoir indicator (Ho/RoI; (4 ecotope indicator (EI; and (5 socio-cultural indicator (S-CI. FINDINGS Determination of vulnerability to CD is mostly influenced by TI, with 44.96% of the total weight in the model, while the lowest contribution was from S-CI, with 7.15%. The five indicators comprise 17 indices, and include 78 of the original 104 priority criteria and variables. The PROMETHEE and GAIA methods proved very efficient for prioritisation and quantitative categorisation of socio-environmental determinants and for better determining which criteria should be considered for interrupting the man-T. cruzi-vector relationship in endemic areas of the Americas. Through the analysis of spatial autocorrelation it is clear that there is a spatial dependence in establishing categories of vulnerability, therefore, the effect of neighbors’ setting (border areas on local values should be incorporated into disease management for establishing programs of surveillance and control of CD via vector

  11. Multi-criteria decision analysis and spatial statistic: an approach to determining human vulnerability to vector transmission of Trypanosoma cruzi.

    Science.gov (United States)

    Montenegro, Diego; Cunha, Ana Paula da; Ladeia-Andrade, Simone; Vera, Mauricio; Pedroso, Marcel; Junqueira, Angela

    2017-10-01

    Chagas disease (CD), caused by the protozoan Trypanosoma cruzi, is a neglected human disease. It is endemic to the Americas and is estimated to have an economic impact, including lost productivity and disability, of 7 billion dollars per year on average. To assess vulnerability to vector-borne transmission of T. cruzi in domiciliary environments within an area undergoing domiciliary vector interruption of T. cruzi in Colombia. Multi-criteria decision analysis [preference ranking method for enrichment evaluation (PROMETHEE) and geometrical analysis for interactive assistance (GAIA) methods] and spatial statistics were performed on data from a socio-environmental questionnaire and an entomological survey. In the construction of multi-criteria descriptors, decision-making processes and indicators of five determinants of the CD vector pathway were summarily defined, including: (1) house indicator (HI); (2) triatominae indicator (TI); (3) host/reservoir indicator (Ho/RoI); (4) ecotope indicator (EI); and (5) socio-cultural indicator (S-CI). Determination of vulnerability to CD is mostly influenced by TI, with 44.96% of the total weight in the model, while the lowest contribution was from S-CI, with 7.15%. The five indicators comprise 17 indices, and include 78 of the original 104 priority criteria and variables. The PROMETHEE and GAIA methods proved very efficient for prioritisation and quantitative categorisation of socio-environmental determinants and for better determining which criteria should be considered for interrupting the man-T. cruzi-vector relationship in endemic areas of the Americas. Through the analysis of spatial autocorrelation it is clear that there is a spatial dependence in establishing categories of vulnerability, therefore, the effect of neighbors' setting (border areas) on local values should be incorporated into disease management for establishing programs of surveillance and control of CD via vector. The study model proposed here is flexible and

  12. TEMPORAL AND SPATIAL ANALYSIS OF EXTREME RAINFALL ON THE SLOPE AREA OF MT. MERAPI

    Directory of Open Access Journals (Sweden)

    Dhian Dharma Prayuda

    2015-02-01

    Full Text Available Rainfall has temporal and spatial characteristics with certain pattern which are affected by topographic variations and climatology of an area. The intensity of extreme rainfall is one of important characteristics related to the trigger factors for debris flow. This research will discuss the result of analysis on short duration rainfall data in the south and west slope of Mt. Merapi. Measured hourly rainfall data in 14 rainfall stations for the last 27 years were used as analysis input. The rainfall intensity-duration-frequency relationship (IDF was derived using empirical formula of Sherman, Kimijima, Haspers, and Mononobe method. The analysis on the characteristics of extreme rainfall intensity was performed by conducting spatial interpolation using Inverse Distance Weighted (IDW method. Result of analysis shows that IDF of rainfall in the research area fits to Sherman’s formula. Besides, the spatial distribution pattern of maximum rainfall intensity was assessed on the basis of area rainfall. Furthermore, the difference on the result of spatial map for one hour extreme rainfall based on isolated event and non-isolated event method can be evaluated. The result of this preliminary research is expected to be inputs in the establishment of debris flow early warning in Mt. Merapi slope area.

  13. Spin echo SPI methods for quantitative analysis of fluids in porous media.

    Science.gov (United States)

    Li, Linqing; Han, Hui; Balcom, Bruce J

    2009-06-01

    Fluid density imaging is highly desirable in a wide variety of porous media measurements. The SPRITE class of MRI methods has proven to be robust and general in their ability to generate density images in porous media, however the short encoding times required, with correspondingly high magnetic field gradient strengths and filter widths, and low flip angle RF pulses, yield sub-optimal S/N images, especially at low static field strength. This paper explores two implementations of pure phase encode spin echo 1D imaging, with application to a proposed new petroleum reservoir core analysis measurement. In the first implementation of the pulse sequence, we modify the spin echo single point imaging (SE-SPI) technique to acquire the k-space origin data point, with a near zero evolution time, from the free induction decay (FID) following a 90 degrees excitation pulse. Subsequent k-space data points are acquired by separately phase encoding individual echoes in a multi-echo acquisition. T(2) attenuation of the echo train yields an image convolution which causes blurring. The T(2) blur effect is moderate for porous media with T(2) lifetime distributions longer than 5 ms. As a robust, high S/N, and fast 1D imaging method, this method will be highly complementary to SPRITE techniques for the quantitative analysis of fluid content in porous media. In the second implementation of the SE-SPI pulse sequence, modification of the basic measurement permits fast determination of spatially resolved T(2) distributions in porous media through separately phase encoding each echo in a multi-echo CPMG pulse train. An individual T(2) weighted image may be acquired from each echo. The echo time (TE) of each T(2) weighted image may be reduced to 500 micros or less. These profiles can be fit to extract a T(2) distribution from each pixel employing a variety of standard inverse Laplace transform methods. Fluid content 1D images are produced as an essential by product of determining the

  14. Bovine spongiform encephalopathy and spatial analysis of the feed industry.

    Science.gov (United States)

    Paul, Mathilde; Abrial, David; Jarrige, Nathalie; Rican, Stéphane; Garrido, Myriam; Calavas, Didier; Ducrot, Christian

    2007-06-01

    In France, despite the ban of meat-and-bone meal (MBM) in cattle feed, bovine spongiform encephalopathy (BSE) was detected in hundreds of cattle born after the ban. To study the role of MBM, animal fat, and dicalcium phosphate on the risk for BSE after the feed ban, we conducted a spatial analysis of the feed industry. We used data from 629 BSE cases as well as data on use of each byproduct and market area of the feed factories. We mapped risk for BSE in 951 areas supplied by the same factories and connection with use of byproducts. A disease map of BSE with covariates was built with the hierarchical Bayesian modeling methods, based on Poisson distribution with spatial smoothing. Only use of MBM was spatially linked to risk for BSE, which highlights cross-contamination as the most probable source of infection after the feed ban.

  15. The effect of spatial micro-CT image resolution and surface complexity on the morphological 3D analysis of open porous structures

    Energy Technology Data Exchange (ETDEWEB)

    Pyka, Grzegorz, E-mail: gregory.pyka@mtm.kuleuven.be [Department of Metallurgy and Materials Engineering, KU Leuven, Kasteelpark Arenberg 44 – PB2450, B-3001 Leuven (Belgium); Kerckhofs, Greet [Department of Metallurgy and Materials Engineering, KU Leuven, Kasteelpark Arenberg 44 – PB2450, B-3001 Leuven (Belgium); Biomechanics Research Unit, Université de Liege, Chemin des Chevreuils 1 - BAT 52/3, B-4000 Liège (Belgium); Schrooten, Jan; Wevers, Martine [Department of Metallurgy and Materials Engineering, KU Leuven, Kasteelpark Arenberg 44 – PB2450, B-3001 Leuven (Belgium)

    2014-01-15

    In material science microfocus X-ray computed tomography (micro-CT) is one of the most popular non-destructive techniques to visualise and quantify the internal structure of materials in 3D. Despite constant system improvements, state-of-the-art micro-CT images can still hold several artefacts typical for X-ray CT imaging that hinder further image-based processing, structural and quantitative analysis. For example spatial resolution is crucial for an appropriate characterisation as the voxel size essentially influences the partial volume effect. However, defining the adequate image resolution is not a trivial aspect and understanding the correlation between scan parameters like voxel size and the structural properties is crucial for comprehensive material characterisation using micro-CT. Therefore, the objective of this study was to evaluate the influence of the spatial image resolution on the micro-CT based morphological analysis of three-dimensional (3D) open porous structures with a high surface complexity. In particular the correlation between the local surface properties and the accuracy of the micro-CT-based macro-morphology of 3D open porous Ti6Al4V structures produced by selective laser melting (SLM) was targeted and revealed for rough surfaces a strong dependence of the resulting structure characteristics on the scan resolution. Reducing the surface complexity by chemical etching decreased the sensitivity of the overall morphological analysis to the spatial image resolution and increased the detection limit. This study showed that scan settings and image processing parameters need to be customized to the material properties, morphological parameters under investigation and the desired final characteristics (in relation to the intended functional use). Customization of the scan resolution can increase the reliability of the micro-CT based analysis and at the same time reduce its operating costs. - Highlights: • We examine influence of the image resolution

  16. Strategic Placing of Field Hospitals Using Spatial Analysis

    OpenAIRE

    Rydén, Magnus

    2011-01-01

    Humanitarian help organisations today may benefit on improving their location analysis when placing field hospitals in countries hit by a disasters or catastrophe. The main objective of this thesis is to develop and evaluate a spatial decision support method for strategic placing of field hospitals for two time perspectives, long term (months) and short term (weeks). Specifically, the possibility of combining existing infrastructure and satellite data is examined to derive a suitability map f...

  17. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  18. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  19. Quantitative analysis of titanium concentration using calibration-free laser-induced breakdown spectroscopy (LIBS)

    Science.gov (United States)

    Zaitun; Prasetyo, S.; Suliyanti, M. M.; Isnaeni; Herbani, Y.

    2018-03-01

    Laser-induced breakdown spectroscopy (LIBS) can be used for quantitative and qualitative analysis. Calibration-free LIBS (CF-LIBS) is a method to quantitatively analyze concentration of elements in a sample in local thermodynamic equilibrium conditions without using available matrix-matched calibration. In this study, we apply CF-LIBS for quantitative analysis of Ti in TiO2 sample. TiO2 powder sample was mixed with polyvinyl alcohol and formed into pellets. An Nd:YAG pulsed laser at a wavelength of 1064 nm was focused onto the sample to generate plasma. The spectrum of plasma was recorded using spectrophotometer then compared to NIST spectral line to determine energy levels and other parameters. The value of plasma temperature obtained using Boltzmann plot is 8127.29 K and electron density from calculation is 2.49×1016 cm-3. Finally, the concentration of Ti in TiO2 sample from this study is 97% that is in proximity with the sample certificate.

  20. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  1. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  2. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  3. Where do overweight women in Ghana live? Answers from exploratory spatial data analysis

    Directory of Open Access Journals (Sweden)

    Fidelia A.A. Dake

    2012-03-01

    Full Text Available Contextual influence on health outcomes is increasingly becoming an important area of research. Analytical techniques such as spatial analysis help explain the variations and dynamics in health inequalities across different context and among different population groups. This paper explores spatial clustering in body mass index among Ghanaian women by analysing data from the 2008 Ghana Demographic and Health Survey using exploratory spatial data analysis techniques. Overweight was a more common occurrence in urban areas than in rural areas. Close to a quarter of the clusters in Ghana, mostly those in the southern sector contained women who were overweight. Women who lived in clusters where the women were overweight were more likely to live around other clusters where the women were also overweight. The results suggest that the urban environment could be a potential contributing factor to the high levels of obesity in urban areas of Ghana. There is the need for researchers to include a spatial dimension to obesity research in Ghana paying particular attention the urban environment.

  4. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  5. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  6. A Spatial Analysis of the Potato Cyst Nematode Globodera pallida in Idaho.

    Science.gov (United States)

    Dandurand, Louise-Marie; Contina, Jean Bertrand; Knudsen, Guy R

    2018-03-13

    The potato cyst nematode (PCN), Globodera pallida, is a globally regulated and quarantine potato pest. It was detected for the first time in the U.S. in the state of Idaho in 2006. A spatial analysis was performed to: (i) understand the spatial arrangement of PCN infested fields in southern Idaho using spatial point pattern analysis; and (ii) evaluate the potential threat of PCN for entry to new areas using spatial interpolation techniques. Data point locations, cyst numbers and egg viability values for each infested field were collected by USDA-APHIS during 2006-2014. Results showed the presence of spatially clustered PCN infested fields (P = 0.003). We determined that the spread of PCN grew in diameter from the original center of infestation toward the southwest as an ellipsoidal-shaped cluster. Based on the aggregated spatial pattern of distribution and the low extent level of PCN infested fields in southern Idaho, we determined that PCN spread followed a contagion effect scenario, where nearby infested fields contributed to the infestation of new fields, probably through soil contaminated agricultural equipment or tubers. We determined that the recent PCN presence in southern Idaho is unlikely to be associated with new PCN entry from outside the state of Idaho. The relative aggregation of PCN infested fields, the low number of cysts recovered, and the low values in egg viability facilitate quarantine activities and confine this pest to a small area, which, in 2017, is estimated to be 1,233 hectares. The tools and methods provided in this study should facilitate comprehensive approaches to improve PCN control and eradication programs as well as to raise public awareness about this economically important potato pest.

  7. Tuberculosis DALY-Gap: Spatial and Quantitative Comparison of Disease Burden Across Urban Slum and Non-slum Census Tracts.

    Science.gov (United States)

    Marlow, Mariel A; Maciel, Ethel Leonor Noia; Sales, Carolina Maia Martins; Gomes, Teresa; Snyder, Robert E; Daumas, Regina Paiva; Riley, Lee W

    2015-08-01

    To quantitatively assess disease burden due to tuberculosis between populations residing in and outside of urban informal settlements in Rio de Janeiro, Brazil, we compared disability-adjusted life years (DALYs), or "DALY-gap." Using the 2010 Brazilian census definition of informal settlements as aglomerados subnormais (AGSN), we allocated tuberculosis (TB) DALYs to AGSN vs non-AGSN census tracts based on geocoded addresses of TB cases reported to the Brazilian Information System for Notifiable Diseases in 2005 and 2010. DALYs were calculated based on the 2010 Global Burden of Disease methodology. DALY-gap was calculated as the difference between age-adjusted DALYs/100,000 population between AGSN and non-AGSN. Total TB DALY in Rio in 2010 was 16,731 (266 DALYs/100,000). DALYs were higher in AGSN census tracts (306 vs 236 DALYs/100,000), yielding a DALY-gap of 70 DALYs/100,000. Attributable DALY fraction for living in an AGSN was 25.4%. DALY-gap was highest for males 40-59 years of age (501 DALYs/100,000) and in census tracts with <60% electricity (12,327 DALYs/100,000). DALY-gap comparison revealed spatial and quantitative differences in TB burden between slum vs non-slum census tracts that were not apparent using traditional measures of incidence and mortality. This metric could be applied to compare TB burden or burden for other diseases in mega-cities with large informal settlements for more targeted resource allocation and evaluation of intervention programs.

  8. Schools, Air Pollution, and Active Transportation: An Exploratory Spatial Analysis of Calgary, Canada.

    Science.gov (United States)

    Bertazzon, Stefania; Shahid, Rizwan

    2017-07-25

    An exploratory spatial analysis investigates the location of schools in Calgary (Canada) in relation to air pollution and active transportation options. Air pollution exhibits marked spatial variation throughout the city, along with distinct spatial patterns in summer and winter; however, all school locations lie within low to moderate pollution levels. Conversely, the study shows that almost half of the schools lie in low walkability locations; likewise, transitability is low for 60% of schools, and only bikability is widespread, with 93% of schools in very bikable locations. School locations are subsequently categorized by pollution exposure and active transportation options. This analysis identifies and maps schools according to two levels of concern: schools in car-dependent locations and relatively high pollution; and schools in locations conducive of active transportation, yet exposed to relatively high pollution. The findings can be mapped and effectively communicated to the public, health practitioners, and school boards. The study contributes with an explicitly spatial approach to the intra-urban public health literature. Developed for a moderately polluted city, the methods can be extended to more severely polluted environments, to assist in developing spatial public health policies to improve respiratory outcomes, neurodevelopment, and metabolic and attention disorders in school-aged children.

  9. Schools, Air Pollution, and Active Transportation: An Exploratory Spatial Analysis of Calgary, Canada

    Science.gov (United States)

    Bertazzon, Stefania; Shahid, Rizwan

    2017-01-01

    An exploratory spatial analysis investigates the location of schools in Calgary (Canada) in relation to air pollution and active transportation options. Air pollution exhibits marked spatial variation throughout the city, along with distinct spatial patterns in summer and winter; however, all school locations lie within low to moderate pollution levels. Conversely, the study shows that almost half of the schools lie in low walkability locations; likewise, transitability is low for 60% of schools, and only bikability is widespread, with 93% of schools in very bikable locations. School locations are subsequently categorized by pollution exposure and active transportation options. This analysis identifies and maps schools according to two levels of concern: schools in car-dependent locations and relatively high pollution; and schools in locations conducive of active transportation, yet exposed to relatively high pollution. The findings can be mapped and effectively communicated to the public, health practitioners, and school boards. The study contributes with an explicitly spatial approach to the intra-urban public health literature. Developed for a moderately polluted city, the methods can be extended to more severely polluted environments, to assist in developing spatial public health policies to improve respiratory outcomes, neurodevelopment, and metabolic and attention disorders in school-aged children. PMID:28757577

  10. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  11. [Spatial tendency of urban land use in new Yinzhou Town of Ningbo City, Zhejiang Province of East China].

    Science.gov (United States)

    Jiang, Wen-Wei; Guo, Hui-Hui; Mei, Yan-Xia

    2012-03-01

    By adopting gradient analysis combining with the analysis of urban land use degree, this paper studied the spatial layout characteristics of residential and industrial lands in new Yinzhou Town, and explored the location characters of various urban land use by selecting public green land, public facilities, and road as the location advantage factors. Gradient analysis could effectively connect with the spatial layout of urban land use, and quantitatively depict the spatial character of urban land use. In the new town, there was a new urban spatial center mostly within the radius of 2 km, namely, the urban core area had obvious location advantage in the cross-shaft direction urban development. On the south of Yinzhou Avenue, the urban hinterland would be constructed soon. In the future land use of the new town, the focus would be the reasonable vicissitude of industrial land after the adjustment of industrial structure, the high-efficient intensive use of the commercial land restricted by the compulsive condition of urban core area, and the agricultural land protection in the southeastern urban-rural fringe.

  12. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  13. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  14. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    Science.gov (United States)

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  15. Three-way methods for the analysis of qualitative and quantitative two-way data.

    NARCIS (Netherlands)

    Kiers, Hendrik Albert Lambertus

    1989-01-01

    A problem often occurring in exploratory data analysis is how to summarize large numbers of variables in terms of a smaller number of dimensions. When the variables are quantitative, one may resort to Principal Components Analysis (PCA). When qualitative (categorical) variables are involved, one may

  16. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    Science.gov (United States)

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis by using the Delphi Technique

    Science.gov (United States)

    Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Ng, E. G.

    2018-02-01

    This paper explains the process carried out in identifying the relevant features of the National Digital Cadastral Database (NDCDB) for spatial analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the technical relevance standpoint. Three statements describing the relevant features of NDCDB for spatial analysis were established after three rounds of consensus building. It highlighted the NDCDB’s characteristics such as its spatial accuracy, functions, and criteria as a facilitating tool for spatial analysis. By recognising the relevant features of NDCDB for spatial analysis in this study, practical application of NDCDB for various analysis and purpose can be widely implemented.

  18. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  19. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  20. Aspects of the incorporation of spatial data into radioecological and restoration analysis

    International Nuclear Information System (INIS)

    Beresford, N.A.; Wright, S.M.; Howard, B.J.; Crout, N.M.J.; Arkhipov, A.; Voigt, G.

    2002-01-01

    In the last decade geographical information systems have been increasingly used to incorporate spatial data into radioecological analysis. This has allowed the development of models with spatially variable outputs. Two main approaches have been adopted in the development of spatial models. Empirical Tag based models applied across a range of spatial scales utilize underlying soil type maps and readily available radioecological data. Soil processes can also be modelled to allow the dynamic prediction of radionuclide soil to plant transfer. We discuss a dynamic semi-mechanistic radiocaesium soil to plant-transfer model, which utilizes readily available spatially variable soil parameters. Both approaches allow the identification of areas that may be vulnerable to radionuclide deposition, therefore enabling the targeting of intervention measures. Improved estimates of radionuclide fluxes and ingestion doses can be achieved by incorporating spatially varying inputs such as agricultural production and dietary habits in to these models. In this paper, aspects of such models, including data requirements, implementation and outputs are discussed and critically evaluated. The relative merits and disadvantages of the two spatial model approaches adopted within radioecology are discussed. We consider the usefulness of such models to aid decision-makers and access the requirements and potential of further application within radiological protection. (author)

  1. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  2. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D. [KAERI, Daejeon (Korea, Republic of)

    2012-01-15

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study.

  3. Quantitative analysis technique for Xenon in PWR spent fuel by using WDS

    International Nuclear Information System (INIS)

    Kwon, H. M.; Kim, D. S.; Seo, H. S.; Ju, J. S.; Jang, J. N.; Yang, Y. S.; Park, S. D.

    2012-01-01

    This study includes three processes. First, a peak centering of the X-ray line was performed after a diffraction for Xenon La1 line was installed. Xe La1 peak was identified by a PWR spent fuel sample. Second, standard intensities of Xe was obtained by interpolation of the La1 intensities from a series of elements on each side of xenon. And then Xe intensities across the radial direction of a PWR spent fuel sample were measured by WDS-SEM. Third, the electron and X-ray depth distributions for a quantitative electron probe micro analysis were simulated by the CASINO Monte Carlo program to do matrix correction of a PWR spent fuel sample. Finally, the method and the procedure for local quantitative analysis of Xenon was developed in this study

  4. Visuo-Spatial Performance in Autism: A Meta-analysis

    OpenAIRE

    Muth, Anne; Honekopp, Johannes; Falter, Christine

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large heterogeneity that is unaccounted for. No clear differences were found for Mental Rotation. ASD samples showed a stronger local processing preference for...

  5. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    Science.gov (United States)

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  6. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    Science.gov (United States)

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  7. Lab-on-capillary: a rapid, simple and quantitative genetic analysis platform integrating nucleic acid extraction, amplification and detection.

    Science.gov (United States)

    Fu, Yu; Zhou, Xiaoming; Xing, Da

    2017-12-05

    In this work, we describe for the first time a genetic diagnosis platform employing a polydiallyldimethylammonium chloride (PDDA)-modified capillary and a liquid-based thermalization system for rapid, simple and quantitative DNA analysis with minimal user interaction. Positively charged PDDA is modified on the inner surface of the silicon dioxide capillary by using an electrostatic self-assembly approach that allows the negatively charged DNA to be separated from the lysate in less than 20 seconds. The capillary loaded with the PCR mix is incorporated in the thermalization system, which can achieve on-site real-time PCR. This system is based on the circulation of pre-heated liquids in the chamber, allowing for high-speed thermalization of the capillary and fast amplification. Multiple targets can be simultaneously analysed with multiplex spatial melting. Starting with live Escherichia coli (E. coli) cells in milk, as a realistic sample, the current method can achieve DNA extraction, amplification, and detection within 40 min.

  8. a Temporal and Spatial Analysis of Urban Heat Island in Basin City Utilizing Remote Sensing Techniques

    Science.gov (United States)

    Chang, Hsiao-Tung

    2016-06-01

    Urban Heat Island (UHI) has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city's UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST) at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls "sinking heat island". From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  9. A TEMPORAL AND SPATIAL ANALYSIS OF URBAN HEAT ISLAND IN BASIN CITY UTILIZING REMOTE SENSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H.-T. Chang

    2016-06-01

    Full Text Available Urban Heat Island (UHI has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city’s UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls “sinking heat island”. From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  10. Quantitative security analysis for programs with low input and noisy output

    NARCIS (Netherlands)

    Ngo, Minh Tri; Huisman, Marieke

    Classical quantitative information flow analysis often considers a system as an information-theoretic channel, where private data are the only inputs and public data are the outputs. However, for systems where an attacker is able to influence the initial values of public data, these should also be

  11. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  12. Quantitative ferromagnetic resonance analysis of CD 133 stem cells labeled with iron oxide nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Gamarra, L F; Pavon, L F; Marti, L C; Moreira-Filho, C A; Amaro, E Jr [Instituto Israelita de Ensino e Pesquisa Albert Einstein, IIEPAE, Sao Paulo 05651-901 (Brazil); Pontuschka, W M; Mamani, J B [Instituto de Fisica, Universidade de Sao Paulo, Sao Paulo 05315-970 (Brazil); Costa-Filho, A J; Vieira, E D [Instituto de Fisica de Sao Carlos, Universidade de Sao Paulo, Sao Carlos 13560-970 (Brazil)], E-mail: lgamarra@einstein.br

    2008-05-21

    The aim of this work is to provide a quantitative method for analysis of the concentration of superparamagnetic iron oxide nanoparticles (SPION), determined by means of ferromagnetic resonance (FMR), with the nanoparticles coupled to a specific antibody (AC 133), and thus to express the antigenic labeling evidence for the stem cells CD 133{sup +}. The FMR efficiency and sensitivity were proven adequate for detecting and quantifying the low amounts of iron content in the CD 133{sup +} cells ({approx}6.16 x 10{sup 5} pg in the volume of 2 {mu}l containing 4.5 x 10{sup 11} SPION). The quantitative method led to the result of 1.70 x 10{sup -13} mol of Fe (9.5 pg), or 7.0 x 10{sup 6} nanoparticles per cell. For the quantification analysis via the FMR technique it was necessary to carry out a preliminary quantitative visualization of iron oxide-labeled cells in order to ensure that the nanoparticles coupled to the antibodies are indeed tied to the antigen at the stem cell surface and that the cellular morphology was conserved, as proof of the validity of this method. The quantitative analysis by means of FMR is necessary for determining the signal intensity for the study of molecular imaging by means of magnetic resonance imaging (MRI)

  13. X-ray quantitative analysis on spallation response in high purity copper under sweeping detonation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yang, E-mail: yangyanggroup@163.com [School of Material Science and Engineering, Central South University, Changsha 410083 (China); Institute of Fluid Physics, China Academy of Engineering Physics, Mianyang 621900 (China); National Key Laboratory of Explosion Science and Technology, Beijing Institute of Technology, Beijing 100081 (China); Key Laboratory of Nonferrous Metals Material Science and Engineering of Ministry of Education, Central South University, Changsha 410083 (China); Chen, Jixiong; Peng, Zhiqiang [School of Material Science and Engineering, Central South University, Changsha 410083 (China); Guo, Zhaoliang; Tang, Tiegang; Hu, Haibo [Institute of Fluid Physics, China Academy of Engineering Physics, Mianyang 621900 (China); Hu, Yanan [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China)

    2016-06-14

    The 3-D quantitative investigation of spall behavior in high purity copper plants with different heat treatment histories was characterized using X-ray computer tomography (XRCT). The effect of shock stress and grain size on the spatial distribution and morphology of incipient spall samples were discussed. The results revealed that, in samples with similar microstructure, the ranges of void distribution decrease with the increasing of shock stress. The characteristic parameters (such as mean elongation, mean flatness and mean sphericity of voids) determined using XRCT herein as a function of shock stress and grain size. The quantitative analyses of spallation datasets render functional relationships between the microscopic parameters (like volume, frequency) of spallation voids and the microstructure. The XRCT observations show that voids are prone to coalescence in thermo-mechanical treatments (TMT) sample, while the final maximum and mean volume of void were smaller than that of annealed sample. This is due to the smaller grain size of TMT sample, which means more nucleation sites of voids, this made the voids get closer and easier to coalescence, and flat voids formed ultimately.

  14. A hybrid spatial-spectral denoising method for infrared hyperspectral images using 2DPCA

    Science.gov (United States)

    Huang, Jun; Ma, Yong; Mei, Xiaoguang; Fan, Fan

    2016-11-01

    The traditional noise reduction methods for 3-D infrared hyperspectral images typically operate independently in either the spatial or spectral domain, and such methods overlook the relationship between the two domains. To address this issue, we propose a hybrid spatial-spectral method in this paper to link both domains. First, principal component analysis and bivariate wavelet shrinkage are performed in the 2-D spatial domain. Second, 2-D principal component analysis transformation is conducted in the 1-D spectral domain to separate the basic components from detail ones. The energy distribution of noise is unaffected by orthogonal transformation; therefore, the signal-to-noise ratio of each component is used as a criterion to determine whether a component should be protected from over-denoising or denoised with certain 1-D denoising methods. This study implements the 1-D wavelet shrinking threshold method based on Stein's unbiased risk estimator, and the quantitative results on publicly available datasets demonstrate that our method can improve denoising performance more effectively than other state-of-the-art methods can.

  15. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    Science.gov (United States)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  16. A spatial cluster analysis of tractor overturns in Kentucky from 1960 to 2002.

    Directory of Open Access Journals (Sweden)

    Daniel M Saman

    Full Text Available Agricultural tractor overturns without rollover protective structures are the leading cause of farm fatalities in the United States. To our knowledge, no studies have incorporated the spatial scan statistic in identifying high-risk areas for tractor overturns. The aim of this study was to determine whether tractor overturns cluster in certain parts of Kentucky and identify factors associated with tractor overturns.A spatial statistical analysis using Kulldorff's spatial scan statistic was performed to identify county clusters at greatest risk for tractor overturns. A regression analysis was then performed to identify factors associated with tractor overturns.The spatial analysis revealed a cluster of higher than expected tractor overturns in four counties in northern Kentucky (RR = 2.55 and 10 counties in eastern Kentucky (RR = 1.97. Higher rates of tractor overturns were associated with steeper average percent slope of pasture land by county (p = 0.0002 and a greater percent of total tractors with less than 40 horsepower by county (p<0.0001.This study reveals that geographic hotspots of tractor overturns exist in Kentucky and identifies factors associated with overturns. This study provides policymakers a guide to targeted county-level interventions (e.g., roll-over protective structures promotion interventions with the intention of reducing tractor overturns in the highest risk counties in Kentucky.

  17. Quantitative analysis of rat Ig (sub)classes binding to cell surface antigens

    International Nuclear Information System (INIS)

    Nilsson, R.; Brodin, T.; Sjoegren, H.-O.

    1982-01-01

    An indirect 125 I-labeled protein A assay for detection of cell surface-bound rat immunoglobulins is presented. The assay is quantitative and rapid and detects as little as 1 ng of cell surface-bound Ig. It discriminates between antibodies belonging to different IgG subclasses, IgM and IgA. The authors describe the production and specificity control of the reagents used and show that the test can be used for quantitative analysis. A large number of sera from untreated rats are tested to evaluate the frequency of falsely positive responses and variation due to age, sex and strain of rat. With this test it is relatively easy to quantitate the binding of classes and subclasses of rat immunoglobulins in a small volume (6 μl) of untreated serum. (Auth.)

  18. Spatial analysis of digital technologies and impact on socio - cultural ...

    African Journals Online (AJOL)

    The objective of this study was to determine the spatial distribution of digital technologies and ascertain whether digital technologies have significant impact on socio - cultural values or not. Moran's index and Getis and Ord's statistic were used for cluster and hotspots analysis. The unique locations of digital technologies ...

  19. Preclinical evaluation of spatial frequency domain-enabled wide-field quantitative imaging for enhanced glioma resection

    Science.gov (United States)

    Sibai, Mira; Fisher, Carl; Veilleux, Israel; Elliott, Jonathan T.; Leblond, Frederic; Roberts, David W.; Wilson, Brian C.

    2017-07-01

    5-Aminolevelunic acid-induced protoporphyrin IX (PpIX) fluorescence-guided resection (FGR) enables maximum safe resection of glioma by providing real-time tumor contrast. However, the subjective visual assessment and the variable intrinsic optical attenuation of tissue limit this technique to reliably delineating only high-grade tumors that display strong fluorescence. We have previously shown, using a fiber-optic probe, that quantitative assessment using noninvasive point spectroscopic measurements of the absolute PpIX concentration in tissue further improves the accuracy of FGR, extending it to surgically curable low-grade glioma. More recently, we have shown that implementing spatial frequency domain imaging with a fluorescent-light transport model enables recovery of two-dimensional images of [PpIX], alleviating the need for time-consuming point sampling of the brain surface. We present first results of this technique modified for in vivo imaging on an RG2 rat brain tumor model. Despite the moderate errors in retrieving the absorption and reduced scattering coefficients in the subdiffusive regime of 14% and 19%, respectively, the recovered [PpIX] maps agree within 10% of the point [PpIX] values measured by the fiber-optic probe, validating its potential as an extension or an alternative to point sampling during glioma resection.

  20. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  1. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Villot-Kadri, M.; Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Michel, K.; Bruyère, D.; Laperche, V. [BRGM, Service Métrologie, Monitoring et Analyse, 3 avenue Claude Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux1.fr [Univ. Bordeaux, LOMA, UMR 5798, F-33400 Talence (France); CNRS, LOMA, UMR 5798, F-33400 Talence (France)

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced.

  2. Artificial neural network for on-site quantitative analysis of soils using laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Villot-Kadri, M.; Ismaël, A.; Gallou, G.; Michel, K.; Bruyère, D.; Laperche, V.; Canioni, L.; Bousquet, B.

    2013-01-01

    Nowadays, due to environmental concerns, fast on-site quantitative analyses of soils are required. Laser induced breakdown spectroscopy is a serious candidate to address this challenge and is especially well suited for multi-elemental analysis of heavy metals. However, saturation and matrix effects prevent from a simple treatment of the LIBS data, namely through a regular calibration curve. This paper details the limits of this approach and consequently emphasizes the advantage of using artificial neural networks well suited for non-linear and multi-variate calibration. This advanced method of data analysis is evaluated in the case of real soil samples and on-site LIBS measurements. The selection of the LIBS data as input data of the network is particularly detailed and finally, resulting errors of prediction lower than 20% for aluminum, calcium, copper and iron demonstrate the good efficiency of the artificial neural networks for on-site quantitative LIBS of soils. - Highlights: ► We perform on-site quantitative LIBS analysis of soil samples. ► We demonstrate that univariate analysis is not convenient. ► We exploit artificial neural networks for LIBS analysis. ► Spectral lines other than the ones from the analyte must be introduced

  3. The spatial kinetic analysis of accelerator-driven subcritical reactor

    International Nuclear Information System (INIS)

    Takahashi, H.; An, Y.; Chen, X.

    1998-02-01

    The operation of the accelerator driven reactor with subcritical condition provides a more flexible choice of the reactor materials and of design parameters. A deep subcriticality is chosen sometime from the analysis of point kinetics. When a large reactor is operated in deep subcritical condition by using a localized spallation source, the power distribution has strong spatial dependence, and point kinetics does not provide proper analysis for reactor safety. In order to analyze the spatial and energy dependent kinetic behavior in the subcritical reactor, the authors developed a computation code which is composed of two parts, the first one is for creating the group cross section and the second part solves the multi-group kinetic diffusion equations. The reactor parameters such as the cross section of fission, scattering, and energy transfer among the several energy groups and regions are calculated by using a code modified from the Monte Carlo codes MCNPA and LAHET instead of the usual analytical method of ANISN, TWOTRAN codes. Thus the complicated geometry of the accelerator driven reactor core can be precisely taken into account. The authors analyzed the subcritical minor actinide transmutor studied by Japan Atomic Energy Research Institute (JAERI) using the code

  4. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  5. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  6. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  7. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  8. Modern Spatial Rainfall Rate is well Correlated with Coretop δ2Hdinosterol in the South Pacific Convergence Zone: a Tool for Quantitative Reconstructions

    Science.gov (United States)

    Maloney, A. E.; Nelson, D. B.; Sachs, J. P.; Hassall, J. D.; Sear, D. A.; Langdon, P. G.; Prebble, M.; Richey, J. N.; Schabetsberger, R.; Sichrowsky, U.; Hope, G.

    2016-02-01

    The South Pacific Convergence Zone (SPCZ) is the Southern Hemisphere's most prominent precipitation feature extending southeastward 3000 km from Papua New Guinea to French Polynesia. Determining how the SPCZ responded to climate variations before the instrumental record requires the use of indirect indicators of rainfall. The link between the hydrogen isotopic composition of water fluxes though the hydrologic cycle, lake water, and molecular fossil 2H/1H ratios make hydrogen isotopes a promising tool for improving our understanding of this important climate feature. An analysis of coretop sediment from freshwater lakes in the SPCZ region indicates that there is a strong spatial relationship between δ2Hdinosterol and mean annual precipitation rate. The objectives of this research are to use 2H/1H ratios of the biomarker dinosterol to develop an empirical relationship between δ2Hdinosterol and modern environmental rainfall rates so that we may quantitatively reconstruct several aspects of the SPCZ's hydrological system during the late Holocene. The analysis includes lake sediment coretops from the Solomon Islands, Wallis Island, Vanuatu, Tahiti, Samoa, New Caledonia, and the Cook Islands. These islands span range of average modern precipitation rates from 3 to 7 mm/day and the coretop sediment δ2Hdinosterol values range from -240‰ to -320‰. Applying this regional coretop calibration to dated sediment cores reveals that the mean annual position and/or intensity of the SPCZ has not been static during the past 2000 years.

  9. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports

    Directory of Open Access Journals (Sweden)

    Chanlekha Hutchatai

    2010-03-01

    Full Text Available Abstract Background Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. Results In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall. Conclusions We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  10. Analysis of spatial pattern of settlements in the federal capital ...

    African Journals Online (AJOL)

    Human settlements are important, seemingly static but dynamic, features of the cultural landscape that have attracted several studies due to the important role they play in human life. This paper examined the spatial distribution of settlements in the Federal Capital Territory (FCT) of Nigeria. The analysis uses vector based ...

  11. Attenuated total internal reflection Fourier transform infrared spectroscopy: a quantitative approach for kidney stone analysis.

    Science.gov (United States)

    Gulley-Stahl, Heather J; Haas, Jennifer A; Schmidt, Katherine A; Evan, Andrew P; Sommer, André J

    2009-07-01

    The impact of kidney stone disease is significant worldwide, yet methods for quantifying stone components remain limited. A new approach requiring minimal sample preparation for the quantitative analysis of kidney stone components has been investigated utilizing attenuated total internal reflection Fourier transform infrared spectroscopy (ATR-FT-IR). Calcium oxalate monohydrate (COM) and hydroxylapatite (HAP), two of the most common constituents of urinary stones, were used for quantitative analysis. Calibration curves were constructed using integrated band intensities of four infrared absorptions versus concentration (weight %). The correlation coefficients of the calibration curves range from 0.997 to 0.93. The limits of detection range from 0.07 +/- 0.02% COM/HAP where COM is the analyte and HAP is the matrix, to 0.26 +/- 0.07% HAP/COM where HAP is the analyte and COM is the matrix. This study shows that linear calibration curves can be generated for the quantitative analysis of stone mixtures provided the system is well understood especially with respect to particle size.

  12. Estimation of spatial uncertainties of tomographic velocity models

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, M.; Du, Z.; Querendez, E. [SINTEF Petroleum Research, Trondheim (Norway)

    2012-12-15

    This research project aims to evaluate the possibility of assessing the spatial uncertainties in tomographic velocity model building in a quantitative way. The project is intended to serve as a test of whether accurate and specific uncertainty estimates (e.g., in meters) can be obtained. The project is based on Monte Carlo-type perturbations of the velocity model as obtained from the tomographic inversion guided by diagonal and off-diagonal elements of the resolution and the covariance matrices. The implementation and testing of this method was based on the SINTEF in-house stereotomography code, using small synthetic 2D data sets. To test the method the calculation and output of the covariance and resolution matrices was implemented, and software to perform the error estimation was created. The work included the creation of 2D synthetic data sets, the implementation and testing of the software to conduct the tests (output of the covariance and resolution matrices which are not implicitly provided by stereotomography), application to synthetic data sets, analysis of the test results, and creating the final report. The results show that this method can be used to estimate the spatial errors in tomographic images quantitatively. The results agree with' the known errors for our synthetic models. However, the method can only be applied to structures in the model, where the change of seismic velocity is larger than the predicted error of the velocity parameter amplitudes. In addition, the analysis is dependent on the tomographic method, e.g., regularization and parameterization. The conducted tests were very successful and we believe that this method could be developed further to be applied to third party tomographic images.

  13. Reliability Analysis of 6-Component Star Markov Repairable System with Spatial Dependence

    Directory of Open Access Journals (Sweden)

    Liying Wang

    2017-01-01

    Full Text Available Star repairable systems with spatial dependence consist of a center component and several peripheral components. The peripheral components are arranged around the center component, and the performance of each component depends on its spatial “neighbors.” Vector-Markov process is adapted to describe the performance of the system. The state space and transition rate matrix corresponding to the 6-component star Markov repairable system with spatial dependence are presented via probability analysis method. Several reliability indices, such as the availability, the probabilities of visiting the safety, the degradation, the alert, and the failed state sets, are obtained by Laplace transform method and a numerical example is provided to illustrate the results.

  14. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  15. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  16. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  17. A Spatial Analysis of Poverty in Kigali, Rwanda using indicators of ...

    African Journals Online (AJOL)

    A Spatial Analysis of Poverty in Kigali, Rwanda using indicators of household ... conducted by the National Institute of Statistics of Rwanda in 2000-2001. ... The third region of low poverty incident has between 4-12% of its population poor.

  18. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  19. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  20. Analysis of Participatory Processes in the Formulation of Spatial Plan for Nature Park Medvednica

    Directory of Open Access Journals (Sweden)

    Nataša Lovrić

    2011-12-01

    Full Text Available Background and Purpose: This research aims to assess the stakeholders influence on spatial planning of Nature Park Medvednica, a mountainous protected area adjacent to Zagreb, the capital city of Croatia, which tries to hold on to the pressure of the urbanization. Because of the inexistence of spatial plan which is required with the Croatian laws, its area was significantly decreased in 2009. This kind of research has not been done yet for NP Medvednica, and it will provide a contribution to the process of developing a spatial Plan for NP Medvednica. Material and Methods: The study was conducted in the framework of stakeholder analysis, for which a series of in-depth interviews with - stakeholders were performed, and documents concerning the spatial plan were analysed. The data gained was processed in MAXQDA software for qualitative analysis. Results and Conclusion: The gathered data explains which are the disadvantages of the tree processes of the formulation of the spatial plan and is giving a possible theoretical explanation or a model which can be implied in any decision making process involving stakeholders in natural resources management in within a given political and cultural context. Description of the past and current spatial planning situation of the NP Medvednica was specified and issues and stakeholders concerning the creation of the spatial plan where identified. The key conflict areas that affect the formulation of spatial plan were detected and examined. The level of participation of stakeholders in the context of fulfilment of their own interests was assessed as well as the influence on participation processes of different stakeholder groups on the formulation of the spatial plan. In order to have proper citizens and stakeholders participation some changes in the legislation should take place.

  1. Quantitative analysis of thorium-containing materials using an Industrial XRF analyzer

    International Nuclear Information System (INIS)

    Hasikova, J.; Titov, V.; Sokolov, A.

    2014-01-01

    Thorium (Th) as nuclear fuel is clean and safe and offers significant advantages over uranium. The technology for several types of thorium reactors is proven but still must be developed on a commercial scale. In the case of commercialization of thorium nuclear reactor thorium raw materials will be on demand. With this, mining and processing companies producing Th and rare earth elements will require prompt and reliable methods and instrumentation for Th quantitative on-line analysis. Potential applicability of X-ray fluorescence conveyor analyzer CON-X series is discussed for Th quantitative or semi-quantitative on-line measurement in several types of Th-bearing materials. Laboratory study of several minerals (zircon sands and limestone as unconventional Th resources; monazite concentrate as Th associated resources and uranium ore residues after extraction as a waste product) was performed and analyzer was tested for on-line quantitative measurements of Th contents along with other major and minor components. Th concentration range in zircon sand is 50-350 ppm; its detection limit at this level is estimated at 25- 50 ppm in 5 minute measurements depending on the type of material. On-site test of the CON-X analyzer for continuous analysis of thorium traces along with other elements in zircon sand showed that accuracy of Th measurements is within 20% relative. When Th content is higher than 1% as in the concentrate of monazite ore (5-8% ThO_2) accuracy of Th determination is within 1% relative. Although preliminary on-site test is recommended in order to address system feasibility at a large scale, provided results show that industrial conveyor XRF analyzer CON-X series can be effectively used for analytical control of mining and processing streams of Th-bearing materials. (author)

  2. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  4. Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm

    Science.gov (United States)

    Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong

    2015-02-01

    Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.

  5. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  6. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  7. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  8. Quantitative detection of mass concentration of sand-dust storms via wind-profiling radar and analysis of Z- M relationship

    Science.gov (United States)

    Wang, Minzhong; Ming, Hu; Ruan, Zheng; Gao, Lianhui; Yang, Di

    2018-02-01

    With the aim to achieve quantitative monitoring of sand-dust storms in real time, wind-profiling radar is applied to monitor and study the process of four sand-dust storms in the Tazhong area of the Taklimakan Desert. Through evaluation and analysis of the spatial-temporal distribution of reflectivity factor, it is found that reflectivity factor ranges from 2 to 18 dBz under sand-dust storm weather. Using echo power spectrum of radar vertical beams, sand-dust particle spectrum and sand-dust mass concentration at the altitude of 600 ˜ 1500 m are retrieved. This study shows that sand-dust mass concentration reaches 700 μg/m3 under blowing sand weather, 2000 μg/m3 under sand-dust storm weather, and 400 μg/m3 under floating dust weather. The following equations are established to represent the relationship between the reflectivity factor and sand-dust mass concentration: Z = 20713.5 M 0.995 under floating dust weather, Z = 22988.3 M 1.006 under blowing sand weather, and Z = 24584.2 M 1.013 under sand-dust storm weather. The retrieval results from this paper are almost consistent with previous monitoring results achieved by former researchers; thus, it is implied that wind-profiling radar can be used as a new reference device to quantitatively monitor sand-dust storms.

  9. [Spatial analysis of childhood obesity and overweight in Peru, 2014].

    Science.gov (United States)

    Hernández-Vásquez, Akram; Bendezú-Quispe, Guido; Díaz-Seijas, Deysi; Santero, Marilina; Minckas, Nicole; Azañedo, Diego; Antiporta, Daniel A

    2016-01-01

    To estimate regional prevalence and identify the spatial patterns of the degree of overweight and obesity by districts in under five years children in Peru during 2014. Analysis of the information reported by the Information System Nutritional Status (SIEN) of the number of cases of overweight and obesity in children under five years recorded during 2014. Regional prevalence for overweight and obesity, and their respective confidence intervals to 95% were calculated. Moran index was used to determine patterns of grouping districts with high prevalence of overweight and/or obesity. Data from 1834 districts and 2,318,980 children under five years were analyzed. 158,738 cases (6.84%; CI 95%: 6.81 to 6.87) were overweight, while 56,125 (2.42%; CI 95%: 2.40 to 2.44) obesity. The highest prevalence of overweight were identified in the regions of Tacna (13.9%), Moquegua (11.8%), Callao (10.4%), Lima (10.2%) and Ica (9.3%), and in the same regions for obesity with 5.3%; 4.3%; 4.0%; 4.0% and 3.8% respectively. The spatial analysis found grouping districts of high prevalence in 10% of all districts for both overweight and obesity, identifying 199 districts for overweight (126 urban and 73 rural), and 184 for obesity (136 urban and 48 rural). The highest prevalence of overweight and obesity were identified in the Peruvian coast regions. Moreover, these regions are predominantly exhibited a spatial clustering of districts with high prevalence of overweight and obesity.

  10. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  11. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  12. Spatial Thinking in Atmospheric Science Education

    Science.gov (United States)

    McNeal, P. M.; Petcovic, H. L.; Ellis, T. D.

    2016-12-01

    Atmospheric science is a STEM discipline that involves the visualization of three-dimensional processes from two-dimensional maps, interpretation of computer-generated graphics and hand plotting of isopleths. Thus, atmospheric science draws heavily upon spatial thinking. Research has shown that spatial thinking ability can be a predictor of early success in STEM disciplines and substantial evidence demonstrates that spatial thinking ability is improved through various interventions. Therefore, identification of the spatial thinking skills and cognitive processes used in atmospheric science is the first step toward development of instructional strategies that target these skills and scaffold the learning of students in atmospheric science courses. A pilot study of expert and novice meteorologists identified mental animation and disembedding as key spatial skills used in the interpretation of multiple weather charts and images. Using this as a starting point, we investigated how these spatial skills, together with expertise, domain specific knowledge, and working memory capacity affect the ability to produce an accurate forecast. Participants completed a meteorology concept inventory, experience questionnaire and psychometric tests of spatial thinking ability and working memory capacity prior to completing a forecasting task. A quantitative analysis of the collected data investigated the effect of the predictor variables on the outcome task. A think-aloud protocol with individual participants provided a qualitative look at processes such as task decomposition, rule-based reasoning and the formation of mental models in an attempt to understand how individuals process this complex data and describe outcomes of particular meteorological scenarios. With our preliminary results we aim to inform atmospheric science education from a cognitive science perspective. The results point to a need to collaborate with the atmospheric science community broadly, such that multiple

  13. Research on spatial Model and analysis algorithm for nuclear weapons' damage effects

    International Nuclear Information System (INIS)

    Liu Xiaohong; Meng Tao; Du Maohua; Wang Weili; Ji Wanfeng

    2011-01-01

    In order to realize the three dimension visualization of nuclear weapons' damage effects. Aiming at the characteristics of the damage effects data, a new model-MRPCT model is proposed, and this model can carry out the modeling of the three dimension spatial data of the nuclear weapons' damage effects. For the sake of saving on the memory, linear coding method is used to store the MRPCT model. On the basis of Morton code, spatial analysis of the damage effects is completed. (authors)

  14. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  15. Infrared spectroscopy reveals both qualitative and quantitative differences in equine subchondral bone during maturation

    Science.gov (United States)

    Kobrina, Yevgeniya; Isaksson, Hanna; Sinisaari, Miikka; Rieppo, Lassi; Brama, Pieter A.; van Weeren, René; Helminen, Heikki J.; Jurvelin, Jukka S.; Saarakkala, Simo

    2010-11-01

    The collagen phase in bone is known to undergo major changes during growth and maturation. The objective of this study is to clarify whether Fourier transform infrared (FTIR) microspectroscopy, coupled with cluster analysis, can detect quantitative and qualitative changes in the collagen matrix of subchondral bone in horses during maturation and growth. Equine subchondral bone samples (n = 29) from the proximal joint surface of the first phalanx are prepared from two sites subjected to different loading conditions. Three age groups are studied: newborn (0 days old), immature (5 to 11 months old), and adult (6 to 10 years old) horses. Spatial collagen content and collagen cross-link ratio are quantified from the spectra. Additionally, normalized second derivative spectra of samples are clustered using the k-means clustering algorithm. In quantitative analysis, collagen content in the subchondral bone increases rapidly between the newborn and immature horses. The collagen cross-link ratio increases significantly with age. In qualitative analysis, clustering is able to separate newborn and adult samples into two different groups. The immature samples display some nonhomogeneity. In conclusion, this is the first study showing that FTIR spectral imaging combined with clustering techniques can detect quantitative and qualitative changes in the collagen matrix of subchondral bone during growth and maturation.

  16. Digital Speckle Photography of Subpixel Displacements of Speckle Structures Based on Analysis of Their Spatial Spectra

    Science.gov (United States)

    Maksimova, L. A.; Ryabukho, P. V.; Mysina, N. Yu.; Lyakin, D. V.; Ryabukho, V. P.

    2018-04-01

    We have investigated the capabilities of the method of digital speckle interferometry for determining subpixel displacements of a speckle structure formed by a displaceable or deformable object with a scattering surface. An analysis of spatial spectra of speckle structures makes it possible to perform measurements with a subpixel accuracy and to extend the lower boundary of the range of measurements of displacements of speckle structures to the range of subpixel values. The method is realized on the basis of digital recording of the images of undisplaced and displaced speckle structures, their spatial frequency analysis using numerically specified constant phase shifts, and correlation analysis of spatial spectra of speckle structures. Transformation into the frequency range makes it possible to obtain quantities to be measured with a subpixel accuracy from the shift of the interference-pattern minimum in the diffraction halo by introducing an additional phase shift into the complex spatial spectrum of the speckle structure or from the slope of the linear plot of the function of accumulated phase difference in the field of the complex spatial spectrum of the displaced speckle structure. The capabilities of the method have been investigated in natural experiment.

  17. Landsat analysis of tropical forest succession employing a terrain model

    Science.gov (United States)

    Barringer, T. H.; Robinson, V. B.; Coiner, J. C.; Bruce, R. C.

    1980-01-01

    Landsat multispectral scanner (MSS) data have yielded a dual classification of rain forest and shadow in an analysis of a semi-deciduous forest on Mindonoro Island, Philippines. Both a spatial terrain model, using a fifth side polynomial trend surface analysis for quantitatively estimating the general spatial variation in the data set, and a spectral terrain model, based on the MSS data, have been set up. A discriminant analysis, using both sets of data, has suggested that shadowing effects may be due primarily to local variations in the spectral regions and can therefore be compensated for through the decomposition of the spatial variation in both elevation and MSS data.

  18. Quantitative analysis of Esophageal Transit of Radionuclide in Patients with Dermatomyositis-Polymyositis

    International Nuclear Information System (INIS)

    Chung, June Key; Lee, Myung Chul; Koh, Chang Soon; Lee, Myung Hae

    1989-01-01

    Esophageal transit of radionuclide was quantitatively analyzed in 29 patients with dermatomyositis-polymyositis Fourteen patients (48.3%) showed retention of tracer in oropharynx. The mean value of percent retention of oropharynx was 15.5+16.6%. Esophageal dysfunction was found in 19 patients (65.5%). Among them 4 showed mild, 12 showed moderate and 3 showed severe esophageal dysfunction. Dysphagia was found in 11 patients (37.9%), which was closely related to percent retention of oropharynx. Quantitative analysis of esophageal transit of radionuclide seemed to be a useful technique for evaluation of dysphagia in patients with dermatomyositis-polymyositis.

  19. Geo-Spatial Social Network Analysis of Social Media to Mitigate Disasters

    Science.gov (United States)

    Carley, K. M.

    2017-12-01

    Understanding the spatial layout of human activity can afford a better understanding many phenomena - such as local cultural, the spread of ideas, and the scope of a disaster. Today, social media is one of the key sensors for acquiring information on socio-cultural activity, some with cues as to the geo-location. We ask, What can be learned by putting such data on maps? For example, are people who chat on line more likely to be near each other? Can Twitter data support disaster planning or early warning? In this talk, such issues are examined using data collected via Twitter and analyzed using ORA. ORA is a network analysis and visualization system. It supports not just social networks (who is interacting with whom), but also high dimensional networks with many types of nodes (e.g. people, organizations, resources, activities …) and relations, geo-spatial network analysis, dynamic network analysis, & geo-temporal analysis. Using ORA lessons learned from five case studies are considered: Arab Spring, Tsunami warning in Padang Indonesia, Twitter around Fukushima in Japan, Typhoon Haiyan (Yolanda), & regional conflict. Using Padang Indonesia data, we characterize the strengths and limitations of social media data to support disaster planning & early warning, identify at risk areas & issues of concern, and estimate where people are and which areas are impacted. Using Fukushima Japanese data, social media is used to estimate geo-spatial regularities in movement and communication that can inform disaster response and risk estimation. Using Arab Spring data, we find that the spread of bots & extremists varies by country and time, to the extent that using twitter to understand who is important or what ideas are critical can be compromised. Bots and extremists can exploit disaster messaging to create havoc and facilitate criminal activity e.g. human trafficking. Event discovery mechanisms support isolating geo-epi-centers for key events become crucial. Spatial inference

  20. Spatial bedrock erosion distribution in a natural gorge

    Science.gov (United States)

    Beer, A. R.; Turowski, J. M.; Kirchner, J. W.

    2015-12-01

    Quantitative analysis of morphological evolution both in terrestrial and planetary landscapes is of increasing interest in the geosciences. In mountainous regions, bedrock channel formation as a consequence of the interaction of uplift and erosion processes is fundamental for the entire surface evolution. Hence, the accurate description of bedrock channel development is important for landscape modelling. To verify existing concepts developed in the lab and to analyse how in situ channel erosion rates depend on the interrelations of discharge, sediment transport and topography, there is a need of highly resolved topographic field data. We analyse bedrock erosion over two years in a bedrock gorge downstream of the Gorner glacier above the town of Zermatt, Switzerland. At the study site, the Gornera stream cuts through a roche moutonnée in serpentine rock of 25m length, 5m width and 8m depth. We surveyed bedrock erosion rates using repeat terrestrial laser scanning (TLS) with an average point spacing of 5mm. Bedrock erosion rates in direction of the individual surface normals were studied directly on the scanned point clouds applying the M3C2 algorithm (Lague et al., 2013, ISPRS). The surveyed erosion patterns were compared to a simple stream erosivity visualisation obtained from painted bedrock sections at the study location. Spatially distributed erosion rates on bedrock surfaces based on millions of scan points allow deduction of millimeter-scale mean annual values of lateral erosion, incision and downstream erosion on protruding streambed surfaces. The erosion rate on a specific surface point is shown to depend on the position of this surface point in the channel's cross section, its height above the streambed and its spatial orientation to the streamflow. Abrasion by impacting bedload was likely the spatially dominant erosion process, as confirmed by the observed patterns along the painted bedrock sections. However, a single plucking event accounted for the half

  1. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  2. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  3. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  4. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  5. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  6. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  7. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  8. Spatially resolved spectroscopy analysis of the XMM-Newton large program on SN1006

    Science.gov (United States)

    Li, Jiang-Tao; Decourchelle, Anne; Miceli, Marco; Vink, Jacco; Bocchino, Fabrizio

    2016-04-01

    We perform analysis of the XMM-Newton large program on SN1006 based on our newly developed methods of spatially resolved spectroscopy analysis. We extract spectra from low and high resolution meshes. The former (3596 meshes) is used to roughly decompose the thermal and non-thermal components and characterize the spatial distributions of different parameters, such as temperature, abundances of different elements, ionization age, and electron density of the thermal component, as well as photon index and cutoff frequency of the non-thermal component. On the other hand, the low resolution meshes (583 meshes) focus on the interior region dominated by the thermal emission and have enough counts to well characterize the Si lines. We fit the spectra from the low resolution meshes with different models, in order to decompose the multiple plasma components at different thermal and ionization states and compare their spatial distributions. In this poster, we will present the initial results of this project.

  9. Urban Transmission of American Cutaneous Leishmaniasis in Argentina: Spatial Analysis Study

    Science.gov (United States)

    Gil, José F.; Nasser, Julio R.; Cajal, Silvana P.; Juarez, Marisa; Acosta, Norma; Cimino, Rubén O.; Diosque, Patricio; Krolewiecki, Alejandro J.

    2010-01-01

    We used kernel density and scan statistics to examine the spatial distribution of cases of pediatric and adult American cutaneous leishmaniasis in an urban disease-endemic area in Salta Province, Argentina. Spatial analysis was used for the whole population and stratified by women > 14 years of age (n = 159), men > 14 years of age (n = 667), and children < 15 years of age (n = 213). Although kernel density for adults encompassed nearly the entire city, distribution in children was most prevalent in the peripheral areas of the city. Scan statistic analysis for adult males, adult females, and children found 11, 2, and 8 clusters, respectively. Clusters for children had the highest odds ratios (P < 0.05) and were located in proximity of plantations and secondary vegetation. The data from this study provide further evidence of the potential urban transmission of American cutaneous leishmaniasis in northern Argentina. PMID:20207869

  10. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Physical aspects of quantitative particles analysis by X-ray fluorescence and electron microprobe techniques

    International Nuclear Information System (INIS)

    Markowicz, A.

    1986-01-01

    The aim of this work is to present both physical fundamentals and recent advances in quantitative particles analysis by X-ray fluorescence (XRF) and electron microprobe (EPXMA) techniques. A method of correction for the particle-size effect in XRF analysis is described and theoretically evaluated. New atomic number- and absorption correction procedures in EPXMA of individual particles are proposed. The applicability of these two correction methods is evaluated for a wide range of elemental composition, X-ray energy and sample thickness. Also, a theoretical model for composition and thickness dependence of Bremsstrahlung background generated in multielement bulk specimens as well as thin films and particles are presented and experimantally evaluated. Finally, the limitations and further possible improvements in quantitative particles analysis by XFR and EPXMA are discussed. 109 refs. (author)

  12. ANALYSIS OF PRO-POOR GROWTH AMONG THE MUNICIPAL DISTRICTS OF THE STATE OF CEARÁ - BRAZIL: SPATIAL APPROACH

    Directory of Open Access Journals (Sweden)

    Wellington Ribeiro Justo

    2014-04-01

    Full Text Available This article investigates the for-poor growth among the municipal districts of the State of Ceará in 2003. Initially it explores the recent literature of the theme as well as of the spatial econometric. Soon after it makes the spatial analysis of the variables through maps and in more robust way through the Exploratory Analysis of Given Spatial (AEDE being used the LISA methodology (Local Indicators of Spatial Association and the statistics I of Moran. Estimates the elasticity income-poverty and elasticity inequality-poverty. The tests indicated the need to incorporate variables in the estimates that learn the spatial externalities. The results suggest for-poor growth in the municipal districts from Ceará in agreement with results aggregate for the Ceará state appointed in recent literature.

  13. Characterising Ageing in the Human Brainstem Using Quantitative Multimodal MRI Analysis

    Directory of Open Access Journals (Sweden)

    Christian eLambert

    2013-08-01

    Full Text Available Ageing is ubiquitous to the human condition. The MRI correlates of healthy ageing have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI and DTI. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analysing this region. By utilising a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of ageing within the human brainstem in vivo. Using quantitative MRI (qMRI, tensor based morphometry (TBM and voxel based quantification (VBQ, the volumetric and quantitative changes across healthy adults between 19-75 years were characterised. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetisation transfer (MT and increase in proton density (PD, accounting for the previously described midbrain shrinkage. Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterised, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterised by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases.

  14. Spatial memory enhances the evacuation efficiency of virtual pedestrians under poor visibility condition

    Science.gov (United States)

    Ma, Yi; Lee, Eric Wai Ming; Shi, Meng; Kwok Kit Yuen, Richard

    2018-03-01

    Spatial memory is a critical navigation support tool for disoriented evacuees during evacuation under adverse environmental conditions such as dark or smoky conditions. Owing to the complexity of memory, it is challenging to understand the effect of spatial memory on pedestrian evacuation quantitatively. In this study, we propose a simple method to quantitatively represent the evacueeʼs spatial memory about the emergency exit, model the evacuation of pedestrians under the guidance of the spatial memory, and investigate the effect of the evacueeʼs spatial memory on the evacuation from theoretical and physical perspectives. The result shows that (i) a good memory can significantly assist the evacuation of pedestrians under poor visibility conditions, and the evacuation can always succeed when the degree of the memory exceeds a threshold (\\varphi > 0.5); (ii) the effect of memory is superior to that of “follow-the-crowd” under the same environmental conditions; (iii) in the case of multiple exits, the difference in the degree of the memory between evacuees has a significant effect (the greater the difference, the faster the evacuation) for the evacuation under poor visibility conditions. Our study provides a new quantitative insight into the effect of spatial memory on crowd evacuation under poor visibility conditions. Project supported by the Research Grants Council of the Hong Kong Special Administrative Region, China (Grant No. 11203615).

  15. Quantitative X-ray fluorescence analysis at the ESRF ID18F microprobe

    CERN Document Server

    Vekemans, B; Somogyi, A; Drakopoulos, M; Kempenaers, L; Simionovici, A; Adams, F

    2003-01-01

    The new ID18F end-station at the European synchrotron radiation facility (ESRF) in Grenoble (France) is dedicated to sensitive and accurate quantitative micro-X-ray fluorescence (XRF) analysis at the ppm level with accuracy better than 10% for elements with atomic numbers above 18. For accurate quantitative analysis, given a high level of instrumental stability, major steps are the extraction and conversion of experimental X-ray line intensities into elemental concentrations. For this purpose a two-step quantification approach was adopted. In the first step, the collected XRF spectra are deconvoluted on the basis of a non-linear least-squares fitting algorithm (AXIL). The extracted characteristic line intensities are then used as input for a detailed Monte Carlo (MC) simulation code dedicated to XRF spectroscopy taking into account specific experimental conditions (excitation/detection) as well as sample characteristics (absorption and enhancement effects, sample topology, heterogeneity etc.). The iterative u...

  16. The Study on the Quantitative Analysis in LPG Tank's Fire and Explosion

    Energy Technology Data Exchange (ETDEWEB)

    Bae, S.J.; Kim, B.J. [Department of chemical Engineering, Soongsil University, Seoul (Korea)

    1999-04-01

    Chemical plant's fire and explosion does not only damage to the chemical plants themselves but also damage to people in or near of the accident spot and the neighborhood of chemical plant. For that reason, Chemical process safety management has become important. One of safety management methods is called 'the quantitative analysis', which is used to reduce and prevent the accident. The results of the quantitative analysis could be used to arrange the equipments, evaluate the minimum safety distance, prepare the safety equipments. In this study we make the computer program to make easy to do quantitative analysis of the accident. The output of the computer program is the magnitude of fire(pool fire and fireball) and explosion (UVCE and BLEVE) effects. We used the thermal radiation as a measure of fire magnitude and used the overpressure as a measure of explosion magnitude. In case of BLEVE, the fly distance of fragment can be evaluated. Also probit analysis was done in every case. As the case study, Buchun LPG explosion accident in Korea was analysed by the program developed. The simulation results showed that the permissible distance was 800m and probit analysis showed that 1st degree burn, 2nd degree burn, and death distances are 450, 280, 260m, respectively. the simulation results showed the good agreement with the result from SAFER PROGRAM made by DuPont. 13 refs., 4 figs., 2 tabs.

  17. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  18. Spatial and Climate Literacy: Connecting Urban and Rural Students

    Science.gov (United States)

    Boger, R. A.; Low, R.; Mandryk, C.; Gorokhovich, Y.

    2013-12-01

    Through a collaboration between the University of Nebraska-Lincoln (UNL), Brooklyn College, and Lehman College, four independent but linked modules were developed and piloted in courses offered at Brooklyn College and UNL simultaneously. Module content includes climate change science and literacy principles, using geospatial technologies (GIS, GPS and remote sensing) as a vehicle to explore issues associated with global, regional, and local climate change in a concrete, quantitative and visual way using Internet resources available through NASA, NOAA, USGS, and a variety of universities and organizations. The materials take an Earth system approach and incorporate sustainability, resilience, water and watersheds, weather and climate, and food security topics throughout the semester. The research component of the project focuses on understanding the role of spatial literacy and authentic inquiry based experiences in climate change understanding and improving confidence in teaching science. In particular, engaging learners in both climate change science and GIS simultaneously provides opportunities to examine questions about the role that data manipulation, mental representation, and spatial literacy plays in students' abilities to understand the consequences and impacts of climate change. Pre and post surveys were designed to discern relationships between spatial cognitive processes and effective acquisition of climate change science concepts in virtual learning environments as well as alignment of teacher's mental models of nature of science and climate system dynamics to scientific models. The courses will again be offered simultaneously in Spring 2014 at Brooklyn College and UNL. Evaluation research will continue to examine the connections between spatial and climate literacy and teacher's mental models (via qualitative textual analysis using MAXQDA text analysis, and UCINET social network analysis programs) as well as how urban-rural learning interactions may

  19. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  20. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.