WorldWideScience

Sample records for quantitative spatial analysis

  1. An Integrative Platform for Three-dimensional Quantitative Analysis of Spatially Heterogeneous Metastasis Landscapes

    Science.gov (United States)

    Guldner, Ian H.; Yang, Lin; Cowdrick, Kyle R.; Wang, Qingfei; Alvarez Barrios, Wendy V.; Zellmer, Victoria R.; Zhang, Yizhe; Host, Misha; Liu, Fang; Chen, Danny Z.; Zhang, Siyuan

    2016-04-01

    Metastatic microenvironments are spatially and compositionally heterogeneous. This seemingly stochastic heterogeneity provides researchers great challenges in elucidating factors that determine metastatic outgrowth. Herein, we develop and implement an integrative platform that will enable researchers to obtain novel insights from intricate metastatic landscapes. Our two-segment platform begins with whole tissue clearing, staining, and imaging to globally delineate metastatic landscape heterogeneity with spatial and molecular resolution. The second segment of our platform applies our custom-developed SMART 3D (Spatial filtering-based background removal and Multi-chAnnel forest classifiers-based 3D ReconsTruction), a multi-faceted image analysis pipeline, permitting quantitative interrogation of functional implications of heterogeneous metastatic landscape constituents, from subcellular features to multicellular structures, within our large three-dimensional (3D) image datasets. Coupling whole tissue imaging of brain metastasis animal models with SMART 3D, we demonstrate the capability of our integrative pipeline to reveal and quantify volumetric and spatial aspects of brain metastasis landscapes, including diverse tumor morphology, heterogeneous proliferative indices, metastasis-associated astrogliosis, and vasculature spatial distribution. Collectively, our study demonstrates the utility of our novel integrative platform to reveal and quantify the global spatial and volumetric characteristics of the 3D metastatic landscape with unparalleled accuracy, opening new opportunities for unbiased investigation of novel biological phenomena in situ.

  2. A method for quantitative analysis of spatially variable physiological processes across leaf surfaces.

    Science.gov (United States)

    Aldea, Mihai; Frank, Thomas D; DeLucia, Evan H

    2006-11-01

    Many physiological processes are spatially variable across leaf surfaces. While maps of photosynthesis, stomatal conductance, gene expression, water transport, and the production of reactive oxygen species (ROS) for individual leaves are readily obtained, analytical methods for quantifying spatial heterogeneity and combining information gathered from the same leaf but with different instruments are not widely used. We present a novel application of tools from the field of geographical imaging to the multivariate analysis of physiological images. Procedures for registration and resampling, cluster analysis, and classification provide a general framework for the analysis of spatially resolved physiological data. Two experiments were conducted to illustrate the utility of this approach. Quantitative analysis of images of chlorophyll fluorescence and the production of ROS following simultaneous exposure of soybean leaves to atmospheric O3 and soybean mosaic virus revealed that areas of the leaf where the operating quantum efficiency of PSII was depressed also experienced an accumulation of ROS. This correlation suggests a causal relationship between oxidative stress and inhibition of photosynthesis. Overlaying maps of leaf surface temperature and chlorophyll fluorescence following a photoinhibition treatment indicated that areas with low operating quantum efficiency of PSII also experienced reduced stomatal conductance (high temperature). While each of these experiments explored the covariance of two processes by overlaying independent images gathered with different instruments, the same procedures can be used to analyze the covariance of information from multiple images. The application of tools from geographic image analysis to physiological processes occurring over small spatial scales will help reveal the mechanisms generating spatial variation across leaves.

  3. Accuracy improvement of quantitative analysis by spatial confinement in laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F

    2013-07-29

    To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.

  4. Spatially-Resolved Proteomics: Rapid Quantitative Analysis of Laser Capture Microdissected Alveolar Tissue Samples

    Energy Technology Data Exchange (ETDEWEB)

    Clair, Geremy; Piehowski, Paul D.; Nicola, Teodora; Kitzmiller, Joseph A.; Huang, Eric L.; Zink, Erika M.; Sontag, Ryan L.; Orton, Daniel J.; Moore, Ronald J.; Carson, James P.; Smith, Richard D.; Whitsett, Jeffrey A.; Corley, Richard A.; Ambalavanan, Namasivayam; Ansong, Charles

    2016-12-22

    Global proteomics approaches allow characterization of whole tissue lysates to an impressive depth. However, it is now increasingly recognized that to better understand the complexity of multicellular organisms, global protein profiling of specific spatially defined regions/substructures of tissues (i.e. spatially-resolved proteomics) is essential. Laser capture microdissection (LCM) enables microscopic isolation of defined regions of tissues preserving crucial spatial information. However, current proteomics workflows entail several manual sample preparation steps and are challenged by the microscopic mass-limited samples generated by LCM, and that impact measurement robustness, quantification, and throughput. Here, we coupled LCM with a fully automated sample preparation workflow that with a single manual step allows: protein extraction, tryptic digestion, peptide cleanup and LC-MS/MS analysis of proteomes from microdissected tissues. Benchmarking against the current state of the art in ultrasensitive global proteomic analysis, our approach demonstrated significant improvements in quantification and throughput. Using our LCM-SNaPP proteomics approach, we characterized to a depth of more than 3,400 proteins, the ontogeny of protein changes during normal lung development in laser capture microdissected alveolar tissue containing ~4,000 cells per sample. Importantly, the data revealed quantitative changes for 350 low abundance transcription factors and signaling molecules, confirming earlier transcript-level observations and defining seven modules of coordinated transcription factor/signaling molecule expression patterns, suggesting that a complex network of temporal regulatory control directs normal lung development with epigenetic regulation fine-tuning pre-natal developmental processes. Our LCM-proteomics approach facilitates efficient, spatially-resolved, ultrasensitive global proteomics analyses in high-throughput that will be enabling for several clinical and

  5. Quantitative spatial analysis of the mouse brain lipidome by pressurized liquid extraction surface analysis

    DEFF Research Database (Denmark)

    Almeida, Reinaldo; Berzina, Zane; Christensen, Eva Arnspang

    2015-01-01

    of internal lipid standards in the extraction solvent. The analysis of lipid microextracts by nanoelectrospray ionization provides long-lasting ion spray which in conjunction with a hybrid ion trap-orbitrap mass spectrometer enables identification and quantification of molecular lipid species using a method......Here we describe a novel surface sampling technique termed pressurized liquid extraction surface analysis (PLESA), which in combination with a dedicated high-resolution shotgun lipidomics routine enables both quantification and in-depth structural characterization of molecular lipid species...... with successive polarity shifting, high-resolution Fourier transform mass spectrometry (FTMS), and fragmentation analysis. We benchmarked the performance of the PLESA approach for in-depth lipidome analysis by comparing it to conventional lipid extraction of excised tissue homogenates and by mapping the spatial...

  6. Spatial stochastic simulation offers potential as a quantitative method for pest risk analysis.

    Science.gov (United States)

    Rafoss, Trond

    2003-08-01

    Pest risk analysis represents an emerging field of risk analysis that evaluates the potential risks of the introduction and establishment of plant pests into a new geographic location and then assesses the management options to reduce those potential risks. Development of new and adapted methodology is required to answer questions concerning pest risk analysis of exotic plant pests. This research describes a new method for predicting the potential establishment and spread of a plant pest into new areas using a case study, Ralstonia solanacearum, a bacterial disease of potato. This method combines current quantitative methodologies, stochastic simulation, and geographic information systems with knowledge of pest biology and environmental data to derive new information about pest establishment potential in a geographical region where a pest had not been introduced. This proposed method extends an existing methodology for matching pest characteristics with environmental conditions by modeling and simulating dissemination behavior of a pest organism. Issues related to integrating spatial variables into risk analysis models are further discussed in this article.

  7. Quantitative analysis of spatial sampling error in the infant and adult electroencephalogram.

    Science.gov (United States)

    Grieve, Philip G; Emerson, Ronald G; Isler, Joseph R; Stark, Raymond I

    2004-04-01

    The purpose of this report was to determine the required number of electrodes to record the infant and adult electroencephalogram (EEG) with a specified amount of spatial sampling error. We first developed mathematical theory that governs the spatial sampling of EEG data distributed on a spherical approximation to the scalp. We then used a concentric sphere model of current flow in the head to simulate realistic EEG data. Quantitative spatial sampling error was calculated for the simulated EEG, with additive measurement noise, for 64, 128, and 256 electrodes equally spaced over the surface of the sphere corresponding to the coverage of the human scalp by commercially available "geodesic" electrode arrays. We found the sampling error for the infant to be larger than that for the adult. For example, a sampling error of less than 10% for the adult was obtained with a 64-electrode array but a 256-electrode array was needed for the infant to achieve the same level of error. With the addition of measurement noise, with power 10 times less than that of the EEG, the sampling error increased to 25% for both the infant and adult, for these numbers of electrodes. These results show that accurate measurement of the spatial properties of the infant EEG requires more electrodes than for the adult.

  8. Spatial Quantitation of Drugs in tissues using Liquid Extraction Surface Analysis Mass Spectrometry Imaging

    Science.gov (United States)

    Swales, John G.; Strittmatter, Nicole; Tucker, James W.; Clench, Malcolm R.; Webborn, Peter J. H.; Goodwin, Richard J. A.

    2016-11-01

    Liquid extraction surface analysis mass spectrometry imaging (LESA-MSI) has been shown to be an effective tissue profiling and imaging technique, producing robust and reliable qualitative distribution images of an analyte or analytes in tissue sections. Here, we expand the use of LESA-MSI beyond qualitative analysis to a quantitative analytical technique by employing a mimetic tissue model previously shown to be applicable for MALDI-MSI quantitation. Liver homogenate was used to generate a viable and molecularly relevant control matrix for spiked drug standards which can be frozen, sectioned and subsequently analyzed for the generation of calibration curves to quantify unknown tissue section samples. The effects of extraction solvent composition, tissue thickness and solvent/tissue contact time were explored prior to any quantitative studies in order to optimize the LESA-MSI method across several different chemical entities. The use of a internal standard to normalize regional differences in ionization response across tissue sections was also investigated. Data are presented comparing quantitative results generated by LESA-MSI to LC-MS/MS. Subsequent analysis of adjacent tissue sections using DESI-MSI is also reported.

  9. Spatial Quantitation of Drugs in tissues using Liquid Extraction Surface Analysis Mass Spectrometry Imaging.

    Science.gov (United States)

    Swales, John G; Strittmatter, Nicole; Tucker, James W; Clench, Malcolm R; Webborn, Peter J H; Goodwin, Richard J A

    2016-11-24

    Liquid extraction surface analysis mass spectrometry imaging (LESA-MSI) has been shown to be an effective tissue profiling and imaging technique, producing robust and reliable qualitative distribution images of an analyte or analytes in tissue sections. Here, we expand the use of LESA-MSI beyond qualitative analysis to a quantitative analytical technique by employing a mimetic tissue model previously shown to be applicable for MALDI-MSI quantitation. Liver homogenate was used to generate a viable and molecularly relevant control matrix for spiked drug standards which can be frozen, sectioned and subsequently analyzed for the generation of calibration curves to quantify unknown tissue section samples. The effects of extraction solvent composition, tissue thickness and solvent/tissue contact time were explored prior to any quantitative studies in order to optimize the LESA-MSI method across several different chemical entities. The use of a internal standard to normalize regional differences in ionization response across tissue sections was also investigated. Data are presented comparing quantitative results generated by LESA-MSI to LC-MS/MS. Subsequent analysis of adjacent tissue sections using DESI-MSI is also reported.

  10. A tool for the quantitative spatial analysis of mammary gland epithelium

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz de Solorzano, Carlos; Fernandez-Gonzalez, Rodrigo

    2004-04-09

    In this paper we present a method for the spatial analysis of complex cellular systems based on a multiscale study of neighborhood relationships. A function to measure those relationships, M, is introduced. The refined Relative Neighborhood Graph is then presented as a method to establish vicinity relationships within layered cellular structures, and particularized to epithelial cell nuclei in the mammary gland. Finally, the method is illustrated with two examples that show interactions within one population of epithelial cells and between two different populations.

  11. Stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA) for quantitative nanoscale assessment of spatial protein organization.

    Science.gov (United States)

    Veeraraghavan, Rengasayee; Gourdie, Robert G

    2016-11-07

    The spatial association between proteins is crucial to understanding how they function in biological systems. Colocalization analysis of fluorescence microscopy images is widely used to assess this. However, colocalization analysis performed on two-dimensional images with diffraction-limited resolution merely indicates that the proteins are within 200-300 nm of each other in the xy-plane and within 500-700 nm of each other along the z-axis. Here we demonstrate a novel three-dimensional quantitative analysis applicable to single-molecule positional data: stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA). This method offers significant advantages: 1) STORM imaging affords 20-nm resolution in the xy-plane and quantitative assessment of the frequency and degree of overlap between clusters of colabeled proteins; and 3) STORM-RLA also calculates the precise distances between both overlapping and nonoverlapping clusters in three dimensions. Thus STORM-RLA represents a significant advance in the high-throughput quantitative assessment of the spatial organization of proteins. © 2016 Veeraraghavan and Gourdie. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Quantitative carbon analysis in coal by combining data processing and spatial confinement in laser-induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiongwei [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Guodian New Energy Technology Research Institute, Beijing (China); Yin, Hualiang [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Wang, Zhe, E-mail: zhewang@tsinghua.edu.cn [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China); Fu, Yangting; Li, Zheng; Ni, Weidou [State Key Lab of Power Systems, Department of Thermal Engineering, Tsinghua-BP Clean Energy Center, Tsinghua University, Beijing (China)

    2015-09-01

    Online measurement of carbon content of coal is important for coal-fired power plants to realize the combustion optimization of coal-fired boilers. Given that the measurement of carbon content of coal using laser-induced breakdown spectroscopy (LIBS) suffers from low measurement accuracy because of matrix effects, our previous study has proposed a combination model to improve the measurement accuracy of carbon content of coal. The spatial confinement method, which utilizes the spectral emissions of laser-induced plasmas spatially confined by cavities for quantitative analysis, has potential to improve quantitative analysis performance. In the present study, the combination model was used for coal measurement with cylindrical cavity confinement to further improve the measurement accuracy of carbon content of coal. Results showed that measurement accuracy was improved when the combination model was used with spatial confinement method. The coefficient of determination, root-mean-square error of prediction, average relative error, and average absolute error for the combination model with cylindrical cavity confinement were 0.99, 1.35%, 1.66%, and 1.08%, respectively, whereas values for the combination model without cylindrical cavity confinement were 0.99, 1.63%, 1.82%, and 1.27%, respectively. This is the first time that the average absolute error of carbon measurement for coal analysis has achieved close to 1.0% using LIBS, which is the critical requirement set for traditional chemical processing method by Chinese national standard. These results indicated that LIBS had significant application potential for coal analysis. - Highlights: • Spatial confinement method is applied for the measurement of carbon content in coal. • Previously proposed combination model is used with spatial confinement method. • The final result is firstly close to the critical reproducibility requirement of Chinese national standard.

  13. Quantitative analysis of illusory movement : spatial filtering and line localization in the human visual system

    NARCIS (Netherlands)

    Jansonius, Nomdo M.; Stam, Lucas; de Jong, Tim; Pijpker, Ben A.

    2014-01-01

    A narrow bar or line (width around 1 arcmin) between two fields of which the luminances are sinusoidally and in counterphase modulated in time appears to make an oscillatory movement. It is possible to annihilate this illusory movement with a real movement and thus to analyze this phenomenon quantit

  14. Quantitative Analysis of Spatial Protein-protein Proximity in Fluorescence Confocal Microscopy

    Science.gov (United States)

    Wu, Yong; Liu, Yi-Kuang; Eghbali, Mansoureh; Stefani, Enrico

    2009-02-01

    To quantify spatial protein-protein proximity (colocalization) in fluorescence microscopic images, cross-correlation and autocorrelation functions were decomposed into fast and slowly decaying components. The fast component results from clusters of proteins specifically labeled and the slow one from background/image heterogeneity. We show that the calculation of the protein-protein proximity index and the correlation coefficient are more reliably determined by extracting the fast-decaying component.

  15. Quantitative analysis and modelling of hepatic iron stores using stereology and spatial statistics.

    Science.gov (United States)

    Ghugre, N R; Gonzalez-Gomez, I; Shimada, H; Coates, T D; Wood, J C

    2010-06-01

    Hepatic iron overload is a common clinical problem resulting from hyperabsorption syndromes and from chronic transfusion therapy. Not only does iron loading vary between reticuloendothelial stores and hepatocytes, but iron is heterogeneously distributed within hepatocytes as well. Since the accessibility of iron particles to chelation may depend, in part, on their distribution, we sought to characterize the shape and scale of iron deposition in humans with transfusional iron overload. Toward this end, we performed a histological analysis of iron stores in liver biopsy specimens of 20 patients (1.3-57.8 mg iron/g dry tissue weight) with aid of electron and light microscopy. We estimated distributions related to variability in siderosomal size, proximity of iron centres and inter-cellular iron loading. These distributions could be well modelled by Gamma distribution functions over most of the pathologic range of iron concentrations. Thus, for a given liver iron burden, a virtual iron-overloaded liver could be created that served as a model for the true histologic appearance. Such a model may be helpful for understanding the mechanics of iron loading or in predicting response to iron removal therapy.

  16. A matter of ephemerality: the study of Kel Tadrart Tuareg (southwest Libya campsites via quantitative spatial analysis

    Directory of Open Access Journals (Sweden)

    Stefano Biagetti

    2016-03-01

    Full Text Available We examined the settlement structure from the Kel Tadrart Tuareg, a small pastoral society from southwest Libya. Our objective was to apply spatial analysis to establish the statistical significance of specific patterns in the settlement layout. In particular, we examined whether there is a separation between domestic and livestock spaces, and whether particular residential features dedicated to guests are spatially isolated. We used both established statistical techniques and newly developed bespoke analyses to test our hypotheses, and then discuss the results in the light of possible applications to other case studies.

  17. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  18. Spatial modulation spectroscopy for imaging and quantitative analysis of single dye-doped organic nanoparticles inside cells

    Science.gov (United States)

    Devadas, Mary Sajini; Devkota, Tuphan; Guha, Samit; Shaw, Scott K.; Smith, Bradley D.; Hartland, Gregory V.

    2015-05-01

    Imaging of non-fluorescent nanoparticles in complex biological environments, such as the cell cytosol, is a challenging problem. For metal nanoparticles, Rayleigh scattering methods can be used, but for organic nanoparticles, such as dye-doped polymer beads or lipid nanoparticles, light scattering does not provide good contrast. In this paper, spatial modulation spectroscopy (SMS) is used to image single organic nanoparticles doped with non-fluorescent, near-IR croconaine dye. SMS is a quantitative imaging technique that yields the absolute extinction cross-section of the nanoparticles, which can be used to determine the number of dye molecules per particle. SMS images were recorded for particles within EMT-6 breast cancer cells. The measurements allowed mapping of the nanoparticle location and the amount of dye in a single cell. The results demonstrate how SMS can facilitate efforts to optimize dye-doped nanoparticles for effective photothermal therapy of cancer.Imaging of non-fluorescent nanoparticles in complex biological environments, such as the cell cytosol, is a challenging problem. For metal nanoparticles, Rayleigh scattering methods can be used, but for organic nanoparticles, such as dye-doped polymer beads or lipid nanoparticles, light scattering does not provide good contrast. In this paper, spatial modulation spectroscopy (SMS) is used to image single organic nanoparticles doped with non-fluorescent, near-IR croconaine dye. SMS is a quantitative imaging technique that yields the absolute extinction cross-section of the nanoparticles, which can be used to determine the number of dye molecules per particle. SMS images were recorded for particles within EMT-6 breast cancer cells. The measurements allowed mapping of the nanoparticle location and the amount of dye in a single cell. The results demonstrate how SMS can facilitate efforts to optimize dye-doped nanoparticles for effective photothermal therapy of cancer. Electronic supplementary information (ESI

  19. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  20. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  1. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  2. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  3. Geologic spatial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thiessen, R.L.; Eliason, J.R.

    1989-01-01

    This report describes the development of geologic spatial analysis research which focuses on conducting comprehensive three-dimensional analysis of regions using geologic data sets that can be referenced by latitude, longitude, and elevation/depth. (CBS)

  4. Spatial Data Analysis.

    Science.gov (United States)

    Banerjee, Sudipto

    2016-01-01

    With increasing accessibility to geographic information systems (GIS) software, statisticians and data analysts routinely encounter scientific data sets with geocoded locations. This has generated considerable interest in statistical modeling for location-referenced spatial data. In public health, spatial data routinely arise as aggregates over regions, such as counts or rates over counties, census tracts, or some other administrative delineation. Such data are often referred to as areal data. This review article provides a brief overview of statistical models that account for spatial dependence in areal data. It does so in the context of two applications: disease mapping and spatial survival analysis. Disease maps are used to highlight geographic areas with high and low prevalence, incidence, or mortality rates of a specific disease and the variability of such rates over a spatial domain. They can also be used to detect hot spots or spatial clusters that may arise owing to common environmental, demographic, or cultural effects shared by neighboring regions. Spatial survival analysis refers to the modeling and analysis for geographically referenced time-to-event data, where a subject is followed up to an event (e.g., death or onset of a disease) or is censored, whichever comes first. Spatial survival analysis is used to analyze clustered survival data when the clustering arises from geographical regions or strata. Illustrations are provided in these application domains.

  5. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  6. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  7. Quantitative Seismic Amplitude Analysis

    OpenAIRE

    Dey, A. K.

    2011-01-01

    The Seismic Value Chain quantifies the cyclic interaction between seismic acquisition, imaging and reservoir characterization. Modern seismic innovation to address the global imbalance in hydrocarbon supply and demand requires such cyclic interaction of both feed-forward and feed-back processes. Currently, the seismic value chain paradigm is in a feed-forward mode. Modern seismic data now have the potential to yield the best images in terms of spatial resolution, amplitude accuracy, and incre...

  8. The Kohs' blocks test as an important instrument to investigate the visuo-spatial impairments in myotonic dystrophy: part I. Quantitative and qualitative analysis

    Directory of Open Access Journals (Sweden)

    WIGG CRISTINA MARIA DUARTE

    1999-01-01

    Full Text Available This study presents the performance of 39 cases of myotonic dystrophy on Kohs' blocks test (21 females and 18 males, age range from 9 to 70 years. On this test, the patients have to reproduce figures from models previously showed to them. Some of the patients had some kind of professional activity, while others had never exerted a professional occupation. The patients denoted considerable difficulty to perform the test. Some cases constructed entirely different figures in comparison to the presented drawings, translating visuo-spatial and constructional disabilities. The performance was insufficient in 71.4 % of the cases. These cases solved less than 50% of the test. The levels of analysis and synthesis were severely impaired. A total of 18 cases got less than 10 points, not reaching 20% of the test. The results showed the sensitivity of this test in detecting visuo-spatial impairment in myotonic dystrophy.

  9. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  10. Professional analysis in spatial planning

    Directory of Open Access Journals (Sweden)

    Andrej Černe

    2005-12-01

    Full Text Available Spatial analysis contributes to accomplishment of the three basic aims of spatial planning: it is basic element for setting spatial policies, concepts and strategies, gives basic information to inhabitants, land owners, investors, planners and helps in performing spatial policies, strategies, plans, programmes and projects. Analysis in planning are generally devoted to: understand current circumstances and emerging conditions within planning decisions; determine priorities of open questions and their solutions; formulate general principles for further development.

  11. Quantitative Evaluation of Spatial Openness of Built Environment Using Visual Impact Analysis%基于视觉影响的建成环境空间开敞度定量评价方法

    Institute of Scientific and Technical Information of China (English)

    钮心毅; 徐方

    2011-01-01

    提出一种运用视觉影响分析技术定量化评价建成环境空间开敞度的方法.首先将建成环境对人的视觉影响划分为二维视觉影响和三维视觉影响.在此基础上,对日本学者芦原义信提出的建筑外部空间D/H指标进行扩展,构造了一个综合二维视觉影响、三维视觉影响的视觉影响分析模型.基于GIS可视性分析功能,提出了以通视率和平均视觉遮挡距离为核心的定量评价指标,实现对建成环境空间开敞度的定量化评价.提出的方法可用于定量分析城市设计中建筑群体的不同形态、不同布局对开敞空间的影响,从而为城市设计方案评估提供技术手段.%This paper presents a quantitative approach for evaluating spatial openness of built environment using visual impact analysis. Firstly, the impact of built environment on the human visual sense is divided into two-dimensional visual impact and three-dimensional visual impact. We expand on the concept of Yoshinobu Ashihara's D/H index for spatial openness and develop a visual impact analysis model combining two-dimensional and three-dimensional visual impact to calculate the spatial openness of built environment. Using GIS-based visibility analysis techniques, two quantitative indices, Visible Ratio and Average Visual Obstructive Distance, are proposed for measuring spatial openness quantitatively. The approach presented can be applied in urban design, in particular for quantitative evaluation of building shapes and spatial configurations related to open space.

  12. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  13. Perspectives on spatial data analysis

    CERN Document Server

    Rey, Sergio

    2010-01-01

    This book takes both a retrospective and prospective view of the field of spatial analysis by combining selected reprints of classic articles by Arthur Getis with current observations by leading experts in the field. Four main aspects are highlighted, dealing with spatial analysis, pattern analysis, local statistics as well as illustrative empirical applications. Researchers and students will gain an appreciation of Getis' methodological contributions to spatial analysis and the broad impact of the methods he has helped pioneer on an impressively broad array of disciplines including spatial epidemiology, demography, economics, and ecology. The volume is a compilation of high impact original contributions, as evidenced by citations, and the latest thinking on the field by leading scholars. This makes the book ideal for advanced seminars and courses in spatial analysis as well as a key resource for researchers seeking a comprehensive overview of recent advances and future directions in the field.

  14. Quantitative analysis of glycated proteins.

    Science.gov (United States)

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  15. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  16. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  17. Regulation of the spatial code for BDNF mRNA isoforms in the rat hippocampus following pilocarpine-treatment: a systematic analysis using laser microdissection and quantitative real-time PCR.

    Science.gov (United States)

    Baj, Gabriele; Del Turco, Domenico; Schlaudraff, Jessica; Torelli, Lucio; Deller, Thomas; Tongiorgi, Enrico

    2013-05-01

    Brain-derived neurotrophic factor (BDNF) is essential for neuronal survival, differentiation, and plasticity and is one of those genes that generate multiple mRNAs with different alternatively spliced 5'UTRs. The functional significance of many BDNF transcripts, each producing the same protein, is emerging. On the basis of the analysis of the four most abundant brain BDNF transcripts, we recently proposed the "spatial code hypothesis of BDNF splice variants" according to which the BDNF transcripts, through their differential subcellular localization in soma or dendrites, represent a mechanism to synthesize the protein at distinct locations and produce local effects. In this study, using laser microdissection of hippocampal laminae and reverse transcription-quantitative real-time PCR (RT-qPCR), we analyzed all known BDNF mRNA variants at resting conditions or following 3 h pilocarpine-induced status epilepticus. In untreated rats, we found dendritic enrichment of BDNF transcripts encoding exons 6 and 7 in CA1; exons 1, 6, and 9a in CA3; and exons 5, 6, 7, and 8 in DG. Considering the low abundance of the other transcripts, exon 6 was the main transcript in dendrites under resting conditions. Pilocarpine treatment induced an increase of BDNF transcripts encoding exons 4 and 6 in all dendritic laminae and, additionally, of exon 2 in CA1 stratum radiatum and exons 2, 3, 9a in DG molecular layer while the other transcripts were decreased in dendrites, suggesting restriction to the soma. These results support the hypothesis of a spatial code to differentially regulate BDNF in the somatic or dendritic compartment under conditions of pilocarpine-induced status epilepticus and, furthermore, highlight the existence of subfield-specific differences.

  18. A New Methodology of Spatial Crosscorrelation Analysis

    CERN Document Server

    Chen, Yanguang

    2015-01-01

    The idea of spatial crosscorrelation was conceived of long ago. However, unlike the related spatial autocorrelation, the theory and method of spatial crosscorrelation analysis have remained undeveloped. This paper presents a set of models and working methods for spatial crosscorrelation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form and by means of mathematical reasoning, I derive a theoretical framework for geographical crosscorrelation analysis. First, two sets of spatial crosscorrelation coefficients are defined, including a global spatial crosscorrelation coefficient and a set of local spatial crosscorrelation coefficients. Second, a pair of scatterplots of spatial crosscorrelation is proposed, and different scatterplots show different relationships between correlated variables. Based on the spatial crosscorrelation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial crosscorrelation) and indirect correlation (sp...

  19. Quantitative analysis of protein turnover in plants.

    Science.gov (United States)

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  20. Evaluation of the Possibility of Applying Spatial 3D Imaging Using X-Ray Computed Tomography Reconstruction Methods for Quantitative Analysis of Multiphase Materials / Rentgenowska Analiza Ilościowa Materiałów Wielofazowych Z Wykorzystaniem Przestrzennego Obrazowania (3D Przy Użyciu Metod Rekonstrukcji Tomografii Komputerowej

    Directory of Open Access Journals (Sweden)

    Matysik P.

    2015-12-01

    Full Text Available In this paper the possibility of using X-ray computed tomography (CT in quantitative metallographic studies of homogeneous and composite materials is presented. Samples of spheroidal cast iron, Fe-Ti powder mixture compact and epoxy composite reinforced with glass fibers, were subjected to comparative structural tests. Volume fractions of each of the phase structure components were determined by conventional methods with the use of a scanning electron microscopy (SEM and X-ray diffraction (XRD quantitative analysis methods. These results were compared with those obtained by the method of spatial analysis of the reconstructed CT image. Based on the comparative analysis, taking into account the selectivity of data verification methods and the accuracy of the obtained results, the authors conclude that the method of computed tomography is suitable for quantitative analysis of several types of structural materials.

  1. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye mov

  2. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye

  3. Omnidirectional Analysis of Spatial Manipulator

    Directory of Open Access Journals (Sweden)

    Yuquan Leng

    2015-01-01

    Full Text Available Space manipulators are mainly used in the spatial loading task. According to problems of the spatial loading diversity, the testing loading installing position, and the utilization ratio of a test platform, the space manipulator is asked to evaluate the position and attitude of itself. This paper proposes the Point Omnidirectional Coefficient (POC with unit attitude sphere/circle to describe attitude of the end-effector, which evaluates any points in the attainable space of the manipulators, in combination with the manipulation’s position message, and get relationships between its position and attitude of all points in the attainable space. It represents the mapping between sphere surface and plane for mission attitude constraints and the method for calculating volume of points space including attainable space, Omnidirectional space, and mission attitude space. Furthermore, the Manipulator Omnidirectional Coefficient based on mission or not is proposed for evaluating manipulator performance. Through analysis and simulation about 3D and 2D manipulators, the results show that the above theoretical approach is feasible and the relationships about link lengths, joints angles, attainable space, and Manipulator Omnidirectional Coefficient are drawn for guiding design.

  4. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  5. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  6. A quantitative measurement of spatial order in ventricular fibrillation

    CERN Document Server

    Bayly, P V; Wolf, P D; Greenside, H S; Smith, W M; Ideker, R E

    1993-01-01

    As an objective measurement of spatial order in ventricular fibrillation (VF), spatial correlation functions and their characteristic lengths were estimated from epicardial electrograms of pigs in VF. The correlation length of the VF in pigs was found to be approximately 4-10 mm, varying as fibrillation progressed. The degree of correlation decreased in the first 4 seconds after fibrillation then increased over the next minute. The correlation length is much smaller than the scale of the heart, suggesting that many independent regions of activity exist on the epicardium at any one time. On the other hand, the correlation length is 4 to 10 times the interelectrode spacing, indicating that some coherence is present. These results imply that the heart behaves during VF as a high-dimensional, but not random, system involving many spatial degrees of freedom, which may explain the lack of convergence of fractal dimension estimates reported in the literature. Changes in the correlation length also suggest that VF re...

  7. 鄱阳湖地区土地覆盖空间分布格局与景观特征分析%Quantitative Analysis on Spatial Distribution of Land Cover Pattern and Landscape Features in Poyang Lake Region

    Institute of Scientific and Technical Information of China (English)

    冉盈盈; 王卷乐; 张永杰; 李玉洁; 周玉洁

    2012-01-01

    Poyang Lake is the largest freshwater lake in China. It plays an important role on ecosystem services, such as flood regulation and storage, water conservation, etc. It is important for the evaluation of its environmental protection and ecosystem services to grasp the spatial pattern and laws of land cover distribution in the Poyang Lake region. Based on HJ images and Poyang Lake land cover data in 2005, this paper obtained the Poyang Lake region land cover data in 2010 by using GIS, RS technology and quantitatively analyzed the land cover spatial distribution pattern of the Poyang Lake region in 2010. The results show that; (1) Base on HJ images and the existing land cover data, the regional land cover data can be updated dynamically. Using the HJ images in 2010 with the 2005 land cover data, the 2010 land cover data was obtained by artificial visual interpretation. Verified by the measured GPS points, the accuracy of the land cover data is 80. 4%; (2) Landscape ecology, G1S and statistical ideas and methods can support quantitative analysis on land cover spatial characteristics. The indices used in the patch area, patch shape index, adjacency index of different land cover types reflect the land cover spatial characteristics of the Poyang Lake region in different aspects; (3) Landscape indices can quantitatively explain the shape, size, number and space combination of the land cover and can reflect the characteristics of the ecological environment and the status of human activities and socio-economy. The area indices has showed that the rural settlement have the highest degree of fragmentation and dispersion. Shape indices have reflected the shape of the water body tend towards the regularization owing to the conversion from lake to land and the flood control projects; And (4) the adjacency indices of land cover types have quantitatively descripted the material exchange, interaction of the adjacent land cover types and have reflected the formation mechanism and

  8. High spatial resolution quantitative MR images: an experimental study of dedicated surface coils

    Energy Technology Data Exchange (ETDEWEB)

    Gensanne, D [Laboratoire de Chimie Bioinorganique Medicale, Imagerie therapeutique et diagnostique, JE 2400-CNRS FR 2599, Universite Paul Sabatier, 118, route de Narbonne, 31062 Toulouse Cedex (France); Josse, G [Centre Europeen de Recherche et d' Evaluation sur la Peau et les Epitheliums de Revetement, Institut de Recherche Pierre Fabre, 2, rue Viguerie, BP 3071 31025 Toulouse Cedex 3 (France); Lagarde, J M [Centre Europeen de Recherche et d' Evaluation sur la Peau et les Epitheliums de Revetement, Institut de Recherche Pierre Fabre, 2, rue Viguerie, BP 3071 31025 Toulouse Cedex 3 (France); Vincensini, D [Laboratoire de Chimie Bioinorganique Medicale, Imagerie therapeutique et diagnostique, JE 2400-CNRS FR 2599, Universite Paul Sabatier, 118, route de Narbonne, 31062 Toulouse Cedex (France)

    2006-06-07

    Measuring spin-spin relaxation times (T{sub 2}) by quantitative MR imaging represents a potentially efficient tool to evaluate the physicochemical properties of various media. However, noise in MR images is responsible for uncertainties in the determination of T{sub 2} relaxation times, which limits the accuracy of parametric tissue analysis. The required signal-to-noise ratio (SNR) depends on the T{sub 2} relaxation behaviour specific to each tissue. Thus, we have previously shown that keeping the uncertainty in T{sub 2} measurements within a limit of 10% implies that SNR values be greater than 100 and 300 for mono- and biexponential T{sub 2} relaxation behaviours, respectively. Noise reduction can be obtained either by increasing the voxel size (i.e., at the expense of spatial resolution) or by using high sensitivity dedicated surface coils (which allows us to increase SNR without deteriorating spatial resolution in an excessive manner). However, surface coil sensitivity is heterogeneous, i.e., it- and hence SNR-decreases with increasing depth, and the more so as the coil radius is smaller. The use of surface coils is therefore limited to the analysis of superficial structure such as the hypodermic tissue analysed here. The aim of this work was to determine the maximum limits of spatial resolution and depth compatible with reliable in vivo T{sub 2} quantitative MR images using dedicated surface coils available on various clinical MR scanners. The average thickness of adipose tissue is around 15 mm, and the results obtained have shown that obtaining reliable biexponential relaxation analysis requires a minimum achievable voxel size of 13 mm{sup 3} for a conventional volume birdcage coil and only of 1.7 mm{sup 3} for the smallest available surface coil (23 mm in diameter). Further improvement in spatial resolution allowing us to detect low details in MR images without deteriorating parametric T{sub 2} images can be obtained by image filtering. By using the non

  9. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  10. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  11. A quantitative framework for assessing spatial flows of ecosystem services

    NARCIS (Netherlands)

    Serna Chavez, H.M.; Schulp, C.J.E.; van Bodegom, P.M.; Bouten, W.; Verburg, P.H.; Davidson, M.D.

    2014-01-01

    Spatial disconnections between locations where ecosystem services are produced and where they are used are common. To date most ecosystem service assessments have relied on static indicators of provision and often do not incorporate relations with the corresponding beneficiaries or benefiting areas.

  12. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  13. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  14. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  15. Spatial analysis and planning under imprecision

    CERN Document Server

    Leung, Y

    1988-01-01

    The book deals with complexity, imprecision, human valuation, and uncertainty in spatial analysis and planning, providing a systematic exposure of a new philosophical and theoretical foundation for spatial analysis and planning under imprecision. Regional concepts and regionalization, spatial preference-utility-choice structures, spatial optimization with single and multiple objectives, dynamic spatial systems and their controls are analyzed in sequence.The analytical framework is based on fuzzy set theory. Basic concepts of fuzzy set theory are first discussed. Many numerical examples and emp

  16. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    Science.gov (United States)

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  17. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    Directory of Open Access Journals (Sweden)

    Lit-Hsin Loo

    2014-03-01

    Full Text Available Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST, an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to

  18. Quantitative histogram analysis of images

    Science.gov (United States)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  19. Quantitative single shot and spatially resolved plasma wakefield diagnostics

    CERN Document Server

    Kasim, Muhammad Firmansyah; Ceurvorst, Luke; Levy, Matthew C; Ratan, Naren; Sadler, James; Bingham, Robert; Burrows, Philip N; Trines, Raoul; Wing, Matthew; Norreys, Peter

    2015-01-01

    Diagnosing plasma conditions can give great advantages in optimizing plasma wakefield accelerator experiments. One possible method is that of photon acceleration. By propagating a laser probe pulse through a plasma wakefield and extracting the imposed frequency modulation, one can obtain an image of the density modulation of the wakefield. In order to diagnose the wakefield parameters at a chosen point in the plasma, the probe pulse crosses the plasma at oblique angles relative to the wakefield. In this paper, mathematical expressions relating the frequency modulation of the laser pulse and the wakefield density profile of the plasma for oblique crossing angles are derived. Multidimensional particle-in-cell simulation results presented in this paper confirm that the frequency modulation profiles and the density modulation profiles agree to within 10%. Limitations to the accuracy of the measurement are discussed in this paper. This technique opens new possibilities to quantitatively diagnose the plasma wakefie...

  20. Local models for spatial analysis

    CERN Document Server

    Lloyd, Christopher D

    2010-01-01

    Focusing on solutions, this second edition provides guidance to a wide variety of real-world problems. The text presents a complete introduction to key concepts and a clear mapping of the methods discussed. It also explores connections between methods. New chapters address spatial patterning in single variables and spatial relations. In addition, every chapter now includes links to key related studies. The author clearly distinguishes between local and global methods and provides more detailed coverage of geographical weighting, image texture measures, local spatial autocorrelation, and multic

  1. Early Development of Spatial-Numeric Associations: Evidence from Spatial and Quantitative Performance of Preschoolers

    Science.gov (United States)

    Opfer, John E.; Thompson, Clarissa A.; Furlong, Ellen E.

    2010-01-01

    Numeric magnitudes often bias adults' spatial performance. Partly because the direction of this bias (left-to-right versus right-to-left) is culture-specific, it has been assumed that the orientation of spatial-numeric associations is a late development, tied to reading practice or schooling. Challenging this assumption, we found that preschoolers…

  2. (Un)awareness of unilateral spatial neglect: a quantitative evaluation of performance in visuo-spatial tasks.

    Science.gov (United States)

    Ronchi, Roberta; Bolognini, Nadia; Gallucci, Marcello; Chiapella, Laura; Algeri, Lorella; Spada, Maria Simonetta; Vallar, Giuseppe

    2014-12-01

    Right-brain-damaged patients with unilateral spatial neglect are usually unaware (anosognosic) about their spatial deficits. However, in the scientific literature there is a lack of systematic and quantitative evaluation of this kind of unawareness, despite the negative impact of anosognosia on rehabilitation programs. This study investigated anosognosia for neglect-related impairments at different clinical tasks, by means of a quantitative assessment. Patients were tested in two different conditions (before and after execution of each task), in order to evaluate changes in the level of awareness of neglect-related behaviours triggered by task execution. Twenty-nine right-brain-damaged patients (17 with left spatial neglect) and 27 neurologically unimpaired controls entered the study. Anosognosia for spatial deficits is not pervasive, with different tasks evoking different degrees of awareness about neglect symptoms. Indeed, patients showed a largely preserved awareness about their performance in complex visuo-motor spatial and reading tasks; conversely, they were impaired in evaluating their spatial difficulties in line bisection and drawing from memory, showing over-estimation of their performance. The selectivity of the patients' unawareness of specific manifestations of spatial neglect is further supported by their preserved awareness of performance at a linguistic task, and by the absence of anosognosia for hemiplegia. This evidence indicates that discrete processes are involved in the aware monitoring of cognitive and motor performance, which can be selectively compromised by brain damage. Awareness of spatial difficulties is supported by a number of distinct components, and influenced by the specific skills required to perform a given task.

  3. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  4. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  5. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  6. Quantitative analysis of arm movement smoothness

    Science.gov (United States)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  7. Seniors' Online Communities: A Quantitative Content Analysis

    Science.gov (United States)

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  8. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  9. Spatially quantitative seafloor habitat mapping: example from the northern South Carolina inner continental shelf

    Science.gov (United States)

    Ojeda, Germán Y.; Gayes, Paul T.; Van Dolah, Robert F.; Schwab, William C.

    2004-03-01

    Naturally occurring hard bottom areas provide the geological substrate that can support diverse assemblages of sessile benthic organisms, which in turn, attract many reef-dwelling fish species. Alternatively, defining the location and extent of bottom sand bodies is relevant for potential nourishment projects as well as to ensure that transient sediment does not affect reef habitats, particularly in sediment-starved continental margins. Furthermore, defining sediment transport pathways documents the effects these mobile bedforms have on proximal reef habitats. Thematic mapping of these substrates is therefore crucial in safeguarding critical habitats and offshore resources of coastal nations. This study presents the results of a spatially quantitative mapping approach based on classification of sidescan-sonar imagery. By using bottom video for image-to-ground control, digital image textural features for pattern recognition, and an artificial neural network for rapid, quantitative, multivariable decision-making, this approach resulted in recognition rates of hard bottom as high as 87%. The recognition of sand bottom was less successful (31%). This approach was applied to a large (686 km 2), high-quality, 2-m resolution sidescan-sonar mosaic of the northern South Carolina inner continental shelf. Results of this analysis indicate that both surficial sand and hard bottoms of variable extent are present over the study area. In total, 59% of the imaged area was covered by hard bottom, while 41% was covered by sand. Qualitative spatial correlation between bottom type and bathymetry appears possible from comparison of our interpretive map and available bathymetry. Hard bottom areas tend to be located on flat, low-lying areas, and sandy bottoms tend to reside on areas of positive relief. Published bio-erosion rates were used to calculate the potential sediment input from the mapped hard bottom areas rendering sediment volumes that may be as high as 0.8 million m 3/yr for

  10. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  11. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    Science.gov (United States)

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  12. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Our...... residuals generalise the well-known residuals for point processes in time, used in signal processing and survival analysis. An important difference is that the conditional intensity or hazard rate of the temporal point process must be replaced by the Papangelou conditional intensity $lambda$ of the spatial...... process. Residuals are ascribed to locations in the empty background, as well as to data points of the point pattern. We obtain variance formulae, and study standardised residuals. There is also an analogy between our spatial residuals and the usual residuals for (non-spatial) generalised linear models...

  13. High resolution or optimum resolution? Spatial analysis of the Federmesser site at Andernach, Germany

    NARCIS (Netherlands)

    Stapert, D; Street, M

    1997-01-01

    This paper discusses spatial analysis at site level. It is suggested that spatial analysis has to proceed in several levels, from global to more detailed questions, and that optimum resolution should be established when applying any quantitative methods in this field. As an example, the ring and sec

  14. A Quantitative Corpus-Based Approach to English Spatial Particles: Conceptual Symmetry and Its Pedagogical Implications

    Science.gov (United States)

    Chen, Alvin Cheng-Hsien

    2014-01-01

    The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…

  15. Method for Quantitative Determination of Spatial Polymer Distribution in Alginate Beads Using Raman Spectroscopy

    NARCIS (Netherlands)

    Heinemann, Matthias; Meinberg, Holger; Büchs, Jochen; Koß, Hans-Jürgen; Ansorge-Schumacher, Marion B.

    2005-01-01

    A new method based on Raman spectroscopy is presented for non-invasive, quantitative determination of the spatial polymer distribution in alginate beads of approximately 4 mm diameter. With the experimental setup, a two-dimensional image is created along a thin measuring line through the bead compri

  16. Method for Quantitative Determination of Spatial Polymer Distribution in Alginate Beads Using Raman Spectroscopy

    NARCIS (Netherlands)

    Heinemann, Matthias; Meinberg, Holger; Büchs, Jochen; Koß, Hans-Jürgen; Ansorge-Schumacher, Marion B.

    2005-01-01

    A new method based on Raman spectroscopy is presented for non-invasive, quantitative determination of the spatial polymer distribution in alginate beads of approximately 4 mm diameter. With the experimental setup, a two-dimensional image is created along a thin measuring line through the bead

  17. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  18. Spatial data exploratory analysis and usability

    Directory of Open Access Journals (Sweden)

    D Josselin

    2003-02-01

    Full Text Available In this article, we intend to show how useful Exploratory Spatial Data Analysis is in improving spatial data usability. We first outlined a general framework about usability using conceptual modelling, including Data, Users and Methodologies. We then defined keywords into classes and their relations. A central ternary relation is enhanced to describe usability. In the second section, we present ESDA with its fundamental basics: i.e. robustness and way(s to handle data and related graphic tools. We also described the software package ARPEGE'. Through a concrete example, we demonstrate and discuss its relevance for exploratory spatial data analysis and usability.

  19. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    OpenAIRE

    A. O. Domanov

    2014-01-01

    Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg) explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial chara...

  20. Quantitative image analysis of celiac disease.

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-07

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  1. Quantitative image analysis of celiac disease

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  2. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  3. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  4. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  5. Spatial access priority mapping (SAPM with fishers: a quantitative GIS method for participatory planning.

    Directory of Open Access Journals (Sweden)

    Katherine L Yates

    Full Text Available Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the

  6. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  7. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link...

  8. Geostatistics and spatial analysis in biological anthropology.

    Science.gov (United States)

    Relethford, John H

    2008-05-01

    A variety of methods have been used to make evolutionary inferences based on the spatial distribution of biological data, including reconstructing population history and detection of the geographic pattern of natural selection. This article provides an examination of geostatistical analysis, a method used widely in geology but which has not often been applied in biological anthropology. Geostatistical analysis begins with the examination of a variogram, a plot showing the relationship between a biological distance measure and the geographic distance between data points and which provides information on the extent and pattern of spatial correlation. The results of variogram analysis are used for interpolating values of unknown data points in order to construct a contour map, a process known as kriging. The methods of geostatistical analysis and discussion of potential problems are applied to a large data set of anthropometric measures for 197 populations in Ireland. The geostatistical analysis reveals two major sources of spatial variation. One pattern, seen for overall body and craniofacial size, shows an east-west cline most likely reflecting the combined effects of past population dispersal and settlement. The second pattern is seen for craniofacial height and shows an isolation by distance pattern reflecting rapid spatial changes in the midlands region of Ireland, perhaps attributable to the genetic impact of the Vikings. The correspondence of these results with other analyses of these data and the additional insights generated from variogram analysis and kriging illustrate the potential utility of geostatistical analysis in biological anthropology.

  9. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  10. Representation of abstract quantitative rules applied to spatial and numerical magnitudes in primate prefrontal cortex.

    Science.gov (United States)

    Eiselt, Anne-Kathrin; Nieder, Andreas

    2013-04-24

    Processing quantity information based on abstract principles is central to intelligent behavior. Neural correlates of quantitative rule selectivity have been identified previously in the prefrontal cortex (PFC). However, whether individual neurons represent rules applied to multiple magnitude types is unknown. We recorded from PFC neurons while monkeys switched between "greater than/less than" rules applied to spatial and numerical magnitudes. A majority of rule-selective neurons responded only to the quantitative rules applied to one specific magnitude type. However, another population of neurons generalized the magnitude principle and represented the quantitative rules related to both magnitudes. This indicates that the primate brain uses rule-selective neurons specialized in guiding decisions related to a specific magnitude type only, as well as generalizing neurons that respond abstractly to the overarching concept "magnitude rules."

  11. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  12. Quantitative laryngeal electromyography: turns and amplitude analysis.

    Science.gov (United States)

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  13. Spatially quantitative models for vulnerability analyses and resilience measures in flood risk management: Case study Rafina, Greece

    Science.gov (United States)

    Karagiorgos, Konstantinos; Chiari, Michael; Hübl, Johannes; Maris, Fotis; Thaler, Thomas; Fuchs, Sven

    2013-04-01

    We will address spatially quantitative models for vulnerability analyses in flood risk management in the catchment of Rafina, 25 km east of Athens, Greece; and potential measures to reduce damage costs. The evaluation of flood damage losses is relatively advanced. Nevertheless, major problems arise since there are no market prices for the evaluation process available. Moreover, there is particular gap in quantifying the damages and necessary expenditures for the implementation of mitigation measures with respect to flash floods. The key issue is to develop prototypes for assessing flood losses and the impact of mitigation measures on flood resilience by adjusting a vulnerability model and to further develop the method in a Mediterranean region influenced by both, mountain and coastal characteristics of land development. The objective of this study is to create a spatial and temporal analysis of the vulnerability factors based on a method combining spatially explicit loss data, data on the value of exposed elements at risk, and data on flood intensities. In this contribution, a methodology for the development of a flood damage assessment as a function of the process intensity and the degree of loss is presented. It is shown that (1) such relationships for defined object categories are dependent on site-specific and process-specific characteristics, but there is a correlation between process types that have similar characteristics; (2) existing semi-quantitative approaches of vulnerability assessment for elements at risk can be improved based on the proposed quantitative method; and (3) the concept of risk can be enhanced with respect to a standardised and comprehensive implementation by applying the vulnerability functions to be developed within the proposed research. Therefore, loss data were collected from responsible administrative bodies and analysed on an object level. The used model is based on a basin scale approach as well as data on elements at risk exposed

  14. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  15. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  16. Lacunarity analysis: A general technique for the analysis of spatial patterns

    Energy Technology Data Exchange (ETDEWEB)

    Plotnick, R.E. [Department of Geological Sciences, University of Illinois at Chicago, 845 West Taylor Street, Chicago, Illinois 60607-7059 (United States); Gardner, R.H. [University of Maryland, Appalachian Environmental Laboratory, Frostburg, Maryland 21532 (United States); Hargrove, W.W. [Environmental Sciences Division, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831-6038 (United States); Prestegaard, K. [Department of Geology, University of Maryland, College Park, Maryland 20742 (United States); Perlmutter, M. [Energy Systems Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, Illinois 60439 (United States)

    1996-05-01

    Lacunarity analysis is a multiscaled method for describing patterns of spatial dispersion. It can be used with both binary and quantitative data in one, two, and three dimensions. Although originally developed for fractal objects, the method is more general and can be readily used to describe nonfractal and multifractal patterns. Lacunarity analysis is broadly applicable to many data sets used in the natural sciences; we illustrate its application to both geological and ecological data. {copyright} {ital 1996 The American Physical Society.}

  17. Quantitative analysis of biological tissues using Fourier transform-second-harmonic generation imaging

    Science.gov (United States)

    Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.

    2010-02-01

    We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.

  18. Nonlinear dynamics and quantitative EEG analysis.

    Science.gov (United States)

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  19. Quantitative structure-activity relationship of botanical sesquiterpenes: spatial and contact repellency to the yellow fever mosquito, Aedes aegypti.

    Science.gov (United States)

    Paluch, Gretchen; Grodnitzky, Justin; Bartholomay, Lyric; Coats, Joel

    2009-08-26

    The plant terpenoids encompass a diversity of structures and have many functional roles in nature, including protection against pest arthropods. Previous studies in this laboratory have identified naturally occurring sesquiterpenes contained in essential oils from two plants, amyris (Amyris balsamifera) and Siam-wood (Fokienia hodginsii), that are significantly repellent to a spectrum of arthropod pests. In efforts to further examine the biological activity of this class of compounds 12 of these plant-derived sesquiterpenes have been isolated, purified, and assayed for spatial and contact repellency against the yellow fever mosquito, Aedes aegypti . These data were used to develop quantitative structure-activity relationships that identified key properties of the sesquiterpene molecule, including electronic and structural parameters that were used to predict optimal repellent activity. There were notable similarities in the models developed for spatial repellency over five time points and for contact repellency. Vapor pressure was an important component of all repellency models. Initial levels of spatial repellency were also related to polarizability of the molecule and lowest unoccupied molecular orbital (LUMO) energy, whereas the equation for late spatial repellency was dependent on other electronic features, including Mulliken population and electrotopological state descriptors. The model identified for contact repellency was the best fit and most significant model in this analysis and showed a relationship with vapor pressure, Mulliken population, and total energy.

  20. Spectral Analysis of Spatial Series Data of Pathologic Tissue: A Study on Small Intestine in ICR Mouse

    Science.gov (United States)

    Mise, Keiji; Sumi, Ayako; Kobayashi, Nobumichi; Torigoe, Toshihiko; Ohtomo, Norio

    2009-01-01

    We examined the usefulness of spectral analysis for investigating quantitatively the spatial pattern of pathologic tissue. To interpret the results obtained from real tissue, we constructed a two-dimensional spatial model of the tissue. Spectral analysis was applied to the spatial series data, which were obtained from the real tissue and model. From the results of spectral analysis, spatial patterns of the tissue and model were characterized quantitatively in reference to the frequencies and powers of the spectral peaks in power spectral densities (PSDs). The results for the model were essentially consistent with those for the tissue. It was concluded that the model was capable of adequately explaining the spatial pattern of the tissue. It is anticipated that spectral analysis will become a useful tool for characterizing the spatial pattern of the tissue quantitatively, resulting in an automated first screening of pathological specimens.

  1. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  2. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  3. A Quantitative Calculation Method of Composite Spatial Direction Similarity Concerning Scale Differences

    Directory of Open Access Journals (Sweden)

    CHEN Zhanlong

    2016-03-01

    Full Text Available This article introduces a new model for direction relations between multiple spatial objects at multiple scales and a corresponding similarity assessment method. The model is an improvement of direction relation matrix, which quantitatively models direction relations on object scale, and by the idea of decomposition and means of the optimum solution of the transportation problem to solve the minimum conversion cost between multiple direction matrices, namely distance between a pair of matrices, thus quantified the difference between a pair of directions, finally obtain the similarity values between arbitrary pairs of multiple spatial objects and compare the results. Experiments on calculating similarity between objects at different scales show that the presented method is efficient, accurate, and capable of obtaining results consistent with human cognition.

  4. Research on Spatial Distribution of Rural Co-operative Financing Housing and the Factors in Guangzhou:Based on GIS Spatial Analysis and SPSS Quantitative Statistics%广州农村集资房空间分布特征及其影响因素研究——基于GIS空间分析和SPSS数理分析

    Institute of Scientific and Technical Information of China (English)

    吕凤琴; 袁奇峰; 陈世栋

    2015-01-01

    农村地区的房地产开发由于与政策相悖而被禁止,受外部需求驱动,都市边缘区源起于农村集资房的小产权房开发呈蔓延趋势,成为学界关注的热点.论文结合空间分析方法和数理分析模型,基于Arc-GIS、SPSS等分析平台,以广州白云区为例,对其20世纪90年代农村集资房项目空间分布特征进行梳理,并根据区域发展特征,通过构建数理模型,揭示农村集资房发展过程的社会经济影响因素.结果表明:广州白云区农村集资房项目空间分布具有明显的交通指向性、空间集聚性及区域差异性;政策因素、区位条件及农民人均纯收入水平等是影响广州白云区农村集资房空间分异的主要因素.%Due to the prohibition of real estate in rural areas by policies and strong demand for housing,the small property rights housing developed from the rural co-operative financing housing in the urban marginal areas becomes very popular,which receives much academic attention.By adopting Arc-GIS10.0 and SPSS software,the paper systematically analyzes the spatial distribution of the rural co-oper-ative financing housing and its influencing factors in Guangzhou Baiyun District in the 1990s based on spatial analysis and mathematical analysis.The result indicates that the spatial distribution of the rural co-operative financing housing presents strong traffic directivity,spatial agglomeration and regional differ-ences.And the factors of policy,location and income contribute much to the difference of the spatial dis-tribution in the rural co-operative financing housing.

  5. Spatial methods in areal administrative data analysis

    Directory of Open Access Journals (Sweden)

    Haijun Ma

    2006-12-01

    Full Text Available Administrative data often arise as electronic copies of paid bills generated from insurance companies including the Medicare and Medicaid programs. Such data are widely seen and analyzed in the public health area, as in investigations of cancer control, health service accessibility, and spatial epidemiology. In areas like political science and education, administrative data are also important. Administrative data are sometimes more readily available as summaries over each administrative unit (county, zip code, etc. in a particular set determined by geopolitical boundaries, or what statisticians refer to as areal data. However, the spatial dependence often present in administrative data is often ignored by health services researchers. This can lead to problems in estimating the true underlying spatial surface, including inefficient use of data and biased conclusions. In this article, we review hierarchical statistical modeling and boundary analysis (wombling methods for areal-level spatial data that can be easily carried out using freely available statistical computing packages. We also propose a new edge-domain method designed to detect geographical boundaries corresponding to abrupt changes in the areal-level surface. We illustrate our methods using county-level breast cancer late detection data from the state of Minnesota.

  6. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  7. Quantitative color analysis for capillaroscopy image segmentation.

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  8. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  9. A Quantitative Analysis on Spatial Estimation Error of Regional Aboveground Forest Carbon Distribution%区域森林碳分布空间估计误差定量分析

    Institute of Scientific and Technical Information of China (English)

    赵平安; 张茂震; 陈金星; 金雨菲; 郭含茹; 何卫安

    2013-01-01

    区域森林碳(地上部分)分布的空间估计存在多种误差来源,其直接影响估计结果的精度.采用森林资源清查样地数据与Landsat TM影像数据相结合的方法对区域森林碳(地上部分)分布进行空间估计的误差来源,采取相对误差的形式对其定量化,根据不确定度的合成与分配方法获得空间估计总误差及各误差来源在总误差中的比例分配.基于临安市2004年森林资源清查和Land-sat TM影像数据的森林碳(地上部分)分布空间估计误差分析结果显示,区域森林碳(地上部分)分布空间估计的总误差为10.15%,各误差所占比例为:抽样误差63.5%、遥感影像坐标校正定位误差22.9%、树高测量误差9.4%、生物量模型误差3.4%、胸径测量误差0.8%.%There are several error sources in the spatial estimation process of regional aboveground forest carbon distribution, which directly impact the precision of the estimation results. This paper addressed the error sources by means of combining NFI plot data with Landsat Thematic Mapper images to estimate the regional aboveground forest carbon distribution. According to the uncertainty synthesis and distribution method to obtain the total error of spatial estimation and the proportions of various error sources which were quantified by the relative error. The error analyses on spatial estimation of regional aboveground forest carbon distribution based on 2004 NFI plot data and Landsat Thematic Mapper images from Lin'an Municipality, Zhejiang Province showed that the total error on spatial estimation of regional aboveground forest carbon distribution was 10. 15%. The proportions of each error were sampling error (63. 5% ) , locating error of remote images coordination correction (22. 9% ) , tree height measuring error (9. 4% ) , biomass model error (3. 4% ) and DBH measurement error (0. 8% ).

  10. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  11. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  12. Spatial heterogeneity analysis of brain activation in fMRI.

    Science.gov (United States)

    Gupta, Lalit; Besseling, René M H; Overvliet, Geke M; Hofman, Paul A M; de Louw, Anton; Vaessen, Maarten J; Aldenkamp, Albert P; Ulman, Shrutin; Jansen, Jacobus F A; Backes, Walter H

    2014-01-01

    In many brain diseases it can be qualitatively observed that spatial patterns in blood oxygenation level dependent (BOLD) activation maps appear more (diffusively) distributed than in healthy controls. However, measures that can quantitatively characterize this spatial distributiveness in individual subjects are lacking. In this study, we propose a number of spatial heterogeneity measures to characterize brain activation maps. The proposed methods focus on different aspects of heterogeneity, including the shape (compactness), complexity in the distribution of activated regions (fractal dimension and co-occurrence matrix), and gappiness between activated regions (lacunarity). To this end, functional MRI derived activation maps of a language and a motor task were obtained in language impaired children with (Rolandic) epilepsy and compared to age-matched healthy controls. Group analysis of the activation maps revealed no significant differences between patients and controls for both tasks. However, for the language task the activation maps in patients appeared more heterogeneous than in controls. Lacunarity was the best measure to discriminate activation patterns of patients from controls (sensitivity 74%, specificity 70%) and illustrates the increased irregularity of gaps between activated regions in patients. The combination of heterogeneity measures and a support vector machine approach yielded further increase in sensitivity and specificity to 78% and 80%, respectively. This illustrates that activation distributions in impaired brains can be complex and more heterogeneous than in normal brains and cannot be captured fully by a single quantity. In conclusion, heterogeneity analysis has potential to robustly characterize the increased distributiveness of brain activation in individual patients.

  13. Spatial heterogeneity analysis of brain activation in fMRI

    Directory of Open Access Journals (Sweden)

    Lalit Gupta

    2014-01-01

    Full Text Available In many brain diseases it can be qualitatively observed that spatial patterns in blood oxygenation level dependent (BOLD activation maps appear more (diffusively distributed than in healthy controls. However, measures that can quantitatively characterize this spatial distributiveness in individual subjects are lacking. In this study, we propose a number of spatial heterogeneity measures to characterize brain activation maps. The proposed methods focus on different aspects of heterogeneity, including the shape (compactness, complexity in the distribution of activated regions (fractal dimension and co-occurrence matrix, and gappiness between activated regions (lacunarity. To this end, functional MRI derived activation maps of a language and a motor task were obtained in language impaired children with (Rolandic epilepsy and compared to age-matched healthy controls. Group analysis of the activation maps revealed no significant differences between patients and controls for both tasks. However, for the language task the activation maps in patients appeared more heterogeneous than in controls. Lacunarity was the best measure to discriminate activation patterns of patients from controls (sensitivity 74%, specificity 70% and illustrates the increased irregularity of gaps between activated regions in patients. The combination of heterogeneity measures and a support vector machine approach yielded further increase in sensitivity and specificity to 78% and 80%, respectively. This illustrates that activation distributions in impaired brains can be complex and more heterogeneous than in normal brains and cannot be captured fully by a single quantity. In conclusion, heterogeneity analysis has potential to robustly characterize the increased distributiveness of brain activation in individual patients.

  14. Wide-field quantitative imaging of tissue microstructure using sub-diffuse spatial frequency domain imaging.

    Science.gov (United States)

    McClatchy, David M; Rizzo, Elizabeth J; Wells, Wendy A; Cheney, Philip P; Hwang, Jeeseong C; Paulsen, Keith D; Pogue, Brian W; Kanick, Stephen C

    2016-06-20

    Localized measurements of scattering in biological tissue provide sensitivity to microstructural morphology but have limited utility to wide-field applications, such as surgical guidance. This study introduces sub-diffusive spatial frequency domain imaging (sd-SFDI), which uses high spatial frequency illumination to achieve wide-field sampling of localized reflectances. Model-based inversion recovers macroscopic variations in the reduced scattering coefficient [Formula: see text] and the phase function backscatter parameter (γ). Measurements in optical phantoms show quantitative imaging of user-tuned phase-function-based contrast with accurate decoupling of parameters that define both the density and the size-scale distribution of scatterers. Measurements of fresh ex vivo breast tissue samples revealed, for the first time, unique clustering of sub-diffusive scattering properties for different tissue types. The results support that sd-SFDI provides maps of microscopic structural biomarkers that cannot be obtained with diffuse wide-field imaging and characterizes spatial variations not resolved by point-based optical sampling.

  15. The quantitative evaluation of cholinergic markers in spatial memory improvement induced by nicotine-bucladesine combination in rats.

    Science.gov (United States)

    Azami, Kian; Etminani, Maryam; Tabrizian, Kaveh; Salar, Fatemeh; Belaran, Maryam; Hosseini, Asieh; Hosseini-Sharifabad, Ali; Sharifzadeh, Mohammad

    2010-06-25

    We previously showed that post-training intra-hippocampal infusion of nicotine-bucladesine combination enhanced spatial memory retention in the Morris water maze. Here we investigated the role of cholinergic markers in nicotine-bucladesine combination-induced memory improvement. We assessed the expression of choline acetyltransferase (ChAT) and vesicular acetylcholine transporter (VAChT) in CA1 region of the hippocampus and medial septal area (MSA) of the brain. Post-training bilateral infusion of a low concentration of either nicotine or bucladesine into the CA1 region of the hippocampus did not affect spatial memory significantly. Quantitative immunostaining analysis of optical density in CA1 regions and evaluation of immunopositive neurons in medial septal area of brain sections from all combination groups revealed a significant increase (Pnicotine and in a concentration dependent manner. Also, increase in the optical density and amount of ChAT and VAChT immunostaining correlated with the decrease in escape latency and traveled distance in rats treated with nicotine and low dose of bucladesine. Taken together, these results suggest that significant increases of ChAT and VAChT protein expressions in the CA1 region and medial septal area are the possible mechanisms of spatial memory improvement induced by nicotine-bucladesine combination.

  16. Spatial variation in school performance, a local analysis of socio ...

    African Journals Online (AJOL)

    Setup

    Poor pass rates of matric learners at secondary schools in South Africa has been a concern for ... family or individual on a hierarchal social structure based on their access to, ... For a long time the complexities of spatial data ... Systems (GIS) has had an effect on quantitative geography and this ability to apply quantitative.

  17. Spatially Resolved Analysis of Bragg Selectivity

    Directory of Open Access Journals (Sweden)

    Tina Sabel

    2015-11-01

    Full Text Available This paper targets an inherent control of optical shrinkage in photosensitive polymers, contributing by means of spatially resolved analysis of volume holographic phase gratings. Point by point scanning of the local material response to the Gaussian intensity distribution of the recording beams is accomplished. Derived information on the local grating period and grating slant is evaluated by mapping of optical shrinkage in the lateral plane as well as through the depth of the layer. The influence of recording intensity, exposure duration and the material viscosity on the Bragg selectivity is investigated.

  18. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  19. The Curriculum in Quantitative Analysis: Results of a Survey.

    Science.gov (United States)

    Locke, David C.; Grossman, William E. L.

    1987-01-01

    Reports on the results of a survey of college level instructors of quantitative analysis courses. Discusses what topics are taught in such courses, how much weight is given to these topics, and which experiments are used in the laboratory. Poses some basic questions about the curriculum in quantitative analysis. (TW)

  20. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  1. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  2. Visual and quantitative evaluation of selected image combination schemes in ultrasound spatial compound scanning

    DEFF Research Database (Denmark)

    Wilhjelm, Jens E.; Jensen, M.S.; Jespersen, S.K.

    2004-01-01

    Multi-angle spatial compound images are normally generated by averaging the recorded single-angle images (SAIs). To exploit possible advantages associated with alternative combination schemes, this paper investigates both the effect of number of angles (Ntheta) as well as operator (mean, median......, mean-excluding-maximum (mem), root-mean-square (rms), geometric mean and maximum) on image quality (tissue delineation and artifacts), speckle signal-to-noise ratio (SNRs) and contrast. The evaluation is based on in vitro SAI (+/-21degrees in steps of Deltatheta = 7degrees) of formalin fixed porcine...... tissue containing adipose, connective and muscular tissue. Image quality increased with number of angles up to +/-14degrees after which the improvements became debatable. The mem and median operators, which try to render the images more quantitatively correct by suppressing strong echoes from specular...

  3. An information theory analysis of spatial decisions in cognitive development.

    Science.gov (United States)

    Scott, Nicole M; Sera, Maria D; Georgopoulos, Apostolos P

    2015-01-01

    Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain) as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages 5 to 10 years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of "cognitive entropy" were defined based on two independent aspects of chunking, namely (1) the number of clusters formed at each age group, and (2) the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured "chunking" of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework.

  4. An Information Theory Analysis of Spatial Decisions in Cognitive Development

    Directory of Open Access Journals (Sweden)

    Nicole M Scott

    2015-02-01

    Full Text Available Performance in a cognitive task can be considered as the outcome of a decision-making process operating across various knowledge domains or aspects of a single domain. Therefore, an analysis of these decisions in various tasks can shed light on the interplay and integration of these domains (or elements within a single domain as they are associated with specific task characteristics. In this study, we applied an information theoretic approach to assess quantitatively the gain of knowledge across various elements of the cognitive domain of spatial, relational knowledge, as a function of development. Specifically, we examined changing spatial relational knowledge from ages five to ten years. Our analyses consisted of a two-step process. First, we performed a hierarchical clustering analysis on the decisions made in 16 different tasks of spatial relational knowledge to determine which tasks were performed similarly at each age group as well as to discover how the tasks clustered together. We next used two measures of entropy to capture the gradual emergence of order in the development of relational knowledge. These measures of cognitive entropy were defined based on two independent aspects of chunking, namely (1 the number of clusters formed at each age group, and (2 the distribution of tasks across the clusters. We found that both measures of entropy decreased with age in a quadratic fashion and were positively and linearly correlated. The decrease in entropy and, therefore, gain of information during development was accompanied by improved performance. These results document, for the first time, the orderly and progressively structured chunking of decisions across the development of spatial relational reasoning and quantify this gain within a formal information-theoretic framework.

  5. Some Epistemological Considerations Concerning Quantitative Analysis

    Science.gov (United States)

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  6. Optical multiresolution analysis with spatial localization

    Science.gov (United States)

    Mazzaferri, Javier; Ledesma, Silvia

    2010-05-01

    Multiresolution analysis is very useful for characterization of textures, segmentation tasks, and feature enhancement. The development of optical methods to perform such procedures is highly promissory for real-time applications. Usually, the optical implementations of multiresolution analysis consist in the decomposition of the input scene in different frequency bands, obtaining various filtered versions of the scene. However, under certain circumstances it could be useful to provide just one version of the scene where the different filters are applied in different regions. This procedure could be specially interesting for biological and medical applications in situations when the approximate localization of the scale information is known a priori. In this paper we present a fully optical method to perform multiresolution analysis with spatial localization. By means of the proposed technique, the multi-scale analysis is performed at once in a unique image. The experimental set-up consist of a double-pass convergent optical processor. The first stage of the device allows the multiple band decomposition, while the second stage confines the information of each band to different regions of the object and recombines it to achieve the desired operation. Numerical simulations and experimental results, which prove the very good performance of the method, are presented.

  7. Spatial Data Mining using Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Ch.N.Santhosh Kumar

    2012-09-01

    Full Text Available Data mining, which is refers to as Knowledge Discovery in Databases(KDD, means a process of nontrivialexaction of implicit, previously useful and unknown information such as knowledge rules, descriptions,regularities, and major trends from large databases. Data mining is evolved in a multidisciplinary field ,including database technology, machine learning, artificial intelligence, neural network, informationretrieval, and so on. In principle data mining should be applicable to the different kind of data and databasesused in many different applications, including relational databases, transactional databases, datawarehouses, object- oriented databases, and special application- oriented databases such as spatialdatabases, temporal databases, multimedia databases, and time- series databases. Spatial data mining, alsocalled spatial mining, is data mining as applied to the spatial data or spatial databases. Spatial data are thedata that have spatial or location component, and they show the information, which is more complex thanclassical data. A spatial database stores spatial data represents by spatial data types and spatialrelationships and among data. Spatial data mining encompasses various tasks. These include spatialclassification, spatial association rule mining, spatial clustering, characteristic rules, discriminant rules,trend detection. This paper presents how spatial data mining is achieved using clustering.

  8. Quantitative analysis and parametric display of regional myocardial mechanics

    Science.gov (United States)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  9. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    Science.gov (United States)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  10. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...

  11. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. O. Domanov

    2014-01-01

    Full Text Available Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial characteristics and citizens’ identities. As the geographical position at the border of Russia provides the citizens with geopolitical alternatives to identify their location as a fortress defending the nation (as in the case of Kronstadt or a bridge between cultures, the given study allows us to compare reasons for these geopolitical choices of inhabitants. Furthermore, the research aims at bridging the gap in the studies of European and multiple identity in Russian regions and provides Northwest Russian perspective on the perpetual discussion about subjective Eastern border of Europe.

  12. Structural and quantitative analysis of Equisetum alkaloids.

    Science.gov (United States)

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  13. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  14. A quantitative method for zoning of protected areas and its spatial ecological implications.

    Science.gov (United States)

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  15. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  16. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  17. Quantitative and qualitative analysis of sterols/sterolins and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-06-03

    Jun 3, 2008 ... Quantitative and qualitative analysis of sterols/sterolins ... method was developed to identify and quantify sterols (especially β-sitosterol) in chloroform extracts of ... Studies with phytosterols, especially β-sitosterol, have.

  18. INFORMAL HOUSING IN GREECE: A QUANTITATIVE SPATIAL ANALYSIS

    OpenAIRE

    Serafeim POLYZOS; MINETOS, Dionysios

    2009-01-01

    During the last 50 years in Greece growing demand for urban (residential and industrial) space has resulted in unplanned residential development and informal dwelling construction to the expense of agricultural and forest land uses. Despite the fact that the post-war challenge faced by the state in providing minimal housing for their citizens has been met the informal settlements phenomenon still proceeds. This situation tents to become an acute problem with serious economic, social and envir...

  19. Spatial and Quantitative Comparison of Satellite-Derived Land Cover Products over China

    Institute of Scientific and Technical Information of China (English)

    GAO Hao; JIA Gen-Suo

    2012-01-01

    Because land cover plays an important role in global climate change studies, assessing the agreement among different land cover products is critical. Significant discrepancies have been reported among satellite-derived land cover products, especially at the regional scale. Dif- ferent classification schemes are a key obstacle to the comparison of products and are considered the main fac- tor behind the disagreement among the different products. Using a feature-based overlap metric, we investigated the degree of spatial agreement and quantified the overall and class-specific agreement among the Moderate Resolution Imaging Spectoradiometer (MODIS), Global Land Cover 2000 (GLC2000), and the National Land Cover/Use Data- sets (NLCD) products, and the author assessed the prod- ucts by ground reference data at the regional scale over China. The areas with a low degree of agreement mostly occurred in heterogeneous terrain and transition zones, while the areas with a high degree of agreement occurred in major plains and areas with homogeneous vegetation. The overall agreement of the MODIS and GLC2000 products was 50.8% and 52.9%, and the overall accuracy was 50.3% and 41.9%, respectively. Class-specific agree- ment or accuracy varied significantly. The high-agreement classes are water, grassland, cropland, snow and ice, and bare areas, whereas classes with low agreement are shru- bland and wetland in both MODIS and GLC2000. These characteristics of spatial patterns and quantitative agree- ment could be partly explained by the complex landscapes, mixed vegetation, low separability of spectro-temporal- texture signals, and coarse pixels. The differences of class definition among different the classification schemes also affects the agreement. Each product had its advantages and limitations, but neither the overall accuracy nor the class-specific accuracy could meet the requirements of climate modeling.

  20. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  1. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  2. Towards a quantitative OCT image analysis.

    Directory of Open Access Journals (Sweden)

    Marina Garcia Garrido

    Full Text Available Optical coherence tomography (OCT is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study.Spectral-Domain Optical Coherence Tomography (OCT, confocal Scanning-Laser Ophthalmoscopy (SLO, and Fluorescein Angiography (FA were performed in mice (Mus musculus, gerbils (Gerbillus perpadillus, and cynomolgus monkeys (Macaca fascicularis using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer. Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/.Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP, the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated.OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions. Our results highlight the

  3. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  4. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  5. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  6. Spatial Econometric data analysis: moving beyond traditional models

    NARCIS (Netherlands)

    Florax, R.J.G.M.; Vlist, van der A.J.

    2003-01-01

    This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling perspecti

  7. Spatial contiguity analysis. A method for describing spatial structures of seismic data

    Energy Technology Data Exchange (ETDEWEB)

    Faraj, A. [Institut Francais du Petrole, 1-4 Av. de Bois-Preau, 92500 Rueil Malmaison, Paris (France); Cailly, F. [Beicip Franlab, 232 Av. Napoleon Bonaparte, 92500 Rueil Malmaison, Paris (France)

    2001-11-01

    We apply spatial contiguity analysis (SCA) to study spatial structures contained in seismic images. Compared to classical methods, such as principal component analysis (PCA), SCA is more efficient for multivariate description and spatial filtering of this kind of images. We present SCA according to geostatistic formalism defined by Matheron. A preliminary spatial analysis of initial variables is required. Made with the help of variogram curves, this permits to underline spatial properties of these variables and defines contiguity distance and direction to apply SCA. A series of mathematical tools is defined. They allow quantifying the information held by initial variables and factorial components in terms of variance and spatial variability and exhibit data spatial structures on different scales. The method is applied to analyse a seismic data set. We compare PCA and SCA results. This data set gives us the opportunity to show the interest of preliminary spatial analysis of initial variables, and the effects of spatial direction and distance on the data decomposition in elementary structures.

  8. Spatial and temporal epidemiological analysis in the Big Data era.

    Science.gov (United States)

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  9. Spatial Analysis Of Human Capital Structures

    Directory of Open Access Journals (Sweden)

    Gajdos Artur

    2014-12-01

    Full Text Available The main purpose of this paper is to analyse the interdependence between labour productivity and the occupational structure of human capital in a spatial cross-section. Research indicates (see Fischer 2009 the possibility to assess the impact of the quality of human capital (measured by means of the level of education on labour productivity in a spatial cross-section.

  10. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  11. Spatial positioning : method development for spatial analysis of interaction in buildings

    OpenAIRE

    Markhede, Henrik

    2010-01-01

    In offices, knowledge sharing largely depends on everyday face-to-face interaction patterns. These interaction patterns may depend on how employees move through the office space. This thesis explores how these spatial relations influence individual choices with respect to employee movements or routes. Space syntax related research has shown a strong relationship between spatial configuration and pedestrian movement in cities, yet field of space syntax has not applied spatial analysis to the o...

  12. Robust geostatistical analysis of spatial data

    Science.gov (United States)

    Papritz, Andreas; Künsch, Hans Rudolf; Schwierz, Cornelia; Stahel, Werner A.

    2013-04-01

    Most of the geostatistical software tools rely on non-robust algorithms. This is unfortunate, because outlying observations are rather the rule than the exception, in particular in environmental data sets. Outliers affect the modelling of the large-scale spatial trend, the estimation of the spatial dependence of the residual variation and the predictions by kriging. Identifying outliers manually is cumbersome and requires expertise because one needs parameter estimates to decide which observation is a potential outlier. Moreover, inference after the rejection of some observations is problematic. A better approach is to use robust algorithms that prevent automatically that outlying observations have undue influence. Former studies on robust geostatistics focused on robust estimation of the sample variogram and ordinary kriging without external drift. Furthermore, Richardson and Welsh (1995) proposed a robustified version of (restricted) maximum likelihood ([RE]ML) estimation for the variance components of a linear mixed model, which was later used by Marchant and Lark (2007) for robust REML estimation of the variogram. We propose here a novel method for robust REML estimation of the variogram of a Gaussian random field that is possibly contaminated by independent errors from a long-tailed distribution. It is based on robustification of estimating equations for the Gaussian REML estimation (Welsh and Richardson, 1997). Besides robust estimates of the parameters of the external drift and of the variogram, the method also provides standard errors for the estimated parameters, robustified kriging predictions at both sampled and non-sampled locations and kriging variances. Apart from presenting our modelling framework, we shall present selected simulation results by which we explored the properties of the new method. This will be complemented by an analysis a data set on heavy metal contamination of the soil in the vicinity of a metal smelter. Marchant, B.P. and Lark, R

  13. Quantitative interferometric microscopic flow cytometer with expanded principal component analysis method

    Science.gov (United States)

    Wang, Shouyu; Jin, Ying; Yan, Keding; Xue, Liang; Liu, Fei; Li, Zhenhua

    2014-11-01

    Quantitative interferometric microscopy is used in biological and medical fields and a wealth of applications are proposed in order to detect different kinds of biological samples. Here, we develop a phase detecting cytometer based on quantitative interferometric microscopy with expanded principal component analysis phase retrieval method to obtain phase distributions of red blood cells with a spatial resolution ~1.5 μm. Since expanded principal component analysis method is a time-domain phase retrieval algorithm, it could avoid disadvantages of traditional frequency-domain algorithms. Additionally, the phase retrieval method realizes high-speed phase imaging from multiple microscopic interferograms captured by CCD camera when the biological cells are scanned in the field of view. We believe this method can be a powerful tool to quantitatively measure the phase distributions of different biological samples in biological and medical fields.

  14. Recent developments in spatial analysis spatial statistics, behavioural modelling, and computational intelligence

    CERN Document Server

    Getis, Arthur

    1997-01-01

    In recent years, spatial analysis has become an increasingly active field, as evidenced by the establishment of educational and research programs at many universities. Its popularity is due mainly to new technologies and the development of spatial data infrastructures. This book illustrates some recent developments in spatial analysis, behavioural modelling, and computational intelligence. World renown spatial analysts explain and demonstrate their new and insightful models and methods. The applications are in areas of societal interest such as the spread of infectious diseases, migration behaviour, and retail and agricultural location strategies. In addition, there is emphasis on the uses of new technologoies for the analysis of spatial data through the application of neural network concepts.

  15. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  16. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  17. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  18. Quantitative analysis of cascade impactor samples - revisited

    Science.gov (United States)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  19. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  20. Quantitative MRI analysis of dynamic enhancement of focal liver lesions

    Directory of Open Access Journals (Sweden)

    S. S. Bagnenko

    2012-01-01

    Full Text Available In our study 45 patients with different focal liver lesions (110 nodules were examined using high field MR-system (1,5 T. During this investigation quantitative MRI analysis of dynamic enhancement of various hepatic lesions and parenchymatous organs of abdomen were performed. It was shown that quantitative evaluation of enhanced MRI improves understanding of vascular transformation processes in pathologic hepatic focuses and in liver itself that is important for differential diagnoses of these diseases.

  1. Book review: Statistical Analysis and Modelling of Spatial Point Patterns

    DEFF Research Database (Denmark)

    Møller, Jesper

    2009-01-01

    Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912......Statistical Analysis and Modelling of Spatial Point Patterns by J. Illian, A. Penttinen, H. Stoyan and D. Stoyan. Wiley (2008), ISBN 9780470014912...

  2. Rings and sector : intrasite spatial analysis of stone age sites.

    NARCIS (Netherlands)

    Stapert, Durk

    1992-01-01

    This thesis deals with intrasite spatial analysis: the analysis of spatial patterns on site level. My main concern has been to develop a simple method for analysing Stone Age sites of a special type: those characterised by the presence of a hearth closely associated in space with an artefact scatter

  3. Quantitative and high spatial resolution d{sub 33} measurement of piezoelectric bulk and thin films

    Energy Technology Data Exchange (ETDEWEB)

    Shetty, Smitha, E-mail: sus44@psu.edu; Yang, Jung In; Trolier-McKinstry, Susan [Department of Materials Science and Engineering, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States); Stitt, Joe [Materials Research Institute, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States)

    2015-11-07

    A single beam laser interferometer based on a modified Mirau detection scheme with a vertical resolution of ∼5 pm was developed for localized d{sub 33} measurements on patterned piezoelectric films. The tool provides high spatial resolution (∼2 μm), essential for understanding scaling and processing effects in piezoelectric materials. This approach enables quantitative information on d{sub 33}, currently difficult in local measurement techniques such as piezoresponse force microscopy. The interferometer is built in a custom microscope and employs a phase lock-in technique in order to detect sub-Angstrom displacements. d{sub 33} measurements on single crystal 0.67PbMg{sub 0.33}Nb{sub 0.67}O{sub 3}-0.33PbTiO{sub 3} and bulk PbZrTiO{sub 3}-5A ceramics demonstrated agreement within <3% with measurements using a double beam laser interferometer. Substrate bending contributions to out-of-plane strain, observed in thin continuous PbZr{sub 0.52}Ti{sub 0.48}O{sub 3} films grown on Si substrates is reduced for electrode diameters smaller than 100 μm. Direct scanning across room temperature and 150 °C poled 5 μm and 10 μm features etched in 0.5 μm thick PbZr{sub 0.52}Ti{sub 0.48}O{sub 3} films doped with 1% Nb confirmed minimal substrate contributions to the effective d{sub 33,f}. Furthermore, enhanced d{sub 33,f} values were observed along the feature edges due to partial declamping from the substrate, thus validating the application of single beam interferometry on finely patterned electrodes.

  4. Financial indicators for municipalities: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    Sreĉko Devjak

    2009-12-01

    Full Text Available From the characterization of Local Authority financing models and structures in Portugal and Slovenia, a set of financial and generic budget indicators has been established. These indicators may be used in a comparative analysis considering the Bragança District in Portugal, and municipalities of similar population size in Slovenia. The research identified significant differences, in terms of financing sources due to some discrepancies on financial models and competences of municipalities on each country. The results show that Portuguese and Slovenian municipalities, in 2003, for the economy indicator, had similar ranking behaviour, but in 2004, they changed this behaviour.

  5. QUANTITATIVE ANALYSIS OF DRAWING TUBES MICROSTRUCTURE

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2009-05-01

    Full Text Available Final properties of forming pieces are affected by production, at first conditions of mechanical working. Application of stereology methods to statistic reconstruction of three-dimensional plastic deformed material structure by bulk forming led to detail analysis of material structure changes. The microstructure of cold drawing tubes from STN 411353 steel was analyzed. Grain boundaries orientation was measured on perpendicular and parallel section of tubes with different degree of deformation. Macroscopic deformation leads to grain boundaries deformation and these ones were compared.

  6. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life...

  7. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  8. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  9. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  10. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  11. 城市中心区圈核结构模式的空间增长过程研究——对南京中心区30年演替的定量分析%STUDY ON SPATIAL GROWTH PROCESS OF CIRCLE-CORE MODE AT URBAN CENTRAL AREA: QUANTITATIVE ANALYSIS ON THIRTY YEARS' SUCCESSION OF CENTRAL AREA OF NANJING

    Institute of Scientific and Technical Information of China (English)

    杨俊宴; 史北祥

    2012-01-01

    Taking Nanjing Xinjiekou Central District's spatial growth process since 1978 as the case, and based on the vector data of many years, this paper calculates quantitatively the spatial scope of the area at each period with Murphy Index. Then spatial matrix analysis, land-use analysis and sector composition analysis are applied to make a detailed analysis on the general characteristics of the spatial growth, the fractal evolution and circular succession of the central area, so that different characteristics of land-use composition, functional structure and spatial morphology for different periods are further studied. And then the circle-core mode of city center's spatial structure is summed up. On this basis, the multi-core structure of mega-city's central area is divided into the main core, shadowed circle, sub-core circle, auxiliary circle and affiliated transportation system. At last, the article reveals the characteristics of the spatial growth process with circle-core mode at the central area of mega-cities.%以南京新街口中心区1978年以来的空间发展历程为例,以多年累积的矢量化数据为基础,通过墨菲指数的定量计算得到各个时期中心区的空间范围,并以空间矩阵分析、用地扇区分析和圈层构成分析技术详细剖析中心区的总体增长特征、分形演替特征及圈层演替特征,进一步研究各阶段用地构成、功能结构、空间形态等的不同特点,总结出城市中心区空间形态演替的圈核结构规律,把特大城市的多核中心区结构划分为主核圈层、阴影圈层、亚核圈层、辅助圈层以及交通输配体系,并以此为基础揭示特大城市中心区圈核结构模式空间增长的基本特征.

  12. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  13. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  14. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  15. Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.

    Science.gov (United States)

    Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain

    2014-10-01

    P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool.

  16. A prototype auto-human support system for spatial analysis

    Institute of Scientific and Technical Information of China (English)

    LI Lianfa; WANG Jinfeng

    2006-01-01

    Spatial analysis is a multidisciplinary field that involves multiple influential factors, variation and uncertainty, and modeling of geospatial data is a complex procedure affected by spatial context, mechanism and assumptions. In order to make spatial modeling easier, some scholars have suggested a lot of knowledge from exploratory data analysis (EDA), specification of the model, fitness and diagnosis of the model, to interpretation of the model. Also an amount of software has improved some functionalities of spatial analysis, e.g. EDA by the dynamic link (GeoDa) and robust statistical calculation (R). However, there are few programs for spatial analysis that can automatically deal with unstructured declarative issues and uncertainty in machine modeling using the domain knowledge. Under this context, this paper suggests a prototype support system for spatial analysis that can automatically use experience and knowledge from the experts to deal with complexity and uncertainty in modeling. The knowledge base component, as the major contribution of the system, in support of the expert system shell, codes and stores declarative modeling knowledge, e.g. spatial context, mechanisms and prior knowledge to deal with declarative issues during the modeling procedure. With the open architecture, the system integrates functionalities of other components, e.g. GIS' visualization, DBMS, and robust calculation in an interactive environment. An application case of spatial sampling, design and implementation of spatial modeling with such a system is demonstrated.

  17. Purification of Spatial Tests: An IRT Analysis of Spatial and Reasoning Components in "Spatial" Tests.

    Science.gov (United States)

    Zimowski, Michele F.; Wothke, Werner

    Tests of spatial ability were analyzed for their analog (visuospatial) and nonanalog (verbal reasoning) components, using factor analyses of items and test scores. The self-selected sample consisted of over 2000 clients (average age about 26 or 27) employing the Johnson O'Connor Research Foundation's aptitude evaluation services in 12 metropolitan…

  18. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  19. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  20. Analysis of a spatially deconvolved solar pore

    CERN Document Server

    Noda, C Quintero; Cobo, B Ruiz; Suematsu, Y; Katsukawa, Y; Ichimoto, K

    2016-01-01

    Solar pores are active regions with large magnetic field strengths and apparent simple magnetic configurations. Their properties resemble the ones found for the sunspot umbra although pores do not show penumbra. Therefore, solar pores present themselves as an intriguing phenomenon that is not completely understood. We examine in this work a solar pore observed with Hinode/SP using two state of the art techniques. The first one is the spatial deconvolution of the spectropolarimetric data that allows removing the stray light contamination induced by the spatial point spread function of the telescope. The second one is the inversion of the Stokes profiles assuming local thermodynamic equilibrium that let us to infer the atmospheric physical parameters. After applying these techniques, we found that the spatial deconvolution method does not introduce artefacts, even at the edges of the magnetic structure, where large horizontal gradients are detected on the atmospheric parameters. Moreover, we also describe the p...

  1. Generalised recurrence plot analysis for spatial data

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert [Institute of Physics, University of Potsdam, 14415 Potsdam (Germany)]. E-mail: marwan@agnld.uni-potsdam.de; Kurths, Juergen [Institute of Physics, University of Potsdam, 14415 Potsdam (Germany); Saparin, Peter [Department of Biomaterials, Max Planck Institute of Colloids and Interfaces, 14424 Potsdam-Golm (Germany)

    2007-01-08

    Recurrence plot based methods are highly efficient and widely accepted tools for the investigation of time series or one-dimensional data. We present an extension of the recurrence plots and their quantifications in order to study recurrent structures in higher-dimensional spatial data. The capability of this extension is illustrated on prototypical 2D models. Next, the tested and proved approach is applied to assess the bone structure from CT images of human proximal tibia. We find that the spatial structures in trabecular bone become more recurrent during the bone loss in osteoporosis.

  2. Spatial analysis of oil reservoirs using DFA of geophysical data

    Directory of Open Access Journals (Sweden)

    R. A. Ribeiro

    2014-04-01

    Full Text Available We employ Detrended Fluctuation Analysis (DFA technique to investigate spatial properties of an oil reservoir. This reservoir is situated at Bacia de Namorados, RJ, Brazil. The data corresponds to well logs of the following geophysical quantities: sonic, gamma ray, density, porosity and electrical resistivity, measured in 56 wells. We tested the hypothesis of constructing spatial models using data from fluctuation analysis over well logs. To verify this hypothesis we compare the matrix of distances among well logs with the differences among DFA-exponents of geophysical quantities using spatial correlation function and Mantel test. Our data analysis suggests that sonic profile is a good candidate to represent spatial structures. Then, we apply the clustering analysis technique to the sonic profile to identify these spatial patterns. In addition we use the Mantel test to search for correlation among DFA-exponents of geophysical quantities.

  3. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  4. A pastoral landscape for millennia: Investigating pastoral mobility in northeastern Jordan using quantitative spatial analyses

    Science.gov (United States)

    Meister, Julia; Knitter, Daniel; Krause, Jan; Müller-Neuhof, Bernd; Schütt, Brigitta

    2017-04-01

    Northeastern Jordan is one of the few remaining regions in the Middle East where pastoral nomadism is still practiced. In this desert region, pastoral mobility is an adapted land use—able to cope with low rainfall rates, great seasonal and annual rainfall variations and thus heterogeneous vegetation and water availability. During winter, herders and their livestock move into the desert; in summer they move to the desert margins to places with perennial water supply. The system of mobile pastoralism was introduced during the Early Late Neolithic. Within the basaltic region of northeastern Jordan, there is a dense distribution of archeological remains; some of them can be linked to pastoral groups due to the herders' ancient practice of building agglomerations of sub-circular enclosures ('clustered enclosures') made of basalt boulders for corralling their flocks and domestic activities. The resulting features provide an excellent opportunity to investigate a pastoral landscape that has been frequently used by herders during the last eight to nine millennia. In this study, 9118 clustered enclosures in the northeastern Jordanian basalt desert have been systematically recorded using satellite imagery. In order to investigate potential migration or communication routes, grazing lands and social interactions of former pastoralists, we examine their first- and second-order characteristics using distance and density based approaches of point pattern analyses by integrating geomorphometric and geomorphological site properties. The results of this spatial analysis are combined with available archaeological data and a review on traditional herding practices in northeastern Jordan. Overall, the results demonstrate that the observed spatial distribution of clustered enclosures is influenced locally by natural characteristics but regionally by cultural practices.

  5. Selective presynaptic terminal remodeling induced by spatial, but not cued, learning: a quantitative confocal study

    Science.gov (United States)

    McGonigal, R.; Tabatadze, N.; Routtenberg, A.

    2011-01-01

    The hippocampal mossy fibers (MFs) are capable of behaviorally-selective, use-dependent structural remodeling. Indeed, we previously observed a new layer of Timm’s staining induced in the stratum oriens (SO) in CA3 after spatial but not cued water maze learning (Rekart et al., Learn. Mem. 2007; 14:416–421). This led to the prediction that there is a learning-specific induction of presynaptic terminal plasticity of MF axons. The present study confirms this prediction demonstrating, at the confocal level of analysis, terminal-specific and behavior-selective presynaptic structural plasticity linked to long-term memory. Male adult Wistar rats were trained for 5d to locate a hidden or visible platform in a water maze and a retention test was performed 7d later. MF terminal subtypes, specifically identified by an antibody to zinc transporter 3 (ZnT3), were counted from confocal z-stacks in the stratum lucidum (SL) and the SO. In hidden platform trained rats there was a significant increase in the number of large MF terminals (LMTs, 2.5–10µm diameter, >2µm2 area) compared to controls both in the proximal SL (p CA3 pyramidal neurons, while SMTs are known to target inhibitory interneurons. The present findings highlight the pivotal role in memory of presynaptic structural plasticity. Because the ‘sprouting’ observed is specific to the LMT, with no detectable change in the number of the SMT, learning may enhance net excitatory input to CA3 pyramidal neurons. Given the sparse coding of the MF-CA3 connection, and the role that granule cells play in pattern separation, the remodeling observed here may be expected to have a major impact on the long-term integration of spatial context into memory. PMID:22180136

  6. Selective presynaptic terminal remodeling induced by spatial, but not cued, learning: a quantitative confocal study.

    Science.gov (United States)

    McGonigal, R; Tabatadze, N; Routtenberg, A

    2012-06-01

    The hippocampal mossy fibers (MFs) are capable of behaviorally selective, use-dependent structural remodeling. Indeed, we previously observed a new layer of Timm's staining induced in the stratum oriens (SO) in CA3 after spatial but not cued water maze learning (Rekart et al., (2007) Learn Mem 14:416-421). This led to the prediction that there is a learning-specific induction of presynaptic terminal plasticity of MF axons. This study confirms this prediction demonstrating, at the confocal level of analysis, terminal-specific, and behavior-selective presynaptic structural plasticity linked to long-term memory. Male adult Wistar rats were trained for 5 days to locate a hidden or visible platform in a water maze and a retention test was performed 7 days later. MF terminal subtypes, specifically identified by an antibody to zinc transporter 3 (ZnT3), were counted from confocal z-stacks in the stratum lucidum (SL) and the SO. In hidden platform trained rats, there was a significant increase in the number of large MF terminals (LMTs, 2.5-10 μm diameter, >2 μm(2) area) compared to controls both in the proximal SL (P CA3 pyramidal neurons, while SMTs are known to target inhibitory interneurons. The present findings highlight the pivotal role in memory of presynaptic structural plasticity. Because the "sprouting" observed is specific to the LMT, with no detectable change in the number of the SMT, learning may enhance net excitatory input to CA3 pyramidal neurons. Given the sparse coding of the MF-CA3 connection, and the role that granule cells play in pattern separation, the remodeling observed here may be expected to have a major impact on the long-term integration of spatial context into memory.

  7. Exploiting spatial descriptions in visual scene analysis.

    Science.gov (United States)

    Ziegler, Leon; Johannsen, Katrin; Swadzba, Agnes; De Ruiter, Jan P; Wachsmuth, Sven

    2012-08-01

    The reliable automatic visual recognition of indoor scenes with complex object constellations using only sensor data is a nontrivial problem. In order to improve the construction of an accurate semantic 3D model of an indoor scene, we exploit human-produced verbal descriptions of the relative location of pairs of objects. This requires the ability to deal with different spatial reference frames (RF) that humans use interchangeably. In German, both the intrinsic and relative RF are used frequently, which often leads to ambiguities in referential communication. We assume that there are certain regularities that help in specific contexts. In a first experiment, we investigated how speakers of German describe spatial relationships between different pieces of furniture. This gave us important information about the distribution of the RFs used for furniture-predicate combinations, and by implication also about the preferred spatial predicate. The results of this experiment are compiled into a computational model that extracts partial orderings of spatial arrangements between furniture items from verbal descriptions. In the implemented system, the visual scene is initially scanned by a 3D camera system. From the 3D point cloud, we extract point clusters that suggest the presence of certain furniture objects. We then integrate the partial orderings extracted from the verbal utterances incrementally and cumulatively with the estimated probabilities about the identity and location of objects in the scene, and also estimate the probable orientation of the objects. This allows the system to significantly improve both the accuracy and richness of its visual scene representation.

  8. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette)

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects throug

  9. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  10. Analysis of a spatially deconvolved solar pore

    Science.gov (United States)

    Quintero Noda, C.; Shimizu, T.; Ruiz Cobo, B.; Suematsu, Y.; Katsukawa, Y.; Ichimoto, K.

    2016-08-01

    Solar pores are active regions with large magnetic field strengths and apparent simple magnetic configurations. Their properties resemble the ones found for the sunspot umbra although pores do not show penumbra. Therefore, solar pores present themselves as an intriguing phenomenon that is not completely understood. We examine in this work a solar pore observed with Hinode/SP using two state of the art techniques. The first one is the spatial deconvolution of the spectropolarimetric data that allows removing the stray light contamination induced by the spatial point spread function of the telescope. The second one is the inversion of the Stokes profiles assuming local thermodynamic equilibrium that let us to infer the atmospheric physical parameters. After applying these techniques, we found that the spatial deconvolution method does not introduce artefacts, even at the edges of the magnetic structure, where large horizontal gradients are detected on the atmospheric parameters. Moreover, we also describe the physical properties of the magnetic structure at different heights finding that, in the inner part of the solar pore, the temperature is lower than outside, the magnetic field strength is larger than 2 kG and unipolar, and the line-of-sight velocity is almost null. At neighbouring pixels, we found low magnetic field strengths of same polarity and strong downward motions that only occur at the low photosphere, below the continuum optical depth log τ = -1. Finally, we studied the spatial relation between different atmospheric parameters at different heights corroborating the physical properties described before.

  11. Accuracy of Image Analysis in Quantitative Study of Cement Paste

    Directory of Open Access Journals (Sweden)

    Feng Shu-Xia

    2016-01-01

    Full Text Available Quantitative study on cement paste especially blended cement paste has been a hot and difficult issue over the years, and the technique of backscattered electron image analysis showed unique advantages in this field. This paper compared the test results of cement hydration degree, Ca(OH2 content and pore size distribution in pure pastes by image analysis and other methods. Then the accuracy of qualitative study by image analysis was analyzed. The results showed that image analysis technique had displayed higher accuracy in quantifying cement hydration degree and Ca(OH2 content than non-evaporable water test and thermal analysis respectively.

  12. Geostatistics and Analysis of Spatial Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2007-01-01

    This note deals with geostatistical measures for spatial correlation, namely the auto-covariance function and the semi-variogram, as well as deterministic and geostatistical methods for spatial interpolation, namely inverse distance weighting and kriging. Some semi-variogram models are mentioned......, specifically the spherical, the exponential and the Gaussian models. Equations to carry out simple og ordinary kriging are deduced. Other types of kriging are mentioned, and references to international literature, Internet addresses and state-of-the-art software in the field are given. A very simple example...... to illustrate the computations and a more realistic example with height data from an area near Slagelse, Denmark, are given. Finally, a series of attractive characteristics of kriging are mentioned, and a simple sampling strategic consideration is given based on the dependence of the kriging variance...

  13. Spatial distribution analysis on climatic variables in northeast China

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Information ecology is a new research area of modern ecology.Here describes spatial distribution analysis methods of four sorts of climatic variables, i.e. temperature, precipitation, relative humidity and sunshine fraction on Northeast China. First,Digital terrain models was built with large-scale maps and vector data. Then trend surface analysis and interpolation method were used to analyze the spatial distribution of these four kinds of climatic variables at three temporal scale: (1) monthly data; (2)mean monthly data of thirty years, and (3) mean annual data of thirty years. Ecological information system were used for graphics analysis on the spatial distribution of these climate variables.

  14. Generalised Recurrence Plot Analysis for Spatial Data

    OpenAIRE

    Marwan, Norbert; Kurths, Juergen; Saparin, Peter

    2006-01-01

    Recurrence plot based methods are highly efficient and widely accepted tools for the investigation of time series or one-dimensional data. We present an extension of the recurrence plots and their quantifications in order to study recurrent structures in higher-dimensional spatial data. The capability of this extension is illustrated on prototypical 2D models. Next, the tested and proved approach is applied to assess the bone structure from CT images of human proximal tibia. We find that the ...

  15. Analysis of Spatial Disparities and Driving Factors of Energy Consumption Change in China Based on Spatial Statistics

    OpenAIRE

    Hualin Xie; Guiying Liu; Qu Liu; Peng Wang

    2014-01-01

    The changes of spatial pattern in energy consumption have an impact on global climate change. Based on the spatial autocorrelation analysis and the auto-regression model of spatial statistics, this study has explored the spatial disparities and driving forces in energy consumption changes in China. The results show that the global spatial autocorrelation of energy consumption change in China is significant during the period 1990–2010, and the trend of spatial clustering of energy consumption ...

  16. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina;

    2004-01-01

    the translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required...

  17. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources ... 350 km long extends from the eastern border of Sierra Leone all the way to. Ghana. .... consider whether data will likely fit the assumptions of a selected model. ... These tests are not alternatives to parametric tests, but rather are a means of.

  18. Analysis of Forecasting Sales By Using Quantitative And Qualitative Methods

    Directory of Open Access Journals (Sweden)

    B. Rama Sanjeeva Sresta,

    2016-09-01

    Full Text Available This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives.

  19. Insights Into Quantitative Biology: analysis of cellular adaptation

    OpenAIRE

    Agoni, Valentina

    2013-01-01

    In the last years many powerful techniques have emerged to measure protein interactions as well as gene expression. Many progresses have been done since the introduction of these techniques but not toward quantitative analysis of data. In this paper we show how to study cellular adaptation and how to detect cellular subpopulations. Moreover we go deeper in analyzing signal transduction pathways dynamics.

  20. Quantitating the subtleties of microglial morphology with fractal analysis.

    Science.gov (United States)

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  1. Quantitating the Subtleties of Microglial Morphology with Fractal Analysis

    Directory of Open Access Journals (Sweden)

    Audrey eKarperien

    2013-01-01

    Full Text Available It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells. Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  2. Spatial analysis of hemorrhagic fever with renal syndrome in China

    Directory of Open Access Journals (Sweden)

    Yang Hong

    2006-04-01

    Full Text Available Abstract Background Hemorrhagic fever with renal syndrome (HFRS is endemic in many provinces with high incidence in mainland China, although integrated intervention measures including rodent control, environment management and vaccination have been implemented for over ten years. In this study, we conducted a geographic information system (GIS-based spatial analysis on distribution of HFRS cases for the whole country with an objective to inform priority areas for public health planning and resource allocation. Methods Annualized average incidence at a county level was calculated using HFRS cases reported during 1994–1998 in mainland China. GIS-based spatial analyses were conducted to detect spatial autocorrelation and clusters of HFRS incidence at the county level throughout the country. Results Spatial distribution of HFRS cases in mainland China from 1994 to 1998 was mapped at county level in the aspects of crude incidence, excess hazard and spatial smoothed incidence. The spatial distribution of HFRS cases was nonrandom and clustered with a Moran's I = 0.5044 (p = 0.001. Spatial cluster analyses suggested that 26 and 39 areas were at increased risks of HFRS (p Conclusion The application of GIS, together with spatial statistical techniques, provide a means to quantify explicit HFRS risks and to further identify environmental factors responsible for the increasing disease risks. We demonstrate a new perspective of integrating such spatial analysis tools into the epidemiologic study and risk assessment of HFRS.

  3. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    Science.gov (United States)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  4. Spatial Analysis Methods for Health Promotion and Education.

    Science.gov (United States)

    Chaney, Robert A; Rojas-Guyler, Liliana

    2016-05-01

    This article provides a review of spatial analysis methods for use in health promotion and education research and practice. Spatial analysis seeks to describe or make inference about variables with respect to the places they occur. This includes geographic differences, proximity issues, and access to resources. This is important for understanding how health outcomes differ from place to place; and in terms of understanding some of the environmental underpinnings of health outcomes data by placing it in context of geographic location. This article seeks to promote spatial analysis as a viable tool for health promotion and education research and practice. Four more commonly used spatial analysis techniques are described in-text. An illustrative example of motor vehicle collisions in a large metropolitan city is presented using these techniques. The techniques discussed are as follows: descriptive mapping, global spatial autocorrelation, cluster detection, and identification and spatial regression analysis. This article provides useful information for health promotion and education researchers and practitioners seeking to examine research questions from a spatial perspective.

  5. Quantitative numerical analysis of transient IR-experiments on buildings

    Science.gov (United States)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  6. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  7. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  8. Quantitative x-ray magnetic circular dichroism mapping with high spatial resolution full-field magnetic transmission soft x-ray spectro-microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, MacCallum J. [Center for X-ray Optics, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Physics Department, University of California, Berkeley, California 94720 (United States); Agostino, Christopher J. [Physics Department, University of California, Berkeley, California 94720 (United States); National Center for Electron Microscopy, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); N' Diaye, Alpha T. [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Chen, Gong [National Center for Electron Microscopy, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Im, Mi-Young [Center for X-ray Optics, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Department of Emerging Materials Science, Daegu Gyeongbuk Institute of Science and Technology, Daegu 711-873 (Korea, Republic of); Fischer, Peter, E-mail: PJFischer@lbl.gov [Center for X-ray Optics, Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Physics Department, University of California, Santa Cruz, California 94056 (United States)

    2015-05-07

    The spectroscopic analysis of X-ray magnetic circular dichroism (XMCD), which serves as strong and element-specific magnetic contrast in full-field magnetic transmission soft x-ray microscopy, is shown to provide information on the local distribution of spin (S) and orbital (L) magnetic moments down to a spatial resolution of 25 nm limited by the x-ray optics used in the x-ray microscope. The spatially resolved L/S ratio observed in a multilayered (Co 0.3 nm/Pt 0.5 nm) × 30 thin film exhibiting a strong perpendicular magnetic anisotropy decreases significantly in the vicinity of domain walls, indicating a non-uniform spin configuration in the vertical profile of a domain wall across the thin film. Quantitative XMCD mapping with x-ray spectro-microscopy will become an important characterization tool for systems with topological or engineered magnetization inhomogeneities.

  9. Stellwagen Bank National Marine Sanctuary - Internal Wave Analysis Spatial Extent

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This feature class contains the spatial extent of the internal wave analysis. This area of interest was defined in interests of time. A cusory review of the 66 SAR...

  10. Quantitative analysis of a frequency-domain nonlinearity indicator.

    Science.gov (United States)

    Reichman, Brent O; Gee, Kent L; Neilsen, Tracianne B; Miller, Kyle G

    2016-05-01

    In this paper, quantitative understanding of a frequency-domain nonlinearity indicator is developed. The indicator is derived from an ensemble-averaged, frequency-domain version of the generalized Burgers equation, which can be rearranged in order to directly compare the effects of nonlinearity, absorption, and geometric spreading on the pressure spectrum level with frequency and distance. The nonlinear effect is calculated using pressure-squared-pressure quadspectrum. Further theoretical development has given an expression for the role of the normalized quadspectrum, referred to as Q/S by Morfey and Howell [AIAA J. 19, 986-992 (1981)], in the spatial rate of change of the pressure spectrum level. To explore this finding, an investigation of the change in level for initial sinusoids propagating as plane waves through inviscid and thermoviscous media has been conducted. The decibel change with distance, calculated through Q/S, captures the growth and decay of the harmonics and indicates that the most significant changes in level occur prior to sawtooth formation. At large distances, the inviscid case results in a spatial rate of change that is uniform across all harmonics. For thermoviscous media, large positive nonlinear gains are observed but offset by absorption, which leads to a greater overall negative spatial rate of change for higher harmonics.

  11. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  12. Local spatial frequency analysis for computer vision

    Science.gov (United States)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  13. Analysis of spatially deconvolved polar faculae

    Science.gov (United States)

    Quintero Noda, C.; Suematsu, Y.; Ruiz Cobo, B.; Shimizu, T.; Asensio Ramos, A.

    2016-07-01

    Polar faculae are bright features that can be detected in solar limb observations and they are related to magnetic field concentrations. Although there are a large number of works studying them, some questions about their nature as their magnetic properties at different heights are still open. Thus, we aim to improve the understanding of solar polar faculae. In that sense, we infer the vertical stratification of the temperature, gas pressure, line-of-sight velocity and magnetic field vector of polar faculae regions. We performed inversions of the Stokes profiles observed with Hinode/Spectropolarimeter after removing the stray light contamination produced by the spatial point spread function of the telescope. Moreover, after solving the azimuth ambiguity, we transform the magnetic field vector to local solar coordinates. The obtained results reveal that the polar faculae are constituted by hot plasma with low line-of-sight velocities and single polarity magnetic fields in the kilogauss range that are nearly perpendicular to the solar surface. We also found that the spatial location of these magnetic fields is slightly shifted respect to the continuum observations towards the disc centre. We believe that this is due to the hot wall effect that allows detecting photons that come from deeper layers located closer to the solar limb.

  14. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  15. Quantitative Monitoring for Enhanced Geothermal Systems Using Double-Difference Waveform Inversion with Spatially-Variant Total-Variation Regularization

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Youzuo [Los Alamos National Laboratory; Huang, Lianjie [Los Alamos National Laboratory; Zhang, Zhigang [Los Alamos National Laboratory

    2011-01-01

    Double-difference waveform inversion is a promising tool for quantitative monitoring for enhanced geothermal systems (EGS). The method uses time-lapse seismic data to jointly inverts for reservoir changes. Due to the ill-posedness of waveform inversion, it is a great challenge to obtain reservoir changes accurately and efficiently, particularly when using timelapse seismic reflection data. To improve reconstruction, we develop a spatially-variant total-variation regularization scheme into double-difference waveform inversion to improve the inversion accuracy and robustness. The new regularization scheme employs different regularization parameters in different regions of the model to obtain an optimal regularization in each area. We compare the results obtained using a spatially-variant parameter with those obtained using a constant regularization parameter. Utilizing a spatially-variant regularization scheme, the target monitoring regions are well reconstructed and the image noise is significantly reduced outside the monitoring regions. Our numerical examples demonstrate that the spatially-variant total-variation regularization scheme provides the flexibility to regularize local regions based on the a priori spatial information without increasing computational costs and the computer memory requirement.

  16. Probabilistic Quantitative Precipitation Forecasting using a Two-Stage Spatial Model

    Science.gov (United States)

    2008-04-08

    decision via a truncation; the second process drives precipitation amounts via an anamorphosis or transformation function (Chilès and Delfiner 1999, p...spatially varying anamorphosis or transformation function (Chilès and Delfiner 1999, p. 381). The anamorphosis has the advantage of retaining the appro... Delfiner , P. (1999), Geostatistics: Modeling Spatial Uncertainty, Wiley, 695 pp. Diebold, F. X., Gunther, T. A., and Tay, A. S. (1998), “Evaluating density

  17. Analysis of spatially deconvolved polar faculae

    CERN Document Server

    Noda, C Quintero; Cobo, B Ruiz; Shimizu, T; Ramos, A Asensio

    2016-01-01

    Polar faculae are bright features that can be detected in solar limb observations and they are related to magnetic field concentrations. Although there is a large number of works studying them, some questions about their nature as their magnetic properties at different heights are still open. Thus, we aim to improve the understanding of solar polar faculae. In that sense, we infer the vertical stratification of the temperature, gas pressure, line of sight velocity and magnetic field vector of polar faculae regions. We performed inversions of the Stokes profiles observed with Hinode/SP after removing the stray light contamination produced by the spatial point spread function of the telescope. Moreover, after solving the azimuth ambiguity, we transform the magnetic field vector to local solar coordinates. The obtained results reveal that the polar faculae are constituted by hot plasma with low line of sight velocities and single polarity magnetic fields in the kilogauss range that are nearly perpendicular to th...

  18. Deflection analysis of rectangular spatial coverage truss

    Directory of Open Access Journals (Sweden)

    M.N. Kirsanov

    2015-02-01

    Full Text Available An elastic spatial statically determinate truss of regular type, simulating the rectangular in plan coverage was considered. In the plane of the base the truss has two axes of symmetry. Four support structures (spherical hinge, cylindrical hinge and two vertical rods are located at its corners. An analytic solution was found for forces in the rods of the truss. Using the Maxwell-Mohr’s formula, the dependence of the deflection of the center was discovered in the truss under the influence of the concentrated force. There are five parameters of the problem in this formula: three linear dimensions, and the numbers of hinges on its lateral sides. To determine the desired patterns by means of the computer mathematics system Maple the recursion task by two parameters was solved. It was shown that dependence of the deflection on the number of panels and height of the truss detects a minimum, allowing optimizing the size of the structure.

  19. Theory and application for retrieval and fusion of spatial and temporal quantitative information from complex natural environment

    Institute of Scientific and Technical Information of China (English)

    JIN Yaqiu

    2007-01-01

    This paper briefly presents the research progress of the State Major Basic Research Project 2001CB309400,"Theory and Application for Retrieval and Fusion of Spatial and Temporal Quantitative Information from Complex Natural Environment". Based on the rapid advancement of synthetic aperture radar (SAR) imagery technology, informa-tion theory of fully polarimetric scattering and applications in polarimetric SAR remote sensing are developed. To promote the modeling of passive microwave remote sensing, thevector (polarized) radiative transfer theory (VRT) of complex natural media such as inhomogeneous, multi-layered and 3-dimensional VRT is developed. With these theoretical progresses, data validation and retrieval algorithms for some typical events and characteristic parameters of earth terrain surfaces, atmosphere, and oceans from operational and experimental remote sensing satellites are studied. Employ-ing remote sensing, radiative transfer simulation, geographic information systems (GIS), land hydrological process, and data assimilation, the Chinese land data assimilation system (CLDAS) is established. Towards the future development of China's microwave meteorological satellites, employing remote sensing data of currently available SSM/I (special sensor microwave/imager), AMSU (advanced microwave sounding unit), MTI (microwave temperature imager),etc., with ground-based measurements, several operation alalgorithms and databases for atmospheric precipitation, water vapor and liquid water in clouds, and other hydrological/hydrological applications are developed. To advance China's SAR and InSAR (interferometric SAR) technologies, the image processing and analysis of ERS (European remote sensing), Radarsat SAR, and Chinese SAR, etc., the software platforms are accomplished. Based on the researches of multi-information fusion, some simulations, identification,and information extractions of the targets from complex background clutter scenes are studied. Some

  20. Quantitative nanoscale analysis in 3D using electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kuebel, Christian [Karlsruhe Institute of Technology, INT, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-07-01

    State-of-the-art electron tomography has been established as a powerful tool to image complex structures with nanometer resolution in 3D. Especially STEM tomography is used extensively in materials science in such diverse areas as catalysis, semiconductor materials, and polymer composites mainly providing qualitative information on morphology, shape and distribution of materials. However, for an increasing number of studies quantitative information, e.g. surface area, fractal dimensions, particle distribution or porosity are needed. A quantitative analysis is typically performed after segmenting the tomographic data, which is one of the main sources of error for the quantification. In addition to noise, systematic errors due to the missing wedge and due to artifacts from the reconstruction algorithm itself are responsible for these segmentation errors and improved algorithms are needed. This presentation will provide an overview of the possibilities and limitations of quantitative nanoscale analysis by electron tomography. Using catalysts and nano composites as applications examples, intensities and intensity variations observed for the 3D volume reconstructed by WBP and SIRT will be quantitatively compared to alternative reconstruction algorithms; implications for quantification of electron (or X-ray) tomographic data will be discussed and illustrated for quantification of particle size distributions, particle correlations, surface area, and fractal dimensions in 3D.

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  2. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...

  3. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  4. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  5. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  6. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  7. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  8. Spatial analysis of BSE cases in the Netherlands

    Directory of Open Access Journals (Sweden)

    Brus Dick J

    2008-06-01

    Full Text Available Abstract Background In many of the European countries affected by Bovine Spongiform Encephalopathy (BSE, case clustering patterns have been observed. Most of these patterns have been interpreted in terms of heterogeneities in exposure of cattle to the BSE agent. Here we investigate whether spatial clustering is present in the Dutch BSE case data. Results We have found three spatial case clusters in the Dutch BSE epidemic. The clusters are geographically distinct and each cluster appears in a different birth cohort. When testing all birth cohorts together, only one significant cluster was detected. The fact that we found stronger spatial clustering when using a cohort-based analysis, is consistent with the evidence that most BSE infections occur in animals less than 12 or 18 months old. Conclusion Significant spatial case clustering is present in the Dutch BSE epidemic. The spatial clusters of BSE cases are most likely due to time-dependent heterogeneities in exposure related to feed production.

  9. Qualitative and quantitative stability analysis of penta-rhythmic circuits

    Science.gov (United States)

    Schwabedal, Justus T. C.; Knapper, Drake E.; Shilnikov, Andrey L.

    2016-12-01

    Inhibitory circuits of relaxation oscillators are often-used models for dynamics of biological networks. We present a qualitative and quantitative stability analysis of such a circuit constituted by three generic oscillators (of a Fitzhugh-Nagumo type) as its nodes coupled reciprocally. Depending on inhibitory strengths, and parameters of individual oscillators, the circuit exhibits polyrhythmicity of up to five simultaneously stable rhythms. With methods of bifurcation analysis and phase reduction, we investigate qualitative changes in stability of these circuit rhythms for a wide range of parameters. Furthermore, we quantify robustness of the rhythms maintained under random perturbations by monitoring phase diffusion in the circuit. Our findings allow us to describe how circuit dynamics relate to dynamics of individual nodes. We also find that quantitative and qualitative stability properties of polyrhythmicity do not always align.

  10. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H;

    2016-01-01

    BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...... to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm...

  11. Quantitative and qualitative analysis and interpretation of CT perfusion imaging.

    Science.gov (United States)

    Valdiviezo, Carolina; Ambrose, Marietta; Mehra, Vishal; Lardo, Albert C; Lima, Joao A C; George, Richard T

    2010-12-01

    Coronary artery disease (CAD) remains the leading cause of death in the United States. Rest and stress myocardial perfusion imaging has an important role in the non-invasive risk stratification of patients with CAD. However, diagnostic accuracies have been limited, which has led to the development of several myocardial perfusion imaging techniques. Among them, myocardial computed tomography perfusion imaging (CTP) is especially interesting as it has the unique capability of providing anatomic- as well as coronary stenosis-related functional data when combined with computed tomography angiography (CTA). The primary aim of this article is to review the qualitative, semi-quantitative, and quantitative analysis approaches to CTP imaging. In doing so, we will describe the image data required for each analysis and discuss the advantages and disadvantages of each approach.

  12. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  13. Advance in orientation microscopy: quantitative analysis of nanocrystalline structures.

    Science.gov (United States)

    Seyring, Martin; Song, Xiaoyan; Rettenmayr, Markus

    2011-04-26

    The special properties of nanocrystalline materials are generally accepted to be a consequence of the high density of planar defects (grain and twin boundaries) and their characteristics. However, until now, nanograin structures have not been characterized with similar detail and statistical relevance as coarse-grained materials, due to the lack of an appropriate method. In the present paper, a novel method based on quantitative nanobeam diffraction in transmission electron microscopy (TEM) is presented to determine the misorientation of adjacent nanograins and subgrains. Spatial resolution of twin boundaries is substantially higher than that observed in bright-field images in the TEM; small angle grain boundaries are prominent; there is an obvious dependence of the grain boundary characteristics on grain size distribution and mean grain size.

  14. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  15. Combining qualitative and quantitative spatial and temporal information in a hierarchical structure: Approximate reasoning for plan execution monitoring

    Science.gov (United States)

    Hoebel, Louis J.

    1993-01-01

    The problem of plan generation (PG) and the problem of plan execution monitoring (PEM), including updating, queries, and resource-bounded replanning, have different reasoning and representation requirements. PEM requires the integration of qualitative and quantitative information. PEM is the receiving of data about the world in which a plan or agent is executing. The problem is to quickly determine the relevance of the data, the consistency of the data with respect to the expected effects, and if execution should continue. Only spatial and temporal aspects of the plan are addressed for relevance in this work. Current temporal reasoning systems are deficient in computational aspects or expressiveness. This work presents a hybrid qualitative and quantitative system that is fully expressive in its assertion language while offering certain computational efficiencies. In order to proceed, methods incorporating approximate reasoning using hierarchies, notions of locality, constraint expansion, and absolute parameters need be used and are shown to be useful for the anytime nature of PEM.

  16. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  17. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  18. A quantitative framework to estimate the relative importance of environment, spatial variation and patch connectivity in driving community composition.

    Science.gov (United States)

    Monteiro, Viviane F; Paiva, Paulo C; Peres-Neto, Pedro R

    2017-03-01

    Perhaps the most widely used quantitative approach in metacommunity ecology is the estimation of the importance of local environment vs. spatial structuring using the variation partitioning framework. Contrary to metapopulation models, however, current empirical studies of metacommunity structure using variation partitioning assume a space-for-dispersal substitution due to the lack of analytical frameworks that incorporate patch connectivity predictors of dispersal dynamics. Here, a method is presented that allows estimating the relative importance of environment, spatial variation and patch connectivity in driving community composition variation within metacommunities. The proposed approach is illustrated by a study designed to understand the factors driving the structure of a soft-bottom marine polychaete metacommunity. Using a standard variation partitioning scheme (i.e. where only environmental and spatial predictors are used), only about 13% of the variation in metacommunity structure was explained. With the connectivity set of predictors, the total amount of explained variation increased up to 51% of the variation. These results highlight the importance of considering predictors of patch connectivity rather than just spatial predictors. Given that information on connectivity can be estimated by commonly available data on species distributions for a number of taxa, the framework presented here can be readily applied to past studies as well, facilitating a more robust evaluation of the factors contributing to metacommunity structure.

  19. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    of PCA and related techniques. An interesting dilemma in reduction of dimensionality of data is the desire to obtain simplicity for better understanding, visualization and interpretation of the data on the one hand, and the desire to retain sufficient detail for adequate representation on the other hand......Based on work by Pearson in 1901, Hotelling in 1933 introduced principal component analysis (PCA). PCA is often used for general feature generation and linear orthogonalization or compression by dimensionality reduction of correlated multivariate data, see Jolliffe for a comprehensive description...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  20. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    Science.gov (United States)

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  1. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    Science.gov (United States)

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  2. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies

    Directory of Open Access Journals (Sweden)

    Siu-Leung Chau

    2016-08-01

    Full Text Available Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ injection (SQI, via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS; saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC with evaporative light scattering detector (ELSD on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%–0.21%, and 53.49%–58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  3. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  4. Portable (handheld) clinical device for quantitative spectroscopy of skin, utilizing spatial frequency domain reflectance techniques

    Science.gov (United States)

    Saager, Rolf B.; Dang, An N.; Huang, Samantha S.; Kelly, Kristen M.; Durkin, Anthony J.

    2017-09-01

    Spatial Frequency Domain Spectroscopy (SFDS) is a technique for quantifying in-vivo tissue optical properties. SFDS employs structured light patterns that are projected onto tissues using a spatial light modulator, such as a digital micromirror device. In combination with appropriate models of light propagation, this technique can be used to quantify tissue optical properties (absorption, μa, and scattering, μs', coefficients) and chromophore concentrations. Here we present a handheld implementation of an SFDS device that employs line (one dimensional) imaging. This instrument can measure 1088 spatial locations that span a 3 cm line as opposed to our original benchtop SFDS system that only collects a single 1 mm diameter spot. This imager, however, retains the spectral resolution (˜1 nm) and range (450-1000 nm) of our original benchtop SFDS device. In the context of homogeneous turbid media, we demonstrate that this new system matches the spectral response of our original system to within 1% across a typical range of spatial frequencies (0-0.35 mm-1). With the new form factor, the device has tremendously improved mobility and portability, allowing for greater ease of use in a clinical setting. A smaller size also enables access to different tissue locations, which increases the flexibility of the device. The design of this portable system not only enables SFDS to be used in clinical settings but also enables visualization of properties of layered tissues such as skin.

  5. An Analysis of Perturbed Quantization Steganography in the Spatial Domain

    Science.gov (United States)

    2005-03-01

    steganography is also common with audio [KaP00]. Figure 1 depicts this form of steganography . Figure 1. Least Significant Bit Substitution 6...QUANTIZATION STEGANOGRAPHY IN THE SPATIAL DOMAIN THESIS Matthew D. Spisak AFIT/GIA/ENG/05-04DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY ORCE...ANALYSIS OF PERTURBED QUANTIZATION STEGANOGRAPHY IN THE SPATIAL DOMAIN THESIS Presented to the Faculty Department of Electrical and

  6. A Spatial Analysis of Obesity in West Virginia

    OpenAIRE

    Anura Amarasinghe; Gerard D'Souza; Cheryl Brown; Tatiana Borisova

    2006-01-01

    A spatial panel data analysis at the county level examines how individual food consumption, recreational, and lifestyle choices ? against a backdrop of changing demographic, built environment, and policy factors ? leads to obesity. Results suggest that obesity tends to be spatially autocorrelated; in addition to hereditary factors and lifestyle choices, it is also caused by sprawl and lack of land use planning. Policy measures which stimulate educational attainment, poverty alleviation, and p...

  7. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications

    OpenAIRE

    Agishev, Ravil R.; Comerón Tejero, Adolfo

    2002-01-01

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal background-radiation ratio SBR at the photodetector input. The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effec...

  8. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications

    OpenAIRE

    Agishev, Ravil R.; Comerón Tejero, Adolfo

    2002-01-01

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal background-radiation ratio SBR at the photodetector input. The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effec...

  9. An Introduction to Spatial Analysis in Social Science Research

    Directory of Open Access Journals (Sweden)

    Yanqing Xu

    2015-02-01

    Full Text Available For years researchers have recognized the need to consider environmental and contextual variables in the social and behavioral sciences. Multilevel models have grown in popularity in large part because they provide a means to explicitly model the influence of context on many individual level processes. However, in applications of these and other statistical models that incorporate context into the analysis, rarely is physical location or distance between entities considered. In this paper we discuss a variety of spatial analysis techniques and their applications in educational and psychological research. We provide examples with the SAS software package and other more specialized spatial analysis software.

  10. Quantitative analysis for nonlinear fluorescent spectra based on edges matching

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A novel spectra-edge-matching approach is proposed for the quantitative analysis of the nonlinear fluorescence spectra of the air impurities excited by a femtosecond laser.The fluorescence spectra are first denoised and compressed,both by wavelet transform,and several peak groups are then picked from each spectrum according to a threshold of intensity and are used to extract the spectral features through principal component analysis.It is indicated that the first two principle components actually cover up to 98% of the total information and are sufficient for the final concentration analysis.The analysis reveals a monotone relationship between the spectra intensity and the concentration of the air impurities,suggesting that the femtosecond laser induced fluorescence spectroscopy along with the proposed spectra analysis method can become a powerful tool for monitoring environmental pollutants.

  11. Efficiency analysis of control algorithms in spatially distributed systems with chaotic behavior

    Directory of Open Access Journals (Sweden)

    Korus Łukasz

    2014-12-01

    Full Text Available The paper presents results of examination of control algorithms for the purpose of controlling chaos in spatially distributed systems like the coupled map lattice (CML. The mathematical definition of the CML, stability analysis as well as some basic results of numerical simulation exposing complex, spatiotemporal and chaotic behavior of the CML were already presented in another paper. The main purpose of this article is to compare the efficiency of controlling chaos by simple classical algorithms in spatially distributed systems like CMLs. This comparison is made based on qualitative and quantitative evaluation methods proposed in the previous paper such as the indirect Lyapunov method, Lyapunov exponents and the net direction phase indicator. As a summary of this paper, some conclusions which can be useful for creating a more efficient algorithm of controlling chaos in spatially distributed systems are made.

  12. Quantitative gait analysis following hemispherotomy for Rasmussen′s encephalitis

    Directory of Open Access Journals (Sweden)

    Santhosh George Thomas

    2007-01-01

    Full Text Available Peri-insular hemispherotomy is a form of disconnective hemispherectomy involving complete disconnection of all ascending / descending and commisural connections of one hemisphere. We report a case of a seven and a half year old child with intractable epilepsy due to Rasmussen′s encephalitis who underwent peri-insular hemispherotomy and achieved complete freedom from seizures. Quantitative gait analysis was used to describe the changes in the kinematic and kinetic parameters of gait with surface electromyographs 18 months after surgery. The focus of this paper is to highlight the utility of gait analysis following hemispherotomy with a view to directing postsurgical motor training and rehabilitation.

  13. Spatial analysis of snail distribution in Jiangning county

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhi-ying; ZHOU Yun; XU De-zhong; SUN Zhi-dong; ZHOU Xiao-nong; GONG Zi-li

    2002-01-01

    Objective: To explore the spatial distribution of oncomelenia snails in Jiangning County. Methods:Cluster analysis and the Spatial Scan Statistics were performed based on the density of alive-snails in habitats and its rate infected by the S. Japonicum. Results: Although areas of snail habitats and density of the alivesnails in marshland in 2000 are higher significantly than that in mountain areas in Jiangning County, the numbers of habitats in mountain are more than that in marshland and they distributed sporadically. The snail habitats were classified into 4 in marshlands and 3 classes in mountain areas respectively in cluster analysis.Although they are mainly the one with low density of alive and infected snails, we should alert that there are also some habitats with high snail density and infection rate, which is important for the transmission of schistosomia. The analysis of Spatial Scan Statistics detected 2 significant spatial aggregations for alive-snail in marshland and 4 in mountain areas respectively with p-values less than 0. 01. There are also 2 significant spatial aggregations for infected snails in marshland. Conclusion.. The significant spatial aggregations for alivesnails and infected snails indicated that there are some factors in the habitats suitable for the survival of snails and the transmission of schistosomia.

  14. Spatial analysis of neglected diseases in Brazil, 2007 to 2009

    Directory of Open Access Journals (Sweden)

    Joyce Mendes de Andrade Schramm

    2016-07-01

    Full Text Available This paper aims to describe a set of epidemiological information regarding the spatial distribution of selected neglected diseases in Brazil from 2007 to 2009 and health care infrastructure and socio-economic indicators 2010. An ecological study of the spatial analysis, based on the incidence of tuberculosis, visceral leishmaniasis, American cutaneous leishmaniasis, malaria and prevalence of leprosy and schistosomiasis. Maps with the spatial distribution of the prevalence and incidence rates were drawn, as well as cluster detection maps, applying spatial statistical analysis techniques. A thematic map with the total distribution of priority municipalities was developed for each disease. Malaria and schistosomiasis had the highest incidence and prevalence rates respectively. All diseases analyzed showed dependence in the spatial correlation measure. A total of 1,630 Brazilians municipalities (29% were considered by the Ministry of Health as a priority to receive control actions for at least one of the studied diseases. North and Northeast regions concentrate municipalities with at least three simultaneous diseases, which overlap with the lowest socio-economic indicators. Spatial analysis studies may contribute to a better planning and organizing health care and services, aiming to reduce the existent gap regarding scientific knowledge on neglected diseases.

  15. Application of Integration of Spatial Statistical Analysis with GIS to Regional Economic Analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Fei; DU Daosheng

    2004-01-01

    This paper summarizes a few spatial statistical analysis methods for to measuring spatial autocorrelation and spatial association, discusses the criteria for the identification of spatial association by the use of global Moran Coefficient, Local Moran and Local Geary. Furthermore, a user-friendly statistical module, combining spatial statistical analysis methods with GIS visual techniques, is developed in Arcview using Avenue. An example is also given to show the usefulness of this module in identifying and quantifying the underlying spatial association patterns between economic units.

  16. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  17. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    Science.gov (United States)

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  18. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  19. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  20. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    Science.gov (United States)

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  1. Segregation Analysis on Genetic System of Quantitative Traits in Plants

    Institute of Scientific and Technical Information of China (English)

    Gai Junyi

    2006-01-01

    Based on the traditional polygene inheritance model of quantitative traits,the author suggests the major gene and polygene mixed inheritance model.The model was considered as a general one,while the pure major gene and pure polygene inheritance model was a specific case of the general model.Based on the proposed theory,the author established the segregation analysis procedure to study the genetic system of quantitative traits of plants.At present,this procedure can be used to evaluate the genetic effect of individual major genes (up to two to three major genes),the collective genetic effect of polygene,and their heritability value.This paper introduces how to establish the procedure,its main achievements,and its applications.An example is given to illustrate the steps,methods,and effectiveness of the procedure.

  2. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  3. Contributions to the Analysis of Spatial and Spatial-Temporal Data

    Energy Technology Data Exchange (ETDEWEB)

    Hoest, G.

    1996-12-31

    This doctoral thesis addresses some problems in the analysis of spatial and spatial-temporal data and discusses prediction, prediction errors and identification of emission sources. European sulphur data are used as illustration. In an investigation of a spatial-temporal decomposition model for improving estimates of spatial interpolation (prediction) errors from monitoring data, the estimates were improved compared to estimates obtained by the method known as Kriging (an extension of the Wiener-Kolmogorov theory from time series to spatial processes), although the interpolated values were quite similar. A study of a random process model with an unknown, slowly varying trend and a correlated residual process is performed, using both trend estimation (smoothing) and prediction. Local polynomial methods are extended to continuous random processes. A new approach to non-parametric smoothing and to non-parametric Kriging is described. Finally, a statistical method for verifying reported sulphur emissions from European countries is presented. The method combines meteorological modeling, prior information on sulphur emissions and measurements of sulphate depositions within a Bayesian framework. 101 refs., 33 figs., 7 tabs.

  4. Three-dimensional modeling and quantitative analysis of gap junction distributions in cardiac tissue.

    Science.gov (United States)

    Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W

    2011-11-01

    Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.

  5. Mathematical Analysis of Urban Spatial Networks

    CERN Document Server

    Blanchard, Philippe

    2009-01-01

    Cities can be considered to be among the largest and most complex artificial networks created by human beings. Due to the numerous and diverse human-driven activities, urban network topology and dynamics can differ quite substantially from that of natural networks and so call for an alternative method of analysis. The intent of the present monograph is to lay down the theoretical foundations for studying the topology of compact urban patterns, using methods from spectral graph theory and statistical physics. These methods are demonstrated as tools to investigate the structure of a number of real cities with widely differing properties: medieval German cities, the webs of city canals in Amsterdam and Venice, and a modern urban structure such as found in Manhattan. Last but not least, the book concludes by providing a brief overview of possible applications that will eventually lead to a useful body of knowledge for architects, urban planners and civil engineers.

  6. [Spatial exploratory analysis of road accidents in Ciudad Juarez, Mexico].

    Science.gov (United States)

    Hernández Hernández, Vladimir

    2012-05-01

    Prepare a tool for the exploratory study of road accidents in Ciudad Juarez, Chihuahua, Mexico, that exclusively applies the spatial geographical variable (location). Observational and cross-sectional study that uses a Geographic Information System to explore the spatial nature of 13 305 road accidents recorded during 2008 and 2009 in Ciudad Juarez. Indicators were constructed that approximated the transit flow and included two variables: indices of the level of urbanization and population density. The value of the global spatial autocorrelation was positive, indicating the presence of groupings that were identified through the spatial association indicators. There are road risk clusters located in areas with a high level of urbanization, low population density, and a high transit flow level. The exploratory analysis of spatial data is a phase that precedes the use of multivariate techniques with a broader scope. The application of exploratory analysis techniques in itself makes it possible to standardize spatial groupings, identify global autocorrelation, and indicate the direction of the variables under study.

  7. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  8. Quantitative analysis of in vivo confocal microscopy images: a review.

    Science.gov (United States)

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  9. Spatial cognition and crime: the study of mental models of spatial relations in crime analysis.

    Science.gov (United States)

    Luini, Lorenzo P; Scorzelli, Marco; Mastroberardino, Serena; Marucci, Francesco S

    2012-08-01

    Several studies employed different algorithms in order to investigate criminal's spatial behaviour and to identify mental models and cognitive strategies related to it. So far, a number of geographic profiling (GP) software have been implemented to analyse mobility and its relation to the way criminals are using spatial environment when committing a crime. Since crimes are usually perpetrated in the offender's high-awareness areas, those cognitive maps can be employed to create a map of the criminal's operating area to help investigators to circumscribe search areas. The aim of the present study was to verify accuracy of simple statistical analysis in predicting spatial mobility of a group of 30 non-criminal subjects. Results showed that statistics such as Mean Centre and Standard Distance were accurate in elaborating a GP for each subject according to the mobility area provided. Future analysis will be implemented using mobility information of criminal subjects and location-based software to verify whether there is a cognitive spatial strategy employed by them when planning and committing a crime.

  10. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT

    Energy Technology Data Exchange (ETDEWEB)

    Isola, A A [Philips Research Laboratories, X-ray Imaging Systems Department, Weisshausstrasse 2, D-52066 Aachen (Germany); Schmitt, H; Van Stevendaal, U; Grass, M [Philips Research Laboratories, Sector Digital Imaging, Roentgenstrasse 24-26, D-22335 Hamburg (Germany); Begemann, P G [Department of Radiology, University Hospital Hamburg-Eppendorf, Martinistrasse 52, D-20246 Hamburg (Germany); Coulon, P [Philips Healthcare France, 33 rue de Verdun, F-92150 Suresnes Cedex (France); Boussel, L, E-mail: Alfonso.Isola@Philips.com [Department of Radiology, Louis Pradel Hospital, CREATIS, UMR CNRS 5515, INSERM U630, Lyon (France)

    2011-09-21

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  11. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT

    Science.gov (United States)

    Isola, A. A.; Schmitt, H.; van Stevendaal, U.; Begemann, P. G.; Coulon, P.; Boussel, L.; Grass, M.

    2011-09-01

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  12. Improved SPECT quantitation using fully three-dimensional iterative spatially variant scatter response compensation.

    Science.gov (United States)

    Beekman, F J; Kamphuis, C; Viergever, M A

    1996-01-01

    The quality and quantitative accuracy of iteratively reconstructed SPECT images improves when better point spread function (PSF) models of the gamma camera are used during reconstruction. Here, inclusion in the PSF model of photon crosstalk between different slices caused by limited gamma camera resolution and scatter is examined. A three-dimensional (3-D) projector back-projector (proback) has been developed which models both the distance dependent detector point spread function and the object shape-dependent scatter point spread function of single photon emission computed tomography (SPECT). A table occupying only a few megabytes of memory is sufficient to represent this scatter model. The contents of this table are obtained by evaluating an analytical expression for object shape-dependent scatter. The proposed approach avoids the huge memory requirements of storing the full transition matrix needed for 3-D reconstruction including object shape-dependent scatter. In addition, the method avoids the need for lengthy Monte Carlo simulations to generate such a matrix. In order to assess the quantitative accuracy of the method, reconstructions of a water filled cylinder containing regions of different activity levels and of simulated 3-D brain projection data have been evaluated for technetium-99m. It is shown that fully 3-D reconstruction including complete detector response and object shape-dependent scatter modeling clearly outperforms simpler methods that lack a complete detector response and/or a complete scatter response model. Fully 3-D scatter correction yields the best quantitation of volumes of interest and the best contrast-to-noise curves.

  13. Spatially resolved quantitative mapping of thermomechanical properties and phase transition temperatures using scanning probe microscopy

    Science.gov (United States)

    Jesse, Stephen; Kalinin, Sergei V; Nikiforov, Maxim P

    2013-07-09

    An approach for the thermomechanical characterization of phase transitions in polymeric materials (polyethyleneterephthalate) by band excitation acoustic force microscopy is developed. This methodology allows the independent measurement of resonance frequency, Q factor, and oscillation amplitude of a tip-surface contact area as a function of tip temperature, from which the thermal evolution of tip-surface spring constant and mechanical dissipation can be extracted. A heating protocol maintained a constant tip-surface contact area and constant contact force, thereby allowing for reproducible measurements and quantitative extraction of material properties including temperature dependence of indentation-based elastic and loss moduli.

  14. Quantitative Determination of Spatial Protein-protein Proximity in Fluorescence Confocal Microscopy

    CERN Document Server

    Wu, Yong; Ou, Jimmy; Li, Min; Toro, Ligia; Stefani, Enrico

    2009-01-01

    To quantify spatial protein-protein proximity (colocalization) in fluorescence microscopic images, cross-correlation and autocorrelation functions were decomposed into fast and slowly decaying components. The fast component results from clusters of proteins specifically labeled and the slow one from background/image heterogeneity. We show that the calculation of the protein-protein proximity index and the correlation coefficient are more reliably determined by extracting the fast-decaying component. This new method is illustrated by analyzing colocalization in both simulated and biological images.

  15. Quantitative phosphoproteomic analysis using iTRAQ method.

    Science.gov (United States)

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  16. A quantitative analysis of IRAS maps of molecular clouds

    Science.gov (United States)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  17. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  18. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    Directory of Open Access Journals (Sweden)

    Peterhansel Christoph

    2007-09-01

    Full Text Available Abstract Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP. ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR is the best method to analyze the precipitates, and present comprehensive insights into data normalization.

  19. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  20. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  1. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  2. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  3. Quantitative analysis of sideband coupling in photoinduced force microscopy

    Science.gov (United States)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  4. Quantitative and comparative analysis of hyperspectral data fusion performance

    Institute of Scientific and Technical Information of China (English)

    王强; 张晔; 李硕; 沈毅

    2002-01-01

    Hyperspectral data fusion technique is the key to hyperspectral data processing in recent years. Manyfusion methods have been proposed, but little research has been done to evaluate the performances of differentdata fusion methods. In order to meet the urgent need, quantitative correlation analysis (QCA) is proposed toanalyse and compare the performances of different fusion methods directly from data before and after fusion. Ex-periment results show that the new method is effective and the results of comparison are in agreement with theresults of application.

  5. A Spatial Lattice Model Applied for Meteorological Visualization and Analysis

    Directory of Open Access Journals (Sweden)

    Mingyue Lu

    2017-03-01

    Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases

  6. FRACTAL ANALYSIS APPLIED TO SPATIAL STRUCTURE OF CHINA'S VEGETATION

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the fractal theory, the spatial structure of China's vegetation has been analyzed quantitatively in this paper. Some conclusions are drawn as the following. 1) The relationships between size and frequency of patch area and patch shape index exist objectively for China's vegetation. 2) The relationships between perimeter and area exist objectively for China's vegetation. 3) The fractal dimension of evergreen needleleaf forests on mountains in subtropical and tropical zones is the largest, while the smallest for deciduous broadleaf and evergreen needleleaf mixed forests in temperate zone, reflecting the most complex spatial structure for evergreen needleleaf forests on mountains in subtropical and tropical zones and the simplest for deciduous broadleaf and evergreen needleleaf mixed forests in temperate zone. 4) The fractal dimensions of China's vegetation types tend to decrease from the subtropics to both sides. 5)The stability of spatial structure of deciduous broadleaf and evergreen needleleaf mixed forests in temperate zone is the largest, while the smallest for double-cropping rice, or double-cropping rice and temperate-like grain, and tropical evergreen economic tree plantations and orchards, reflecting the steadiest for deciduous broadleaf and evergreen needleleaf mixed forests in temperate zone and the most unstable for double-cropping rice, or double-cropping rice and temperate-like grain, and tropical evergreen economic tree plantations and orchards in spatial structure. 6) The stability of spatial structure of China's vegetation tends to decrease from the temperate zone to both sides. It is significantly pertinent to understand the formation, evolution, dynamics and complexity rule of ecosystem of vegetation.

  7. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  8. Quantitative chemical analysis of ocular melanosomes in the TEM.

    Science.gov (United States)

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  9. Spatial regression analysis of traffic crashes in Seoul.

    Science.gov (United States)

    Rhee, Kyoung-Ah; Kim, Joon-Ki; Lee, Young-ihn; Ulfarsson, Gudmundur F

    2016-06-01

    Traffic crashes can be spatially correlated events and the analysis of the distribution of traffic crash frequency requires evaluation of parameters that reflect spatial properties and correlation. Typically this spatial aspect of crash data is not used in everyday practice by planning agencies and this contributes to a gap between research and practice. A database of traffic crashes in Seoul, Korea, in 2010 was developed at the traffic analysis zone (TAZ) level with a number of GIS developed spatial variables. Practical spatial models using available software were estimated. The spatial error model was determined to be better than the spatial lag model and an ordinary least squares baseline regression. A geographically weighted regression model provided useful insights about localization of effects. The results found that an increased length of roads with speed limit below 30 km/h and a higher ratio of residents below age of 15 were correlated with lower traffic crash frequency, while a higher ratio of residents who moved to the TAZ, more vehicle-kilometers traveled, and a greater number of access points with speed limit difference between side roads and mainline above 30 km/h all increased the number of traffic crashes. This suggests, for example, that better control or design for merging lower speed roads with higher speed roads is important. A key result is that the length of bus-only center lanes had the largest effect on increasing traffic crashes. This is important as bus-only center lanes with bus stop islands have been increasingly used to improve transit times. Hence the potential negative safety impacts of such systems need to be studied further and mitigated through improved design of pedestrian access to center bus stop islands.

  10. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass movemen

  11. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass

  12. Visuo-Spatial Performance in Autism: A Meta-Analysis

    Science.gov (United States)

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  13. Tuberculosis and living conditions in Salvador, Brazil: a spatial analysis.

    Science.gov (United States)

    Erazo, Carlos; Pereira, Susan M; Costa, Maria da Conceição N; Evangelista-Filho, Delsuc; Braga, José Ueleres; Barreto, Mauricio L

    2014-07-01

    To investigate spatial tuberculosis (TB) distribution patterns and the association between living conditions and incidence of the disease in Salvador, Bahia, Brazil. An ecological study with neighborhood as the unit of analysis. Data was collected from the Notifiable Diseases Information System (Sistema de Informação de Agravos de Notificação, SINAN) and the Brazilian Institute of Geography and Statistics (Instituto Brasileiro de Geografia e Estatística, IBGE). Rates of TB incidence were transformed and smoothed. Spatial analysis was applied to identify spatial auto-correlation and "hotspot" areas of high and low risk. The relationship between TB and living conditions was confirmed by spatial linear regression. The incidence of TB in Salvador displayed heterogeneous patterns, with higher rates occurring in neighborhoods with poor living conditions in 1995 - 1996. Over the study period, disease occurrence declined, particularly in less-privileged strata. In 2004 - 2005, the association between living conditions and TB was no longer observed. The heterogeneous spatial distribution of TB in Salvador previously reflected inequalities related to living conditions. Improvements in such conditions and health care for the less privileged may have contributed to observed changes.

  14. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  15. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  16. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    Science.gov (United States)

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  17. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  18. Analysis of forest fires spatial clustering using local fractal measure

    Science.gov (United States)

    Kanevski, Mikhail; Rochat, Mikael; Timonin, Vadim

    2013-04-01

    The research deals with an application of local fractal measure - local sandbox counting or mass counting, for the characterization of patterns of spatial clustering. The main application concerns the simulated (random patterns within validity domain in forest regions) and real data (forest fires in Ticino, Switzerland) case studies. The global patterns of spatial clustering of forest fires were extensively studied using different topological (nearest-neighbours, Voronoi polygons), statistical (Ripley's k-function, Morisita diagram) and fractal/multifractal measures (box-counting, sandbox counting, lacunarity) (Kanevski, 2008). Generalizations of these measures to functional ones can reveal the structure of the phenomena, e.g. burned areas. All these measures are valuable and complementary tools to study spatial clustering. Moreover, application of the validity domain (complex domain where phenomena is studied) concept helps in understanding and interpretation of the results. In the present paper a sandbox counting method was applied locally, i.e. each point of ignition was considered as a centre of events counting with an increasing search radius. Then, the local relationships between the radius and the number of ignition points within the given radius were examined. Finally, the results are mapped using an interpolation algorithm for the visualization and analytical purposes. Both 2d (X,Y) and 3d (X,Y,Z) cases were studied and compared. Local "fractal" study gives an interesting spatially distributed picture of clustering. The real data case study was compared with a reference homogeneous pattern - complete spatial randomness. The difference between two patterns clearly indicates the regions with important spatial clustering. An extension to the local functional measure was applied taking into account the surface of burned area, i.e. by analysing only data with the fires above some threshold of burned area. Such analysis is similar to marked point processes and

  19. Quantitative analysis of agricultural land use change in China

    Science.gov (United States)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  20. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  1. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  2. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  3. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    Science.gov (United States)

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  4. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  5. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    Science.gov (United States)

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  6. Geospatial analysis platform: Supporting strategic spatial analysis and planning

    CSIR Research Space (South Africa)

    Naude, A

    2008-11-01

    Full Text Available of spatially incompatible statistical area boundaries (e.g. administrative boundaries differing from river catchment management boundaries), it is argued that there is also a need to move from the prevailing “container“ approach to a much more relational...

  7. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  8. Image Chunking: Defining Spatial Building Blocks for Scene Analysis.

    Science.gov (United States)

    1987-04-01

    mumgs0.USmusa 7.AUWOJO 4. CIUTAC Rm6ANT Wuugme*j James V/. Mlahoney DACA? 6-85-C-00 10 NOQ 1 4-85-K-O 124 Artificial Inteligence Laboratory US USS 545...0197 672 IMAGE CHUWING: DEINING SPATIAL UILDING PLOCKS FOR 142 SCENE ANRLYSIS(U) MASSACHUSETTS INST OF TECH CAIIAIDGE ARTIFICIAL INTELLIGENCE LAO J...Technical Report 980 F-Image Chunking: Defining Spatial Building Blocks for Scene DTm -Analysis S ELECTED James V. Mahoney’ MIT Artificial Intelligence

  9. Análise quantitativa da influência de um novo paradigma ecológico: autocorrelação espacial - DOI: 10.4025/actascibiolsci.v25i1.2113 Quantitative analysis of the influence of a new ecological paradigm: spatial autocorrelation - DOI: 10.4025/actascibiolsci.v25i1.2113

    Directory of Open Access Journals (Sweden)

    Luis Mauricio Bini

    2003-04-01

    Full Text Available O objetivo desse trabalho foi o de avaliar a influência da autocorrelação espacial (ausência de independência estatística de observações obtidas ao longo do espaço geográfico nos estudos ecológicos. Para tanto, uma avaliação dos trabalhos que empregaram os métodos necessários para a quantificação da autocorrelação espacial foi realizada, utilizando os dados fornecidos pelo “Institute for Scientific Information”. Os resultados demonstraram que existe uma tendência crescente da utilização das análises de autocorrelação espacial em estudos ecológicos, e que a presença de autocorrelação espacial significativa foi detectada na maior parte dos estudos. Além disso, esses estudos foram desenvolvidos em vários países, por cientistas de diferentes nacionalidades, com diferentes grupos de organismos e em diferentes tipos de ecossistemas. Dessa forma, pode-se considerar que o reconhecimento explícito da estrutura espacial dos processos naturais, por meio da análise da autocorrelação espacial, é um novo paradigma dos estudos ecológicosThe aim of this paper was to evaluate the influence of the spatial autocorrelation (absence of independence among observations gathered along geographical space in ecological studies. For this task, an evaluation of the studies that used spatial autocorrelation analysis was carried out using the data furnished by the Institute for Scientific Information. There is a positive temporal tendency in the number of studies that used spatial autocorrelation analysis. A significant autocorrelation was detected in most studies. Moreover, scientist of several nationalities carried out these studies in different countries, with different organisms and in different types of ecosystems. In this way, it is possible to consider that the explicit incorporation of the spatial structure of natural processes, through the autocorrelation analysis, is a new ecological paradigm

  10. Glioblastoma multiforme: exploratory radiogenomic analysis by using quantitative image features.

    Science.gov (United States)

    Gevaert, Olivier; Mitchell, Lex A; Achrol, Achal S; Xu, Jiajing; Echegaray, Sebastian; Steinberg, Gary K; Cheshier, Samuel H; Napel, Sandy; Zaharchuk, Greg; Plevritis, Sylvia K

    2014-10-01

    To derive quantitative image features from magnetic resonance (MR) images that characterize the radiographic phenotype of glioblastoma multiforme (GBM) lesions and to create radiogenomic maps associating these features with various molecular data. Clinical, molecular, and MR imaging data for GBMs in 55 patients were obtained from the Cancer Genome Atlas and the Cancer Imaging Archive after local ethics committee and institutional review board approval. Regions of interest (ROIs) corresponding to enhancing necrotic portions of tumor and peritumoral edema were drawn, and quantitative image features were derived from these ROIs. Robust quantitative image features were defined on the basis of an intraclass correlation coefficient of 0.6 for a digital algorithmic modification and a test-retest analysis. The robust features were visualized by using hierarchic clustering and were correlated with survival by using Cox proportional hazards modeling. Next, these robust image features were correlated with manual radiologist annotations from the Visually Accessible Rembrandt Images (VASARI) feature set and GBM molecular subgroups by using nonparametric statistical tests. A bioinformatic algorithm was used to create gene expression modules, defined as a set of coexpressed genes together with a multivariate model of cancer driver genes predictive of the module's expression pattern. Modules were correlated with robust image features by using the Spearman correlation test to create radiogenomic maps and to link robust image features with molecular pathways. Eighteen image features passed the robustness analysis and were further analyzed for the three types of ROIs, for a total of 54 image features. Three enhancement features were significantly correlated with survival, 77 significant correlations were found between robust quantitative features and the VASARI feature set, and seven image features were correlated with molecular subgroups (P < .05 for all). A radiogenomics map was

  11. Enhancing contrast and quantitation by spatial frequency domain fluorescence molecular imaging

    Science.gov (United States)

    Sun, Jessica; Hathi, Deep; Zhou, Haiying; Shokeen, Monica; Akers, Walter J.

    2016-03-01

    Optical imaging with fluorescent contrast agents is highly sensitive for molecular imaging but is limited in depth to a few centimeters below the skin. Planar fluorescence imaging with full-field, uniform illumination and scientific camera image capture provides a portable and robust configuration for real-time, sensitive fluorescence detection with scalable resolution, but is inherently surface weighted and therefore limited in depth to a few millimeters. At the NIR region (700-1000 nm), tissue absorption and autofluorescence are relatively reduced, increasing depth penetration and reducing background signal, respectively. Optical imaging resolution scales with depth, limiting microscopic resolution with multiphoton microscopy and optical coherence tomography to skin and peri-tumoral tissues are not uniform, varying in thickness and color, complicating subsurface fluorescence measurements. Diffuse optical imaging methods have been developed that better quantify optical signals relative to faster full-field planar reflectance imaging, but require long scan times, complex instrumentation, and reconstruction algorithms. Here we report a novel strategy for rapid measurement of subsurface fluorescence using structured light illumination to improve quantitation of deep-seated fluorescence molecular probe accumulation. This technique, in combination with highly specific, tumor-avid fluorescent molecular probes, will easily integrate noninvasive diagnostics for superficial cancers and fluorescence guided surgery.

  12. IMACULAT - an open access package for the quantitative analysis of chromosome localization in the nucleus.

    Directory of Open Access Journals (Sweden)

    Ishita Mehta

    Full Text Available The alteration in the location of the chromosomes within the nucleus upon action of internal or external stimuli has been implicated in altering genome function. The effect of stimuli at a whole genome level is studied by using two-dimensional fluorescence in situ hybridization (FISH to delineate whole chromosome territories within a cell nucleus, followed by a quantitative analysis of the spatial distribution of the chromosome. However, to the best of our knowledge, open access software capable of quantifying spatial distribution of whole chromosomes within cell nucleus is not available. In the current work, we present a software package that computes localization of whole chromosomes - Image Analysis of Chromosomes for computing localization (IMACULAT. We partition the nucleus into concentric elliptical compartments of equal area and the variance in the quantity of any chromosome in these shells is used to determine its localization in the nucleus. The images are pre-processed to remove the smudges outside the cell boundary. Automation allows high throughput analysis for deriving statistics. Proliferating normal human dermal fibroblasts were subjected to standard a two-dimensional FISH to delineate territories for all human chromosomes. Approximately 100 images from each chromosome were analyzed using IMACULAT. The analysis corroborated that these chromosome territories have non-random gene density based organization within the interphase nuclei of human fibroblasts. The ImageMagick Perl API has been used for pre-processing the images. The source code is made available at www.sanchak.com/imaculat.html.

  13. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    Sujay Rakshit; Arunita Rakshit; J. V. Patil

    2011-04-01

    Most traits of interest to medical, agricultural and animal scientists show continuous variation and complex mode of inheritance. DNA-based markers are being deployed to analyse such complex traits, that are known as quantitative trait loci (QTL). In conventional QTL analysis, F2, backcross populations, recombinant inbred lines, backcross inbred lines and double haploids from biparental crosses are commonly used. Introgression lines and near isogenic lines are also being used for QTL analysis. However, such populations have major limitations like predominantly relying on the recombination events taking place in the F1 generation and mapping of only the allelic pairs present in the two parents. The second generation mapping resources like association mapping, nested association mapping and multiparent intercross populations potentially address the major limitations of available mapping resources. The potential of multiparent intercross populations in gene mapping has been discussed here. In such populations both linkage and association analysis can be conductted without encountering the limitations of structured populations. In such populations, larger genetic variation in the germplasm is accessed and various allelic and cytoplasmic interactions are assessed. For all practical purposes, across crop species, use of eight founders and a fixed population of 1000 individuals are most appropriate. Limitations with multiparent intercross populations are that they require longer time and more resource to be generated and they are likely to show extensive segregation for developmental traits, limiting their use in the analysis of complex traits. However, multiparent intercross population resources are likely to bring a paradigm shift towards QTL analysis in plant species.

  14. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    Science.gov (United States)

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  15. Bovine spongiform encephalopathy and spatial analysis of the feed industry.

    Science.gov (United States)

    Paul, Mathilde; Abrial, David; Jarrige, Nathalie; Rican, Stéphane; Garrido, Myriam; Calavas, Didier; Ducrot, Christian

    2007-06-01

    In France, despite the ban of meat-and-bone meal (MBM) in cattle feed, bovine spongiform encephalopathy (BSE) was detected in hundreds of cattle born after the ban. To study the role of MBM, animal fat, and dicalcium phosphate on the risk for BSE after the feed ban, we conducted a spatial analysis of the feed industry. We used data from 629 BSE cases as well as data on use of each byproduct and market area of the feed factories. We mapped risk for BSE in 951 areas supplied by the same factories and connection with use of byproducts. A disease map of BSE with covariates was built with the hierarchical Bayesian modeling methods, based on Poisson distribution with spatial smoothing. Only use of MBM was spatially linked to risk for BSE, which highlights cross-contamination as the most probable source of infection after the feed ban.

  16. Spatial analysis of the Chania prefecture: Crete triangulation network quality

    Science.gov (United States)

    Achilleos, Georgios

    2016-08-01

    The network of trigonometric points of a region is the basis upon which any form of cartographic work is attached to the national geodetic coordinate system (data collection, processing, output presentations) and not only. The products of the cartographic work (cartographic representations), provide the background which is used in cases of spatial planning and development strategy. This trigonometric network, except that, provides to a single cartographic work, the ability to exist within a unified official state geodetic reference system, simultaneously determines the quality of the result, since the trigonometric network data that are used, have their own quality. In this paper, we present the research of spatial quality of the trigonometric network of Chania Prefecture in Crete. This analysis examines the triangulation network points, both with respect to their spatial position (distribution in space), and in their accuracy (horizontally and vertically).

  17. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  18. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications.

    Science.gov (United States)

    Agishev, Ravil R; Comeron, Adolfo

    2002-12-20

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal/background-radiation ratio (SBR) at the photodetector input The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effective protection against sky background radiation for groundbased biaxial lidars is the modifying of their angular field according to a spatial-angular filtering efficiency criterion. Some effective approaches to achieve a high filtering efficiency for the receiving system optimization are discussed.

  19. Spatial filtering efficiency of monostatic biaxial lidar: analysis and applications

    Science.gov (United States)

    Agishev, Ravil R.; Comeron, Adolfo

    2002-12-01

    Results of lidar modeling based on spatial-angular filtering efficiency criteria are presented. Their analysis shows that the low spatial-angular filtering efficiency of traditional visible and near-infrared systems is an important cause of low signal/background-radiation ratio (SBR) at the photodetector input. The low SBR may be responsible for considerable measurement errors and ensuing the low accuracy of the retrieval of atmospheric optical parameters. As shown, the most effective protection against sky background radiation for groundbased biaxial lidars is the modifying of their angular field according to a spatial-angular filtering efficiency criterion. Some effective approaches to achieve a high filtering efficiency for the receiving system optimization are discussed.

  20. Socio-Spatial Intelligence: social media and spatial cognition for territorial behavioral analysis.

    Science.gov (United States)

    Luini, Lorenzo P; Cardellicchio, Davide; Felletti, Fulvia; Marucci, Francesco S

    2015-09-01

    Investigative analysts gather data from different sources, especially from social media (SM), in order to shed light on cognitive factors that may explain criminal spatial behavior. A former research shows how tweets can be used to estimate private points of interest. Authors' aim was to demonstrate, as they extend the analysis to a wider statistical base, how social maps and Web applications could be used in investigative analysis and spatial cognition research. A total of 100 Twitter accounts with approximately 250 tweets each were submitted to common geographical techniques (measures such as Convex-Hull, Mean-Center, Median-Center, Standard-Deviation-Ellipse) in order to test the hypothesis that user areas of activity are predictable. Predictions were tested through a set of specific information: clear reference to areas of activity and clear reference about user's residence. Simple algorithms and procedures demonstrated that they could be used to predict where SM users live, giving positive results in about 4/5 cases and giving indications about their home location. In fact, all home positions were found in the Convex-Hull and most of them in the Standard-Deviation-Ellipse. Furthermore, in up to 80% of cases, houses were found within a buffer zone of 1.500 m with Median-Center as centrum (70% using Median-Center as centrum) with a minimum effectiveness threshold of 12-13 tweets. SM may help in studying people mobility and their cognition of space and, moreover, where they live, or their traveling behavior. The processing of geographical data in conjunction with the SM analysis may facilitate the construction of models describing specific behavior of people. The use of geographical information system tools and SM analysis represents an effective approach in order to acquire spatial and territorial information, referred to social relationship. The results may be used successfully in the understanding of social dynamics and for the prevention of criminal behavior.

  1. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  2. Analysis and System Design Framework for Infrared Spatial Heterodyne Spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Cooke, B.J.; Smith, B.W.; Laubscher, B.E.; Villeneuve, P.V.; Briles, S.D.

    1999-04-05

    The authors present a preliminary analysis and design framework developed for the evaluation and optimization of infrared, Imaging Spatial Heterodyne Spectrometer (SHS) electro-optic systems. Commensurate with conventional interferometric spectrometers, SHS modeling requires an integrated analysis environment for rigorous evaluation of system error propagation due to detection process, detection noise, system motion, retrieval algorithm and calibration algorithm. The analysis tools provide for optimization of critical system parameters and components including : (1) optical aperture, f-number, and spectral transmission, (2) SHS interferometer grating and Littrow parameters, and (3) image plane requirements as well as cold shield, optical filtering, and focal-plane dimensions, pixel dimensions and quantum efficiency, (4) SHS spatial and temporal sampling parameters, and (5) retrieval and calibration algorithm issues.

  3. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  4. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  5. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Quantitative assessment of industrial VOC emissions in China: Historical trend, spatial distribution, uncertainties, and projection

    Science.gov (United States)

    Zheng, Chenghang; Shen, Jiali; Zhang, Yongxin; Huang, Weiwei; Zhu, Xinbo; Wu, Xuecheng; Chen, Linghong; Gao, Xiang; Cen, Kefa

    2017-02-01

    The temporal trends of industrial volatile organic compound (VOC) emissions was comprehensively summarized for the 2011 to 2013 period, and the projections for 2020 to 2050 for China were set. The results demonstrate that industrial VOC emissions in China increased from 15.3 Tg in 2011 to 29.4 Tg in 2013 at an annual average growth rate of 38.3%. Guangdong (3.45 Tg), Shandong (2.85 Tg), and Jiangsu (2.62 Tg) were the three largest contributors collectively accounting for 30.4% of the national total emissions in 2013. The top three average industrial VOC emissions per square kilometer were Shanghai (247.2 ton/km2), Tianjin (62.8 ton/km2), and Beijing (38.4 ton/km2), which were 12-80 times of the average level in China. The data from the inventory indicate that the use of VOC-containing products, as well as the production and use of VOCs as raw materials, as well as for storage and transportation contributed 75.4%, 10.3%, 9.1%, and 5.2% of the total emissions, respectively. ArcGIS was used to display the remarkable spatial distribution variation by allocating the emission into 1 km × 1 km grid cells with a population as surrogate indexes. Combined with future economic development and population change, as well as implementation of policy and upgrade of control technologies, three scenarios (scenarios A, B, and C) were set to project industrial VOC emissions for the years 2020, 2030, and 2050, which present the industrial VOC emissions in different scenarios and the potential of reducing emissions. Finally, the result shows that the collaborative control policies considerably influenced industrial VOC emissions.

  7. Quantitative modeling and data analysis of SELEX experiments

    Science.gov (United States)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  8. Quantitative analysis of multiple sclerosis: a feasibility study

    Science.gov (United States)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  9. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    Science.gov (United States)

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Quantitative analysis of tumor burden in mouse lung via MRI.

    Science.gov (United States)

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  11. The Impact of Arithmetic Skills on Mastery of Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bruce K. Blaylock

    2012-01-01

    Full Text Available Over the past several years math education has moved from a period where all math calculations were done by hand to an era where most calculations are done using a calculator or computer. There are certainly benefits to this approach, but when one concomitantly recognizes the declining scores on national standardized mathematics exams, it raises the question, “Could the lack of technology-assisted arithmetic manipulation skills have a carryover to understanding higher-level mathematical concepts or is it just a spurious correlation?” Eighty-seven students were tested for their ability to do simple arithmetic and algebra by hand. These scores were then regressed on three important areas of quantitative analysis: recognizing the appropriate tool to use in an analysis, creating a model to carry out the analysis, and interpreting the results of the analysis. The study revealed a significant relationship between the ability to accurately do arithmetic calculations and the ability to recognize the appropriate tool and creating a model. It found no significant relationship between results interpretation and arithmetic skills.

  12. Analysis of generalized interictal discharges using quantitative EEG.

    Science.gov (United States)

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  13. Spatial eigenvector filtering for spatiotemporal crime mapping and spatial crime analysis

    NARCIS (Netherlands)

    Helbich, M; Jokar Arsanjani, J

    2015-01-01

    Spatial and spatiotemporal analyses are exceedingly relevant to determine criminogenic factors. The estimation of Poisson and negative binomial models (NBM) is complicated by spatial autocorrelation. Therefore, first, eigenvector spatial filtering (ESF) is introduced as a method for spatiotemporal m

  14. Directional spatial frequency analysis of lipid distribution in atherosclerotic plaque

    Science.gov (United States)

    Korn, Clyde; Reese, Eric; Shi, Lingyan; Alfano, Robert; Russell, Stewart

    2016-04-01

    Atherosclerosis is characterized by the growth of fibrous plaques due to the retention of cholesterol and lipids within the artery wall, which can lead to vessel occlusion and cardiac events. One way to evaluate arterial disease is to quantify the amount of lipid present in these plaques, since a higher disease burden is characterized by a higher concentration of lipid. Although therapeutic stimulation of reverse cholesterol transport to reduce cholesterol deposits in plaque has not produced significant results, this may be due to current image analysis methods which use averaging techniques to calculate the total amount of lipid in the plaque without regard to spatial distribution, thereby discarding information that may have significance in marking response to therapy. Here we use Directional Fourier Spatial Frequency (DFSF) analysis to generate a characteristic spatial frequency spectrum for atherosclerotic plaques from C57 Black 6 mice both treated and untreated with a cholesterol scavenging nanoparticle. We then use the Cauchy product of these spectra to classify the images with a support vector machine (SVM). Our results indicate that treated plaque can be distinguished from untreated plaque using this method, where no difference is seen using the spatial averaging method. This work has the potential to increase the effectiveness of current in-vivo methods of plaque detection that also use averaging methods, such as laser speckle imaging and Raman spectroscopy.

  15. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    Michael M·derl; Wolfgang Rauch

    2011-01-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  16. Capacity analysis of spectrum sharing spatial multiplexing MIMO systems

    KAUST Repository

    Yang, Liang

    2014-12-01

    This paper considers a spectrum sharing (SS) multiple-input multiple-output (MIMO) system operating in a Rayleigh fading environment. First the capacity of a single-user SS spatial multiplexing system is investigated in two scenarios that assume different receivers. To explicitly show the capacity scaling law of SS MIMO systems, some approximate capacity expressions for the two scenarios are derived. Next, we extend our analysis to a multiple user system with zero-forcing receivers (ZF) under spatially-independent scheduling and analyze the sum-rate. Furthermore, we provide an asymptotic sum-rate analysis to investigate the effects of different parameters on the multiuser diversity gain. Our results show that the secondary system with a smaller number of transmit antennas Nt and a larger number of receive antennas Nr can achieve higher capacity at lower interference temperature Q, but at high Q the capacity follows the scaling law of the conventional MIMO systems. However, for a ZF SS spatial multiplexing system, the secondary system with small Nt and large Nr can achieve the highest capacity throughout the entire region of Q. For a ZF SS spatial multiplexing system with scheduling, the asymptotic sum-rate scales like Ntlog2(Q(KNtNp-1)/Nt), where Np denotes the number of antennas of the primary receiver and K represents the number of secondary transmitters.

  17. Quantitative microstructure analysis of polymer-modified mortars.

    Science.gov (United States)

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  18. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  19. Quantitative genetic analysis of injury liability in infants and toddlers

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  20. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  1. Quantitative image analysis of WE43-T6 cracking behavior

    Science.gov (United States)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  2. qfasar: quantitative fatty acid signature analysis with R

    Science.gov (United States)

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  3. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  4. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  5. Quantitative analysis of causes for temporal and spatial variation characteristics and evolution of potential evapotranspiration in Gansu Province during recent 30 years%近30年甘肃省潜在蒸散发时空变化特征及演变归因的定量分析

    Institute of Scientific and Technical Information of China (English)

    李耀军; 魏霞; 苏辉东

    2015-01-01

    蒸散发是地球系统水量平衡和能量平衡中的重要组成部分,研究潜在蒸散发的变化对于更好地理解气候变化对水文循环的影响及水资源配置具有重要意义。本文基于FAO推荐并修订的Penman-Monteith模型,利用1981-2010年甘肃省内及周边的32个气象站点的常规观测资料对甘肃近30年内潜在蒸散发的时空变化特征进行研究,并对潜在蒸散发对气象因子的敏感性及其敏感系数的空间分布进行分析,定量揭示了影响甘肃潜在蒸散发变化的主导因素。结果可为研究甘肃气候变化对水循环的影响、提高农业灌溉效率和调整水资源利用结构及优化配置水资源提供参考。%Evapotranspiration is an important part in the water and energy balance of earth's system.The research on the variation of potential evapotranspiration ( ETp ) is of great significance for a better under-standing of the impact of climate change on hydrological cycle and water resources allocation.Based on Penman-Monteith model recommended and revised by FAO,by use of meteorological data collected from 32 meteorological stations in or around Gansu,the paper analyzed the spatial and temporal variation char-acteristics of potential evapotranspiration in recent 30 years.Moreover,it analyzed the sensitivity of mete-orological factors to potential evapotranspiration and the spatial distribution of sensitive coefficient, and quantitatively reveal the dominant factor of ETp 's variation.The result can provide reference for studying the impact of climate change on hydrological cycle,improve agricultural irrigation efficiency,adjust utiliza-tion structure of water resources and optimizing the allocation of water resources in Gansu Province.

  6. Automated Spatial Brain Normalization and Hindbrain White Matter Reference Tissue Give Improved [(18)F]-Florbetaben PET Quantitation in Alzheimer's Model Mice.

    Science.gov (United States)

    Overhoff, Felix; Brendel, Matthias; Jaworska, Anna; Korzhova, Viktoria; Delker, Andreas; Probst, Federico; Focke, Carola; Gildehaus, Franz-Josef; Carlsen, Janette; Baumann, Karlheinz; Haass, Christian; Bartenstein, Peter; Herms, Jochen; Rominger, Axel

    2016-01-01

    Preclinical PET studies of β-amyloid (Aβ) accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore, we aimed to evaluate systematically the (1) impact of an automated algorithm for spatial brain normalization, and (2) intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a 6 week longitudinal setting (N = 37) and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40) were investigated by [(18)F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ). For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX∕REF), relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen's d), and in reference to gold standard histopathological quantitation (Pearson's R). Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ≥0.99) between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX∕REF. All SUVRCTX∕REF methods performed better than SUVCTX both with regard to longitudinal stability (d≥1.21 vs. d = 0.23) and histological gold standard agreement (R≥0.66 vs. R≥0.31). Voxel-wise analysis suggested a physiologically implausible longitudinal decrease by global mean scaling. The hindbrain white matter reference (R mean = 0.75) was slightly superior to the brainstem (R mean = 0.74) and the cerebellum (R mean = 0.73). Automated

  7. Automated spatial brain normalization and hindbrain white matter reference tissue give improved [18F]-florbetaben PET quantitation in Alzheimer´s model mice

    Directory of Open Access Journals (Sweden)

    Felix eOverhoff

    2016-02-01

    Full Text Available Preclinical PET studies of β-amyloid (Aβ accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore we aimed to evaluate systematically the 1 impact of an automated algorithm for spatial brain normalization, and 2 intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a six week longitudinal setting (N = 37 and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40 were investigated by [18F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ. For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX/REF, relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen’s d, and in reference to gold standard histopathological quantitation (Pearson’s R.Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ ≥ 0.99 between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX/REF. All SUVRCTX/REF methods performed better than SUVCTX both with regard to longitudinal stability (d ≥ 1.21 vs. d = 0.23 and histological gold standard agreement (R ≥ 0.66 vs. R ≥ 0.31. Voxel-wise analysis suggested a physiologically implausible longitudinal decrease of global mean scaling. The hindbrain white matter reference (Rmean = 0.75 was slightly superior to the brainstem (Rmean = 0.74 and the cerebellum (Rmean = 0

  8. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  9. Covariate selection in multivariate spatial analysis of ovine parasitic infection.

    Science.gov (United States)

    Musella, V; Catelan, D; Rinaldi, L; Lagazio, C; Cringoli, G; Biggeri, A

    2011-05-01

    Gastrointestinal (GI) strongyle and fluke infections remain one of the main constraints on health and productivity in sheep dairy production. A cross-sectional survey was conducted in 2004-2005 on ovine farms in the Campania region of southern Italy in order to evaluate the prevalence of Haemonchus contortus, Fasciola hepatica, Dicrocoelium dendriticum and Calicophoron daubneyi from among other parasitic infections. In the present work, we focused on the role of the ecological characteristics of the pasture environment while accounting for the underlying long range geographical risk pattern. Bayesian multivariate spatial statistical analysis was used. A systematic grid (10 km×10 km) sampling approach was used. Laboratory procedures were based on the FLOTAC technique to detect and count eggs of helminths. A Geographical Information System (GIS) was constructed by using environmental data layers. Data on each of these layers were then extracted for pasturing areas that were previously digitalized aerial images of the ovine farms. Bayesian multivariate statistical analyses, including improper multivariate conditional autoregressive models, were used to select covariates on a multivariate spatially structured risk surface. Out of the 121 tested farms, 109 were positive for H. contortus, 81 for D. dendriticum, 17 for C. daubneyi and 15 for F. hepatica. The statistical analysis highlighted a north-south long range spatially structured pattern. This geographical pattern is treated here as a confounder, because the main interest was in the causal role of ecological covariates at the level of each pasturing area. A high percentage of pasture and impermeable soil were strong predictors of F. hepatica risk and a high percentage of wood was a strong predictor of C. daubneyi. A high percentage of wood, rocks and arable soil with sparse trees explained the spatial distribution of D. dendriticum. Sparse vegetation, river, mixed soil and permeable soil explained the spatial

  10. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates pept

  11. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  12. QTL analysis for some quantitative traits in bread wheat

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection.

  13. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    Science.gov (United States)

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  14. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    Directory of Open Access Journals (Sweden)

    Metzenberg Stan

    2010-04-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  15. Spatial Analysis of Maritime Traffic for Maritime Security

    Science.gov (United States)

    2009-10-01

    Locations Spatial Pattern II Sample size NB NS Canoe 4 17 Kayak 8 21 Motorboat 15 10 Sailboat 5 47 Scope of Study Methodology Data Pre-cleaning Points on land...Attributes to Variates Speed Total distance Bounding Box Dedensified Trajectory Distance from shore Pattern Analysis Discrimination Classification Kayak ...Trajectories Pattern Classification Procedure Predicted: Kayak : T1,T5… Canoe: T3,T4… Sailboat: T6,… Motorboat: T2.. Actual: Kayak : T1,T5

  16. Spatial wavelet analysis of calcium oscillations in developing neurons.

    Directory of Open Access Journals (Sweden)

    Federico Alessandro Ruffinatti

    Full Text Available Calcium signals play a major role in the control of all key stages of neuronal development, and in particular in the growth and orientation of neuritic processes. These signals are characterized by high spatial compartmentalization, a property which has a strong relevance in the different roles of specific neuronal regions in information coding. In this context it is therefore important to understand the structural and functional basis of this spatial compartmentalization, and in particular whether the behavior at each compartment is merely a consequence of its specific geometry or the result of the spatial segregation of specific calcium influx/efflux mechanisms. Here we have developed a novel approach to separate geometrical from functional differences, regardless on the assumptions on the actual mechanisms involved in the generation of calcium signals. First, spatial indices are derived with a wavelet-theoretic approach which define a measure of the oscillations of cytosolic calcium concentration in specific regions of interests (ROIs along a cell, in our case developing chick ciliary ganglion neurons. The resulting spatial profile demonstrates clearly that different ROIs along the neuron are characterized by specific patterns of calcium oscillations. Next we have investigated whether this inhomogeneity is due just to geometrical factors, namely the surface to volume ratio in the different subcompartments (e.g. soma vs. growth cone or it depends on their specific biophysical properties. To this aim correlation functions are computed between the activity indices and the surface/volume ratio along the cell: the data thus obtained are validated by a statistical analysis on a dataset of [Formula: see text] different cells. This analysis shows that whereas in the soma calcium dynamics is highly correlated to the surface/volume ratio, correlations drop in the growth cone-neurite region, suggesting that in this latter case the key factor is the

  17. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  18. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  19. GIS-BASED SPATIAL STATISTICAL ANALYSIS OF COLLEGE GRADUATES EMPLOYMENT

    Directory of Open Access Journals (Sweden)

    R. Tang

    2012-07-01

    Full Text Available It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004–2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  20. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    Science.gov (United States)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  1. Spatial analysis of drumlins within the Arran, Guelph, and Galt drumlin fields of southern Ontario, Canada

    Science.gov (United States)

    Maclachlan, John

    2016-04-01

    Reconstruction of former ice conditions and glacier dynamics in previously glaciated terrains requires understanding of the processes and controls on the development of subglacial landforms such as drumlins. This paper presents a quantitative analysis of the spatial distribution of drumlins identified from digital elevation model (DEM) data within three drumlin fields in southern Ontario, Canada (the Arran, Galt and Guelph drumlin fields) formed in the Late Wisconsin by the Ontario and Georgian Bay ice lobes of the Laurentide Ice Sheet. Detailed field description of a partially excavated drumlin within the Guelph drumlin field provides firther insight to compliment the geomorphometric analysis. Drumlins are identified and their morphological parameters documented using a computer-based process that allows direct comparison of forms within and between individual fields. Statistical analysis of the morphological characteristics and spatial distribution of drumlins within each of the three drumlin fields, using kernel density and nearest neighbour analysis, indicates that drumlins of particular types show distinct patterns of clustering that appear to be are related to several different factors including length of time under ice, bedrock topography, and ice velocity. Sediments exposed in an excavated drumlin within the Guelph drumlin field show a relatively undisturbed older fluvial or glaciofluvial crudely stratified sands draped by a thin veneer of coarse grained deformation till. This stratigraphy is similar to that described from modern drumlins in Iceland and is consistent with models of drumlin formation by subglacial deformation processes. The methodology of drumlin analysis can be applied to the study of any drumlin field with an adequate coverage of digital spatial data. The ability to consistently identify and characterize drumlin morphology and distribution will allow more objective and systematic comparison of these landforms both within and between

  2. Research on Bribery Characterization of Property Developers in Land Market: Based on Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Hongxia ZHANG; Shukui TAN; Li XIA; Chunhong JIANG

    2015-01-01

    Taking Wuhan,Nanjing and Guangzhou as examples and using Logistic model and Moran index,this paper made a quantitative analysis on bribery characterization of developers in land market. It found that( i) bribery behavior of developers is promoted by supply and demands;( ii) bribery behavior of developers takes on regional agglomeration and difference. It reached following conclusions:( i) under the influence of macro factors and micro factors,bribery behavior of developers is rational selection after full consideration of institutional environment and corporate strength,and is passive to a certain extent;( ii) bribery behavior of developers has certain spatial correlation,and the high-high correlation characteristic is most significant.

  3. Finite element analysis of the hip and spine based on quantitative computed tomography.

    Science.gov (United States)

    Carpenter, R Dana

    2013-06-01

    Quantitative computed tomography (QCT) provides three-dimensional information about bone geometry and the spatial distribution of bone mineral. Images obtained with QCT can be used to create finite element models, which offer the ability to analyze bone strength and the distribution of mechanical stress and physical deformation. This approach can be used to investigate different mechanical loading scenarios (stance and fall configurations at the hip, for example) and to estimate whole bone strength and the relative mechanical contributions of the cortical and trabecular bone compartments. Finite element analyses based on QCT images of the hip and spine have been used to provide important insights into the biomechanical effects of factors such as age, sex, bone loss, pharmaceuticals, and mechanical loading at sites of high clinical importance. Thus, this analysis approach has become an important tool in the study of the etiology and treatment of osteoporosis at the hip and spine.

  4. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    , apatite-nepheline, essentially sphalerite, and ore-quartz deposits of gold, tin, tungsten, molybdenum, zinc, crystal, and other raw materials. This method also enables differentiation of rocks such as bauxites, kimberlites, etc., from the host rocks, by their electrokinetic properties. Classification of some rocks, ores, and minerals by their piezoactivity is given in Table 1. These objects (targets) transform wave elastic oscillations into electromagnetic ones. It should be taken into account that anomalous bodies may be detected not only by positive, but also by negative anomalies, if low-piezoactive body occurs in the higher piezoactive medium. The piezoelectric method is an example of successful application of piezoelectric and seismo-electrokinetic phenomena in exploration and environmental geophysics and designed for delineation of targets differing from the host media by piezoelectric properties (Neishtadt et al., 2006, Neishtadt and Eppelbaum, 2012). This method is employed in surface, downhole, and underground modes. Recent testing of piezeoelectric effects of archaeological samples composed from fired clay have shown values of 2.0 - 3.0 ṡ 10-14 C/N. However, absence of reliable procedures for solving the direct and inverse problems of piezoelectric anomalies (PEA), drastically hampers further progression of the method. Therefore, it was suggested to adapt the tomography procedure, widely used in the seismic prospecting, to the PEA modeling. Diffraction of seismic waves has been computed for models of circular cylinder, thin inclined bed and thick bed (Alperovich et al., 1997). As a result, spatial-time distribution of the electromagnetic field caused by the seismic wave has been found. The computations have shown that effectiveness and reliability of PEA analysis may be critically enhanced by considering total electro- and magnetograms as differentiated from the conventional approaches. Distribution of the electromagnetic field obtained by solving the direct

  5. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  6. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  7. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  8. An operational modal analysis method in frequency and spatial domain

    Institute of Scientific and Technical Information of China (English)

    Wang Tong; Zhang Lingmi; Tamura Yukio

    2005-01-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  9. Spectral theory and nonlinear analysis with applications to spatial ecology

    CERN Document Server

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  10. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Science.gov (United States)

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  11. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  12. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    Science.gov (United States)

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  13. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  14. Correlation between two methods of florbetapir PET quantitative analysis.

    Science.gov (United States)

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  15. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    Science.gov (United States)

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  16. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    Science.gov (United States)

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  17. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    Science.gov (United States)

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  18. Tuberculosis DALY-Gap: Spatial and Quantitative Comparison of Disease Burden Across Urban Slum and Non-slum Census Tracts.

    Science.gov (United States)

    Marlow, Mariel A; Maciel, Ethel Leonor Noia; Sales, Carolina Maia Martins; Gomes, Teresa; Snyder, Robert E; Daumas, Regina Paiva; Riley, Lee W

    2015-08-01

    To quantitatively assess disease burden due to tuberculosis between populations residing in and outside of urban informal settlements in Rio de Janeiro, Brazil, we compared disability-adjusted life years (DALYs), or "DALY-gap." Using the 2010 Brazilian census definition of informal settlements as aglomerados subnormais (AGSN), we allocated tuberculosis (TB) DALYs to AGSN vs non-AGSN census tracts based on geocoded addresses of TB cases reported to the Brazilian Information System for Notifiable Diseases in 2005 and 2010. DALYs were calculated based on the 2010 Global Burden of Disease methodology. DALY-gap was calculated as the difference between age-adjusted DALYs/100,000 population between AGSN and non-AGSN. Total TB DALY in Rio in 2010 was 16,731 (266 DALYs/100,000). DALYs were higher in AGSN census tracts (306 vs 236 DALYs/100,000), yielding a DALY-gap of 70 DALYs/100,000. Attributable DALY fraction for living in an AGSN was 25.4%. DALY-gap was highest for males 40-59 years of age (501 DALYs/100,000) and in census tracts with <60% electricity (12,327 DALYs/100,000). DALY-gap comparison revealed spatial and quantitative differences in TB burden between slum vs non-slum census tracts that were not apparent using traditional measures of incidence and mortality. This metric could be applied to compare TB burden or burden for other diseases in mega-cities with large informal settlements for more targeted resource allocation and evaluation of intervention programs.

  19. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  20. Visualization analysis of multivariate spatial-temporal data of the Red Army Long March in China

    Science.gov (United States)

    Ma, Ding; Ma, Zhimin; Meng, Lumin; Li, Xia

    2009-10-01

    Recently, the visualization of spatial-temporal data in historic events is emphasized by more and more people. To provide an efficient and effective approach to meet this requirement is the duty of Geo-data modeling researchers. The aim of the paper is to ground on a new perspective to visualize the multivariate spatial-temporal data of the Red Army Long March, which is one of the most important events of the Chinese modem history. This research focuses on the extraction of relevant information from a 3-dimensional trajectory, which captures object locations in geographic space at specified temporal intervals. However, existing visualization methods cannot deal with the multivariate spatial-temporal data effectively. Thus there is a potential chance to represent and analyze this kind of data in the case study. The thesis combines two visualization methods, the Space-Time-Cube for spatial temporal data and Parallel Coordinates Plots (PCPs) for multivariable data, to develop conceptual GIS database model that facilitates the exploration and analysis of multivariate spatial-temporal data sets in the combination with 3D Space-Time-Path and 2D graphics. The designed model is supported by the geo-visualization environment and integrates diverse sets of multivariate spatial-temporal data and built-up the dynamic process and relationships. It is concluded that this way of geo-visualization can effectively manipulate a large amount of distributed data, realize the high efficient transmission of quantitative and qualitative information and also provide a new research mode in the field of the History of CPC and military affairs.

  1. A microcomputer-based system for quantitative petrographic analysis

    Science.gov (United States)

    Starkey, John; Samantaray, Abani Kanta

    1994-11-01

    An imaging system based on a videocamera and frame grabber is described which is capable of capturing and analyzing composite images. Individual images are captured interactively, this permits manipulation of the illumination to emphasize selected features of interest in sequentially captured images. Data from the sequential images are accumulated to form a synoptic image, which allows analysis to proceed in a manner which emulates the techniques of manual, polarized light microscopy. The effects of rotating a thin section in plane and crossed polarized light can be simulated so that mineral boundaries can be detected across which there is a lack of contrast at some orientations. The imaging system implements algorithms for digital filtering and boundary identification and incorporates facilities for image editing. Mathematical functions are provided for the interpolation of boundaries which are not detected in their entirety, in a way analogous to visual interpretation. The image data are written to 256-color PCX image files which can be manipulated by other software or transmitted electronically. The locations of the boundaries of the features of interest are available as lists of ( x, y) coordinates and as chain codes. From these the size, shape, and spatial parameters are computed. In addition, the gray-level and segmented images are used to obtain texture information. The imaging system is illustrated by application to the analysis of grain boundaries, modal composition, and grain shapes in petrographic thin sections. The analytical results are compared with results obtained by traditional petrographic analyses.

  2. Spatially resolved fish population analysis for designing MPAs

    DEFF Research Database (Denmark)

    Christensen, Asbjørn; Mosegaard, Henrik; Jensen, Henrik

    2009-01-01

    The sandeel population analysis model (SPAM) is presented as a simulation tool for exploring the efficiency of Marine Protected Areas (MPAs) for sandeel stocks. SPAM simulates spatially resolved sandeel population distributions, based on a high-resolution map of all fishery-established sandbank...... habitats for settled sandeels, combined with a life-cycle model for survival, growth, and reproduction, and a three-dimensional hydrodynamic model for describing larval transport between the network of habitats. SPAM couples stock dynamics to ecosystem and anthropogenic forcing via well-defined drivers....... The SPAM framework was tested using ICES statistical rectangle 37F2 as an MPA, and the impact on sandeel populations within the MPA and neighbouring habitats was investigated. Increased larval spillover compensated for lost catches inside the MPA. The temporal and spatial scales of stock response to MPAs...

  3. ESTATE: Strategy for Exploring Labeled Spatial Datasets Using Association Analysis

    Science.gov (United States)

    Stepinski, Tomasz F.; Salazar, Josue; Ding, Wei; White, Denis

    We propose an association analysis-based strategy for exploration of multi-attribute spatial datasets possessing naturally arising classification. Proposed strategy, ESTATE (Exploring Spatial daTa Association patTErns), inverts such classification by interpreting different classes found in the dataset in terms of sets of discriminative patterns of its attributes. It consists of several core steps including discriminative data mining, similarity between transactional patterns, and visualization. An algorithm for calculating similarity measure between patterns is the major original contribution that facilitates summarization of discovered information and makes the entire framework practical for real life applications. Detailed description of the ESTATE framework is followed by its application to the domain of ecology using a dataset that fuses the information on geographical distribution of biodiversity of bird species across the contiguous United States with distributions of 32 environmental variables across the same area.

  4. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  5. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  6. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  7. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    Science.gov (United States)

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  8. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  9. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  10. Quantitative analysis of polymorphic mixtures of ranitidine hydrochloride by Raman spectroscopy and principal components analysis.

    Science.gov (United States)

    Pratiwi, Destari; Fawcett, J Paul; Gordon, Keith C; Rades, Thomas

    2002-11-01

    Ranitidine hydrochloride exists as two polymorphs, forms I and II, both of which are used to manufacture commercial tablets. Raman spectroscopy can be used to differentiate the two forms but univariate methods of quantitative analysis of one polymorph as an impurity in the other lack sensitivity. We have applied principal components analysis (PCA) of Raman spectra to binary mixtures of the two polymorphs and to binary mixtures prepared by adding one polymorph to powdered tablets of the other. Based on absorption measurements of seven spectral regions, it was found that >97% of the spectral variation was accounted for by three principal components. Quantitative calibration models generated by multiple linear regression predicted a detection limit and quantitation limit for either forms I or II in mixtures of the two of 0.6 and 1.8%, respectively. This study demonstrates that PCA of Raman spectroscopic data provides a sensitive method for the quantitative analysis of polymorphic impurities of drugs in commercial tablets with a quantitation limit of less than 2%.

  11. Quantitative fluorescence loss in photobleaching for analysis of protein transport and aggregation

    Directory of Open Access Journals (Sweden)

    Wüstner Daniel

    2012-11-01

    Full Text Available Abstract Background Fluorescence loss in photobleaching (FLIP is a widely used imaging technique, which provides information about protein dynamics in various cellular regions. In FLIP, a small cellular region is repeatedly illuminated by an intense laser pulse, while images are taken with reduced laser power with a time lag between the bleaches. Despite its popularity, tools are lacking for quantitative analysis of FLIP experiments. Typically, the user defines regions of interest (ROIs for further analysis which is subjective and does not allow for comparing different cells and experimental settings. Results We present two complementary methods to detect and quantify protein transport and aggregation in living cells from FLIP image series. In the first approach, a stretched exponential (StrExp function is fitted to fluorescence loss (FL inside and outside the bleached region. We show by reaction–diffusion simulations, that the StrExp function can describe both, binding/barrier–limited and diffusion-limited FL kinetics. By pixel-wise regression of that function to FL kinetics of enhanced green fluorescent protein (eGFP, we determined in a user-unbiased manner from which cellular regions eGFP can be replenished in the bleached area. Spatial variation in the parameters calculated from the StrExp function allow for detecting diffusion barriers for eGFP in the nucleus and cytoplasm of living cells. Polyglutamine (polyQ disease proteins like mutant huntingtin (mtHtt can form large aggregates called inclusion bodies (IB’s. The second method combines single particle tracking with multi-compartment modelling of FL kinetics in moving IB’s to determine exchange rates of eGFP-tagged mtHtt protein (eGFP-mtHtt between aggregates and the cytoplasm. This method is self-calibrating since it relates the FL inside and outside the bleached regions. It makes it therefore possible to compare release kinetics of eGFP-mtHtt between different cells and

  12. Quantitative chemical-structure evaluation using atom probe tomography: Short-range order analysis of Fe–Al

    Energy Technology Data Exchange (ETDEWEB)

    Marceau, R.K.W., E-mail: r.marceau@deakin.edu.au [Institute for Frontier Materials, Deakin University, Geelong, VIC 3216 (Australia); Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Straße 1, 40237 Düsseldorf (Germany); Ceguerra, A.V.; Breen, A.J. [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Raabe, D. [Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Straße 1, 40237 Düsseldorf (Germany); Ringer, S.P. [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia)

    2015-10-15

    Short-range-order (SRO) has been quantitatively evaluated in an Fe–18Al (at%) alloy using atom probe tomography (APT) data and by calculation of the generalised multicomponent short-range order (GM-SRO) parameters, which have been determined by shell-based analysis of the three-dimensional atomic positions. The accuracy of this method with respect to limited detector efficiency and spatial resolution is tested against simulated D0{sub 3} ordered data. Whilst there is minimal adverse effect from limited atom probe instrument detector efficiency, the combination of this with imperfect spatial resolution has the effect of making the data appear more randomised. The value of lattice rectification of the experimental APT data prior to GM-SRO analysis is demonstrated through improved information sensitivity. - Highlights: • Short-range-order (SRO) is quantitatively evaluated using atom probe tomography data. • Chemical species-specific SRO parameters have been calculated. • The accuracy of this method is tested against simulated D0{sub 3} ordered data. • Imperfect spatial resolution combined with finite detector efficiency causes a randomising effect. • Lattice rectification of the data prior to GM-SRO analysis is demonstrated to improve information sensitivity.

  13. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  14. Quantitative Analysis of the Outdoor Thermal Comfort in the Hot and Humid Summertime of Shenzhen, China

    Institute of Scientific and Technical Information of China (English)

    Lin Liu; Yaoyu Lin; Dan Wang; Jing Liu

    2017-01-01

    Outdoor thermal comfort has always been a major issue due to its irreplaceable role in maintaining good health and energy use. Thus, quantitative analysis of outdoor thermal comfort and discussions on influential factors seem very necessary to achieve the climate-conscious urban design. Therefore, an outdoor thermal comfort questionnaire survey and the simultaneous field measurement were conducted in six different places during the hot and humid summertime in Shenzhen. The results show that the overall weather conditions during the investigation can be expressed with high temperature and high humidity with strong solar radiation. The micro-meteorological parameters of six test sites vary greatly due to their different regional spatial layouts. Moderate range of air temperature ( Ta ) is between 28 to 30 ℃ while that of relative humidity ( RH ) mainly concentrates in 60%-70% with the thermal sensation votes. The main influential factors impacting outdoor thermal comfort are obtained and Ta has the greatest effect. The overall thermal comfortable ranges in Shenzhen are expressed by the range of 28. 14-32. 83 ℃ of PET and 24. 74-30. 45 ℃ of SET?. With the correlation analysis between the characteristic parameters of regional spatial layout and thermal climate and thermal comfort, it reveals that increasing the coverage ratio of water and green space ( S) helps lower Ta and increase RH . The global solar radiation ( G) has a significant negative correlation with the height of buildings ( H) and a positive correlation with sky view factor ( SVF ) . Overall, reasonable configuration of the regional spatial layout contributes to providing a thermal comfortable environment.

  15. Spatial Dependence and Heterogeneity in Bayesian Factor Analysis : A Cross-National Investigation of Schwartz Values

    NARCIS (Netherlands)

    Stakhovych, Stanislav; Bijmolt, Tammo H. A.; Wedel, Michel

    2012-01-01

    In this article, we present a Bayesian spatial factor analysis model. We extend previous work on confirmatory factor analysis by including geographically distributed latent variables and accounting for heterogeneity and spatial autocorrelation. The simulation study shows excellent recovery of the

  16. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  17. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  18. Quantitative Analysis of AGV System in FMS Cell Layout

    Directory of Open Access Journals (Sweden)

    B. Ramana

    1997-01-01

    Full Text Available Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs are invariably used for material handling in flexible manufacturing Systems (FMSs due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system, such as AGV can be improved by reducing the length of empty move. The length of empty move of AGV depends upon despatching and scheduling methods. If these methods of AGVs are not properly planned, the length of empty move of AGV is greater than the length of delivery .This results in increase in material handling time which in turn increases the number of AGVs required in FMS cell. This paper presents a method for optimising the length of empty travel of AGV in a typical FMS cell layout.

  19. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    Science.gov (United States)

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  20. Quantitative produced water analysis using mobile 1H NMR

    Science.gov (United States)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  1. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    Science.gov (United States)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  2. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    Science.gov (United States)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  3. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    Energy Technology Data Exchange (ETDEWEB)

    Haase, A.T.; Zupancic, M.; Cavert, W. [Univ. of Minnesota Medical School, Minneapolis, MN (United States)] [and others

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  4. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  5. Quantitative Analysis and Comparisons of EPON Protection Schemes

    Institute of Scientific and Technical Information of China (English)

    CHENHong; JINDepeng; ZENGLieguang; SULi

    2005-01-01

    This paper presents the relationship between the intensity of network damage and the network survivability. Then a method for quantitatively analyzing the survivability of tree network is studied. Based on the analysis, the survivability of Ethernet passive optical network (EPON) with three kinds of protection schemes (i.e., Trunk-fiber protection scheme, Node-fiber protection scheme, and Bus-fiber protection) is discussed. Following this, the comparisons of the survivability among these three kinds of protection schemes of F.PON are put forward. The simulation results show that, when the coverage area is the same, the survivability of EPON with Node-fiber protection scheme is better than that of EPON with Trunk-fiber protection scheme, and when the number and distribution of Optical network unit (ONU) are the same, the survivability of EPON with Bus-fiber protection scheme is better than that of EPON with Nodefiber protection scheme. Under the same constraints, the needed fiber of EPON with Bus-fiber protection scheme is the least when there are more than 12 ONU nodes. These results are useful not only for forecasting and evaluating the survivability of EPON access network, but also for its topology design.

  6. Quantitative analysis of regulatory flexibility under changing environmental conditions

    Science.gov (United States)

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  7. Quantitative analysis of piperine in ayurvedic formulation by UV Spectrophotometry

    Directory of Open Access Journals (Sweden)

    Gupta Vishvnath

    2011-02-01

    Full Text Available A simple and reproducible UV- spectrophotometric method for the quantitative determination of piperine in Sitopaladi churna (STPLC were developed and validated in the present work. The parameters linearity, precision , accuracy, and standard error were studies according to indian herbal pharmacopiea. In this present study a new, simple, rapid, sensitive, precise and economic spectrophotometric method in ultraviolet region has been developed for the determination of piperine in market and laboratory herbal formulation of Sitopaladi churna. which were procured and purchased respectively from the local market and they were evaluated as per Indian herbal Pharmacopoeia and WHO guidelines. The concentration of piperine present in raw material of PSC was found to be 1.45±0.014 w/w in piper longum fruits. Piperine has the maximum wavelength at 342.5 nm and hence the UV spectrophotometric method was performed at 342.5 nm. The samples were prepared in methanol and methos obeys Beers law in concentration ranges employed for evaluation. The content of piperine in ayurvedic formulation was determined. The result of analysis have been validated statistically and recovery studies confirmed the accuracy of the proposed method. Hence the proposed method can be used for the reliable quantification of Piperine in crude drug and its herbal formulation.

  8. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  9. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    Science.gov (United States)

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  10. Performance analysis of improved methodology for incorporation of spatial/spectral variability in synthetic hyperspectral imagery

    Science.gov (United States)

    Scanlan, Neil W.; Schott, John R.; Brown, Scott D.

    2004-01-01

    Synthetic imagery has traditionally been used to support sensor design by enabling design engineers to pre-evaluate image products during the design and development stages. Increasingly exploitation analysts are looking to synthetic imagery as a way to develop and test exploitation algorithms before image data are available from new sensors. Even when sensors are available, synthetic imagery can significantly aid in algorithm development by providing a wide range of "ground truthed" images with varying illumination, atmospheric, viewing and scene conditions. One limitation of synthetic data is that the background variability is often too bland. It does not exhibit the spatial and spectral variability present in real data. In this work, four fundamentally different texture modeling algorithms will first be implemented as necessary into the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model environment. Two of the models to be tested are variants of a statistical Z-Score selection model, while the remaining two involve a texture synthesis and a spectral end-member fractional abundance map approach, respectively. A detailed comparative performance analysis of each model will then be carried out on several texturally significant regions of the resultant synthetic hyperspectral imagery. The quantitative assessment of each model will utilize a set of three peformance metrics that have been derived from spatial Gray Level Co-Occurrence Matrix (GLCM) analysis, hyperspectral Signal-to-Clutter Ratio (SCR) measures, and a new concept termed the Spectral Co-Occurrence Matrix (SCM) metric which permits the simultaneous measurement of spatial and spectral texture. Previous research efforts on the validation and performance analysis of texture characterization models have been largely qualitative in nature based on conducting visual inspections of synthetic textures in order to judge the degree of similarity to the original sample texture imagery. The quantitative

  11. Communication about vaccinations in Italian websites: a quantitative analysis.

    Science.gov (United States)

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  12. Variability of apparently homogeneous soilscapes in São Paulo state, Brazil: I. spatial analysis

    Directory of Open Access Journals (Sweden)

    M. van Den Berg

    2000-06-01

    Full Text Available The spatial variability of strongly weathered soils under sugarcane and soybean/wheat rotation was quantitatively assessed on 33 fields in two regions in São Paulo State, Brazil: Araras (15 fields with sugarcane and Assis (11 fields with sugarcane and seven fields with soybean/wheat rotation. Statistical methods used were: nested analysis of variance (for 11 fields, semivariance analysis and analysis of variance within and between fields. Spatial levels from 50 m to several km were analyzed. Results are discussed with reference to a previously published study carried out in the surroundings of Passo Fundo (RS. Similar variability patterns were found for clay content, organic C content and cation exchange capacity. The fields studied are quite homogeneous with respect to these relatively stable soil characteristics. Spatial variability of other characteristics (resin extractable P, pH, base- and Al-saturation and also soil colour, varies with region and, or land use management. Soil management for sugarcane seems to have induced modifications to greater depths than for soybean/wheat rotation. Surface layers of soils under soybean/wheat present relatively little variation, apparently as a result of very intensive soil management. The major part of within-field variation occurs at short distances (< 50 m in all study areas. Hence, little extra information would be gained by increasing sampling density from, say, 1/km² to 1/50 m². For many purposes, the soils in the study regions can be mapped with the same observation density, but residual variance will not be the same in all areas. Bulk sampling may help to reveal spatial patterns between 50 and 1.000 m.

  13. Use of artificial neural network for spatial rainfall analysis

    Indian Academy of Sciences (India)

    Tsangaratos Paraskevas; Rozos Dimitrios; Benardos Andreas

    2014-04-01

    In the present study, the precipitation data measured at 23 rain gauge stations over the Achaia County, Greece, were used to estimate the spatial distribution of the mean annual precipitation values over a specific catchment area. The objective of this work was achieved by programming an Artificial Neural Network (ANN) that uses the feed-forward back-propagation algorithm as an alternative interpolating technique. A Geographic Information System (GIS) was utilized to process the data derived by the ANN and to create a continuous surface that represented the spatial mean annual precipitation distribution.The ANN introduced an optimization procedure that was implemented during training, adjusting the hidden number of neurons and the convergence of the ANN in order to select the best network architecture. The performance of the ANN was evaluated using three standard statistical evaluation criteria applied to the study area and showed good performance. The outcomes were also compared with the results obtained from a previous study in the area of research which used a linear regression analysis for the estimation of the mean annual precipitation values giving more accurate results. The information and knowledge gained from the present study could improve the accuracy of analysis concerning hydrology and hydrogeological models, ground water studies, flood related applications and climate analysis studies.

  14. Analysis on sensitivity and landscape ecological spatial structure of site resources

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    This article establishes a set of indicators and standards for landscape ecological sensitivity analysis of site resources by using thetheories and approaches of landscape ecology.It uses landscape diversity index( H), evenness(E), natural degree(N), contrast degree(C)to study spatial structure and landscape heterogeneity of site resources and thus provides a qualitative-quantitative evaluation method for landplanning and management of small, medium scale areas.The analysis of Yantian District, Shenzhen of China showed that Wutong Mountainbelonged to high landscape ecological sensitivity area, Sanzhoutian Reservoir and Shangping Reservoir were medium landscape sensitivity areaand high ecological sensitivity area; Dameisha and Xiaomeisha belonged to medium sensitivity area caused by the decline of natural ecologicalareas.Shatoujiao, Yantian Pier belonged to low sensitivity area but urban landscape ecological development had reshaped and influenced theirlandscape ecological roles in a great extent.Suggestions on planning, protection goals and development intensity of each site or district wereraised.

  15. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  16. Different Ways of Thinking about Street Networks and Spatial Analysis

    CERN Document Server

    Jiang, Bin

    2014-01-01

    Street networks, as one of the oldest infrastructures of transport in the world, play a significant role in modernization, sustainable development, and human daily activities in both ancient and modern times. Although street networks have been well studied in a variety of engineering and scientific disciplines, including for instance transport, geography, urban planning, economics, and even physics, our understanding of street networks in terms of their structure and dynamics remains limited, especially when dealing with such real-world problems as traffic jams, pollution, and human evacuations for disaster management. One goal of this special issue is to promote different ways of thinking about understanding street networks, and of conducting spatial analysis.

  17. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  18. Reducing spatial uncertainty in climatic maps through geostatistical analysis

    Science.gov (United States)

    Pesquer, Lluís; Ninyerola, Miquel; Pons, Xavier

    2014-05-01

    Climatic maps from meteorological stations and geographical co-variables can be obtained through correlative models (Ninyerola et al., 2000)*. Nevertheless, the spatial uncertainty of the resulting maps could be reduced. The present work is a new stage over those approaches aiming to study how to obtain better results while characterizing spatial uncertainty. The study area is Catalonia (32000 km2), a region with highly variable relief (0 to 3143 m). We have used 217 stations (321 to 1244 mm) to model the annual precipitation in two steps: 1/ multiple regression using geographical variables (elevation, distance to the coast, latitude, etc) and 2/ refinement of the results by adding the spatial interpolation of the regression residuals with inverse distance weighting (IDW), regularized splines with tension (SPT) or ordinary kriging (OK). Spatial uncertainty analysis is based on an independent subsample (test set), randomly selected in previous works. The main contribution of this work is the analysis of this test set as well as the search for an optimal process of division (split) of the stations in two sets, one used to perform the multiple regression and residuals interpolation (fit set), and another used to compute the quality (test set); optimal division should reduce spatial uncertainty and improve the overall quality. Two methods have been evaluated against classical methods: (random selection RS and leave-one-out cross-validation LOOCV): selection by Euclidian 2D-distance, and selection by anisotropic 2D-distance combined with a 3D-contribution (suitable weighted) from the most representative independent variable. Both methods define a minimum threshold distance, obtained by variogram analysis, between samples. Main preliminary results for LOOCV, RS (average from 10 executions), Euclidian criterion (EU), and for anisotropic criterion (with 1.1 value, UTMY coordinate has a bit more weight than UTMX) combined with 3D criteria (A3D) (1000 factor for elevation

  19. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases.

    Science.gov (United States)

    Hongoh, Valerie; Hoen, Anne Gatewood; Aenishaenslin, Cécile; Waaub, Jean-Philippe; Bélanger, Denise; Michel, Pascal

    2011-12-29

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular.

  20. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    Directory of Open Access Journals (Sweden)

    Hongoh Valerie

    2011-12-01

    Full Text Available Abstract The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector

  1. Stomata size and spatial pattern effects on leaf gas exchange - a quantitative assessment of plant evolutionary choices

    Science.gov (United States)

    Or, Dani; Assouline, Shmuel; Aminzadeh, Milad; Haghighi, Erfan; Schymanski, Stan; Lehmann, Peter

    2014-05-01

    Land plants developed a dynamically gas-permeable layer at their leaf surfaces to allow CO2 uptake for photosynthesis while controlling water vapor loss through numerous adjustable openings (stomata) in the impervious leaf epidermis. Details of stomata structure, density and function may vary greatly among different plant families and respond to local environmental conditions, yet they share basic traits in dynamically controlling gaseous exchange rates by varying stomata apertures. We implement a pore scale gas diffusion model to quantitatively interpret the functionality of different combinations of stomata size and pattern on leaf gas exchange and thermal management based on data from fossil records and contemporary data sets. Considering all available data we draw several general conclusions concerning stomata design considerations: (1) the sizes and densities of stomata in the available fossil record leaves were designed to evaporate at rates in the range 0.75≤e/e0 ≤0.99 (relative to free water evaporation); (2) examination of evaporation curves show that for a given stomata size, the density (jointly defining the leaf evaporating area when fully open) was chosen to enable a high sensitivity in reducing evaporation rate with incremental stomatal closure, nevertheless, results show the design includes safety margins to account for different wind conditions (boundary layer thickness); (3) scaled for mean vapor flux, the size of stomata plays a minor role in the uniformity of leaf thermal field for a given stomata density. These principles enable rationale assessment of plant response to raising CO2, and provide a physical framework for considering the consequences of different stomata patterns (patchy) on leaf gas exchange (and thermal regime). In contrast with present quantitative description of traits and functionality of these dynamic covers in terms of gaseous diffusion resistance (or conductance), where stomata size, density and spatial pattern are

  2. Drought analysis in Switzerland: spatial and temporal features

    Science.gov (United States)

    Di Franca, Gaetano; Molnar, Peter; Burlando, Paolo; Bonaccorso, Brunella; Cancelliere, Antonino

    2015-04-01

    Drought as a natural hazard may have negative impacts even in regions characterized by a general abundance of water resources. The Swiss Alpine region has experienced several extreme meteorological events (heat waves, droughts) during the last fifty years that have caused human and economic losses. Though Swiss climate is far from arid or semi-arid, natural climatic variability, exacerbated by climate change, could lead to more severe impacts from naturally occurring meteorological droughts (i.e. lack or significant reduction of precipitation) in the future. In this work, spatial and temporal features of meteorological droughts in Switzerland have been explored by the identification and probabilistic characterization of historic drought events on gridded precipitation data during the period 1961-2012. The run method has been applied to both monthly and annual precipitation time series to probabilistically characterize drought occurrences as well as to analyze their spatial variability. Spatial features have also been investigated by means of Principal Components Analysis (PCA) applied to Standardized Precipitation Index (SPI) series at 3, 6, and 12-month aggregated time scale, in order to detect areas with distinct precipitation patterns, accounting for seasonality throughout year and including both wet and dry conditions. Furthermore, a probabilistic analysis of drought areal extent has been carried out by applying an SPI-based procedure to derive Severity-Area-Frequency (SAF) curves. The application of run method reveals that Ticino and Valais are the most potentially drought-prone Swiss regions, since accumulated deficit precipitation is significantly higher (up to two times) than in the rest of the country. Inspection of SPI series reveals many events in which precipitation has shown significant anomalies from the average in the period 1961-2012 at the investigated time scales. Anomalies in rainfall seem to exhibit high spatial correlation, showing uniform sub

  3. A quantitative analysis of Salmonella Typhimurium metabolism during infection

    OpenAIRE

    Steeb, Benjamin

    2012-01-01

    In this thesis, Salmonella metabolism during infection was investigated. The goal was to gain a quantitative and comprehensive understanding of Salmonella in vivo nutrient supply, utilization and growth. To achieve this goal, we used a combined experimental / in silico approach. First, we generated a reconstruction of Salmonella metabolism ([1], see 2.1). This reconstruction was then combined with in vivo data from experimental mutant phenotypes to build a comprehensive quantitative in viv...

  4. Quantitative analysis of flavanones and chalcones from willow bark.

    Science.gov (United States)

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond β-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 μM of naringenin and 0.45 μM of eriodictyol. The LOQ was 2.34 μM of naringenin and 1.35 μM for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations.

  5. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  6. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  7. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    Science.gov (United States)

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction.

  8. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  9. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  10. Cartographic system for spatial distribution analysis of corneal endothelial cells.

    Science.gov (United States)

    Corkidi, G; Márquez, J; García-Ruiz, M; Díaz-Cintra, S; Graue, E

    1994-07-01

    A combined cartographic and morphometric endothelium analyser has been developed by integrating the HISTO 2000 histological imaging and analysis system with a prototype human corneal endothelium analyser. The complete system allows the elaboration and analysis of cartographies of corneal endothelial tissue, and hence the in vitro study of the spatial distribution of corneal endothelial cells, according to their regional morphometric characteristics (cell size and polygonality). The global cartographic reconstruction is obtained by sequential integration of the data analysed for each microscopic field. Subsequently, the location of each microscopically analysed field is referred to its real position on the histologic preparation by means of X-Y co-ordinates; both are provided by micrometric optoelectronic sensors installed on the optical microscope stage. Some cartographies of an excised human corneal keratoconus button in vitro are also presented. These cartographic images allow a macroscopic view of endothelial cells analysed microscopically. Parametric colour images show the spatial distribution of endothelial cells, according to their specific morphometric parameters, and exhibit the variability in size and cellular shape which depend on the analysed area.

  11. Modeling and analysis of Schistosoma Argonaute protein molecular spatial conformation

    Institute of Scientific and Technical Information of China (English)

    Jianhua Zhang; Zhigang Shang; Xiaohui Zhang; Yuntao Zhang

    2011-01-01

    Objective: To analyze the amino acid sequence composition, secondary structure, the spatial conformation of its domain and other characteristics of Argonaute protein. Methods:Bioinformatics tools and the internet server were used. Firstly, the amino acid sequence composition features of the Argonaute protein were analyzed, and the phylogenetic tree was constructed. Secondly, Argonaute protein’s distribution of secondary structure and its physicochemical properties were predicted. Lastly, the protein functional expression form of the domain group was established through the Phyre-based analysis on the spatial conformation of Argonaute protein domains. Results: 593 amino acids were encoded by Argonaute protein, the phylogenetic tree was constructed, and Argonaute protein’s distribution of secondary structure and its physicochemical properties were obtained through analysis. In addition, the functional expression form which comprised the N-terminal PAZ domain and C-terminal Piwi domain for the Argonaute protein was obtained with Phyre. Conclusions: The information relationship between the structure and function of the Argonaute protein can be initially established with bioinformatics tools and the internet server, and this provides the theoretical basis for further clarifying the function of Schistosoma Argonaute protein.

  12. Spatially explicit analysis of gastropod biodiversity in ancient Lake Ohrid

    Directory of Open Access Journals (Sweden)

    T. Hauffe

    2011-01-01

    Full Text Available The quality of spatial analyses of biodiversity is improved by (i utilizing study areas with well defined physiogeographical boundaries, (ii limiting the impact of widespread species, and (iii using taxa with heterogeneous distributions. These conditions are typically met by ecosystems such as oceanic islands or ancient lakes and their biota. While research on ancient lakes has contributed significantly to our understanding of evolutionary processes, statistically sound studies of spatial variation of extant biodiversity have been hampered by the frequently vast size of ancient lakes, their limited accessibility, and the lack of scientific infrastructure. The European ancient Lake Ohrid provides a rare opportunity for such a reliable spatial study. The comprehensive horizontal and vertical sampling of a species-rich taxon, the Gastropoda, presented here, revealed interesting patterns of biodiversity, which, in part, have not been shown before for other ancient lakes.

    In a total of 284 samples from 224 different locations throughout the Ohrid Basin, 68 gastropod species, with 50 of them (= 73.5% being endemic, could be reported. The spatial distribution of these species shows the following characteristics: (i within Lake Ohrid, the most frequent species are endemic taxa with a wide depth range, (ii widespread species (i.e. those occurring throughout the Balkans or beyond are rare and mainly occur in the upper layer of the lake, (iii while the total number of species decreases with water depth, the proportion of endemics increases, and (iv the deeper layers of Lake Ohrid appear to have a higher spatial homogeneity of biodiversity. Moreover, gastropod communities of Lake Ohrid and its feeder springs are both distinct from each other and from the surrounding waters. The analysis also shows that community similarity of Lake Ohrid is mainly driven by niche processes (e.g. environmental factors, but also by neutral processes (e.g. dispersal

  13. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  14. Exploratory analysis of spatial and temporal data a systematic approach

    CERN Document Server

    Andrienko, Natalia

    2006-01-01

    Exploratory data analysis (EDA) is about detecting and describing patterns, trends, and relations in data, motivated by certain purposes of investigation. As something relevant is detected in data, new questions arise, causing specific parts to be viewed in more detail. So EDA has a significant appeal: it involves hypothesis generation rather than mere hypothesis testing. The authors describe in detail and systemize approaches, techniques, and methods for exploring spatial and temporal data in particular. They start by developing a general view of data structures and characteristics and then build on top of this a general task typology, distinguishing between elementary and synoptic tasks. This typology is then applied to the description of existing approaches and technologies, resulting not just in recommendations for choosing methods but in a set of generic procedures for data exploration. Professionals practicing analysis will profit from tested solutions - illustrated in many examples - for reuse in the c...

  15. Quantitative measurements of inequality in geographic accessibility to pediatric care in Oita Prefecture, Japan: Standardization with complete spatial randomness

    Directory of Open Access Journals (Sweden)

    Shima Masayuki

    2011-07-01

    Full Text Available Abstract Background A quantitative measurement of inequality in geographic accessibility to pediatric care as well as that of mean distance or travel time is very important for priority setting to ensure fair access to pediatric facilities. However, conventional techniques for measuring inequality is inappropriate in geographic settings. Since inequality measures of access distance or travel time is strongly influenced by the background geographic distribution patterns, they cannot be directly used for regional comparisons of geographic accessibility. The objective of this study is to resolve this issue by using a standardization approach. Methods Travel times to the nearest pediatric care were calculated for all children in Oita Prefecture, Japan. Relative mean differences were considered as the inequality measure for secondary medical service areas, and were standardized with an expected value estimated from a Monte Carlo simulation based on complete spatial randomness. Results The observed mean travel times in the area considered averaged 4.50 minutes, ranging from 1.83 to 7.02 minutes. The mean of the observed inequality measure was 1.1, ranging from 0.9 to 1.3. The expected values of the inequality measure varied according to the background geographic distribution pattern of children, which ranged from 0.3 to 0.7. After standardizing the observed inequality measure with the expected one, we found that the ranks of the inequality measure were reversed for the observed areas. Conclusions Using the indicator proposed in this paper, it is possible to compare the inequality in geographic accessibility among regions. Such a comparison may facilitate priority setting in health policy and planning.

  16. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    Science.gov (United States)

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  17. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  18. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  19. Quantitative diagnosis of bladder cancer by morphometric analysis of HE images

    Science.gov (United States)

    Wu, Binlin; Nebylitsa, Samantha V.; Mukherjee, Sushmita; Jain, Manu

    2015-02-01

    In clinical practice, histopathological analysis of biopsied tissue is the main method for bladder cancer diagnosis and prognosis. The diagnosis is performed by a pathologist based on the morphological features in the image of a hematoxylin and eosin (HE) stained tissue sample. This manuscript proposes algorithms to perform morphometric analysis on the HE images, quantify the features in the images, and discriminate bladder cancers with different grades, i.e. high grade and low grade. The nuclei are separated from the background and other types of cells such as red blood cells (RBCs) and immune cells using manual outlining, color deconvolution and image segmentation. A mask of nuclei is generated for each image for quantitative morphometric analysis. The features of the nuclei in the mask image including size, shape, orientation, and their spatial distributions are measured. To quantify local clustering and alignment of nuclei, we propose a 1-nearest-neighbor (1-NN) algorithm which measures nearest neighbor distance and nearest neighbor parallelism. The global distributions of the features are measured using statistics of the proposed parameters. A linear support vector machine (SVM) algorithm is used to classify the high grade and low grade bladder cancers. The results show using a particular group of nuclei such as large ones, and combining multiple parameters can achieve better discrimination. This study shows the proposed approach can potentially help expedite pathological diagnosis by triaging potentially suspicious biopsies.

  20. Quantitative analysis of geomorphic processes using satellite image data at different scales

    Science.gov (United States)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  1. Quantitative analysis of scanning tunneling microscopy images of mixed-ligand-functionalized nanoparticles.

    Science.gov (United States)

    Biscarini, Fabio; Ong, Quy Khac; Albonetti, Cristiano; Liscio, Fabiola; Longobardi, Maria; Mali, Kunal S; Ciesielski, Artur; Reguera, Javier; Renner, Christoph; De Feyter, Steven; Samorì, Paolo; Stellacci, Francesco

    2013-11-12

    Ligand-protected gold nanoparticles exhibit large local curvatures, features rapidly varying over small scales, and chemical heterogeneity. Their imaging by scanning tunneling microscopy (STM) can, in principle, provide direct information on the architecture of their ligand shell, yet STM images require laborious analysis and are challenging to interpret. Here, we report a straightforward, robust, and rigorous method for the quantitative analysis of the multiscale features contained in STM images of samples consisting of functionalized Au nanoparticles deposited onto Au/mica. The method relies on the analysis of the topographical power spectral density (PSD) and allows us to extract the characteristic length scales of the features exhibited by nanoparticles in STM images. For the mixed-ligand-protected Au nanoparticles analyzed here, the characteristic length scale is 1.2 ± 0.1 nm, whereas for the homoligand Au NPs this scale is 0.75 ± 0.05 nm. These length scales represent spatial correlations independent of scanning parameters, and hence the features in the PSD can be ascribed to a fingerprint of the STM contrast of ligand-protected nanoparticles. PSD spectra from images recorded at different laboratories using different microscopes and operators can be overlapped across most of the frequency range, proving that the features in the STM images of nanoparticles can be compared and reproduced.

  2. CT acquisition technique and quantitative analysis of the lung parenchyma: variability and corrections

    Science.gov (United States)

    Zheng, Bin; Leader, J. K.; Coxson, Harvey O.; Scuirba, Frank C.; Fuhrman, Carl R.; Balkan, Arzu; Weissfeld, Joel L.; Maitz, Glenn S.; Gur, David

    2006-03-01

    The fraction of lung voxels below a pixel value "cut-off" has been correlated with pathologic estimates of emphysema. We performed a "standard" quantitative CT (QCT) lung analysis using a -950 HU cut-off to determine the volume fraction of emphysema (below the cut-off) and a "corrected" QCT analysis after removing small group (5 and 10 pixels) of connected pixels ("blobs") below the cut-off. CT examinations two dataset of 15 subjects each with a range of visible emphysema and pulmonary obstruction were acquired at "low-dose and conventional dose reconstructed using a high-spatial frequency kernel at 2.5 mm section thickness for the same subject. The "blob" size (i.e., connected-pixels) removed was inversely related to the computed fraction of emphysema. The slopes of emphysema fraction versus blob size were 0.013, 0.009, and 0.005 for subjects with both no emphysema and no pulmonary obstruction, moderate emphysema and pulmonary obstruction, and severe emphysema and severe pulmonary obstruction, respectively. The slopes of emphysema fraction versus blob size were 0.008 and 0.006 for low-dose and conventional CT examinations, respectively. The small blobs of pixels removed are most likely CT image artifacts and do not represent actual emphysema. The magnitude of the blob correction was appropriately associated with COPD severity. The blob correction appears to be applicable to QCT analysis in low-dose and conventional CT exams.

  3. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  4. [Bibliometric analysis of bacterial quantitative proteomics in English literatures].

    Science.gov (United States)

    Zhang, Xin; She, Danyang; Liu, Youning; Wang, Rui; Di, Xiuzhen; Liang, Beibei; Wang, Yue

    2014-07-01

    To analyze the worldwide advances on bacterial quantitative proteomics over the past fifteen years with bibliometric approach. Literature retrieval was conducted throughout the databases of Pubmed, Embase and Science citation index (SCI), using "bacterium" and "quantitative proteomics" as the key words. The deadline is July 2013. We sorted and analyzed these articles with Endnote X6 from the aspects of published year, the first author, name of journal, published institution, cited frequency and publication type. 932 English articles were included in our research after deleting the duplicates. The first article on bacterial quantitative proteomics was reported in 1999. The maximal publications were 163 related articles in 2012. Up till July 2013, authors from more than 23 countries and regions have published articles in this field. China ranks the fourth. The main publication type is original articles. The most frequently cited article is entitled with "Absolute quantification of proteins by LCMSE: a virtue of parallel MS acquisition" by Silva JC, Gorenstein MV, Li GZ, et al in Mol Cell Proteomics 2006. The most productive author is Smith RD from Biological Sciences Division, Pac. Northwest National Laboratory. The top journal publishing bacterial quantitative proteomics is Proteomics. More and more researchers pay attention to quantitative proteomics which will be widely used in bacteriology.

  5. Map Analysis and Spatial Statistic: Assessment of Spatial Variability of Agriculture Land Conversion at Urban Fringe Area of Yogyakarta

    Science.gov (United States)

    Susilo, Bowo

    2016-11-01

    Urban development has brought various effects, one of which was the marginalization of the agricultural sector. Agricultural land is gradually converted to other type of land uses which considered more profitable. Conversion of agricultural land cannot be avoided but it should be controlled. Early identification on spatial distribution and intensity of agricultural land conversion as well as its related factor is necessary. Objective of the research were (1) to assess the spatial variability of agricultural land conversion, (2) to identify factors that affecting the spatial variability of agricultural land conversion. Research was conducted at urban fringe area of Yogyakarta. Spatial variability of agricultural land conversion was analysed using an index called Relative Conversion Index (RCI). Combined of map analysis and spatial statistical were used to determine the center of agricultural land conversion. Simple regression analysis was used to determine the factors associated with the conversion of agricultural land. The result shows that intensity of agricultural land conversion in the study area varies spatially as well as temporally. Intensity of agricultural land conversion in the period 1993-2000, involves three categories which are high, moderate and low. In the period of 2000-2007, the intensity of agricultural land conversion involves two categories which are high and low. Spatial variability of agricultural land conversion in the study area has a significant correlation with three factors: population growth, fragmentation of agricultural land and distance of agricultural land to the city

  6. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality.

  7. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    Science.gov (United States)

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  8. Quantitative risk analysis for landslides ‒ Examples from Bíldudalur, NW-Iceland

    Directory of Open Access Journals (Sweden)

    R. Bell

    2004-01-01

    Full Text Available Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative

  9. Quantitative analysis of norfloxacin by 1H NMR and HPLC.

    Science.gov (United States)

    Frackowiak, Anita; Kokot, Zenon J

    2012-01-01

    1H NMR and developed previously HPLC methods were applied to quantitative determination of norfloxacin in veterinary solution form for pigeon. Changes in concentration can lead to significant changes in the 1H chemical shifts of non-exchangeable aromatic protons as a result of extensive self-association phenomena. This chemical shift variation of protons was analyzed and applied in the quantitative determination of norfloxacin. The method is simple, rapid, precise and accurate, and can be used for quality control of this drug.

  10. Quantitative analysis of sensor for pressure waveform measurement

    Directory of Open Access Journals (Sweden)

    Tyan Chu-Chang

    2010-01-01

    Full Text Available Abstract Background Arterial pressure waveforms contain important diagnostic and physiological information since their contour depends on a healthy cardiovascular system 1. A sensor was placed at the measured artery and some contact pressure was used to measure the pressure waveform. However, where is the location of the sensor just about enough to detect a complete pressure waveform for the diagnosis? How much contact pressure is needed over the pulse point? These two problems still remain unresolved. Method In this study, we propose a quantitative analysis to evaluate the pressure waveform for locating the position and applying the appropriate force between the sensor and the radial artery. The two-axis mechanism and the modified sensor have been designed to estimate the radial arterial width and detect the contact pressure. The template matching method was used to analyze the pressure waveform. In the X-axis scan, we found that the arterial diameter changed waveform (ADCW and the pressure waveform would change from small to large and then back to small again when the sensor was moved across the radial artery. In the Z-axis scan, we also found that the ADCW and the pressure waveform would change from small to large and then back to small again when the applied contact pressure continuously increased. Results In the X-axis scan, the template correlation coefficients of the left and right boundaries of the radial arterial width were 0.987 ± 0.016 and 0.978 ± 0.028, respectively. In the Z-axis scan, when the excessive contact pressure was more than 100 mm Hg, the template correlation was below 0.983. In applying force, when using the maximum amplitude as the criteria level, the lower contact pressure (r = 0.988 ± 0.004 was better than the higher contact pressure (r = 0.976 ± 0.012. Conclusions Although, the optimal detective position has to be close to the middle of the radial arterial, the pressure waveform also has a good completeness with

  11. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    Science.gov (United States)

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  12. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  13. Brazilian road traffic fatalities: a spatial and environmental analysis.

    Directory of Open Access Journals (Sweden)

    Luciano de Andrade

    Full Text Available BACKGROUND: Road traffic injuries (RTI are a major public health epidemic killing thousands of people daily. Low and middle-income countries, such as Brazil, have the highest annual rates of road traffic fatalities. In order to improve road safety, this study mapped road traffic fatalities on a Brazilian highway to determine the main environmental factors affecting road traffic fatalities. METHODS AND FINDINGS: Four techniques were utilized to identify and analyze RTI hotspots. We used spatial analysis by points by applying kernel density estimator, and wavelet analysis to identify the main hot regions. Additionally, built environment analysis, and principal component analysis were conducted to verify patterns contributing to crash occurrence in the hotspots. Between 2007 and 2009, 379 crashes were notified, with 466 fatalities on BR277. Higher incidence of crashes occurred on sections of highway with double lanes (ratio 2∶1. The hotspot analysis demonstrated that both the eastern and western regions had higher incidences of crashes when compared to the central region. Through the built environment analysis, we have identified five different patterns, demonstrating that specific environmental characteristics are associated with different types of fatal crashes. Patterns 2 and 4 are constituted mainly by predominantly urban characteristics and have frequent fatal pedestrian crashes. Patterns 1, 3 and 5 display mainly rural characteristics and have higher prevalence of vehicular collisions. In the built environment analysis, the variables length of road in urban area, limited lighting, double lanes roadways, and less auxiliary lanes were associated with a higher incidence of fatal crashes. CONCLUSIONS: By combining different techniques of analyses, we have identified numerous hotspots and environmental characteristics, which governmental or regulatory agencies could make use to plan strategies to reduce RTI and support life-saving policies.

  14. Spatial Hierarchical Bayesian Analysis of the Historical Extreme Streamflow

    Science.gov (United States)

    Najafi, M. R.; Moradkhani, H.

    2012-04-01

    Analysis of the climate change impact on extreme hydro-climatic events is crucial for future hydrologic/hydraulic designs and water resources decision making. The purpose of this study is to investigate the changes of the extreme value distribution parameters with respect to time to reflect upon the impact of climate change. We develop a statistical model using the observed streamflow data of the Columbia River Basin in USA to estimate the changes of high flows as a function of time as well as other variables. Generalized Pareto Distribution (GPD) is used to model the upper 95% flows during December through March for 31 gauge stations. In the process layer of the model the covariates including time, latitude, longitude, elevation and basin area are considered to assess the sensitivity of the model to each variable. Markov Chain Monte Carlo (MCMC) method is used to estimate the parameters. The Spatial Hierarchical Bayesian technique models the GPD parameters spatially and borrows strength from other locations by pooling data together, while providing an explicit estimation of the uncertainties in all stages of modeling.

  15. FORMAL EMPLOYMENT IN AGRICULTURE: A SPATIAL ANALYSIS 1989-2009

    Directory of Open Access Journals (Sweden)

    Patricia Estanislau

    2015-08-01

    Full Text Available This article aims to analyze the spatial distribution -total and by gender - of formal employment in Brazilian agriculture, by exploratory spatial data analysis (ESDA. After collecting data and analyzing the dispersion of formal employment for men and women, it is possible to show whether there is interference from the income of this sector by gender in the location of formal employment, that refers to the 558 geographical micro-regions of the country and the Annual Relation of Social Information (RAIS was used for the years 1989, 1999 and 2009. Clear participation of the largest state of Sao Paulo in the number of formal contracts was identified on the three study periods. This trend was explained by the composition of agricultural production of the state which has products that require intensive labor, such as sugar cane, coffee and orange. Furthermore, it has been observed, when unbundled formal employment by gender, an increase in the number of hiring women to work in agriculture. This fact maintains this state as the largest applicant's formal workforce to support the expansion of plantations. However, this growth occurs in a percentage much lower than hiring manpower male, so, female participation is still limited in the universe of formal labor market.

  16. Analysis of Spatial Data Structures for Proximity Detection

    Institute of Scientific and Technical Information of China (English)

    Anupreet Walia; Jochen Teizer

    2008-01-01

    Construction is a dangerous business.According to statistics,in every of the past thirteen years more than 1000 workers died in the USA construction industry.In order to minimize the overall number of these incidents,the research presented in this paper investigates to monitor and analyze the trejectories of construction resources first in a simulated environment and later on the actual job site.Due to the complex nature of the construction environment,three dimensional (3D) positioning data of workers is hardly col-lected.Although technology is available that allows tracking construction assets in real-time,indoors and outdoors,in 3D,at the same time,the continuously changing spatial and temporal arrangement of job sites requires any successfully working data processing system to work in real-time.This research paper focuses is safety on spatial data structures that offer the capability of realigning itself and reporting the distance of the closest neighbor in real-time.This paper presents results to simulations that allow the processing of real-time location data for collision detection and proximity analysis.The presented data structures and perform-ance results to the developed algorithms demonstmte that real-time tracking and proximity detection of re-sources is feasible.

  17. Spatially heterogeneous drought analysis theory and future trends

    Science.gov (United States)

    Şen, Zekâi

    2014-11-01

    Most often drought analysis methodologies are based on point-wise time series record assessments with very specific local conclusions, but their areal, regional and in general spatial properties are examined to a lesser extent. This paper will provide general description of droughts in addition to the regional methodological approaches with rather simple but innovative manner. The study area is thought as composed of a set of sub-areas each with different dry (wet) period probabilities. In general, heterogeneous region has different probability of dry(wet) spell occurrence at each sub-area. In fact, whenever the probability at each sub-area is not equal to any other sub-area, it corresponds to heterogeneous region. The areal cover probability (ACP) of the region is derived by considering the point-wise probabilities on the assumptions that the occurrences of dry (wet) spells are mutually exclusive, independent with heterogeneous probability patterns. Additionally extreme value probabilities of areal drought coverage are derived for heterogeneous sub-area probabilities. All of the heterogeneous drought probability expressions reduce down to homogeneous case provided that sub-areal probabilities are equal to each other. The methodology presented in this paper paves ways for more realistic drought phenomenon spatial probability of occurrences when the sub-areal dry spell probabilities are dependent on each other but heterogeneous.

  18. Effect of spatial normalization on analysis of functional data

    Science.gov (United States)

    Gee, James C.; Alsop, David C.; Aguirre, Geoffrey K.

    1997-04-01

    Conventional analysis of functional data often involves a normalization step in which the data are spatially aligned so that a measurement can be made across or between studies. Whether to enhance the signal-to-noise ratio or to detect significant deviations in activation from normal, the method used to register the underlying anatomies clearly impacts the viability of the analysis. Nevertheless, it is common practice to infer only homogeneous transformations, in which all parts of the image volume undergo the same mapping. To detect subtle effects or to extend the analysis to anatomies that exhibit considerable morphological variation, higher dimensional mappings to allow more accurate alignment will be crucial. We describe a Bayesian volumetric warping approach to the normalization problem, which matches local image features between MRI brain volumes, and compares its performance with a standard method (SPM'96) as well as contrast its effect on the analysis of a set of functional MRI studies against that obtained with a 9-parameter affine registration.

  19. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...

  20. Digital image analysis of inflammation markers in colorectal mucosa by using a spatial visualization method.

    Science.gov (United States)

    Kaczmarek, Elzbieta; Banasiewicz, Tomasz; Seraszek-Jaros, Agnieszka; Krokowicz, Piotr; Grochowalski, Marcin; Majewski, Przemysław; Zurawski, Jakub; Paszkowski, Jacek; Drews, Michał

    2014-03-01

    The aim of this study was to apply the spatial visualization method of digital images to quantitative analysis of pro-inflammatory cytokines IL-1, IL-6 and TNF-α in various segments of large bowel excised because of colitis ulcerosa in relation with selected clinical symptoms. Our preliminary study included 17 patients having undergone restorative proctocolectomy. Immunohistochemistry was performed for IL-1, IL-6 and TNF-α. The area fraction and intensity fraction of the cytokines studied were determined by digital image analysis. The results were then categorized using Alfred Immunohistochemistry Score. The expression of IL-1, IL-6 and TNF-α was significantly higher in the rectum than in colonic segments (p<0.01), and was associated with the patients' clinical condition. The method of quantitative immunohistochemistry presented here allows for searching associations between the expression of biomarkers and clinical symptoms. Evaluation of inflammatory cytokines could be recommended in the active stage of the disease with present symptoms of bloody and mucus stools. A higher expression of IL-1, IL-6 and TNF in samples beyond large intestine correlates with an intensified clinical course of the disease. In patients without bleeding and mucus symptoms present in stools, no significant correlations were found. Therefore, the assessment of cytokines during remission or clinically silent stage might not be useful.

  1. Quantitative Analysis and Design of a Rudder Roll Damping Controller

    DEFF Research Database (Denmark)

    Hearns, G.; Blanke, M.

    1998-01-01

    A rudder roll damping controller is designed using Quantitative feedback theory to be robust for changes in the ships metacentric height. The analytical constraint due to the non-minimum phase behaviour of the rudder to roll is analysed using the Poisson Integral Formula and it is shown how...

  2. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two...

  3. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  4. Quantitative security analysis for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Tri Minh; Huisman, Marieke

    2013-01-01

    Quantitative theories of information flow give us an approach to relax the absolute confidentiality properties that are difficult to satisfy for many practical programs. The classical information-theoretic approaches for sequential programs, where the program is modeled as a communication channel wi

  5. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Science.gov (United States)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  6. Spatial Data Analysis with R-INLA with Some Extensions

    Directory of Open Access Journals (Sweden)

    Roger Bivand

    2015-02-01

    Finally, we show some examples of the application of this technique in spatial statistics. It is worth noting that our approach can be extended to a number of other fields, and not only spatial statistics.

  7. GIS Based Spatial Data Analysis for Landslide Susceptibility Mapping

    Institute of Scientific and Technical Information of China (English)

    S.Sarkar; D.P.Kanungo; A.K.Patra; Pushpendra Kumar

    2008-01-01

    Landslide susceptibility map delineates the potential zones for landslides occurrence.The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Sikkim Himalaya.Six important causative factors for landslide occurrences were selected and corresponding thematic data layers were prepared in GIS.Topographic maps,satellite image,field data and published maps constitute the input data for thematic layer preparation.Numerical weights for different categories of these factors were determined based on a statistical approach and the weighted thematic layers were integrated in GIS environment to generate the landslide susceptibility map of the area.The landslide susceptibility map classifies the area into five different landslide susceptible zones i.e.,very high,high,moderate,low and very low.This map was validated using the existing landslide distribution in the area.

  8. Spatial analysis of Crimean Congo hemorrhagic fever in Iran.

    Science.gov (United States)

    Mostafavi, Ehsan; Haghdoost, AliAkbar; Khakifirouz, Sahar; Chinikar, Sadegh

    2013-12-01

    Crimean Congo hemorrhagic fever (CCHF) is a viral zoonotic disease. During 1999-2011, 871 human cases of CCHF were diagnosed in Iran. A history of serologic conversion for CCHF virus was seen in 58.7% of 2,447 sheep samples, 25.0% of 1,091 cattle samples and 24.8% of 987 goat samples from different parts of Iran. Spatial analysis showed that the main foci of this disease in humans during these years were in eastern Iran (P Iran. Two livestock foci were detected in the northeastern northwestern Iran. On the basis of the results of this study, infection likely entered Iran from eastern and western neighboring countries.

  9. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  10. plusTipTracker: Quantitative image analysis software for the measurement of microtubule dynamics.

    Science.gov (United States)

    Applegate, Kathryn T; Besson, Sebastien; Matov, Alexandre; Bagonis, Maria H; Jaqaman, Khuloud; Danuser, Gaudenz

    2011-11-01

    Here we introduce plusTipTracker, a Matlab-based open source software package that combines automated tracking, data analysis, and visualization tools for movies of fluorescently-labeled microtubule (MT) plus end binding proteins (+TIPs). Although +TIPs mark only phases of MT growth, the plusTipTracker software allows inference of additional MT dynamics, including phases of pause and shrinkage, by linking collinear, sequential growth tracks. The algorithm underlying the reconstruction of full MT trajectories relies on the spatially and temporally global tracking framework described in Jaqaman et al. (2008). Post-processing of track populations yields a wealth of quantitative phenotypic information about MT network architecture that can be explored using several visualization modalities and bioinformatics tools included in plusTipTracker. Graphical user interfaces enable novice Matlab users to track thousands of MTs in minutes. In this paper, we describe the algorithms used by plusTipTracker and show how the package can be used to study regional differences in the relative proportion of MT subpopulations within a single cell. The strategy of grouping +TIP growth tracks for the analysis of MT dynamics has been introduced before (Matov et al., 2010). The numerical methods and analytical functionality incorporated in plusTipTracker substantially advance this previous work in terms of flexibility and robustness. To illustrate the enhanced performance of the new software we thus compare computer-assembled +TIP-marked trajectories to manually-traced MT trajectories from the same movie used in Matov et al. (2010).

  11. Raman and infra-red microspectroscopy: towards quantitative evaluation for clinical research by ratiometric analysis.

    Science.gov (United States)

    Kumar, Srividya; Verma, Taru; Mukherjee, Ria; Ariese, Freek; Somasundaram, Kumaravel; Umapathy, Siva

    2016-04-07

    Biomolecular structure elucidation is one of the major techniques for studying the basic processes of life. These processes get modulated, hindered or altered due to various causes like diseases, which is why biomolecular analysis and imaging play an important role in diagnosis, treatment prognosis and monitoring. Vibrational spectroscopy (IR and Raman), which is a molecular bond specific technique, can assist the researcher in chemical structure interpretation. Based on the combination with microscopy, vibrational microspectroscopy is currently emerging as an important tool for biomedical research, with a spatial resolution at the cellular and sub-cellular level. These techniques offer various advantages, enabling label-free, biomolecular fingerprinting in the native state. However, the complexity involved in deciphering the required information from a spectrum hampered their entry into the clinic. Today with the advent of automated algorithms, vibrational microspectroscopy excels in the field of spectropathology. However, researchers should be aware of how quantification based on absolute band intensities may be affected by instrumental parameters, sample thickness, water content, substrate backgrounds and other possible artefacts. In this review these practical issues and their effects on the quantification of biomolecules will be discussed in detail. In many cases ratiometric analysis can help to circumvent these problems and enable the quantitative study of biological samples, including ratiometric imaging in 1D, 2D and 3D. We provide an extensive overview from the recent scientific literature on IR and Raman band ratios used for studying biological systems and for disease diagnosis and treatment prognosis.

  12. Quantitative 3D Analysis of Plant Roots Growing in Soil Using Magnetic Resonance Imaging.

    Science.gov (United States)

    van Dusschoten, Dagmar; Metzner, Ralf; Kochs, Johannes; Postma, Johannes A; Pflugfelder, Daniel; Bühler, Jonas; Schurr, Ulrich; Jahnke, Siegfried

    2016-03-01

    Precise measurements of root system architecture traits are an important requirement for plant phenotyping. Most of the current methods for analyzing root growth require either artificial growing conditions (e.g. hydroponics), are severely restricted in the fraction of roots detectable (e.g. rhizotrons), or are destructive (e.g. soil coring). On the other hand, modalities such as magnetic resonance imaging (MRI) are noninvasive and allow high-quality three-dimensional imaging of roots in soil. Here, we present a plant root imaging and analysis pipeline using MRI together with an advanced image visualization and analysis software toolbox named NMRooting. Pots up to 117 mm in diameter and 800 mm in height can be measured with the 4.7 T MRI instrument used here. For 1.5 l pots (81 mm diameter, 300 mm high), a fully automated system was developed enabling measurement of up to 18 pots per day. The most important root traits that can be nondestructively monitored over time are root mass, length, diameter, tip number, and growth angles (in two-dimensional polar coordinates) and spatial distribution. Various validation measurements for these traits were performed, showing that roots down to a diameter range between 200 μm and 300 μm can be quantitatively measured. Root fresh weight correlates linearly with root mass determined by MRI. We demonstrate the capabilities of MRI and the dedicated imaging pipeline in experimental series performed on soil-grown maize (Zea mays) and barley (Hordeum vulgare) plants.

  13. The spatial and temporal analysis of forest resources and institutions

    Science.gov (United States)

    Schweik, Charles M.

    This study addresses a central puzzle facing the Human Dimensions of Global Change research community: How can we understand the influence of environmental policies on human behavior when little or no information is available on the condition of forest resources? This dissertation capitalizes on new research tools, methods and approaches to overcome the "no information about the resource" problem. Specifically, I combine (1) forest mensuration techniques, (2) Global Positioning Systems, (3) Geographic Information Systems (GIS), (4) spatial statistics, (5) remote sensing, and (6) institutional analysis to analyze forest vegetation patterns. I provide explanation of these patterns by considering the incentive structures driving human decision-making and activity and do this through two studies in very different empirical settings. Both studies apply applicable theory related to human behavior and action. Both examine the incentive structures individuals face as they undertake daily activities related to forest resources. The first study, set in East Chitwan, Nepal, identifies spatial patterns in georeferenced forest inventory data and links these to patterns predicted by optimal foraging subject to institutional constraints. The second study compares forest management in one state and one national forest in Indiana, U.S.A. In this effort, I identify spatio-temporal patterns in the forest vegetation captured by a time series of Landsat multispectral images. The combination of natural forest regrowth and property manager actions in response to incentives and constraints explain these patterns. Substantively, both studies identify change in forest resources associated with combinations of the physical, human community and institutional "landscapes" in their regions. In both cases, geographic attributes of institutions (e.g., laws, rules) are found to influence the type and location of human actions. Methodologically, the two studies provide examples of how to control

  14. Contextual neural gas for spatial clustering and analysis

    NARCIS (Netherlands)

    Hagenauer, J.; Helbich, M.

    2013-01-01

    This study aims to introduce contextual Neural Gas (CNG), a variant of the Neural Gas algorithm, which explicitly accounts for spatial dependencies within spatial data. The main idea of the CNG is to map spatially close observations to neurons, which are close with respect to their rank distance.

  15. The Determinants of VAT Introduction : A Spatial Duration Analysis

    NARCIS (Netherlands)

    Cizek, P.; Lei, J.; Ligthart, J.E.

    2012-01-01

    Abstract: The spatial survival models typically impose frailties, which characterize unobserved heterogeneity, to be spatially correlated. This specification relies highly on a pre-determinate covariance structure of the errors. However, the spatial effect may not only exist in the unobserved errors

  16. Structural and Quantitative Analysis of Three C-Glycosylflavones by Variable Temperature Proton Quantitative Nuclear Magnetic Resonance

    Directory of Open Access Journals (Sweden)

    Jing Liu

    2017-01-01

    Full Text Available Quantitative nuclear magnetic resonance is a powerful tool in drug analysis because of its speed, precision, and efficiency. In present study, the application of variable temperature proton quantitative nuclear magnetic resonance (VT-1H-qNMR for the calibration of three C-glycosylflavones including orientin, isoorientin, and schaftoside as reference substances was reported. Since there was conformational equilibrium due to the restricted rotation around the C(sp3-C(sp2 bond in C-glycosylflavones, the conformational behaviors were investigated by VT-NMR and verified by molecular mechanics (MM calculation. The VT-1H-qNMR method was validated including the linearity, limit of quantification, precision, and stability. The results were consistent with those obtained from mass balance approach. VT-1H-qNMR can be deployed as an effective tool in analyzing C-glycosylflavones.

  17. Structural and Quantitative Analysis of Three C-Glycosylflavones by Variable Temperature Proton Quantitative Nuclear Magnetic Resonance

    Science.gov (United States)

    Liu, Yang; Dai, Zhong

    2017-01-01

    Quantitative nuclear magnetic resonance is a powerful tool in drug analysis because of its speed, precision, and efficiency. In present study, the application of variable temperature proton quantitative nuclear magnetic resonance (VT-1H-qNMR) for the calibration of three C-glycosylflavones including orientin, isoorientin, and schaftoside as reference substances was reported. Since there was conformational equilibrium due to the restricted rotation around the C(sp3)-C(sp2) bond in C-glycosylflavones, the conformational behaviors were investigated by VT-NMR and verified by molecular mechanics (MM) calculation. The VT-1H-qNMR method was validated including the linearity, limit of quantification, precision, and stability. The results were consistent with those obtained from mass balance approach. VT-1H-qNMR can be deployed as an effective tool in analyzing C-glycosylflavones. PMID:28243484

  18. Multitemporal spatial pattern analysis of Tulum's tropical coastal landscape

    Science.gov (United States)

    Ramírez-Forero, Sandra Carolina; López-Caloca, Alejandra; Silván-Cárdenas, José Luis

    2011-11-01

    The tropical coastal landscape of Tulum in Quintana Roo, Mexico has a high ecological, economical, social and cultural value, it provides environmental and tourism services at global, national, regional and local levels. The landscape of the area is heterogeneous and presents random fragmentation patterns. In recent years, tourist services of the region has been increased promoting an accelerate expansion of hotels, transportation and recreation infrastructure altering the complex landscape. It is important to understand the environmental dynamics through temporal changes on the spatial patterns and to propose a better management of this ecological area to the authorities. This paper addresses a multi-temporal analysis of land cover changes from 1993 to 2000 in Tulum using Thematic Mapper data acquired by Landsat-5. Two independent methodologies were applied for the analysis of changes in the landscape and for the definition of fragmentation patterns. First, an Iteratively Multivariate Alteration Detection (IR-MAD) algorithm was used to detect and localize land cover change/no-change areas. Second, the post-classification change detection evaluated using the Support Vector Machine (SVM) algorithm. Landscape metrics were calculated from the results of IR-MAD and SVM. The analysis of the metrics indicated, among other things, a higher fragmentation pattern along roadways.

  19. Analysis of Spatial Disparities and Driving Factors of Energy Consumption Change in China Based on Spatial Statistics

    Directory of Open Access Journals (Sweden)

    Hualin Xie

    2014-04-01

    Full Text Available The changes of spatial pattern in energy consumption have an impact on global climate change. Based on the spatial autocorrelation analysis and the auto-regression model of spatial statistics, this study has explored the spatial disparities and driving forces in energy consumption changes in China. The results show that the global spatial autocorrelation of energy consumption change in China is significant during the period 1990–2010, and the trend of spatial clustering of energy consumption change is weakened. The regions with higher energy consumption change are significantly distributed in the developed coastal areas in China, while those with lower energy consumption change are significantly distributed in the less developed western regions in China. Energy consumption change in China is mainly caused by transportation industry and non-labor intensive industry. Rapid economic development and higher industrialization rate are the main causes for faster changes in energy consumption in China. The results also indicate that spatial autoregressive model can reveal more influencing factors of energy consumption changes in China, in contrast with standard linear model. At last, this study has put forward the corresponding measures or policies for dealing with the growing trend of energy consumption in China.

  20. Cancer incidence in men: a cluster analysis of spatial patterns

    Directory of Open Access Journals (Sweden)

    D'Alò Daniela

    2008-11-01

    Full Text Available Abstract Background Spatial clustering of different diseases has received much less attention than single disease mapping. Besides chance or artifact, clustering of different cancers in a given area may depend on exposure to a shared risk factor or to multiple correlated factors (e.g. cigarette smoking and obesity in a deprived area. Models developed so far to investigate co-occurrence of diseases are not well-suited for analyzing many cancers simultaneously. In this paper we propose a simple two-step exploratory method for screening clusters of different cancers in a population. Methods Cancer incidence data were derived from the regional cancer registry of Umbria, Italy. A cluster analysis was performed on smoothed and non-smoothed standardized incidence ratios (SIRs of the 13 most frequent cancers in males. The Besag, York and Mollie model (BYM and Poisson kriging were used to produce smoothed SIRs. Results Cluster analysis on non-smoothed SIRs was poorly informative in terms of clustering of different cancers, as only larynx and oral cavity were grouped, and of characteristic patterns of cancer incidence in specific geographical areas. On the other hand BYM and Poisson kriging gave similar results, showing cancers of the oral cavity, larynx, esophagus, stomach and liver formed a main cluster. Lung and urinary bladder cancers clustered together but not with the cancers mentioned above. Both methods, particularly the BYM model, identified distinct geographic clusters of adjacent areas. Conclusion As in single disease mapping, non-smoothed SIRs do not provide reliable estimates of cancer risks because of small area variability. The BYM model produces smooth risk surfaces which, when entered into a cluster analysis, identify well-defined geographical clusters of adjacent areas. It probably enhances or amplifies the signal arising from exposure of more areas (statistical units to shared risk factors that are associated with different cancers. In

  1. Selecting landing sites for lunar lander missions using spatial analysis

    Science.gov (United States)

    Djachkova, Maia; Lazarev, Evgeniy

    Russian Federal Space Agency (Roscosmos) is planning to launch two spacecrafts to the Moon with lander missions in 2015 and 2017. [1] Here, we present an approach to create a method of landing sites selection. We researched the physical features of the Moon using spatial analysis techniques presented in ArcGIS Desktop Software in accordance with its suitability for automatic landing. Hence we analyzed Russian lunar program and received the technical characteristics of the spacecrafts and scientific goals that they should meet [1]. Thus we identified the criteria of surface suitability for landing. We divided them into two groups: scientific criteria (the hydrogen content of the regolith [2] and day and night sur-face temperature [3]) and safety criteria (surface slopes and roughness, sky view factor, the Earth altitude, presence of polar permanently shadowed regions). In conformity with some investigations it is believed that the south polar region of the Moon is the most promising territory where water ice can be found (finding water ice is the main goal for Russian lunar missions [1]). According to the selected criteria and selected area of research we used remote sensing data from LRO (Lunar Reconnaissance Orbiter) [4] as basic data, because it is the most actual and easily available. The data was processed and analyzed using spatial analysis techniques of ArcGIS Desktop Software, so we created a number of maps depicting the criteria and then combined and overlaid them. As a result of overlay process we received five territories where the landing will be safe and the scientific goals will have being met. It should be noted that our analysis is only the first order assessment and the results cannot be used as actual landing sites for the lunar missions in 2015 and 2017, since a number of factors, which can only be analyzed in a very large scale, was not taken into account. However, an area of researching is narrowed to five territories, what can make the future

  2. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... linear methods. Different strategies for selecting projections (linear combinations) of multivariate images are presented. An exploratory, iterative method for finding interesting projections originated in data analysis is compared to principal components. A method for introducing spatial context...... structural images for heavy minerals based on irregularly sampled geochemical data. This methodology has proven useful in producing images that reflect real geological structures with potential application in mineral exploration. A method for removing loboratory-produced map-sheet patterns in spatial data...

  3. Quantitative analysis of immobilized metalloenzymes by atomic absorption spectroscopy.

    Science.gov (United States)

    Opwis, Klaus; Knittel, Dierk; Schollmeyer, Eckhard

    2004-12-01

    A new, sensitive assay for the quantitative determination of immobilized metal containing enzymes has been developed using atomic absorption spectroscopy (AAS). In contrast with conventionally used indirect methods the described quantitative AAS assay for metalloenzymes allows more exact analyses, because the carrier material with the enzyme is investigated directly. As an example, the validity and reliability of the method was examined by fixing the iron-containing enzyme catalase on cotton fabrics using different immobilization techniques. Sample preparation was carried out by dissolving the loaded fabrics in sulfuric acid before oxidising the residues with hydrogen peroxide. The iron concentrations were determined by flame atomic absorption spectrometry after calibration of the spectrometer with solutions of the free enzyme at different concentrations.

  4. Research on Petroleum Reservoir Diagenesis and Damage Using EDS Quantitative Analysis Method With Standard Samples

    Institute of Scientific and Technical Information of China (English)

    包书景; 陈文学; 等

    2000-01-01

    In recent years,the X-ray spectrometer has been devekloped not only just in enhancing resolution,but also towards dynamic analysis.Computer modeling processing,sampled quantitative analysis and supra-light element analysis.With the gradual sophistication of the quantitative analysis system software,the rationality and accuracy of the established sample deferential document have become the most important guarantee to the reliability of sample quantitative analysis.This work is an important technical subject in China Petroleum Reservoir Research.Through two years of research and experimental work,the EDS quantitative analysis method for petroleum geolgey and resevoir research has been established.and referential documents for five mineral(silicate,etc).specimen standards have been compiled.Closely combining the shape characters and compositional characters of the minerals together and applying them into reservoir diagenetic research and prevention of oil formations from damage,we have obtained obvious geological effects.

  5. Quantitative analysis of pheromone-binding protein specificity

    OpenAIRE

    Katti, S.; Lokhande, N.; D González; Cassill, A.; Renthal, R

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-p...

  6. Public Library System in Ankara: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bülent Yılmaz

    2014-12-01

    Full Text Available This study investigates 42 public libraries in 25 central districts within the boundaries of Ankara Metropolitan Municipality in respect of five factors according to national and international standards quantitatively. The findings show that public libraries in Ankara are insufficient with respect to the number of buildings, users, staff and collection and also in terms of standards. Therefore, it has been suggested that an urgent planning is necessary for public libraries in Ankara.

  7. Quantitative Trait Locus Analysis of the Early Domestication of Sunflower

    OpenAIRE

    David M Wills; Burke, John M.

    2007-01-01

    Genetic analyses of the domestication syndrome have revealed that domestication-related traits typically have a very similar genetic architecture across most crops, being conditioned by a small number of quantitative trait loci (QTL), each with a relatively large effect on the phenotype. To date, the domestication of sunflower (Helianthus annuus L.) stands as the only counterexample to this pattern. In previous work involving a cross between wild sunflower (also H. annuus) and a highly improv...

  8. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  9. The effects of selection on linkage analysis for quantitative traits.

    Science.gov (United States)

    Mackinnon, M J; Georges, M A

    1992-12-01

    The effects of within-sample selection on the outcome of analyses detecting linkage between genetic markers and quantitative traits were studied. It was found that selection by truncation for the trait of interest significantly reduces the differences between marker genotype means thus reducing the power to detect linked quantitative trait loci (QTL). The size of this reduction is a function of proportion selected, the magnitude of the QTL effect, recombination rate between the marker locus and the QTL, and the allele frequency of the QTL. Proportion selected was the most influential of these factors on bias, e.g., for an allele substitution effect of one standard deviation unit, selecting the top 80%, 50% or 20% of the population required 2, 6 or 24 times the number of progeny, respectively, to offset the loss of power caused by this selection. The effect on power was approximately linear with respect to the size of gene effect, almost invariant to recombination rate, and a complex function of QTL allele frequency. It was concluded that experimental samples from animal populations which have been subjected to even minor amounts of selection will be inefficient in yielding information on linkage between markers and loci influencing the quantitative trait under selection.

  10. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ji Young; Lee, Sun Wha [Ewha Womans University College of Medicine, Seoul (Korea, Republic of); Park, Youn Soo [Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2006-11-15

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) ({rho} < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option.

  11. Quantitation of DNA methylation by melt curve analysis

    Directory of Open Access Journals (Sweden)

    Jones Michael E

    2009-04-01

    Full Text Available Abstract Background Methylation of DNA is a common mechanism for silencing genes, and aberrant methylation is increasingly being implicated in many diseases such as cancer. There is a need for robust, inexpensive methods to quantitate methylation across a region containing a number of CpGs. We describe and validate a rapid, in-tube method to quantitate DNA methylation using the melt data obtained following amplification of bisulfite modified DNA in a real-time thermocycler. Methods We first describe a mathematical method to normalise the raw fluorescence data generated by heating the amplified bisulfite modified DNA. From this normalised data the temperatures at which melting begins and finishes can be calculated, which reflect the less and more methylated template molecules present respectively. Also the T50, the temperature at which half the amplicons are melted, which represents the summative methylation of all the CpGs in the template mixture, can be calculated. These parameters describe the methylation characteristics of the region amplified in the original sample. Results For validation we used synthesized oligonucleotides and DNA from fresh cells and formalin fixed paraffin embedded tissue, each with known methylation. Using our quantitation we could distinguish between unmethylated, partially methylated and fully methylated oligonucleotides mixed in varying ratios. There was a linear relationship between T50 and the dilution of methylated into unmethylated DNA. We could quantitate the change in methylation over time in cell lines treated with the demethylating drug 5-aza-2'-deoxycytidine, and the differences in methylation associated with complete, clonal or no loss of MGMT expression in formalin fixed paraffin embedded tissues. Conclusion We have validated a rapid, simple in-tube method to quantify methylation which is robust and reproducible, utilizes easily designed primers and does not need proprietary algorithms or software. The

  12. Spatial and temporal variation in selection of genes associated with pearl millet varietal quantitative traits in situ

    Directory of Open Access Journals (Sweden)

    Cedric Mariac

    2016-07-01

    Full Text Available Ongoing global climate changes imply new challenges for agriculture. Whether plants and crops can adapt to such rapid changes is still a widely debated question. We previously showed adaptation in the form of earlier flowering in pearl millet at the scale of a whole country over three decades. However, this analysis did not deal with variability of year to year selection. To understand and possibly manage plant and crop adaptation, we need more knowledge of how selection acts in situ. Is selection gradual, abrupt, and does it vary in space and over time? In the present study, we tracked the evolution of allele frequency in two genes associated with pearl millet phenotypic variation in situ. We sampled 17 populations of cultivated pearl millet over a period of two years. We tracked changes in allele frequencies in these populations by genotyping more than seven thousand individuals. We demonstrate that several allele frequencies changes are compatible with selection, by correcting allele frequency changes associated with genetic drift. We found marked variation in allele frequencies from year to year, suggesting a variable selection effect in space and over time. We estimated the strength of selection associated with variations in allele frequency. Our results suggest that the polymorphism maintained at the genes we studied is partially explained by the spatial and temporal variability of selection. In response to environmental changes, traditional pearl millet varieties could rapidly adapt thanks to this available functional variability.

  13. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  14. Spatial control of groundwater contamination, using principal component analysis

    Indian Academy of Sciences (India)

    N Subba Rao

    2014-06-01

    A study on the geochemistry of groundwater was carried out in a river basin of Andhra Pradesh to probe into the spatial controlling processes of groundwater contamination, using principal component analysis (PCA). The PCA transforms the chemical variables, pH, EC, Ca2+, Mg2+, Na+, K+, HCO$^{−}_{3}$, Cl−, SO$^{2−}_{4}$, NO$^{−}_{3}$ and F−, into two orthogonal principal components (PC1 and PC2), accounting for 75% of the total variance of the data matrix. PC1 has high positive loadings of EC, Na+, Cl−, SO$^{2−}_{4}$, Mg2+ and Ca2+, representing a salinity controlled process of geogenic (mineral dissolution, ion exchange, and evaporation), anthropogenic (agricultural activities and domestic wastewaters), and marine (marine clay) origin. The PC2 loadings are highly positive for HCO$^{−}_{3}$, F−, pH and NO$^{−}_{3}$, attributing to the alkalinity and pollution controlled processes of geogenic and anthropogenic origins. The PC scores reflect the change of groundwater quality of geogenic origin from upstream to downstream area with an increase in concentration of chemical variables, which is due to anthropogenic and marine origins with varying topography, soil type, depth of water levels, and water usage. Thus, the groundwater quality shows a variation of chemical facies from Na+ > Ca2+ > Mg2+ > K+: HCO$^{−}_{3}$ > Cl− > SO$^{2−}_{4}$ > NO$^{−}_{3}$ > F− at high topography to Na+ > Mg2+ > Ca2+ > K+: Cl− > HCO$^{−}_{3}$ > SO$^{2−}_{4}$ > NO$^{−}_{3}$ > F− at low topography. With PCA, an effective tool for the spatial controlling processes of groundwater contamination, a subset of explored wells is indexed for continuous monitoring to optimize the expensive effort.

  15. Spatial Analysis of Stomach Cancer Incidence in Iran.

    Science.gov (United States)

    Pakzad, Reza; Khani, Yousef; Pakzad, Iraj; Momenimovahed, Zohre; Mohammadian-Hashejani, Abdollah; Salehiniya, Hamid; Towhidi, Farhad; Makhsosi, Behnam Reza

    2016-01-01

    Stomach cancer, the fourth most common cancer and the second leading cause of cancer-related death through the world, is very common in parts of Iran. Geographic variation in the incidence of stomach cancer is due to many different factors. The aim of this study was to assess the geographical and spatial distribution of stomach cancer in Iran using data from the cancer registry program in Iran for the year 2009. The reported incidences of stomach cancer for different provinces were standardized to the world population structure. ArcGIS software was used to analyse the data. Hot spots and high risk areas were determined using spatial analysis (Getis-Ord Gi). Hot and cold spots were determined as more than or less than 2 standard deviations from the national average, respectively. A significance level of 0.10 was used for statistical judgment. In 2009, a total of 6,886 cases of stomach cancers were reported of which 4,891 were in men and 1,995 in women (standardized incidence rates of 19.2 and 10.0, respectively, per 100,000 population). The results showed that stomach cancer was concentrated mainly in northwest of the country in both men and women. In women, northwest provinces such as Ardebil, East Azerbaijan, West Azerbaijan, Gilan, and Qazvin were identified as hot spots (pQazvin, Zanjan and Kurdistan, the incidences were higher than the national average and these were identified as hot spots (P<0.01). As stomach cancer is clustered in the northwest of the country, further epidemiological studies are needed to identify factors contributing to this concentration.

  16. Quantitative XRD Analysis of Cement Clinker by the Multiphase Rietveld Method

    Institute of Scientific and Technical Information of China (English)

    HONG Han-lie; FU Zheng-yi; MIN Xin-min

    2003-01-01

    Quantitative phase analysis of Portland cement clinker samples was performed using an adaptation of the Rietveld method.The Rietveld quantitative analysis program,originally in Fortran 77 code,was significantly modified in visual basic code with windows 9X graph-user interface,which is free from the constraint of direct utilizable memory 640 k,and can be conveniently operated under the windows environment.The Rietveld quantitative method provides numerous advantages over conventional XRD quantitative method,especially in the intensity anomalies and superposition problems.Examples of its use are given with the results from other methods.It is concluded that,at present,the Rietveld method is the most suitable one for quantitative phase analysis of Portland cement clinker.

  17. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    Science.gov (United States)

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  18. Signal Adaptive System for Space/Spatial-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Veselin N. Ivanović

    2009-01-01

    Full Text Available This paper outlines the development of a multiple-clock-cycle implementation (MCI of a signal adaptive two-dimensional (2D system for space/spatial-frequency (S/SF signal analysis. The design is based on a method for improved S/SF representation of the analyzed 2D signals, also proposed here. The proposed MCI design optimizes critical design performances related to hardware complexity, making it a suitable system for real time implementation on an integrated chip. Additionally, the design allows the implemented system to take a variable number of clock cycles (CLKs (the only necessary ones regarding desirable—2D Wigner distribution-presentation of autoterms in different frequency-frequency points during the execution. This ability represents a major advantage of the proposed design which helps to optimize the time required for execution and produce an improved, cross-terms-free S/SF signal representation. The design has been verified by a field-programmable gate array (FPGA circuit design, capable of performing S/SF analysis of 2D signals in real time.

  19. Analysis of WiMAX Physical Layer Using Spatial Multiplexing

    CERN Document Server

    Sanghoi, Pavani

    2012-01-01

    Broadband Wireless Access (BWA) has emerged as a promising solution for providing last mile internet access technology to provide high speed internet access to the users in the residential as well as in the small and medium sized enterprise sectors. IEEE 802.16e is one of the most promising and attractive candidate among the emerging technologies for broadband wireless access. The emergence of WiMAX protocol has attracted various interests from almost all the fields of wireless communications. MIMO systems which are created according to the IEEE 802.16-2005 standard (WiMAX) under different fading channels can be implemented to get the benefits of both the MIMO and WiMAX technologies. In this paper analysis of higher level of modulations (i.e. M-PSK and M-QAM for different values of M) with different code rates and on WiMAX-MIMO system is presented for Rayleigh channel by focusing on spatial multiplexing MIMO technique. Signal-to Noise Ratio (SNR) vs Bit Error Rate (BER) analysis has been done.

  20. Spatial analysis of the tuberculosis treatment dropout, Buenos Aires, Argentina

    Directory of Open Access Journals (Sweden)

    María Belén Herrero

    2015-01-01

    Full Text Available OBJECTIVE Identify spatial distribution patterns of the proportion of nonadherence to tuberculosis treatment and its associated factors.METHODS We conducted an ecological study based on secondary and primary data from municipalities of the metropolitan area of Buenos Aires, Argentina. An exploratory analysis of the characteristics of the area and the distributions of the cases included in the sample (proportion of nonadherence was also carried out along with a multifactor analysis by linear regression. The variables related to the characteristics of the population, residences and families were analyzed.RESULTS Areas with higher proportion of the population without social security benefits (p = 0.007 and of households with unsatisfied basic needs had a higher risk of nonadherence (p = 0.032. In addition, the proportion of nonadherence was higher in areas with the highest proportion of households with no public transportation within 300 meters (p = 0.070.CONCLUSIONS We found a risk area for the nonadherence to treatment characterized by a population living in poverty, with precarious jobs and difficult access to public transportation.