WorldWideScience

Sample records for quantitative spatial analysis

  1. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  2. Problems in implementation of the spatial plan of the Republic of Srpska until 2015: Quantitative analysis

    Directory of Open Access Journals (Sweden)

    Bijelić Branislav

    2017-01-01

    Full Text Available The implementation of spatial plans in the Republic of Srpska is certainly the weakest phase of the process of spatial planning in this entity. It is particularly evident in the case of the Spatial Plan of the Republic of Srpska until 2015 which is the highest strategic spatial planning document in the Republic of Srpska. More precisely, the implementation of spatial plans has been defined as the carrying out of spatial planning documents, i.e. planning propositions as defined in the spatial plans. For the purpose of this paper, a quantitative analysis of the implementation of the planning propositions envisioned by this document has been carried out. The difference between what was planned and what was implemented at the end of the planning period (ex-post evaluation of planning decisions is presented in this paper. The weighting factor is defined for each thematic field and planning proposition, where the main criterion for determining the weighting factor is the share of the planning proposition and thematic field in the estimated total costs of the plan (financial criterion. The paper has also tackled the issue of the implementation of the Spatial Plan of Bosnia and Herzegovina for the period 1981 - 2000, as well as of the Spatial Plan of the Republic of Srpska 1996 - 2001 - Phased Plan for the period 1996 - 2001, as the previous strategic spatial planning documents of the highest rank covering the area of the Republic of Srpska. The research results have proven primary hypothesis of the paper that the level of the implementation of Spatial Plan of the Republic of Srpska until 2015 is less than 10%.

  3. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    Science.gov (United States)

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  4. Quantitative spatial analysis of the mouse brain lipidome by pressurized liquid extraction surface analysis

    DEFF Research Database (Denmark)

    Almeida, Reinaldo; Berzina, Zane; Christensen, Eva Arnspang

    2015-01-01

    extracted directly from tissue sections. PLESA uses a sealed and pressurized sampling probe that enables the use of chloroform-containing extraction solvents for efficient in situ lipid microextraction with a spatial resolution of 400 μm. Quantification of lipid species is achieved by the inclusion...

  5. Prospects for higher spatial resolution quantitative X-ray analysis using transition element L-lines

    Science.gov (United States)

    Statham, P.; Holland, J.

    2014-03-01

    Lowering electron beam kV reduces electron scattering and improves spatial resolution of X-ray analysis. However, a previous round robin analysis of steels at 5 - 6 kV using Lα-lines for the first row transition elements gave poor accuracies. Our experiments on SS63 steel using Lα-lines show similar biases in Cr and Ni that cannot be corrected with changes to self-absorption coefficients or carbon coating. The inaccuracy may be caused by different probabilities for emission and anomalous self-absorption for the La-line between specimen and pure element standard. Analysis using Ll(L3-M1)-lines gives more accurate results for SS63 plausibly because the M1-shell is not so vulnerable to the atomic environment as the unfilled M4,5-shell. However, Ll-intensities are very weak and WDS analysis may be impractical for some applications. EDS with large area SDD offers orders of magnitude faster analysis and achieves similar results to WDS analysis with Lα-lines but poorer energy resolution precludes the use of Ll-lines in most situations. EDS analysis of K-lines at low overvoltage is an alternative strategy for improving spatial resolution that could give higher accuracy. The trade-off between low kV versus low overvoltage is explored in terms of sensitivity for element detection for different elements.

  6. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  7. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  8. Mutual misunderstanding and avoidance, misrepresentations and disciplinary politics: spatial science and quantitative analysis in (United Kingdom) geographical curricula

    DEFF Research Database (Denmark)

    Johnston, Ron; Harris, Richard J; Jones, Kelvyn

    2014-01-01

    are those commonly categorised by such terms as ?spatial science? and ?quantitative analysis?. Critics of these areas often write as if the type of work undertaken in the 1960s?1970s still characterises them today, with little appreciation of contemporary activities. This article responds to such claims...... by presenting the current nature of work in those areas ? very different from that of several decades ago ? and makes the case for their inclusion in curricula so that students (most of whom will not proceed to research in the areas) can appreciate the underlying principles of quantitative analyses...... and their important role in the formation of an informed citizenry in data-driven, evidence-based policy societies....

  9. Development of radiochromic film for spatially quantitative dosimetric analysis of indirect ionizing radiation fields

    Science.gov (United States)

    Brady, Samuel Loren

    Two types of radiochromic films (RCF) were characterized for this work: EBT and XRQA film. Both films were investigated for: radiation interaction with film structure; light interaction with film structure for optimal film readout (densitometry) sensitivity; range of absorbed dose measurements; dependence of film dose measurement response as a function of changing radiation energy; fractionation and dose rate effects on film measurement response; film response sensitivity to ambient factors; and stability of measured film response with time. EBT film was shown to have the following properties: near water equivalent atomic weight (Zeff); dynamic dose range of 10 -1-102 Gy; 3% change in optical density (OD) response for a single exposure level when exposed to radiation energies from (75-18,000) kV; and best digitized using transmission densitometry. XRQA film was shown to have: a Zeff of ˜25; a 12 fold increase in sensitivity at lower photon energies for a dynamic dose range of 10-3-100 Gy, a difference of 25% in OD response when comparing 120 kV to 320 kV, and best digitized using reflective densitometry. Both XRQA and EBT films were shown to have: a temporal stability (DeltaOD) of ˜1% for t > 24 hr post film exposure for up to ˜20 days; a change in dose response of ˜0.03 mGy hr-1 when exposed to fluorescent room lighting at standard room temperature and humidity levels; a negligible dose rate and fractionation effect when operated within the optimal dose ranges; and a light wavelength dependence with dose for film readout. The flat bed scanner was chosen as the primary film digitizer due to its availability, cost, OD range, functionality (transmission and reflection scanning), and digitization speed. As a cost verses functionality comparison, the intrinsic and operational limitations were determined for two flat bed scanners. The EPSON V700 and 10000XL exhibited equal spatial and OD accuracy. The combined precision of both the scanner light sources and CCD

  10. Growth of solid domains in model membranes: quantitative image analysis reveals a strong correlation between domain shape and spatial position

    DEFF Research Database (Denmark)

    Bernchou, Uffe; Ipsen, John Hjort; Simonsen, Adam Cohen

    2009-01-01

    . To analyze this effect, the nucleation points were used as generators in a Voronoi construction. Associated with each generator is a Voronoi polygon that contains all points closer to this generator than to any other. Through a detailed quantitative analysis of the Voronoi cells and the domains, we have...

  11. A matter of ephemerality: the study of Kel Tadrart Tuareg (southwest Libya campsites via quantitative spatial analysis

    Directory of Open Access Journals (Sweden)

    Stefano Biagetti

    2016-03-01

    Full Text Available We examined the settlement structure from the Kel Tadrart Tuareg, a small pastoral society from southwest Libya. Our objective was to apply spatial analysis to establish the statistical significance of specific patterns in the settlement layout. In particular, we examined whether there is a separation between domestic and livestock spaces, and whether particular residential features dedicated to guests are spatially isolated. We used both established statistical techniques and newly developed bespoke analyses to test our hypotheses, and then discuss the results in the light of possible applications to other case studies.

  12. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  13. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  14. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  16. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  17. Dahl (S × R) rat congenic strain analysis confirms and defines a chromosome 17 spatial navigation quantitative trait locus to <10 Mbp.

    Science.gov (United States)

    Herrera, Victoria L; Pasion, Khristine A; Tan, Glaiza A; Ruiz-Opazo, Nelson

    2013-01-01

    A quantitative trait locus (QTL) linked with ability to find a platform in the Morris Water Maze (MWM) was located on chromosome 17 (Nav-5 QTL) using intercross between Dahl S and Dahl R rats. We developed two congenic strains, S.R17A and S.R17B introgressing Dahl R-chromosome 17 segments into Dahl S chromosome 17 region spanning putative Nav-5 QTL. Performance analysis of S.R17A, S.R17B and Dahl S rats in the Morris water maze (MWM) task showed a significantly decreased spatial navigation performance in S.R17B congenic rats when compared with Dahl S controls (P = 0.02). The S.R17A congenic segment did not affect MWM performance delimiting Nav-5 to the chromosome 17 65.02-74.66 Mbp region. Additional fine mapping is necessary to identify the specific gene variant accounting for Nav-5 effect on spatial learning and memory in Dahl rats.

  18. Dahl (S × R rat congenic strain analysis confirms and defines a chromosome 17 spatial navigation quantitative trait locus to <10 Mbp.

    Directory of Open Access Journals (Sweden)

    Victoria L Herrera

    Full Text Available A quantitative trait locus (QTL linked with ability to find a platform in the Morris Water Maze (MWM was located on chromosome 17 (Nav-5 QTL using intercross between Dahl S and Dahl R rats. We developed two congenic strains, S.R17A and S.R17B introgressing Dahl R-chromosome 17 segments into Dahl S chromosome 17 region spanning putative Nav-5 QTL. Performance analysis of S.R17A, S.R17B and Dahl S rats in the Morris water maze (MWM task showed a significantly decreased spatial navigation performance in S.R17B congenic rats when compared with Dahl S controls (P = 0.02. The S.R17A congenic segment did not affect MWM performance delimiting Nav-5 to the chromosome 17 65.02-74.66 Mbp region. Additional fine mapping is necessary to identify the specific gene variant accounting for Nav-5 effect on spatial learning and memory in Dahl rats.

  19. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, James A.

    1988-01-01

    A theoretical model for evaluating human spatial habitability (HuSH) in the proposed U.S. Space Station is developed. Optimizing the fitness of the space station environment for human occupancy will help reduce environmental stress due to long-term isolation and confinement in its small habitable volume. The development of tools that operationalize the behavioral bases of spatial volume for visual kinesthetic, and social logic considerations is suggested. This report further calls for systematic scientific investigations of how much real and how much perceived volume people need in order to function normally and with minimal stress in space-based settings. The theoretical model presented in this report can be applied to any size or shape interior, at any scale of consideration, for the Space Station as a whole to an individual enclosure or work station. Using as a point of departure the Isovist model developed by Dr. Michael Benedikt of the U. of Texas, the report suggests that spatial habitability can become as amenable to careful assessment as engineering and life support concerns.

  20. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  1. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  2. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  3. Towards Quantitative Spatial Models of Seabed Sediment Composition.

    Directory of Open Access Journals (Sweden)

    David Stephens

    Full Text Available There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom's parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method.

  4. Professional analysis in spatial planning

    Directory of Open Access Journals (Sweden)

    Andrej Černe

    2005-12-01

    Full Text Available Spatial analysis contributes to accomplishment of the three basic aims of spatial planning: it is basic element for setting spatial policies, concepts and strategies, gives basic information to inhabitants, land owners, investors, planners and helps in performing spatial policies, strategies, plans, programmes and projects. Analysis in planning are generally devoted to: understand current circumstances and emerging conditions within planning decisions; determine priorities of open questions and their solutions; formulate general principles for further development.

  5. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  6. Quantitative Concept Analysis

    NARCIS (Netherlands)

    Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas

    2012-01-01

    Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all

  7. Perspectives on spatial data analysis

    CERN Document Server

    Rey, Sergio

    2010-01-01

    This book takes both a retrospective and prospective view of the field of spatial analysis by combining selected reprints of classic articles by Arthur Getis with current observations by leading experts in the field. Four main aspects are highlighted, dealing with spatial analysis, pattern analysis, local statistics as well as illustrative empirical applications. Researchers and students will gain an appreciation of Getis' methodological contributions to spatial analysis and the broad impact of the methods he has helped pioneer on an impressively broad array of disciplines including spatial epidemiology, demography, economics, and ecology. The volume is a compilation of high impact original contributions, as evidenced by citations, and the latest thinking on the field by leading scholars. This makes the book ideal for advanced seminars and courses in spatial analysis as well as a key resource for researchers seeking a comprehensive overview of recent advances and future directions in the field.

  8. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  9. Backyard housing in Gauteng: An analysis of spatial dynamics

    African Journals Online (AJOL)

    Backyard housing in Gauteng: An analysis of spatial dynamics. Yasmin Shapurjee ... Drawing on quantitative geo-demographic data from GeoTerraImage (GTI). (2010), Knowledge .... a fundamental role in absorbing demand for low-income ...

  10. Using 3D spatial correlations to improve the noise robustness of multi component analysis of 3D multi echo quantitative T2 relaxometry data.

    Science.gov (United States)

    Kumar, Dushyant; Hariharan, Hari; Faizy, Tobias D; Borchert, Patrick; Siemonsen, Susanne; Fiehler, Jens; Reddy, Ravinder; Sedlacik, Jan

    2018-05-12

    We present a computationally feasible and iterative multi-voxel spatially regularized algorithm for myelin water fraction (MWF) reconstruction. This method utilizes 3D spatial correlations present in anatomical/pathological tissues and underlying B1 + -inhomogeneity or flip angle inhomogeneity to enhance the noise robustness of the reconstruction while intrinsically accounting for stimulated echo contributions using T2-distribution data alone. Simulated data and in vivo data acquired using 3D non-selective multi-echo spin echo (3DNS-MESE) were used to compare the reconstruction quality of the proposed approach against those of the popular algorithm (the method by Prasloski et al.) and our previously proposed 2D multi-slice spatial regularization spatial regularization approach. We also investigated whether the inter-sequence correlations and agreements improved as a result of the proposed approach. MWF-quantifications from two sequences, 3DNS-MESE vs 3DNS-gradient and spin echo (3DNS-GRASE), were compared for both reconstruction approaches to assess correlations and agreements between inter-sequence MWF-value pairs. MWF values from whole-brain data of six volunteers and two multiple sclerosis patients are being reported as well. In comparison with competing approaches such as Prasloski's method or our previously proposed 2D multi-slice spatial regularization method, the proposed method showed better agreements with simulated truths using regression analyses and Bland-Altman analyses. For 3DNS-MESE data, MWF-maps reconstructed using the proposed algorithm provided better depictions of white matter structures in subcortical areas adjoining gray matter which agreed more closely with corresponding contrasts on T2-weighted images than MWF-maps reconstructed with the method by Prasloski et al. We also achieved a higher level of correlations and agreements between inter-sequence (3DNS-MESE vs 3DNS-GRASE) MWF-value pairs. The proposed algorithm provides more noise

  11. A quantitative method for determining spatial discriminative capacity

    Directory of Open Access Journals (Sweden)

    Dennis Robert G

    2008-03-01

    Full Text Available Abstract Background The traditional two-point discrimination (TPD test, a widely used tactile spatial acuity measure, has been criticized as being imprecise because it is based on subjective criteria and involves a number of non-spatial cues. The results of a recent study showed that as two stimuli were delivered simultaneously, vibrotactile amplitude discrimination became worse when the two stimuli were positioned relatively close together and was significantly degraded when the probes were within a subject's two-point limen. The impairment of amplitude discrimination with decreasing inter-probe distance suggested that the metric of amplitude discrimination could possibly provide a means of objective and quantitative measurement of spatial discrimination capacity. Methods A two alternative forced-choice (2AFC tracking procedure was used to assess a subject's ability to discriminate the amplitude difference between two stimuli positioned at near-adjacent skin sites. Two 25 Hz flutter stimuli, identical except for a constant difference in amplitude, were delivered simultaneously to the hand dorsum. The stimuli were initially spaced 30 mm apart, and the inter-stimulus distance was modified on a trial-by-trial basis based on the subject's performance of discriminating the stimulus with higher intensity. The experiment was repeated via sequential, rather than simultaneous, delivery of the same vibrotactile stimuli. Results Results obtained from this study showed that the performance of the amplitude discrimination task was significantly degraded when the stimuli were delivered simultaneously and were near a subject's two-point limen. In contrast, subjects were able to correctly discriminate between the amplitudes of the two stimuli when they were sequentially delivered at all inter-probe distances (including those within the two-point limen, and improved when an adapting stimulus was delivered prior to simultaneously delivered stimuli. Conclusion

  12. Spatial analysis of weed patterns

    NARCIS (Netherlands)

    Heijting, S.

    2007-01-01

    Keywords: Spatial analysis, weed patterns, Mead’s test, space-time correlograms, 2-D correlograms, dispersal, Generalized Linear Models, heterogeneity, soil, Taylor’s power law. Weeds in agriculture occur in patches. This thesis is a contribution to the characterization of this patchiness, to its

  13. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  14. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  15. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  16. Stereological analysis of spatial structures

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  17. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  18. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  19. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  20. Evaluation of the Possibility of Applying Spatial 3D Imaging Using X-Ray Computed Tomography Reconstruction Methods for Quantitative Analysis of Multiphase Materials / Rentgenowska Analiza Ilościowa Materiałów Wielofazowych Z Wykorzystaniem Przestrzennego Obrazowania (3D Przy Użyciu Metod Rekonstrukcji Tomografii Komputerowej

    Directory of Open Access Journals (Sweden)

    Matysik P.

    2015-12-01

    Full Text Available In this paper the possibility of using X-ray computed tomography (CT in quantitative metallographic studies of homogeneous and composite materials is presented. Samples of spheroidal cast iron, Fe-Ti powder mixture compact and epoxy composite reinforced with glass fibers, were subjected to comparative structural tests. Volume fractions of each of the phase structure components were determined by conventional methods with the use of a scanning electron microscopy (SEM and X-ray diffraction (XRD quantitative analysis methods. These results were compared with those obtained by the method of spatial analysis of the reconstructed CT image. Based on the comparative analysis, taking into account the selectivity of data verification methods and the accuracy of the obtained results, the authors conclude that the method of computed tomography is suitable for quantitative analysis of several types of structural materials.

  1. High spatial resolution quantitative MR images: an experimental study of dedicated surface coils

    International Nuclear Information System (INIS)

    Gensanne, D; Josse, G; Lagarde, J M; Vincensini, D

    2006-01-01

    Measuring spin-spin relaxation times (T 2 ) by quantitative MR imaging represents a potentially efficient tool to evaluate the physicochemical properties of various media. However, noise in MR images is responsible for uncertainties in the determination of T 2 relaxation times, which limits the accuracy of parametric tissue analysis. The required signal-to-noise ratio (SNR) depends on the T 2 relaxation behaviour specific to each tissue. Thus, we have previously shown that keeping the uncertainty in T 2 measurements within a limit of 10% implies that SNR values be greater than 100 and 300 for mono- and biexponential T 2 relaxation behaviours, respectively. Noise reduction can be obtained either by increasing the voxel size (i.e., at the expense of spatial resolution) or by using high sensitivity dedicated surface coils (which allows us to increase SNR without deteriorating spatial resolution in an excessive manner). However, surface coil sensitivity is heterogeneous, i.e., it- and hence SNR-decreases with increasing depth, and the more so as the coil radius is smaller. The use of surface coils is therefore limited to the analysis of superficial structure such as the hypodermic tissue analysed here. The aim of this work was to determine the maximum limits of spatial resolution and depth compatible with reliable in vivo T 2 quantitative MR images using dedicated surface coils available on various clinical MR scanners. The average thickness of adipose tissue is around 15 mm, and the results obtained have shown that obtaining reliable biexponential relaxation analysis requires a minimum achievable voxel size of 13 mm 3 for a conventional volume birdcage coil and only of 1.7 mm 3 for the smallest available surface coil (23 mm in diameter). Further improvement in spatial resolution allowing us to detect low details in MR images without deteriorating parametric T 2 images can be obtained by image filtering. By using the non-linear selective blurring filter described in a

  2. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  3. Spatial analysis and planning under imprecision

    CERN Document Server

    Leung, Y

    1988-01-01

    The book deals with complexity, imprecision, human valuation, and uncertainty in spatial analysis and planning, providing a systematic exposure of a new philosophical and theoretical foundation for spatial analysis and planning under imprecision. Regional concepts and regionalization, spatial preference-utility-choice structures, spatial optimization with single and multiple objectives, dynamic spatial systems and their controls are analyzed in sequence.The analytical framework is based on fuzzy set theory. Basic concepts of fuzzy set theory are first discussed. Many numerical examples and emp

  4. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  5. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    Science.gov (United States)

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  6. Quantitative single shot and spatially resolved plasma wakefield diagnostics

    CERN Document Server

    Kasim, Muhammad Firmansyah; Ceurvorst, Luke; Levy, Matthew C; Ratan, Naren; Sadler, James; Bingham, Robert; Burrows, Philip N; Trines, Raoul; Wing, Matthew; Norreys, Peter

    2015-01-01

    Diagnosing plasma conditions can give great advantages in optimizing plasma wakefield accelerator experiments. One possible method is that of photon acceleration. By propagating a laser probe pulse through a plasma wakefield and extracting the imposed frequency modulation, one can obtain an image of the density modulation of the wakefield. In order to diagnose the wakefield parameters at a chosen point in the plasma, the probe pulse crosses the plasma at oblique angles relative to the wakefield. In this paper, mathematical expressions relating the frequency modulation of the laser pulse and the wakefield density profile of the plasma for oblique crossing angles are derived. Multidimensional particle-in-cell simulation results presented in this paper confirm that the frequency modulation profiles and the density modulation profiles agree to within 10%. Limitations to the accuracy of the measurement are discussed in this paper. This technique opens new possibilities to quantitatively diagnose the plasma wakefie...

  7. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  8. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  9. Spatial data analysis for exploration of regional scale geothermal resources

    Science.gov (United States)

    Moghaddam, Majid Kiavarz; Noorollahi, Younes; Samadzadegan, Farhad; Sharifi, Mohammad Ali; Itoi, Ryuichi

    2013-10-01

    Defining a comprehensive conceptual model of the resources sought is one of the most important steps in geothermal potential mapping. In this study, Fry analysis as a spatial distribution method and 5% well existence, distance distribution, weights of evidence (WofE), and evidential belief function (EBFs) methods as spatial association methods were applied comparatively to known geothermal occurrences, and to publicly-available regional-scale geoscience data in Akita and Iwate provinces within the Tohoku volcanic arc, in northern Japan. Fry analysis and rose diagrams revealed similar directional patterns of geothermal wells and volcanoes, NNW-, NNE-, NE-trending faults, hotsprings and fumaroles. Among the spatial association methods, WofE defined a conceptual model correspondent with the real world situations, approved with the aid of expert opinion. The results of the spatial association analyses quantitatively indicated that the known geothermal occurrences are strongly spatially-associated with geological features such as volcanoes, craters, NNW-, NNE-, NE-direction faults and geochemical features such as hotsprings, hydrothermal alteration zones and fumaroles. Geophysical data contains temperature gradients over 100 °C/km and heat flow over 100 mW/m2. In general, geochemical and geophysical data were better evidence layers than geological data for exploring geothermal resources. The spatial analyses of the case study area suggested that quantitative knowledge from hydrothermal geothermal resources was significantly useful for further exploration and for geothermal potential mapping in the case study region. The results can also be extended to the regions with nearly similar characteristics.

  10. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  11. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  12. Hierarchical modeling and analysis for spatial data

    CERN Document Server

    Banerjee, Sudipto; Gelfand, Alan E

    2003-01-01

    Among the many uses of hierarchical modeling, their application to the statistical analysis of spatial and spatio-temporal data from areas such as epidemiology And environmental science has proven particularly fruitful. Yet to date, the few books that address the subject have been either too narrowly focused on specific aspects of spatial analysis, or written at a level often inaccessible to those lacking a strong background in mathematical statistics.Hierarchical Modeling and Analysis for Spatial Data is the first accessible, self-contained treatment of hierarchical methods, modeling, and dat

  13. New quantitative approaches reveal the spatial preference of nuclear compartments in mammalian fibroblasts.

    Science.gov (United States)

    Weston, David J; Russell, Richard A; Batty, Elizabeth; Jensen, Kirsten; Stephens, David A; Adams, Niall M; Freemont, Paul S

    2015-03-06

    The nuclei of higher eukaryotic cells display compartmentalization and certain nuclear compartments have been shown to follow a degree of spatial organization. To date, the study of nuclear organization has often involved simple quantitative procedures that struggle with both the irregularity of the nuclear boundary and the problem of handling replicate images. Such studies typically focus on inter-object distance, rather than spatial location within the nucleus. The concern of this paper is the spatial preference of nuclear compartments, for which we have developed statistical tools to quantitatively study and explore nuclear organization. These tools combine replicate images to generate 'aggregate maps' which represent the spatial preferences of nuclear compartments. We present two examples of different compartments in mammalian fibroblasts (WI-38 and MRC-5) that demonstrate new knowledge of spatial preference within the cell nucleus. Specifically, the spatial preference of RNA polymerase II is preserved across normal and immortalized cells, whereas PML nuclear bodies exhibit a change in spatial preference from avoiding the centre in normal cells to exhibiting a preference for the centre in immortalized cells. In addition, we show that SC35 splicing speckles are excluded from the nuclear boundary and localize throughout the nucleoplasm and in the interchromatin space in non-transformed WI-38 cells. This new methodology is thus able to reveal the effect of large-scale perturbation on spatial architecture and preferences that would not be obvious from single cell imaging.

  14. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  15. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  16. Spatial analysis of various multiplex cinema types

    Directory of Open Access Journals (Sweden)

    Young-Seo Park

    2016-03-01

    Full Text Available This study identifies the spatial characteristics and relationships of each used space according to the multiplex type. In this study, multiplexes are classified according to screen rooms and circulation systems, and each used space is quantitatively analyzed. The multiplex type based on screen rooms and moving line systems influences the relationship and characteristics of each used space in various ways. In particular, the structure of the used space of multiplexes has a significant effect on profit generation and audience convenience.

  17. High resolution or optimum resolution? Spatial analysis of the Federmesser site at Andernach, Germany

    NARCIS (Netherlands)

    Stapert, D; Street, M

    1997-01-01

    This paper discusses spatial analysis at site level. It is suggested that spatial analysis has to proceed in several levels, from global to more detailed questions, and that optimum resolution should be established when applying any quantitative methods in this field. As an example, the ring and

  18. A spatially explicit and quantitative vulnerability assessment of ecosystem service change in Europe

    NARCIS (Netherlands)

    Metzger, M.J.; Schröter, D.; Leemans, R.; Cramer, W.

    2008-01-01

    Environmental change alters ecosystem functioning and may put the provision of services to human at risk. This paper presents a spatially explicit and quantitative assessment of the corresponding vulnerability for Europe, using a new framework designed to answer multidisciplinary policy relevant

  19. Quantitative studies with the gamma-camera: correction for spatial and energy distortion

    International Nuclear Information System (INIS)

    Soussaline, F.; Todd-Pokropek, A.E.; Raynaud, C.

    1977-01-01

    The gamma camera sensitivity distribution is an important source of error in quantitative studies. In addition, spatial distortion produces apparent variations in count density which degrades quantitative studies. The flood field image takes into account both effects and is influenced by the pile-up of the tail distribution. It is essential to measure separately each of these parameters. These were investigated using a point source displaced by a special scanning table with two X, Y stepping motors of 10 micron precision. The spatial distribution of the sensitivity, spatial distortion and photopeak in the field of view were measured and compared for different setting-up of the camera and PM gains. For well-tuned cameras, the sensitivity is fairly constant, while the variations appearing in the flood field image are primarily due to spatial distortion, the former more dependent than the latter on the energy window setting. This indicates why conventional flood field uniformity correction must not be applied. A correction technique to improve the results in quantitative studies has been tested using a continuously matched energy window at every point within the field. A method for correcting spatial distortion is also proposed, where, after an adequately sampled measurement of this error, a transformation can be applied to calculate the true position of events. The knowledge of the magnitude of these parameters is essential in the routine use and design of detector systems

  20. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  1. Spatial Analysis Methods of Road Traffic Collisions

    DEFF Research Database (Denmark)

    Loo, Becky P. Y.; Anderson, Tessa Kate

    Spatial Analysis Methods of Road Traffic Collisions centers on the geographical nature of road crashes, and uses spatial methods to provide a greater understanding of the patterns and processes that cause them. Written by internationally known experts in the field of transport geography, the book...... outlines the key issues in identifying hazardous road locations (HRLs), considers current approaches used for reducing and preventing road traffic collisions, and outlines a strategy for improved road safety. The book covers spatial accuracy, validation, and other statistical issues, as well as link...

  2. Spatial access priority mapping (SAPM) with fishers: a quantitative GIS method for participatory planning.

    Science.gov (United States)

    Yates, Katherine L; Schoeman, David S

    2013-01-01

    Spatial management tools, such as marine spatial planning and marine protected areas, are playing an increasingly important role in attempts to improve marine management and accommodate conflicting needs. Robust data are needed to inform decisions among different planning options, and early inclusion of stakeholder involvement is widely regarded as vital for success. One of the biggest stakeholder groups, and the most likely to be adversely impacted by spatial restrictions, is the fishing community. In order to take their priorities into account, planners need to understand spatial variation in their perceived value of the sea. Here a readily accessible, novel method for quantitatively mapping fishers' spatial access priorities is presented. Spatial access priority mapping, or SAPM, uses only basic functions of standard spreadsheet and GIS software. Unlike the use of remote-sensing data, SAPM actively engages fishers in participatory mapping, documenting rather than inferring their priorities. By so doing, SAPM also facilitates the gathering of other useful data, such as local ecological knowledge. The method was tested and validated in Northern Ireland, where over 100 fishers participated in a semi-structured questionnaire and mapping exercise. The response rate was excellent, 97%, demonstrating fishers' willingness to be involved. The resultant maps are easily accessible and instantly informative, providing a very clear visual indication of which areas are most important for the fishers. The maps also provide quantitative data, which can be used to analyse the relative impact of different management options on the fishing industry and can be incorporated into planning software, such as MARXAN, to ensure that conservation goals can be met at minimum negative impact to the industry. This research shows how spatial access priority mapping can facilitate the early engagement of fishers and the ready incorporation of their priorities into the decision-making process

  3. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  4. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  5. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  6. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  7. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  8. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  9. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  10. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  11. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  12. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  13. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  14. Real-time and quantitative isotropic spatial resolution susceptibility imaging for magnetic nanoparticles

    Science.gov (United States)

    Pi, Shiqiang; Liu, Wenzhong; Jiang, Tao

    2018-03-01

    The magnetic transparency of biological tissue allows the magnetic nanoparticle (MNP) to be a promising functional sensor and contrast agent. The complex susceptibility of MNPs, strongly influenced by particle concentration, excitation magnetic field and their surrounding microenvironment, provides significant implications for biomedical applications. Therefore, magnetic susceptibility imaging of high spatial resolution will give more detailed information during the process of MNP-aided diagnosis and therapy. In this study, we present a novel spatial magnetic susceptibility extraction method for MNPs under a gradient magnetic field, a low-frequency drive magnetic field, and a weak strength high-frequency magnetic field. Based on this novel method, a magnetic particle susceptibility imaging (MPSI) of millimeter-level spatial resolution (<3 mm) was achieved using our homemade imaging system. Corroborated by the experimental results, the MPSI shows real-time (1 s per frame acquisition) and quantitative abilities, and isotropic high resolution.

  15. Regional Convergence of Income: Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Vera Ivanovna Ivanova

    2014-12-01

    Full Text Available Russia has a huge territory and a strong interregional heterogeneity, so we can assume that geographical factors have a significant impact on the pace of economic growth in Russian regions. Therefore the article is focused on the following issues: 1 correlation between comparative advantages of geographical location and differences in growth rates; 2 impact of more developed regions on their neighbors and 3 correlation between economic growth of regions and their spatial interaction. The article is devoted to the empirical analysis of regional per capita incomes from 1996 to 2012 and explores the dynamics of the spatial autocorrelation of regional development indicator. It is shown that there is a problem of measuring the intensity of spatial dependence: factor value of Moran’s index varies greatly depending on the choice of the matrix of distances. In addition, with the help of spatial econometrics the author tests the following hypotheses: 1 there is convergence between regions for a specified period; 2 the process of beta convergence is explained by the spatial arrangement of regions and 3 there is positive impact of market size on regional growth. The author empirically confirmed all three hypotheses

  16. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  17. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  18. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  19. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  20. Spatial-temporal development of the mangrove vegetation cover on a hydraulic landfill (Via Expressa Sul, Florianópolis, SC): mapping and interpretation of digital aerophotographs, and quantitative analysis

    OpenAIRE

    Anderson Tavares de Melo; Eduardo Juan Soriano-Sierra; Luiz Antônio Paulino

    2011-01-01

    The implementation of a hydraulic landfill along the southern expressway (Via Expressa Sul), in the central-south region of Santa Catarina Island, started in 1995 and was completed in 1997. The landfill provided the mangrove vegetation a new environment to colonize, which has developed rapidly during this short period of time. This study mapped the vegetation cover of this region using aerial photographs from five years (1994, 1997, 2002, 2004 and 2007), which demonstrated the spatial-tempora...

  1. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  2. Quantitative tradeoffs between spatial, temporal, and thermometric resolution of nonresonant Raman thermometry for dynamic experiments.

    Science.gov (United States)

    McGrane, Shawn D; Moore, David S; Goodwin, Peter M; Dattelbaum, Dana M

    2014-01-01

    The ratio of Stokes to anti-Stokes nonresonant spontaneous Raman can provide an in situ thermometer that is noncontact, independent of any material specific parameters or calibrations, can be multiplexed spatially with line imaging, and can be time resolved for dynamic measurements. However, spontaneous Raman cross sections are very small, and thermometric measurements are often limited by the amount of laser energy that can be applied without damaging the sample or changing its temperature appreciably. In this paper, we quantitatively detail the tradeoff space between spatial, temporal, and thermometric accuracy measurable with spontaneous Raman. Theoretical estimates are pinned to experimental measurements to form realistic expectations of the resolution tradeoffs appropriate to various experiments. We consider the effects of signal to noise, collection efficiency, laser heating, pulsed laser ablation, and blackbody emission as limiting factors, provide formulae to help choose optimal conditions and provide estimates relevant to planning experiments along with concrete examples for single-shot measurements.

  3. Nanoscale nuclear architecture for cancer diagnosis by spatial-domain low-coherence quantitative phase microscopy

    Science.gov (United States)

    Wang, Pin; Bista, Rajan K.; Khalbuss, Walid E.; Qiu, Wei; Staton, Kevin D.; Zhang, Lin; Brentnall, Teresa A.; Brand, Randall E.; Liu, Yang

    2011-03-01

    Alterations in nuclear architecture are the hallmark diagnostic characteristic of cancer cells. In this work, we show that the nuclear architectural characteristics quantified by spatial-domain low-coherence quantitative phase microscopy (SL-QPM), is more sensitive for the identification of cancer cells than conventional cytopathology. We demonstrated the importance of nuclear architectural characteristics in both an animal model of intestinal carcinogenesis - APC/Min mouse model and human cytology specimens with colorectal cancer by identifying cancer from cytologically noncancerous appearing cells. The determination of nanoscale nuclear architecture using this simple and practical optical instrument is a significant advance towards cancer diagnosis.

  4. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  5. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  6. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  7. Spatial heterogeneity analysis of brain activation in fMRI

    Directory of Open Access Journals (Sweden)

    Lalit Gupta

    2014-01-01

    Full Text Available In many brain diseases it can be qualitatively observed that spatial patterns in blood oxygenation level dependent (BOLD activation maps appear more (diffusively distributed than in healthy controls. However, measures that can quantitatively characterize this spatial distributiveness in individual subjects are lacking. In this study, we propose a number of spatial heterogeneity measures to characterize brain activation maps. The proposed methods focus on different aspects of heterogeneity, including the shape (compactness, complexity in the distribution of activated regions (fractal dimension and co-occurrence matrix, and gappiness between activated regions (lacunarity. To this end, functional MRI derived activation maps of a language and a motor task were obtained in language impaired children with (Rolandic epilepsy and compared to age-matched healthy controls. Group analysis of the activation maps revealed no significant differences between patients and controls for both tasks. However, for the language task the activation maps in patients appeared more heterogeneous than in controls. Lacunarity was the best measure to discriminate activation patterns of patients from controls (sensitivity 74%, specificity 70% and illustrates the increased irregularity of gaps between activated regions in patients. The combination of heterogeneity measures and a support vector machine approach yielded further increase in sensitivity and specificity to 78% and 80%, respectively. This illustrates that activation distributions in impaired brains can be complex and more heterogeneous than in normal brains and cannot be captured fully by a single quantity. In conclusion, heterogeneity analysis has potential to robustly characterize the increased distributiveness of brain activation in individual patients.

  8. Spatial-temporal development of the mangrove vegetation cover on a hydraulic landfill (Via Expressa Sul, Florianópolis, SC: mapping and interpretation of digital aerophotographs, and quantitative analysis

    Directory of Open Access Journals (Sweden)

    Anderson Tavares de Melo

    2011-12-01

    Full Text Available The implementation of a hydraulic landfill along the southern expressway (Via Expressa Sul, in the central-south region of Santa Catarina Island, started in 1995 and was completed in 1997. The landfill provided the mangrove vegetation a new environment to colonize, which has developed rapidly during this short period of time. This study mapped the vegetation cover of this region using aerial photographs from five years (1994, 1997, 2002, 2004 and 2007, which demonstrated the spatial-temporal evolution of the vegetation since the year before the implementation of the landfill (1994 to its recent state (2007. The data from this study allowed changes in the surface of three bands of vegetation, a band of trees (Laguncularia racemosa and Avicennia schaueriana, a band of the seagrass praturá (Spartina alterniflora and a transition band (companions of mangrove species and restinga plants, to be quantified.

  9. Spatially Resolved Analysis of Bragg Selectivity

    Directory of Open Access Journals (Sweden)

    Tina Sabel

    2015-11-01

    Full Text Available This paper targets an inherent control of optical shrinkage in photosensitive polymers, contributing by means of spatially resolved analysis of volume holographic phase gratings. Point by point scanning of the local material response to the Gaussian intensity distribution of the recording beams is accomplished. Derived information on the local grating period and grating slant is evaluated by mapping of optical shrinkage in the lateral plane as well as through the depth of the layer. The influence of recording intensity, exposure duration and the material viscosity on the Bragg selectivity is investigated.

  10. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  11. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  12. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  13. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  14. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  15. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  16. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  17. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  18. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  19. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  20. Geospatial analysis platform: Supporting strategic spatial analysis and planning

    CSIR Research Space (South Africa)

    Naude, A

    2008-11-01

    Full Text Available Whilst there have been rapid advances in satellite imagery and related fine resolution mapping and web-based interfaces (e.g. Google Earth), the development of capabilities for strategic spatial analysis and planning support has lagged behind...

  1. SRXRF analysis with spatial resolution of dental calculus

    International Nuclear Information System (INIS)

    Sanchez, Hector Jorge; Perez, Carlos Alberto; Grenon, Miriam

    2000-01-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45 deg. + 45 deg.) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μmx50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm

  2. SRXRF analysis with spatial resolution of dental calculus

    Science.gov (United States)

    Sánchez, Héctor Jorge; Pérez, Carlos Alberto; Grenón, Miriam

    2000-09-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45°+45°) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μm×50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm.

  3. Participative Spatial Scenario Analysis for Alpine Ecosystems

    Science.gov (United States)

    Kohler, Marina; Stotten, Rike; Steinbacher, Melanie; Leitinger, Georg; Tasser, Erich; Schirpke, Uta; Tappeiner, Ulrike; Schermer, Markus

    2017-10-01

    Land use and land cover patterns are shaped by the interplay of human and ecological processes. Thus, heterogeneous cultural landscapes have developed, delivering multiple ecosystem services. To guarantee human well-being, the development of land use types has to be evaluated. Scenario development and land use and land cover change models are well-known tools for assessing future landscape changes. However, as social and ecological systems are inextricably linked, land use-related management decisions are difficult to identify. The concept of social-ecological resilience can thereby provide a framework for understanding complex interlinkages on multiple scales and from different disciplines. In our study site (Stubai Valley, Tyrol/Austria), we applied a sequence of steps including the characterization of the social-ecological system and identification of key drivers that influence farmers' management decisions. We then developed three scenarios, i.e., "trend", "positive" and "negative" future development of farming conditions and assessed respective future land use changes. Results indicate that within the "trend" and "positive" scenarios pluri-activity (various sources of income) prevents considerable changes in land use and land cover and promotes the resilience of farming systems. Contrarily, reductions in subsidies and changes in consumer behavior are the most important key drivers in the negative scenario and lead to distinct abandonment of grassland, predominantly in the sub-alpine zone of our study site. Our conceptual approach, i.e., the combination of social and ecological methods and the integration of local stakeholders' knowledge into spatial scenario analysis, resulted in highly detailed and spatially explicit results that can provide a basis for further community development recommendations.

  4. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  5. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  6. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  7. A quantitative method for zoning of protected areas and its spatial ecological implications.

    Science.gov (United States)

    Del Carmen Sabatini, María; Verdiell, Adriana; Rodríguez Iglesias, Ricardo M; Vidal, Marta

    2007-04-01

    Zoning is a key prescriptive tool for administration and management of protected areas. However, the lack of zoning is common for most protected areas in developing countries and, as a consequence, many protected areas are not effective in achieving the goals for which they were created. In this work, we introduce a quantitative method to expeditiously zone protected areas and we evaluate its ecological implications on hypothetical zoning cases. A real-world application is reported for the Talampaya National Park, a UNESCO World Heritage Site located in Argentina. Our method is a modification of the zoning forest model developed by Bos [Bos, J., 1993. Zoning in forest management: a quadratic assignment problem solved by simulated annealing. Journal of Environmental Management 37, 127-145.]. Main innovations involve a quadratic function of distance between land units, non-reciprocal weights for adjacent land uses (mathematically represented by a non-symmetric matrix), and the possibility of imposing a connectivity constraint. Due to its intrinsic spatial dimension, the zoning problem belongs to the NP-hard class, i.e. a solution can only be obtained in non-polynomial time [Nemhausser, G., Wolsey, L., 1988. Integer and Combinatorial Optimization. John Wiley, New York.]. For that purpose, we applied a simulated annealing heuristic implemented as a FORTRAN language routine. Our innovations were effective in achieving zoning designs more compatible with biological diversity protection. The quadratic distance term facilitated the delineation of core zones for elements of significance; the connectivity constraint minimized fragmentation; non-reciprocal land use weightings contributed to better representing management decisions, and influenced mainly the edge and shape of zones. This quantitative method can assist the zoning process within protected areas by offering many zonation scheme alternatives with minimum cost, time and effort. This ability provides a new tool to

  8. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  9. Spatial Econometric data analysis: moving beyond traditional models

    NARCIS (Netherlands)

    Florax, R.J.G.M.; Vlist, van der A.J.

    2003-01-01

    This article appraises recent advances in the spatial econometric literature. It serves as the introduction too collection of new papers on spatial econometric data analysis brought together in this special issue, dealing specifically with new extensions to the spatial econometric modeling

  10. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  11. A resilience-oriented approach for quantitatively assessing recurrent spatial-temporal congestion on urban roads.

    Directory of Open Access Journals (Sweden)

    Junqing Tang

    Full Text Available Traffic congestion brings not only delay and inconvenience, but other associated national concerns, such as greenhouse gases, air pollutants, road safety issues and risks. Identification, measurement, tracking, and control of urban recurrent congestion are vital for building a livable and smart community. A considerable amount of works has made contributions to tackle the problem. Several methods, such as time-based approaches and level of service, can be effective for characterizing congestion on urban streets. However, studies with systemic perspectives have been minor in congestion quantification. Resilience, on the other hand, is an emerging concept that focuses on comprehensive systemic performance and characterizes the ability of a system to cope with disturbance and to recover its functionality. In this paper, we symbolized recurrent congestion as internal disturbance and proposed a modified metric inspired by the well-applied "R4" resilience-triangle framework. We constructed the metric with generic dimensions from both resilience engineering and transport science to quantify recurrent congestion based on spatial-temporal traffic patterns and made the comparison with other two approaches in freeway and signal-controlled arterial cases. Results showed that the metric can effectively capture congestion patterns in the study area and provides a quantitative benchmark for comparison. Also, it suggested not only a good comparative performance in measuring strength of proposed metric, but also its capability of considering the discharging process in congestion. The sensitivity tests showed that proposed metric possesses robustness against parameter perturbation in Robustness Range (RR, but the number of identified congestion patterns can be influenced by the existence of ϵ. In addition, the Elasticity Threshold (ET and the spatial dimension of cell-based platform differ the congestion results significantly on both the detected number and

  12. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  13. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  14. Spatial Analysis Of Human Capital Structures

    Directory of Open Access Journals (Sweden)

    Gajdos Artur

    2014-12-01

    Full Text Available The main purpose of this paper is to analyse the interdependence between labour productivity and the occupational structure of human capital in a spatial cross-section. Research indicates (see Fischer 2009 the possibility to assess the impact of the quality of human capital (measured by means of the level of education on labour productivity in a spatial cross-section.

  15. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  16. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  17. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  18. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  19. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  20. Spatial and temporal epidemiological analysis in the Big Data era.

    Science.gov (United States)

    Pfeiffer, Dirk U; Stevens, Kim B

    2015-11-01

    Concurrent with global economic development in the last 50 years, the opportunities for the spread of existing diseases and emergence of new infectious pathogens, have increased substantially. The activities associated with the enormously intensified global connectivity have resulted in large amounts of data being generated, which in turn provides opportunities for generating knowledge that will allow more effective management of animal and human health risks. This so-called Big Data has, more recently, been accompanied by the Internet of Things which highlights the increasing presence of a wide range of sensors, interconnected via the Internet. Analysis of this data needs to exploit its complexity, accommodate variation in data quality and should take advantage of its spatial and temporal dimensions, where available. Apart from the development of hardware technologies and networking/communication infrastructure, it is necessary to develop appropriate data management tools that make this data accessible for analysis. This includes relational databases, geographical information systems and most recently, cloud-based data storage such as Hadoop distributed file systems. While the development in analytical methodologies has not quite caught up with the data deluge, important advances have been made in a number of areas, including spatial and temporal data analysis where the spectrum of analytical methods ranges from visualisation and exploratory analysis, to modelling. While there used to be a primary focus on statistical science in terms of methodological development for data analysis, the newly emerged discipline of data science is a reflection of the challenges presented by the need to integrate diverse data sources and exploit them using novel data- and knowledge-driven modelling methods while simultaneously recognising the value of quantitative as well as qualitative analytical approaches. Machine learning regression methods, which are more robust and can handle

  1. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  2. Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition

    Science.gov (United States)

    Li, Wei; Wu, Bing; Liu, Chunlei

    2011-01-01

    Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002

  3. A two-locus model of spatially varying stabilizing or directional selection on a quantitative trait.

    Science.gov (United States)

    Geroldinger, Ludwig; Bürger, Reinhard

    2014-06-01

    The consequences of spatially varying, stabilizing or directional selection on a quantitative trait in a subdivided population are studied. A deterministic two-locus two-deme model is employed to explore the effects of migration, the degree of divergent selection, and the genetic architecture, i.e., the recombination rate and ratio of locus effects, on the maintenance of genetic variation. The possible equilibrium configurations are determined as functions of the migration rate. They depend crucially on the strength of divergent selection and the genetic architecture. The maximum migration rates are investigated below which a stable fully polymorphic equilibrium or a stable single-locus polymorphism can exist. Under stabilizing selection, but with different optima in the demes, strong recombination may facilitate the maintenance of polymorphism. However usually, and in particular with directional selection in opposite direction, the critical migration rates are maximized by a concentrated genetic architecture, i.e., by a major locus and a tightly linked minor one. Thus, complementing previous work on the evolution of genetic architectures in subdivided populations subject to diversifying selection, it is shown that concentrated architectures may aid the maintenance of polymorphism. Conditions are obtained when this is the case. Finally, the dependence of the phenotypic variance, linkage disequilibrium, and various measures of local adaptation and differentiation on the parameters is elaborated. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Implementing quantitative analysis and its complement

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Nelson, W.R.; Shepherd, J.C.

    1982-01-01

    This paper presents an application of risk analysis for the evaluation of nuclear reactor facility operation. Common cause failure analysis (CCFA) techniques to identify potential problem areas are discussed. Integration of CCFA and response trees, a particular form of the path sets of a success tree, to gain significant insight into the operation of the facility is also demonstrated. An example illustrating the development of the risk analysis methodology, development of the fault trees, generation of response trees, and evaluation of the CCFA is presented to explain the technique

  5. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  6. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  7. Photoacoustic image reconstruction: a quantitative analysis

    Science.gov (United States)

    Sperl, Jonathan I.; Zell, Karin; Menzenbach, Peter; Haisch, Christoph; Ketzer, Stephan; Marquart, Markus; Koenig, Hartmut; Vogel, Mika W.

    2007-07-01

    Photoacoustic imaging is a promising new way to generate unprecedented contrast in ultrasound diagnostic imaging. It differs from other medical imaging approaches, in that it provides spatially resolved information about optical absorption of targeted tissue structures. Because the data acquisition process deviates from standard clinical ultrasound, choice of the proper image reconstruction method is crucial for successful application of the technique. In the literature, multiple approaches have been advocated, and the purpose of this paper is to compare four reconstruction techniques. Thereby, we focused on resolution limits, stability, reconstruction speed, and SNR. We generated experimental and simulated data and reconstructed images of the pressure distribution using four different methods: delay-and-sum (DnS), circular backprojection (CBP), generalized 2D Hough transform (HTA), and Fourier transform (FTA). All methods were able to depict the point sources properly. DnS and CBP produce blurred images containing typical superposition artifacts. The HTA provides excellent SNR and allows a good point source separation. The FTA is the fastest and shows the best FWHM. In our study, we found the FTA to show the best overall performance. It allows a very fast and theoretically exact reconstruction. Only a hardware-implemented DnS might be faster and enable real-time imaging. A commercial system may also perform several methods to fully utilize the new contrast mechanism and guarantee optimal resolution and fidelity.

  8. Recent developments in spatial analysis spatial statistics, behavioural modelling, and computational intelligence

    CERN Document Server

    Getis, Arthur

    1997-01-01

    In recent years, spatial analysis has become an increasingly active field, as evidenced by the establishment of educational and research programs at many universities. Its popularity is due mainly to new technologies and the development of spatial data infrastructures. This book illustrates some recent developments in spatial analysis, behavioural modelling, and computational intelligence. World renown spatial analysts explain and demonstrate their new and insightful models and methods. The applications are in areas of societal interest such as the spread of infectious diseases, migration behaviour, and retail and agricultural location strategies. In addition, there is emphasis on the uses of new technologoies for the analysis of spatial data through the application of neural network concepts.

  9. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  10. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  11. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  12. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  13. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  14. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  15. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  16. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  17. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  18. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  19. Rings and sector : intrasite spatial analysis of stone age sites

    NARCIS (Netherlands)

    Stapert, Durk

    1992-01-01

    This thesis deals with intrasite spatial analysis: the analysis of spatial patterns on site level. My main concern has been to develop a simple method for analysing Stone Age sites of a special type: those characterised by the presence of a hearth closely associated in space with an artefact

  20. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  1. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  2. Quantitative analysis of deuterium by gas chromatography

    International Nuclear Information System (INIS)

    Isomura, Shohei; Kaetsu, Hayato

    1981-01-01

    An analytical method for the determination of deuterium concentration in water and hydrogen gas by gas chromatography is described. HD and D 2 in a hydrogen gas sample were separated from H 2 by a column packed with Molecular Sieve 13X, using extra pure hydrogen gas as carrier. A thermal conductivity detector was used. Concentrations of deuterium were determined by comparison with standard samples. The error inherent to the present method was less a 1% on the basis of the calibration curves obtained with the standard samples. The average time required for the analysis was about 3 minutes. (author)

  3. Influence of corrosion layers on quantitative analysis

    International Nuclear Information System (INIS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Roehrich, J.; Strub, E.

    2005-01-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed

  4. Statistical analysis of long term spatial and temporal trends of ...

    Indian Academy of Sciences (India)

    Statistical analysis of long term spatial and temporal trends of temperature ... CGCM3; HadCM3; modified Mann–Kendall test; statistical analysis; Sutlej basin. ... Water Resources Systems Division, National Institute of Hydrology, Roorkee 247 ...

  5. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  6. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  7. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  8. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  9. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  10. Generalised recurrence plot analysis for spatial data

    International Nuclear Information System (INIS)

    Marwan, Norbert; Kurths, Juergen; Saparin, Peter

    2007-01-01

    Recurrence plot based methods are highly efficient and widely accepted tools for the investigation of time series or one-dimensional data. We present an extension of the recurrence plots and their quantifications in order to study recurrent structures in higher-dimensional spatial data. The capability of this extension is illustrated on prototypical 2D models. Next, the tested and proved approach is applied to assess the bone structure from CT images of human proximal tibia. We find that the spatial structures in trabecular bone become more recurrent during the bone loss in osteoporosis

  11. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  12. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  13. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  14. Research progress and hotspot analysis of spatial interpolation

    Science.gov (United States)

    Jia, Li-juan; Zheng, Xin-qi; Miao, Jin-li

    2018-02-01

    In this paper, the literatures related to spatial interpolation between 1982 and 2017, which are included in the Web of Science core database, are used as data sources, and the visualization analysis is carried out according to the co-country network, co-category network, co-citation network, keywords co-occurrence network. It is found that spatial interpolation has experienced three stages: slow development, steady development and rapid development; The cross effect between 11 clustering groups, the main convergence of spatial interpolation theory research, the practical application and case study of spatial interpolation and research on the accuracy and efficiency of spatial interpolation. Finding the optimal spatial interpolation is the frontier and hot spot of the research. Spatial interpolation research has formed a theoretical basis and research system framework, interdisciplinary strong, is widely used in various fields.

  15. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); V. Capasso

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects

  16. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Capasso, V.

    2009-01-01

    We give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects through a video

  17. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  18. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  19. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  1. Automated high resolution full-field spatial coherence tomography for quantitative phase imaging of human red blood cells

    Science.gov (United States)

    Singla, Neeru; Dubey, Kavita; Srivastava, Vishal; Ahmad, Azeem; Mehta, D. S.

    2018-02-01

    We developed an automated high-resolution full-field spatial coherence tomography (FF-SCT) microscope for quantitative phase imaging that is based on the spatial, rather than the temporal, coherence gating. The Red and Green color laser light was used for finding the quantitative phase images of unstained human red blood cells (RBCs). This study uses morphological parameters of unstained RBCs phase images to distinguish between normal and infected cells. We recorded the single interferogram by a FF-SCT microscope for red and green color wavelength and average the two phase images to further reduced the noise artifacts. In order to characterize anemia infected from normal cells different morphological features were extracted and these features were used to train machine learning ensemble model to classify RBCs with high accuracy.

  2. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  3. Spatial analysis of the electrical energy demand in Greece

    International Nuclear Information System (INIS)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2017-01-01

    The Electrical Energy Demand (EED) of the agricultural, commercial and industrial sector in Greece, as well as its use for domestic activities, public and municipal authorities and street lighting are analysed spatially using Geographical Information System and spatial statistical methods. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the NUTS (Nomenclature of Territorial Units for Statistics) level 3. The aim is to identify spatial patterns of the EED and its transformations such as the ratios of the EED to socioeconomic variables, i.e. the population, the total area, the population density and the Gross Domestic Product (GDP). Based on the analysis, Greece is divided in five regions, each one with a different development model, i.e. Attica and Thessaloniki which are two heavily populated major poles, Thessaly and Central Greece which form a connected geographical region with important agricultural and industrial sector, the islands and some coastal areas which are characterized by an important commercial sector and the rest Greek areas. The spatial patterns can provide additional information for policy decision about the electrical energy management and better representation of the regional socioeconomic conditions. - Highlights: • We visualize spatially the Electrical Energy Demand (EED) in Greece. • We apply spatial analysis methods to the EED data. • Spatial patterns of the EED are identified. • Greece is classified in five distinct groups, based on the analysis. • The results can be used for optimal planning of the electric system.

  4. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  5. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  6. Spatial analysis of hemorrhagic fever with renal syndrome in China

    Directory of Open Access Journals (Sweden)

    Yang Hong

    2006-04-01

    Full Text Available Abstract Background Hemorrhagic fever with renal syndrome (HFRS is endemic in many provinces with high incidence in mainland China, although integrated intervention measures including rodent control, environment management and vaccination have been implemented for over ten years. In this study, we conducted a geographic information system (GIS-based spatial analysis on distribution of HFRS cases for the whole country with an objective to inform priority areas for public health planning and resource allocation. Methods Annualized average incidence at a county level was calculated using HFRS cases reported during 1994–1998 in mainland China. GIS-based spatial analyses were conducted to detect spatial autocorrelation and clusters of HFRS incidence at the county level throughout the country. Results Spatial distribution of HFRS cases in mainland China from 1994 to 1998 was mapped at county level in the aspects of crude incidence, excess hazard and spatial smoothed incidence. The spatial distribution of HFRS cases was nonrandom and clustered with a Moran's I = 0.5044 (p = 0.001. Spatial cluster analyses suggested that 26 and 39 areas were at increased risks of HFRS (p Conclusion The application of GIS, together with spatial statistical techniques, provide a means to quantify explicit HFRS risks and to further identify environmental factors responsible for the increasing disease risks. We demonstrate a new perspective of integrating such spatial analysis tools into the epidemiologic study and risk assessment of HFRS.

  7. Platform for Quantitative Evaluation of Spatial Intratumoral Heterogeneity in Multiplexed Fluorescence Images.

    Science.gov (United States)

    Spagnolo, Daniel M; Al-Kofahi, Yousef; Zhu, Peihong; Lezon, Timothy R; Gough, Albert; Stern, Andrew M; Lee, Adrian V; Ginty, Fiona; Sarachan, Brion; Taylor, D Lansing; Chennubhotla, S Chakra

    2017-11-01

    We introduce THRIVE (Tumor Heterogeneity Research Interactive Visualization Environment), an open-source tool developed to assist cancer researchers in interactive hypothesis testing. The focus of this tool is to quantify spatial intratumoral heterogeneity (ITH), and the interactions between different cell phenotypes and noncellular constituents. Specifically, we foresee applications in phenotyping cells within tumor microenvironments, recognizing tumor boundaries, identifying degrees of immune infiltration and epithelial/stromal separation, and identification of heterotypic signaling networks underlying microdomains. The THRIVE platform provides an integrated workflow for analyzing whole-slide immunofluorescence images and tissue microarrays, including algorithms for segmentation, quantification, and heterogeneity analysis. THRIVE promotes flexible deployment, a maintainable code base using open-source libraries, and an extensible framework for customizing algorithms with ease. THRIVE was designed with highly multiplexed immunofluorescence images in mind, and, by providing a platform to efficiently analyze high-dimensional immunofluorescence signals, we hope to advance these data toward mainstream adoption in cancer research. Cancer Res; 77(21); e71-74. ©2017 AACR . ©2017 American Association for Cancer Research.

  8. Crash rates analysis in China using a spatial panel model

    Directory of Open Access Journals (Sweden)

    Wonmongo Lacina Soro

    2017-10-01

    Full Text Available The consideration of spatial externalities in traffic safety analysis is of paramount importance for the success of road safety policies. Yet, the quasi-totality of spatial dependence studies on crash rates is performed within the framework of single-equation spatial cross-sectional studies. The present study extends the spatial cross-sectional scheme to a spatial fixed-effects panel model estimated using the maximum likelihood method. The spatial units are the 31 administrative regions of mainland China over the period 2004–2013. The presence of neighborhood effects is evidenced through the Moran's I statistic. Consistent with previous studies, the analysis reveals that omitting the spatial effects in traffic safety analysis is likely to bias the estimation results. The spatial and error lags are all positive and statistically significant suggesting similarities of crash rates pattern in neighboring regions. Some other explanatory variables, such as freight traffic, the length of paved roads and the populations of age 65 and above are related to higher rates while the opposite trend is observed for the Gross Regional Product, the urban unemployment rate and passenger traffic.

  9. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  10. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  11. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  12. Spatial Analysis of Accident Spots Using Weighted Severity Index ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Spatial Analysis of Accident Spots Using Weighted Severity Index (WSI) and ... pedestrians avoiding the use of pedestrian bridges/aid even when they are available. ..... not minding an unforeseen obstruction, miscalculations and wrong break.

  13. SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA) TRAINING COURSE

    Science.gov (United States)

    Spatial Analysis and Decision Assistance (SADA) is a Windows freeware program that incorporates tools from environmental assessment into an effective problem-solving environment. SADA was developed by the Institute for Environmental Modeling at the University of Tennessee and inc...

  14. Quantitative electron microscope autoradiography: application of multiple linear regression analysis

    International Nuclear Information System (INIS)

    Markov, D.V.

    1986-01-01

    A new method for the analysis of high resolution EM autoradiographs is described. It identifies labelled cell organelle profiles in sections on a strictly statistical basis and provides accurate estimates for their radioactivity without the need to make any assumptions about their size, shape and spatial arrangement. (author)

  15. Asymptotic analysis of spatial discretizations in implicit Monte Carlo

    International Nuclear Information System (INIS)

    Densmore, Jeffery D.

    2009-01-01

    We perform an asymptotic analysis of spatial discretizations in Implicit Monte Carlo (IMC). We consider two asymptotic scalings: one that represents a time step that resolves the mean-free time, and one that corresponds to a fixed, optically large time step. We show that only the latter scaling results in a valid spatial discretization of the proper diffusion equation, and thus we conclude that IMC only yields accurate solutions when using optically large spatial cells if time steps are also optically large. We demonstrate the validity of our analysis with a set of numerical examples.

  16. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  17. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  18. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  19. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  20. Spatial Assessment of Road Traffic Injuries in the Greater Toronto Area (GTA: Spatial Analysis Framework

    Directory of Open Access Journals (Sweden)

    Sina Tehranchi

    2017-03-01

    Full Text Available This research presents a Geographic Information Systems (GIS and spatial analysis approach based on the global spatial autocorrelation of road traffic injuries for identifying spatial patterns. A locational spatial autocorrelation was also used for identifying traffic injury at spatial level. Data for this research study were acquired from Canadian Institute for Health Information (CIHI based on 2004 and 2011. Moran’s I statistics were used to examine spatial patterns of road traffic injuries in the Greater Toronto Area (GTA. An assessment of Getis-Ord Gi* statistic was followed as to identify hot spots and cold spots within the study area. The results revealed that Peel and Durham have the highest collision rate for other motor vehicle with motor vehicle. Geographic weighted regression (GWR technique was conducted to test the relationships between the dependent variable, number of road traffic injury incidents and independent variables such as number of seniors, low education, unemployed, vulnerable groups, people smoking and drinking, urban density and average median income. The result of this model suggested that number of seniors and low education have a very strong correlation with the number of road traffic injury incidents.

  1. Assessment of tuberculosis spatial hotspot areas in Antananarivo, Madagascar, by combining spatial analysis and genotyping.

    Science.gov (United States)

    Ratovonirina, Noël Harijaona; Rakotosamimanana, Niaina; Razafimahatratra, Solohery Lalaina; Raherison, Mamy Serge; Refrégier, Guislaine; Sola, Christophe; Rakotomanana, Fanjasoa; Rasolofo Razanamparany, Voahangy

    2017-08-14

    Tuberculosis (TB) remains a public health problem in Madagascar. A crucial element of TB control is the development of an easy and rapid method for the orientation of TB control strategies in the country. Our main objective was to develop a TB spatial hotspot identification method by combining spatial analysis and TB genotyping method in Antananarivo. Sputa of new pulmonary TB cases from 20 TB diagnosis and treatment centers (DTCs) in Antananarivo were collected from August 2013 to May 2014 for culture. Mycobacterium tuberculosis complex (MTBC) clinical isolates were typed by spoligotyping on a Luminex® 200 platform. All TB patients were respectively localized according to their neighborhood residence and the spatial distribution of all pulmonary TB patients and patients with genotypic clustered isolates were scanned respectively by the Kulldorff spatial scanning method for identification of significant spatial clustering. Areas exhibiting spatial clustering of patients with genotypic clustered isolates were considered as hotspot TB areas for transmission. Overall, 467 new cases were included in the study, and 394 spoligotypes were obtained (84.4%). New TB cases were distributed in 133 of the 192 Fokontany (administrative neighborhoods) of Antananarivo (1 to 15 clinical patients per Fokontany) and patients with genotypic clustered isolates were distributed in 127 of the 192 Fokontany (1 to 13 per Fokontany). A single spatial focal point of epidemics was detected when ignoring genotypic data (p = 0.039). One Fokontany of this focal point and three additional ones were detected to be spatially clustered when taking genotypes into account (p Madagascar and will allow better TB control strategies by public health authorities.

  2. A spatial analysis of China's coal flow

    International Nuclear Information System (INIS)

    Mou Dunguo; Li Zhi

    2012-01-01

    The characteristics of China's energy structure and the distribution of its coal resources make coal transportation a very important component of the energy system; moreover, coal transportation acts as a bottleneck for the Chinese economy. To insure the security of the coal supply, China has begun to build regional strategic coal reserves at some locations, but transportation is still the fundamental way to guaranty supply security. Here, we study China's coal transportation quantitatively with a linear programming method that analyses the direction and volume of China's coal flows with the prerequisite that each province's supply and demand balance is guaranteed. First, we analyse the optimal coal transportation for the status quo coal supply and demand given the bottleneck effects that the Daqin Railway has on China's coal flow; second, we analyse the influence of future shifts in the coal supply zone in the future, finding that China's coal flows will also change, which will pressure China to construct railways and ports; and finally, we analyse the possibility of exploiting Yangtze River capacity for coal transportation. We conclude the paper with suggestions for enhancing China's coal transportation security. - Highlights: ► We use linear programming to study China's coal transportation. ► First, analyse the optimal coal flow under the status quo condition. ► Second, analyse influences of coal supply zone shifts to Neimeng and Xinjiang. ► Third, analyse the influence of using Yangtze River for coal transportation. ► At last, we give suggestions about infrastructure construction to guaranty China's long-run coal supply security.

  3. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  4. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  5. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  6. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  7. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  8. Application of Fourier analysis to multispectral/spatial recognition

    Science.gov (United States)

    Hornung, R. J.; Smith, J. A.

    1973-01-01

    One approach for investigating spectral response from materials is to consider spatial features of the response. This might be accomplished by considering the Fourier spectrum of the spatial response. The Fourier Transform may be used in a one-dimensional to multidimensional analysis of more than one channel of data. The two-dimensional transform represents the Fraunhofer diffraction pattern of the image in optics and has certain invariant features. Physically the diffraction pattern contains spatial features which are possibly unique to a given configuration or classification type. Different sampling strategies may be used to either enhance geometrical differences or extract additional features.

  9. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  10. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  11. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  12. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  13. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  14. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  15. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  16. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  17. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    kernel PCA. Shawe-Taylor and Cristianini [4] is an excellent reference for kernel methods in general. Bishop [5] and Press et al. [6] describe kernel methods among many other subjects. The kernel version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional...... feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...... of the kernel width. The 2,097 samples each covering on average 5 km2 are analyzed chemically for the content of 41 elements....

  18. Portable (handheld) clinical device for quantitative spectroscopy of skin, utilizing spatial frequency domain reflectance techniques

    Science.gov (United States)

    Saager, Rolf B.; Dang, An N.; Huang, Samantha S.; Kelly, Kristen M.; Durkin, Anthony J.

    2017-09-01

    Spatial Frequency Domain Spectroscopy (SFDS) is a technique for quantifying in-vivo tissue optical properties. SFDS employs structured light patterns that are projected onto tissues using a spatial light modulator, such as a digital micromirror device. In combination with appropriate models of light propagation, this technique can be used to quantify tissue optical properties (absorption, μa, and scattering, μs', coefficients) and chromophore concentrations. Here we present a handheld implementation of an SFDS device that employs line (one dimensional) imaging. This instrument can measure 1088 spatial locations that span a 3 cm line as opposed to our original benchtop SFDS system that only collects a single 1 mm diameter spot. This imager, however, retains the spectral resolution (˜1 nm) and range (450-1000 nm) of our original benchtop SFDS device. In the context of homogeneous turbid media, we demonstrate that this new system matches the spectral response of our original system to within 1% across a typical range of spatial frequencies (0-0.35 mm-1). With the new form factor, the device has tremendously improved mobility and portability, allowing for greater ease of use in a clinical setting. A smaller size also enables access to different tissue locations, which increases the flexibility of the device. The design of this portable system not only enables SFDS to be used in clinical settings but also enables visualization of properties of layered tissues such as skin.

  19. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  20. Mathematical Analysis of Urban Spatial Networks

    CERN Document Server

    Blanchard, Philippe

    2009-01-01

    Cities can be considered to be among the largest and most complex artificial networks created by human beings. Due to the numerous and diverse human-driven activities, urban network topology and dynamics can differ quite substantially from that of natural networks and so call for an alternative method of analysis. The intent of the present monograph is to lay down the theoretical foundations for studying the topology of compact urban patterns, using methods from spectral graph theory and statistical physics. These methods are demonstrated as tools to investigate the structure of a number of real cities with widely differing properties: medieval German cities, the webs of city canals in Amsterdam and Venice, and a modern urban structure such as found in Manhattan. Last but not least, the book concludes by providing a brief overview of possible applications that will eventually lead to a useful body of knowledge for architects, urban planners and civil engineers.

  1. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  2. Coupling of oceanic carbon and nitrogen: A window to spatially resolved quantitative reconstruction of nitrate inventories

    Science.gov (United States)

    Glock, N.; Liebetrau, V.; Gorb, S.; Wallmann, K. J. G.; Erdem, Z.; Schönfeld, J.; Eisenhauer, A.

    2017-12-01

    Anthropogenic impact has led to a severe acceleration of the global nitrogen cycle. Every second nitrogen atom in the biosphere may now originate from anthropogenic sources such as chemical fertilizers and the burning of fossil fuels. A quantitative reconstruction of past reactive nitrogen inventories is invaluable to facilitate projections for future scenarios and calibrations for such paleoproxies should be done as long the natural signature is still visible. Here we present a first quantitative reconstruction of nitrate concentrations in intermediate water depths of the Peruvian oxygen minimum zone over the last deglaciation using the pore density in the benthic foraminiferal species Bolivina spissa. A comparison of the nitrate reconstruction to the stable carbon isotope (δ13C) record reveals a strong coupling between the carbon and nitrogen cycles. The linear correlation between δ13C and nitrate availability remained stable over the last 22,000 years, facilitating the use of δ13C records as a quantitative nitrate proxy. The combination of the pore density record with δ13C records shows an elevated oceanic nitrate inventory during the Last Glacial Maximum as compared to the Holocene. Our novel proxy approach is consistent with the results of previous δ15N-based biogeochemical modeling studies, and thus provides sound estimates of the nitrate inventory in the glacial and deglacial ocean.

  3. Multispectral colour analysis for quantitative evaluation of pseudoisochromatic color deficiency tests

    Science.gov (United States)

    Ozolinsh, Maris; Fomins, Sergejs

    2010-11-01

    Multispectral color analysis was used for spectral scanning of Ishihara and Rabkin color deficiency test book images. It was done using tunable liquid-crystal LC filters built in the Nuance II analyzer. Multispectral analysis keeps both, information on spatial content of tests and on spectral content. Images were taken in the range of 420-720nm with a 10nm step. We calculated retina neural activity charts taking into account cone sensitivity functions, and processed charts in order to find the visibility of latent symbols in color deficiency plates using cross-correlation technique. In such way the quantitative measure is found for each of diagnostics plate for three different color deficiency carrier types - protanopes, deutanopes and tritanopes. Multispectral color analysis allows to determine the CIE xyz color coordinates of pseudoisochromatic plate design elements and to perform statistical analysis of these data to compare the color quality of available color deficiency test books.

  4. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  5. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  6. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  7. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  8. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  9. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    Science.gov (United States)

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  10. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  11. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  12. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  13. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  14. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  15. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  16. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  17. Spatially resolved quantitative mapping of thermomechanical properties and phase transition temperatures using scanning probe microscopy

    Science.gov (United States)

    Jesse, Stephen; Kalinin, Sergei V; Nikiforov, Maxim P

    2013-07-09

    An approach for the thermomechanical characterization of phase transitions in polymeric materials (polyethyleneterephthalate) by band excitation acoustic force microscopy is developed. This methodology allows the independent measurement of resonance frequency, Q factor, and oscillation amplitude of a tip-surface contact area as a function of tip temperature, from which the thermal evolution of tip-surface spring constant and mechanical dissipation can be extracted. A heating protocol maintained a constant tip-surface contact area and constant contact force, thereby allowing for reproducible measurements and quantitative extraction of material properties including temperature dependence of indentation-based elastic and loss moduli.

  18. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  19. Quantitation of MRI sensitivity to quasi-monodisperse microbubble contrast agents for spatially resolved manometry.

    Science.gov (United States)

    Bencsik, Martin; Al-Rwaili, Amgad; Morris, Robert; Fairhurst, David J; Mundell, Victoria; Cave, Gareth; McKendry, Jonathan; Evans, Stephen

    2013-11-01

    The direct in-vivo measurement of fluid pressure cannot be achieved with MRI unless it is done with the contribution of a contrast agent. No such contrast agents are currently available commercially, whilst those demonstrated previously only produced qualitative results due to their broad size distribution. Our aim is to quantitate then model the MR sensitivity to the presence of quasi-monodisperse microbubble populations. Lipid stabilised microbubble populations with mean radius 1.2 ± 0.8 μm have been produced by mechanical agitation. Contrast agents with increasing volume fraction of bubbles up to 4% were formed and the contribution the bubbles bring to the relaxation rate was quantitated. A periodic pressure change was also continuously applied to the same contrast agent, until MR signal changes were only due to bubble radius change and not due to a change in bubble density. The MR data compared favourably with the prediction of an improved numerical simulation. An excellent MR sensitivity of 23 % bar(-1) has been demonstrated. This work opens up the possibility of generating microbubble preparations tailored to specific applications with optimised MR sensitivity, in particular MRI based in-vivo manometry. Copyright © 2012 Wiley Periodicals, Inc.

  20. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  1. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  2. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  3. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  4. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  5. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  6. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  7. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  9. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  10. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  11. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  12. Multi-spatial analysis of aeolian dune-field patterns

    Science.gov (United States)

    Ewing, Ryan C.; McDonald, George D.; Hayes, Alex G.

    2015-07-01

    Aeolian dune-fields are composed of different spatial scales of bedform patterns that respond to changes in environmental boundary conditions over a wide range of time scales. This study examines how variations in spatial scales of dune and ripple patterns found within dune fields are used in environmental reconstructions on Earth, Mars and Titan. Within a single bedform type, different spatial scales of bedforms emerge as a pattern evolves from an initial state into a well-organized pattern, such as with the transition from protodunes to dunes. Additionally, different types of bedforms, such as ripples, coarse-grained ripples and dunes, coexist at different spatial scales within a dune-field. Analysis of dune-field patterns at the intersection of different scales and types of bedforms at different stages of development provides a more comprehensive record of sediment supply and wind regime than analysis of a single scale and type of bedform. Interpretations of environmental conditions from any scale of bedform, however, are limited to environmental signals associated with the response time of that bedform. Large-scale dune-field patterns integrate signals over long-term climate cycles and reveal little about short-term variations in wind or sediment supply. Wind ripples respond instantly to changing conditions, but reveal little about longer-term variations in wind or sediment supply. Recognizing the response time scales across different spatial scales of bedforms maximizes environmental interpretations from dune-field patterns.

  13. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  14. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  15. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  16. Quantitative analysis with energy dispersive X-ray fluorescence analyser

    International Nuclear Information System (INIS)

    Kataria, S.K.; Kapoor, S.S.; Lal, M.; Rao, B.V.N.

    1977-01-01

    Quantitative analysis of samples using radioisotope excited energy dispersive x-ray fluorescence system is described. The complete set-up is built around a locally made Si(Li) detector x-ray spectrometer with an energy resolution of 220 eV at 5.94 KeV. The photopeaks observed in the x-ray fluorescence spectra are fitted with a Gaussian function and the intensities of the characteristic x-ray lines are extracted, which in turn are used for calculating the elemental concentrations. The results for a few typical cases are presented. (author)

  17. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  18. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  19. Analysis of Spatial Concepts, Spatial Skills and Spatial Representations in New York State Regents Earth Science Examinations

    Science.gov (United States)

    Kastens, Kim A.; Pistolesi, Linda; Passow, Michael J.

    2014-01-01

    Research has shown that spatial thinking is important in science in general, and in Earth Science in particular, and that performance on spatially demanding tasks can be fostered through instruction. Because spatial thinking is rarely taught explicitly in the U.S. education system, improving spatial thinking may be "low-hanging fruit" as…

  20. Visuo-Spatial Performance in Autism: A Meta-Analysis

    Science.gov (United States)

    Muth, Anne; Hönekopp, Johannes; Falter, Christine M.

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large…

  1. Spatial and temporal analysis of mass movement using dendrochronology

    NARCIS (Netherlands)

    Braam, R.R.; Weiss, E.E.J.; Burrough, P.A.

    1987-01-01

    Tree growth and inclination on sloping land is affected by mass movement. Suitable analysis of tree growth and tree form can therefore provide considerable information on mass movement activity. This paper reports a new, automated method for studying the temporal and spatial aspects of mass

  2. Spatial Analysis of Political Capital Citation Using Remote Sensing ...

    African Journals Online (AJOL)

    Spatial Analysis of Political Capital Citation Using Remote Sensing and GIS; A Case Study of Lokoja. ... The PDF file you selected should load here if your Web browser has a PDF reader plug-in installed (for example, a recent version of Adobe Acrobat Reader). If you would like more information about how to print, save, ...

  3. Spatial analysis of digital technologies and impact on socio - cultural ...

    African Journals Online (AJOL)

    The objective of this study was to determine the spatial distribution of digital technologies and ascertain whether digital technologies have significant impact on socio - cultural values or not. Moran's index and Getis and Ord's statistic were used for cluster and hotspots analysis. The unique locations of digital technologies ...

  4. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  5. The Structure of Spatial Ability Items: A Faceted Analysis.

    Science.gov (United States)

    Guttman, Ruth; Shoham, Ilana

    1982-01-01

    Eight spatial tests assembled with a mapping sentence of four content facets (rule type, dimensionality, presence or absence of rotation, and test format) were administered to 800 individuals. Smallest Space Analysis of an intercorrelation matrix yielded three facets which formed distinct regions in a two-dimensional projection of a…

  6. Analysis of spatial pattern of settlements in the federal capital ...

    African Journals Online (AJOL)

    Human settlements are important, seemingly static but dynamic, features of the cultural landscape that have attracted several studies due to the important role they play in human life. This paper examined the spatial distribution of settlements in the Federal Capital Territory (FCT) of Nigeria. The analysis uses vector based ...

  7. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    Science.gov (United States)

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  8. Image Chunking: Defining Spatial Building Blocks for Scene Analysis.

    Science.gov (United States)

    1987-04-01

    mumgs0.USmusa 7.AUWOJO 4. CIUTAC Rm6ANT Wuugme*j James V/. Mlahoney DACA? 6-85-C-00 10 NOQ 1 4-85-K-O 124 Artificial Inteligence Laboratory US USS 545...0197 672 IMAGE CHUWING: DEINING SPATIAL UILDING PLOCKS FOR 142 SCENE ANRLYSIS(U) MASSACHUSETTS INST OF TECH CAIIAIDGE ARTIFICIAL INTELLIGENCE LAO J...Technical Report 980 F-Image Chunking: Defining Spatial Building Blocks for Scene DTm -Analysis S ELECTED James V. Mahoney’ MIT Artificial Intelligence

  9. Quantitative analysis of agricultural land use change in China

    Science.gov (United States)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  10. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  11. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  12. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  13. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  14. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu

    2001-01-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99m Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99m Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  15. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  16. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2001-04-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with {sup 99m}Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or {sup 99m}Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  17. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  18. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  19. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  20. Análise quantitativa da influência de um novo paradigma ecológico: autocorrelação espacial - DOI: 10.4025/actascibiolsci.v25i1.2113 Quantitative analysis of the influence of a new ecological paradigm: spatial autocorrelation - DOI: 10.4025/actascibiolsci.v25i1.2113

    Directory of Open Access Journals (Sweden)

    Luis Mauricio Bini

    2003-04-01

    Full Text Available O objetivo desse trabalho foi o de avaliar a influência da autocorrelação espacial (ausência de independência estatística de observações obtidas ao longo do espaço geográfico nos estudos ecológicos. Para tanto, uma avaliação dos trabalhos que empregaram os métodos necessários para a quantificação da autocorrelação espacial foi realizada, utilizando os dados fornecidos pelo “Institute for Scientific Information”. Os resultados demonstraram que existe uma tendência crescente da utilização das análises de autocorrelação espacial em estudos ecológicos, e que a presença de autocorrelação espacial significativa foi detectada na maior parte dos estudos. Além disso, esses estudos foram desenvolvidos em vários países, por cientistas de diferentes nacionalidades, com diferentes grupos de organismos e em diferentes tipos de ecossistemas. Dessa forma, pode-se considerar que o reconhecimento explícito da estrutura espacial dos processos naturais, por meio da análise da autocorrelação espacial, é um novo paradigma dos estudos ecológicosThe aim of this paper was to evaluate the influence of the spatial autocorrelation (absence of independence among observations gathered along geographical space in ecological studies. For this task, an evaluation of the studies that used spatial autocorrelation analysis was carried out using the data furnished by the Institute for Scientific Information. There is a positive temporal tendency in the number of studies that used spatial autocorrelation analysis. A significant autocorrelation was detected in most studies. Moreover, scientist of several nationalities carried out these studies in different countries, with different organisms and in different types of ecosystems. In this way, it is possible to consider that the explicit incorporation of the spatial structure of natural processes, through the autocorrelation analysis, is a new ecological paradigm

  1. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  2. Bovine spongiform encephalopathy and spatial analysis of the feed industry.

    Science.gov (United States)

    Paul, Mathilde; Abrial, David; Jarrige, Nathalie; Rican, Stéphane; Garrido, Myriam; Calavas, Didier; Ducrot, Christian

    2007-06-01

    In France, despite the ban of meat-and-bone meal (MBM) in cattle feed, bovine spongiform encephalopathy (BSE) was detected in hundreds of cattle born after the ban. To study the role of MBM, animal fat, and dicalcium phosphate on the risk for BSE after the feed ban, we conducted a spatial analysis of the feed industry. We used data from 629 BSE cases as well as data on use of each byproduct and market area of the feed factories. We mapped risk for BSE in 951 areas supplied by the same factories and connection with use of byproducts. A disease map of BSE with covariates was built with the hierarchical Bayesian modeling methods, based on Poisson distribution with spatial smoothing. Only use of MBM was spatially linked to risk for BSE, which highlights cross-contamination as the most probable source of infection after the feed ban.

  3. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  4. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  5. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  6. Pattern decomposition and quantitative-phase analysis in pulsed neutron transmission

    International Nuclear Information System (INIS)

    Steuwer, A.; Santisteban, J.R.; Withers, P.J.; Edwards, L.

    2004-01-01

    Neutron diffraction methods provide accurate quantitative insight into material properties with applications ranging from fundamental physics to applied engineering research. Neutron radiography or tomography on the other hand, are useful tools in the non-destructive spatial imaging of materials or engineering components, but are less accurate with respect to any quantitative analysis. It is possible to combine the advantages of diffraction and radiography using pulsed neutron transmission in a novel way. Using a pixellated detector at a time-of-flight source it is possible to collect 2D 'images' containing a great deal of interesting information in the thermal regime. This together with the unprecedented intensities available at spallation sources and improvements in computing power allow for a re-assessment of the transmission methods. It opens the possibility of simultaneous imaging of diverse material properties such as strain or temperature, as well as the variation in attenuation, and can assist in the determination of phase volume fraction. Spatial and time resolution (for dynamic experiment) are limited only by the detector technology and the intensity of the source. In this example, phase information contained in the cross-section is extracted from Bragg edges using an approach similar to pattern decomposition

  7. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  8. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  9. Synchrotron radiation microprobe quantitative analysis method for biomedical specimens

    International Nuclear Information System (INIS)

    Xu Qing; Shao Hanru

    1994-01-01

    Relative changes of trace elemental content in biomedical specimens are obtained easily by means of synchrotron radiation X-ray fluorescence microprobe analysis (SXRFM). However, the accurate assignment of concentration on a g/g basis is difficult. Because it is necessary to know both the trace elemental content and the specimen mass in the irradiated volume simultaneously. the specimen mass is a function of the spatial position and can not be weighed. It is possible to measure the specimen mass indirectly by measuring the intensity of Compton scattered peak for normal XRF analysis using a X-ray tube with Mo anode, if the matrix was consisted of light elements and the specimen was a thin sample. The Compton peak is not presented in fluorescence spectrum for white light SXRFM analysis. The continuous background in the spectrum was resulted from the Compton scattering with a linear polarization X-ray source. Biomedical specimens for SXRFM analysis, for example biological section and human hair, are always a thin sample for high energy X-ray, and they consist of H,C,N and O etc. light elements, which implies a linear relationship between the specimen mass and the Compton scattering background in the high energy region of spectrum. By this way , it is possible to carry out measurement of concentration for SXRFM analysis

  10. Quantitative assessment of industrial VOC emissions in China: Historical trend, spatial distribution, uncertainties, and projection

    Science.gov (United States)

    Zheng, Chenghang; Shen, Jiali; Zhang, Yongxin; Huang, Weiwei; Zhu, Xinbo; Wu, Xuecheng; Chen, Linghong; Gao, Xiang; Cen, Kefa

    2017-02-01

    The temporal trends of industrial volatile organic compound (VOC) emissions was comprehensively summarized for the 2011 to 2013 period, and the projections for 2020 to 2050 for China were set. The results demonstrate that industrial VOC emissions in China increased from 15.3 Tg in 2011 to 29.4 Tg in 2013 at an annual average growth rate of 38.3%. Guangdong (3.45 Tg), Shandong (2.85 Tg), and Jiangsu (2.62 Tg) were the three largest contributors collectively accounting for 30.4% of the national total emissions in 2013. The top three average industrial VOC emissions per square kilometer were Shanghai (247.2 ton/km2), Tianjin (62.8 ton/km2), and Beijing (38.4 ton/km2), which were 12-80 times of the average level in China. The data from the inventory indicate that the use of VOC-containing products, as well as the production and use of VOCs as raw materials, as well as for storage and transportation contributed 75.4%, 10.3%, 9.1%, and 5.2% of the total emissions, respectively. ArcGIS was used to display the remarkable spatial distribution variation by allocating the emission into 1 km × 1 km grid cells with a population as surrogate indexes. Combined with future economic development and population change, as well as implementation of policy and upgrade of control technologies, three scenarios (scenarios A, B, and C) were set to project industrial VOC emissions for the years 2020, 2030, and 2050, which present the industrial VOC emissions in different scenarios and the potential of reducing emissions. Finally, the result shows that the collaborative control policies considerably influenced industrial VOC emissions.

  11. Global sensitivity analysis for models with spatially dependent outputs

    International Nuclear Information System (INIS)

    Iooss, B.; Marrel, A.; Jullien, M.; Laurent, B.

    2011-01-01

    The global sensitivity analysis of a complex numerical model often calls for the estimation of variance-based importance measures, named Sobol' indices. Meta-model-based techniques have been developed in order to replace the CPU time-expensive computer code with an inexpensive mathematical function, which predicts the computer code output. The common meta-model-based sensitivity analysis methods are well suited for computer codes with scalar outputs. However, in the environmental domain, as in many areas of application, the numerical model outputs are often spatial maps, which may also vary with time. In this paper, we introduce an innovative method to obtain a spatial map of Sobol' indices with a minimal number of numerical model computations. It is based upon the functional decomposition of the spatial output onto a wavelet basis and the meta-modeling of the wavelet coefficients by the Gaussian process. An analytical example is presented to clarify the various steps of our methodology. This technique is then applied to a real hydrogeological case: for each model input variable, a spatial map of Sobol' indices is thus obtained. (authors)

  12. Capacity analysis of spectrum sharing spatial multiplexing MIMO systems

    KAUST Repository

    Yang, Liang

    2014-12-01

    This paper considers a spectrum sharing (SS) multiple-input multiple-output (MIMO) system operating in a Rayleigh fading environment. First the capacity of a single-user SS spatial multiplexing system is investigated in two scenarios that assume different receivers. To explicitly show the capacity scaling law of SS MIMO systems, some approximate capacity expressions for the two scenarios are derived. Next, we extend our analysis to a multiple user system with zero-forcing receivers (ZF) under spatially-independent scheduling and analyze the sum-rate. Furthermore, we provide an asymptotic sum-rate analysis to investigate the effects of different parameters on the multiuser diversity gain. Our results show that the secondary system with a smaller number of transmit antennas Nt and a larger number of receive antennas Nr can achieve higher capacity at lower interference temperature Q, but at high Q the capacity follows the scaling law of the conventional MIMO systems. However, for a ZF SS spatial multiplexing system, the secondary system with small Nt and large Nr can achieve the highest capacity throughout the entire region of Q. For a ZF SS spatial multiplexing system with scheduling, the asymptotic sum-rate scales like Ntlog2(Q(KNtNp-1)/Nt), where Np denotes the number of antennas of the primary receiver and K represents the number of secondary transmitters.

  13. Violence in public transportation: an approach based on spatial analysis.

    Science.gov (United States)

    Sousa, Daiane Castro Bittencourt de; Pitombo, Cira Souza; Rocha, Samille Santos; Salgueiro, Ana Rita; Delgado, Juan Pedro Moreno

    2017-12-11

    To carry out a spatial analysis of the occurrence of acts of violence (specifically robberies) in public transportation, identifying the regions of greater incidence, using geostatistics, and possible causes with the aid of a multicriteria analysis in the Geographic Information System. The unit of analysis is the traffic analysis zone of the survey named Origem-Destino, carried out in Salvador, state of Bahia, in 2013. The robberies recorded by the Department of Public Security of Bahia in 2013 were located and made compatible with the limits of the traffic analysis zones and, later, associated with the respective centroids. After determining the regions with the highest probability of robbery, we carried out a geographic analysis of the possible causes in the region with the highest robbery potential, considering the factors analyzed using a multicriteria analysis in a Geographic Information System environment. The execution of the two steps of this study allowed us to identify areas corresponding to the greater probability of occurrence of robberies in public transportation. In addition, the three most vulnerable road sections (Estrada da Liberdade, Rua Pero Vaz, and Avenida General San Martin) were identified in these areas. In these sections, the factors that most contribute with the potential for robbery in buses are: F1 - proximity to places that facilitate escape, F3 - great movement of persons, and F2 - absence of policing, respectively. Indicator Kriging (geostatistical estimation) can be used to construct a spatial probability surface, which can be a useful tool for the implementation of public policies. The multicriteria analysis in the Geographic Information System environment allowed us to understand the spatial factors related to the phenomenon under analysis.

  14. Quantitative, depth-resolved determination of particle motion using multi-exposure, spatial frequency domain laser speckle imaging.

    Science.gov (United States)

    Rice, Tyler B; Kwan, Elliott; Hayakawa, Carole K; Durkin, Anthony J; Choi, Bernard; Tromberg, Bruce J

    2013-01-01

    Laser Speckle Imaging (LSI) is a simple, noninvasive technique for rapid imaging of particle motion in scattering media such as biological tissue. LSI is generally used to derive a qualitative index of relative blood flow due to unknown impact from several variables that affect speckle contrast. These variables may include optical absorption and scattering coefficients, multi-layer dynamics including static, non-ergodic regions, and systematic effects such as laser coherence length. In order to account for these effects and move toward quantitative, depth-resolved LSI, we have developed a method that combines Monte Carlo modeling, multi-exposure speckle imaging (MESI), spatial frequency domain imaging (SFDI), and careful instrument calibration. Monte Carlo models were used to generate total and layer-specific fractional momentum transfer distributions. This information was used to predict speckle contrast as a function of exposure time, spatial frequency, layer thickness, and layer dynamics. To verify with experimental data, controlled phantom experiments with characteristic tissue optical properties were performed using a structured light speckle imaging system. Three main geometries were explored: 1) diffusive dynamic layer beneath a static layer, 2) static layer beneath a diffuse dynamic layer, and 3) directed flow (tube) submerged in a dynamic scattering layer. Data fits were performed using the Monte Carlo model, which accurately reconstructed the type of particle flow (diffusive or directed) in each layer, the layer thickness, and absolute flow speeds to within 15% or better.

  15. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  17. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  18. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  19. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  1. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  2. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  3. Quantitative analysis and classification of AFM images of human hair.

    Science.gov (United States)

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.

  4. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  5. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  6. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Quantitative image analysis of WE43-T6 cracking behavior

    International Nuclear Information System (INIS)

    Ahmad, A; Yahya, Z

    2013-01-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  8. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  9. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  10. Quantitative image analysis for investigating cell-matrix interactions

    Science.gov (United States)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  11. Visuo-Spatial Performance in Autism: A Meta-analysis

    OpenAIRE

    Muth, Anne; Honekopp, Johannes; Falter, Christine

    2014-01-01

    Visuo-spatial skills are believed to be enhanced in autism spectrum disorders (ASDs). This meta-analysis tests the current state of evidence for Figure Disembedding, Block Design, Mental Rotation and Navon tasks in ASD and neurotypicals. Block Design (d = 0.32) and Figure Disembedding (d = 0.26) showed superior performance for ASD with large heterogeneity that is unaccounted for. No clear differences were found for Mental Rotation. ASD samples showed a stronger local processing preference for...

  12. Strategic Placing of Field Hospitals Using Spatial Analysis

    OpenAIRE

    Rydén, Magnus

    2011-01-01

    Humanitarian help organisations today may benefit on improving their location analysis when placing field hospitals in countries hit by a disasters or catastrophe. The main objective of this thesis is to develop and evaluate a spatial decision support method for strategic placing of field hospitals for two time perspectives, long term (months) and short term (weeks). Specifically, the possibility of combining existing infrastructure and satellite data is examined to derive a suitability map f...

  13. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  14. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  15. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  16. Quantitative Motion Analysis of Tai Chi Chuan: The Upper Extremity Movement

    Directory of Open Access Journals (Sweden)

    Tsung-Jung Ho

    2018-01-01

    Full Text Available The quantitative and reproducible analysis of the standard body movement in Tai Chi Chuan (TCC was performed in this study. We aimed to provide a reference of the upper extremities for standardizing TCC practice. Microsoft Kinect was used to record the motion during the practice of TCC. The preparation form and eight essential forms of TCC performed by an instructor and 101 practitioners were analyzed in this study. The instructor completed an entire TCC practice cycle and performed the cycle 12 times. An entire cycle of TCC was performed by practitioners and images were recorded for statistics analysis. The performance of the instructor showed high similarity (Pearson correlation coefficient (r=0.71~0.84 to the first practice cycle. Among the 9 forms, lay form had the highest similarity (rmean=0.90 and push form had the lowest similarity (rmean=0.52. For the practitioners, ward off form (rmean=0.51 and roll back form (rmean=0.45 had the highest similarity with moderate correlation. We used Microsoft Kinect to record the spatial coordinates of the upper extremity joints during the practice of TCC and the data to perform quantitative and qualitative analysis of the joint positions and elbow joint angle.

  17. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  18. Automated Spatial Brain Normalization and Hindbrain White Matter Reference Tissue Give Improved [(18)F]-Florbetaben PET Quantitation in Alzheimer's Model Mice.

    Science.gov (United States)

    Overhoff, Felix; Brendel, Matthias; Jaworska, Anna; Korzhova, Viktoria; Delker, Andreas; Probst, Federico; Focke, Carola; Gildehaus, Franz-Josef; Carlsen, Janette; Baumann, Karlheinz; Haass, Christian; Bartenstein, Peter; Herms, Jochen; Rominger, Axel

    2016-01-01

    Preclinical PET studies of β-amyloid (Aβ) accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore, we aimed to evaluate systematically the (1) impact of an automated algorithm for spatial brain normalization, and (2) intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a 6 week longitudinal setting (N = 37) and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40) were investigated by [(18)F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ). For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX∕REF), relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen's d), and in reference to gold standard histopathological quantitation (Pearson's R). Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ≥0.99) between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX∕REF. All SUVRCTX∕REF methods performed better than SUVCTX both with regard to longitudinal stability (d≥1.21 vs. d = 0.23) and histological gold standard agreement (R≥0.66 vs. R≥0.31). Voxel-wise analysis suggested a physiologically implausible longitudinal decrease by global mean scaling. The hindbrain white matter reference (R mean = 0.75) was slightly superior to the brainstem (R mean = 0.74) and the cerebellum (R mean = 0.73). Automated

  19. Spatial Factor Analysis for Aerosol Optical Depth in Metropolises in China with Regard to Spatial Heterogeneity

    Directory of Open Access Journals (Sweden)

    Hui Shi

    2018-04-01

    Full Text Available A substantial number of studies have analyzed how driving factors impact aerosols, but they have been little concerned with the spatial heterogeneity of aerosols and the factors that impact aerosols. The spatial distributions of the aerosol optical depth (AOD retrieved by Moderate Resolution Imaging Spectrometer (MODIS data at 550-nm and 3-km resolution for three highly developed metropolises, the Beijing-Tianjin-Hebei (BTH region, the Yangtze River Delta (YRD, and the Pearl River Delta (PRD, in China during 2015 were analyzed. Different degrees of spatial heterogeneity of the AOD were found, which were indexed by Moran’s I index giving values of 0.940, 0.715, and 0.680 in BTH, YRD, and PRD, respectively. For the spatial heterogeneity, geographically weighted regression (GWR was employed to carry out a spatial factor analysis, where terrain, climate condition, urban development, and vegetation coverage were taken as the potential driving factors. The results of the GWR imply varying relationships between the AOD and the factors. The results were generally consistent with existing studies, but the results suggest the following: (1 Elevation increase would more likely lead to a strong negative impact on aerosols (with most of the coefficients ranging from −1.5~0 in the BTH, −1.5~0 in the YRD, and −1~0 in the PRD in the places with greater elevations where the R-squared values are always larger than 0.5. However, the variation of elevations cannot explain the variation of aerosols in the places with relatively low elevations (with R-squared values approximately 0.1, ranging from 0 to 0.3, and approximately 0.1 in the BTH, YRD, and PRD, such as urban areas in the BTH and YRD. (2 The density of the built-up areas made a strong and positive impact on aerosols in the urban areas of the BTH (R-squared larger than 0.5, while the R-squared dropped to 0.1 in the places far away from the urban areas. (3 The vegetation coverage led to a stronger

  20. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  1. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  2. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  3. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  4. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  5. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  6. Concept of spatial channel theory applied to reactor shielding analysis

    International Nuclear Information System (INIS)

    Williams, M.L.; Engle, W.W. Jr.

    1977-01-01

    The concept of channel theory is used to locate spatial regions that are important in contributing to a shielding response. The method is analogous to the channel-theory method developed for ascertaining important energy channels in cross-section analysis. The mathematical basis for the theory is shown to be the generalized reciprocity relation, and sample problems are given to exhibit and verify properties predicted by the mathematical equations. A practical example is cited from the shielding analysis of the Fast Flux Test Facility performed at Oak Ridge National Laboratory, in which a perspective plot of channel-theory results was found useful in locating streaming paths around the reactor cavity shield

  7. Spectral theory and nonlinear analysis with applications to spatial ecology

    CERN Document Server

    Cano-Casanova, S; Mora-Corral , C

    2005-01-01

    This volume details some of the latest advances in spectral theory and nonlinear analysis through various cutting-edge theories on algebraic multiplicities, global bifurcation theory, non-linear Schrödinger equations, non-linear boundary value problems, large solutions, metasolutions, dynamical systems, and applications to spatial ecology. The main scope of the book is bringing together a series of topics that have evolved separately during the last decades around the common denominator of spectral theory and nonlinear analysis - from the most abstract developments up to the most concrete applications to population dynamics and socio-biology - in an effort to fill the existing gaps between these fields.

  8. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  9. Compact or spread? A quantitative spatial model of urban areas in Europe since 1990

    Science.gov (United States)

    Haase, Dagmar; Haase, Annegret

    2018-01-01

    Changes in urban residential density represent an important issue in terms of land consumption, the conservation of ecosystems, air quality and related human health problems, as well as the consequential challenges for urban and regional planning. It is the decline of residential densities, in particular, that has often been used as the very definition of sprawl, describing a phenomenon that has been extensively studied in the United States and in Western Europe. Whilst these studies provide valuable insights into urbanization processes, only a handful of them have reflected the uneven dynamics of simultaneous urban growth and shrinkage, using residential density changes as a key indicator to uncover the underlying dynamics. This paper introduces a contrasting analysis of recent developments in both de- and re-concentration, defined as decreasing or increasing residential densities, respectively. Using a large sample of European cities, it detects differences in density changes between successional population growth/decline. The paper shows that dedensification, found in some large cities globally, is not a universal phenomenon in growing urban areas; neither the increasing disproportion between a declining demand for and an increasing supply of residential areas nor actual concentration processes in cities were found. Thus, the paper provides a new, very detailed perspective on (de)densification in both shrinking and growing cities and how they specifically contribute to current land take in Europe. PMID:29489851

  10. Compact or spread? A quantitative spatial model of urban areas in Europe since 1990.

    Science.gov (United States)

    Wolff, Manuel; Haase, Dagmar; Haase, Annegret

    2018-01-01

    Changes in urban residential density represent an important issue in terms of land consumption, the conservation of ecosystems, air quality and related human health problems, as well as the consequential challenges for urban and regional planning. It is the decline of residential densities, in particular, that has often been used as the very definition of sprawl, describing a phenomenon that has been extensively studied in the United States and in Western Europe. Whilst these studies provide valuable insights into urbanization processes, only a handful of them have reflected the uneven dynamics of simultaneous urban growth and shrinkage, using residential density changes as a key indicator to uncover the underlying dynamics. This paper introduces a contrasting analysis of recent developments in both de- and re-concentration, defined as decreasing or increasing residential densities, respectively. Using a large sample of European cities, it detects differences in density changes between successional population growth/decline. The paper shows that dedensification, found in some large cities globally, is not a universal phenomenon in growing urban areas; neither the increasing disproportion between a declining demand for and an increasing supply of residential areas nor actual concentration processes in cities were found. Thus, the paper provides a new, very detailed perspective on (de)densification in both shrinking and growing cities and how they specifically contribute to current land take in Europe.

  11. Compact or spread? A quantitative spatial model of urban areas in Europe since 1990.

    Directory of Open Access Journals (Sweden)

    Manuel Wolff

    Full Text Available Changes in urban residential density represent an important issue in terms of land consumption, the conservation of ecosystems, air quality and related human health problems, as well as the consequential challenges for urban and regional planning. It is the decline of residential densities, in particular, that has often been used as the very definition of sprawl, describing a phenomenon that has been extensively studied in the United States and in Western Europe. Whilst these studies provide valuable insights into urbanization processes, only a handful of them have reflected the uneven dynamics of simultaneous urban growth and shrinkage, using residential density changes as a key indicator to uncover the underlying dynamics. This paper introduces a contrasting analysis of recent developments in both de- and re-concentration, defined as decreasing or increasing residential densities, respectively. Using a large sample of European cities, it detects differences in density changes between successional population growth/decline. The paper shows that dedensification, found in some large cities globally, is not a universal phenomenon in growing urban areas; neither the increasing disproportion between a declining demand for and an increasing supply of residential areas nor actual concentration processes in cities were found. Thus, the paper provides a new, very detailed perspective on (dedensification in both shrinking and growing cities and how they specifically contribute to current land take in Europe.

  12. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  13. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  14. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    , apatite-nepheline, essentially sphalerite, and ore-quartz deposits of gold, tin, tungsten, molybdenum, zinc, crystal, and other raw materials. This method also enables differentiation of rocks such as bauxites, kimberlites, etc., from the host rocks, by their electrokinetic properties. Classification of some rocks, ores, and minerals by their piezoactivity is given in Table 1. These objects (targets) transform wave elastic oscillations into electromagnetic ones. It should be taken into account that anomalous bodies may be detected not only by positive, but also by negative anomalies, if low-piezoactive body occurs in the higher piezoactive medium. The piezoelectric method is an example of successful application of piezoelectric and seismo-electrokinetic phenomena in exploration and environmental geophysics and designed for delineation of targets differing from the host media by piezoelectric properties (Neishtadt et al., 2006, Neishtadt and Eppelbaum, 2012). This method is employed in surface, downhole, and underground modes. Recent testing of piezeoelectric effects of archaeological samples composed from fired clay have shown values of 2.0 - 3.0 ṡ 10-14 C/N. However, absence of reliable procedures for solving the direct and inverse problems of piezoelectric anomalies (PEA), drastically hampers further progression of the method. Therefore, it was suggested to adapt the tomography procedure, widely used in the seismic prospecting, to the PEA modeling. Diffraction of seismic waves has been computed for models of circular cylinder, thin inclined bed and thick bed (Alperovich et al., 1997). As a result, spatial-time distribution of the electromagnetic field caused by the seismic wave has been found. The computations have shown that effectiveness and reliability of PEA analysis may be critically enhanced by considering total electro- and magnetograms as differentiated from the conventional approaches. Distribution of the electromagnetic field obtained by solving the direct

  15. Spatial Data Analysis: Recommendations for Educational Infrastructure in Sindh

    Directory of Open Access Journals (Sweden)

    Abdul Aziz Ansari

    2017-06-01

    Full Text Available Analysing the Education infrastructure has become a crucial activity in imparting quality teaching and resources to students. Facilitations required in improving current education status and future schools is an important analytical component. This is best achieved through a Geographical Information System (GIS analysis of the spatial distribution of schools. In this work, we will execute GIS Analytics on the rural and urban school distributions in Sindh, Pakistan. Using a reliable dataset collected from an international survey team, GIS analysis is done with respect to: 1 school locations, 2 school facilities (water, sanitation, class rooms etc. and 3 student’s results. We will carry out analysis at district level by presenting several spatial results. Correlational analysis of highly influential factors, which may impact the educational performance will generate recommendations for planning and development in weak areas which will provide useful insights regarding effective utilization of resources and new locations to build future schools. The time series analysis will predict the future results which may be witnessed through keen observations and data collections.

  16. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  17. A method for the quantitative metallographic analysis of nuclear fuels (Programme QMA)

    International Nuclear Information System (INIS)

    Moreno, A.; Sari, C.

    1978-01-01

    A method is described for the quantitative analysis of features such as voids, cracks, phases, inclusions and grains distributed on random plane sections of fuel materials. An electronic image analyzer, Quantimet, attached to a MM6 Leitz microscope was used to measure size, area, perimeter and shape of features dispersed in a matrix. The apparatus is driven by a computer which calculates the size, area and perimeter distribution, form factors and orientation of the features as well as the inclusion content of the matrix expressed in weight per cent. A computer programme, QMA, executes the spatial correction of the measured two-dimensional sections and delivers the true distribution of feature sizes in a three-dimensional system

  18. Quantitative analysis of the chromatin of lymphocytes: an assay on comparative structuralism.

    Science.gov (United States)

    Meyer, F

    1980-01-01

    With 26 letters we can form all the words we use, and with a few words it is possible to form an infinite number of different meaningful sentences. In our case, the letters will be a few simple neighborhood image transformations and area measurements. The paper shows how, by iterating these transformations, it is possible to obtain a good quantitative description of the nuclear structure of Feulgen-stained lymphocytes (CLL and normal). The fact that we restricted ourselves to a small number of image transformations made it possible to construct an image analysis system (TAS) able to do these transformations very quickly. We will see, successively, how to segment the nucleus itself, the chromatin, and the interchromatinic channels, how openings and closings lead to size and spatial distribution curves, and how skeletons may be used for measuring the lengths of interchromatinic channels.

  19. Spatial Durbin model analysis macroeconomic loss due to natural disasters

    Science.gov (United States)

    Kusrini, D. E.; Mukhtasor

    2015-03-01

    Magnitude of the damage and losses caused by natural disasters is huge for Indonesia, therefore this study aimed to analyze the effects of natural disasters for macroeconomic losses that occurred in 115 cities/districts across Java during 2012. Based on the results of previous studies it is suspected that it contains effects of spatial dependencies in this case, so that the completion of this case is performed using a regression approach to the area, namely Analysis of Spatial Durbin Model (SDM). The obtained significant predictor variable is population, and predictor variable with a significant weighting is the number of occurrences of disasters, i.e., disasters in the region which have an impact on other neighboring regions. Moran's I index value using the weighted Queen Contiguity also showed significant results, meaning that the incidence of disasters in the region will decrease the value of GDP in other.

  20. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    International Nuclear Information System (INIS)

    Charland, P.; Peters, T.; McGill Univ., Montreal, Quebec

    1996-01-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer's perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions

  1. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  2. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  3. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  4. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  5. Tuberculosis DALY-Gap: Spatial and Quantitative Comparison of Disease Burden Across Urban Slum and Non-slum Census Tracts.

    Science.gov (United States)

    Marlow, Mariel A; Maciel, Ethel Leonor Noia; Sales, Carolina Maia Martins; Gomes, Teresa; Snyder, Robert E; Daumas, Regina Paiva; Riley, Lee W

    2015-08-01

    To quantitatively assess disease burden due to tuberculosis between populations residing in and outside of urban informal settlements in Rio de Janeiro, Brazil, we compared disability-adjusted life years (DALYs), or "DALY-gap." Using the 2010 Brazilian census definition of informal settlements as aglomerados subnormais (AGSN), we allocated tuberculosis (TB) DALYs to AGSN vs non-AGSN census tracts based on geocoded addresses of TB cases reported to the Brazilian Information System for Notifiable Diseases in 2005 and 2010. DALYs were calculated based on the 2010 Global Burden of Disease methodology. DALY-gap was calculated as the difference between age-adjusted DALYs/100,000 population between AGSN and non-AGSN. Total TB DALY in Rio in 2010 was 16,731 (266 DALYs/100,000). DALYs were higher in AGSN census tracts (306 vs 236 DALYs/100,000), yielding a DALY-gap of 70 DALYs/100,000. Attributable DALY fraction for living in an AGSN was 25.4%. DALY-gap was highest for males 40-59 years of age (501 DALYs/100,000) and in census tracts with <60% electricity (12,327 DALYs/100,000). DALY-gap comparison revealed spatial and quantitative differences in TB burden between slum vs non-slum census tracts that were not apparent using traditional measures of incidence and mortality. This metric could be applied to compare TB burden or burden for other diseases in mega-cities with large informal settlements for more targeted resource allocation and evaluation of intervention programs.

  6. Quantitative atom probe analysis of nanostructure containing clusters and precipitates with multiple length scales

    International Nuclear Information System (INIS)

    Marceau, R.K.W.; Stephenson, L.T.; Hutchinson, C.R.; Ringer, S.P.

    2011-01-01

    A model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered in this study. It has recently been shown that the addition of the GP zones to such microstructures can lead to significant increases in strength without a decrease in the uniform elongation. In this study, atom probe tomography (APT) has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. Recent nuclear magnetic resonance (NMR) analysis has clearly shown strain-induced dissolution of the GP zones, which is supported by the current APT data with additional spatial information. There is significant repartitioning of Cu from the GP zones into the solid solution during deformation. A new approach for cluster finding in APT data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features in the solid solution solute as a function of applied strain. -- Research highlights: → A new approach for cluster finding in atom probe tomography (APT) data has been used to quantitatively characterise the evolution of the sizes and shapes of the Cu containing features with multiple length scales. → In this study, a model Al-3Cu-(0.05 Sn) (wt%) alloy containing a bimodal distribution of relatively shear-resistant θ' precipitates and shearable GP zones is considered. → APT has been used to quantitatively characterise the evolution of the GP zones and the solute distribution in the bimodal microstructure as a function of applied plastic strain. → It is clearly shown that there is strain-induced dissolution of the GP zones with significant repartitioning of Cu from the GP zones into the solid solution during deformation.

  7. Quantitative Imaging of Cholinergic Interneurons Reveals a Distinctive Spatial Organization and a Functional Gradient across the Mouse Striatum.

    Directory of Open Access Journals (Sweden)

    Miriam Matamales

    Full Text Available Information processing in the striatum requires the postsynaptic integration of glutamatergic and dopaminergic signals, which are then relayed to the output nuclei of the basal ganglia to influence behavior. Although cellularly homogeneous in appearance, the striatum contains several rare interneuron populations which tightly modulate striatal function. Of these, cholinergic interneurons (CINs have been recently shown to play a critical role in the control of reward-related learning; however how the striatal cholinergic network is functionally organized at the mesoscopic level and the way this organization influences striatal function remains poorly understood. Here, we systematically mapped and digitally reconstructed the entire ensemble of CINs in the mouse striatum and quantitatively assessed differences in densities, spatial arrangement and neuropil content across striatal functional territories. This approach demonstrated that the rostral portion of the striatum contained a higher concentration of CINs than the caudal striatum and that the cholinergic content in the core of the ventral striatum was significantly lower than in the rest of the regions. Additionally, statistical comparison of spatial point patterns in the striatal cholinergic ensemble revealed that only a minor portion of CINs (17% aggregated into cluster and that they were predominantly organized in a random fashion. Furthermore, we used a fluorescence reporter to estimate the activity of over two thousand CINs in naïve mice and found that there was a decreasing gradient of CIN overall function along the dorsomedial-to-ventrolateral axis, which appeared to be independent of their propensity to aggregate within the striatum. Altogether this work suggests that the regulation of striatal function by acetylcholine across the striatum is highly heterogeneous, and that signals originating in external afferent systems may be principally determining the function of CINs in the

  8. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  9. Spatial analysis and characteristics of pig farming in Thailand.

    Science.gov (United States)

    Thanapongtharm, Weerapong; Linard, Catherine; Chinson, Pornpiroon; Kasemsuwan, Suwicha; Visser, Marjolein; Gaughan, Andrea E; Epprech, Michael; Robinson, Timothy P; Gilbert, Marius

    2016-10-06

    In Thailand, pig production intensified significantly during the last decade, with many economic, epidemiological and environmental implications. Strategies toward more sustainable future developments are currently investigated, and these could be informed by a detailed assessment of the main trends in the pig sector, and on how different production systems are geographically distributed. This study had two main objectives. First, we aimed to describe the main trends and geographic patterns of pig production systems in Thailand in terms of pig type (native, breeding, and fattening pigs), farm scales (smallholder and large-scale farming systems) and type of farming systems (farrow-to-finish, nursery, and finishing systems) based on a very detailed 2010 census. Second, we aimed to study the statistical spatial association between these different types of pig farming distribution and a set of spatial variables describing access to feed and markets. Over the last decades, pig population gradually increased, with a continuously increasing number of pigs per holder, suggesting a continuing intensification of the sector. The different pig-production systems showed very contrasted geographical distributions. The spatial distribution of large-scale pig farms corresponds with that of commercial pig breeds, and spatial analysis conducted using Random Forest distribution models indicated that these were concentrated in lowland urban or peri-urban areas, close to means of transportation, facilitating supply to major markets such as provincial capitals and the Bangkok Metropolitan region. Conversely the smallholders were distributed throughout the country, with higher densities located in highland, remote, and rural areas, where they supply local rural markets. A limitation of the study was that pig farming systems were defined from the number of animals per farm, resulting in their possible misclassification, but this should have a limited impact on the main patterns revealed

  10. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  11. Quantitative assessment of early diabetic retinopathy using fractal analysis.

    Science.gov (United States)

    Cheung, Ning; Donaghue, Kim C; Liew, Gerald; Rogers, Sophie L; Wang, Jie Jin; Lim, Shueh-Wen; Jenkins, Alicia J; Hsu, Wynne; Li Lee, Mong; Wong, Tien Y

    2009-01-01

    Fractal analysis can quantify the geometric complexity of the retinal vascular branching pattern and may therefore offer a new method to quantify early diabetic microvascular damage. In this study, we examined the relationship between retinal fractal dimension and retinopathy in young individuals with type 1 diabetes. We conducted a cross-sectional study of 729 patients with type 1 diabetes (aged 12-20 years) who had seven-field stereoscopic retinal photographs taken of both eyes. From these photographs, retinopathy was graded according to the modified Airlie House classification, and fractal dimension was quantified using a computer-based program following a standardized protocol. In this study, 137 patients (18.8%) had diabetic retinopathy signs; of these, 105 had mild retinopathy. Median (interquartile range) retinal fractal dimension was 1.46214 (1.45023-1.47217). After adjustment for age, sex, diabetes duration, A1C, blood pressure, and total cholesterol, increasing retinal vascular fractal dimension was significantly associated with increasing odds of retinopathy (odds ratio 3.92 [95% CI 2.02-7.61] for fourth versus first quartile of fractal dimension). In multivariate analysis, each 0.01 increase in retinal vascular fractal dimension was associated with a nearly 40% increased odds of retinopathy (1.37 [1.21-1.56]). This association remained after additional adjustment for retinal vascular caliber. Greater retinal fractal dimension, representing increased geometric complexity of the retinal vasculature, is independently associated with early diabetic retinopathy signs in type 1 diabetes. Fractal analysis of fundus photographs may allow quantitative measurement of early diabetic microvascular damage.

  12. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  13. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  14. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    Science.gov (United States)

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  15. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    Science.gov (United States)

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  16. Left atrial appendage segmentation and quantitative assisted diagnosis of atrial fibrillation based on fusion of temporal-spatial information.

    Science.gov (United States)

    Jin, Cheng; Feng, Jianjiang; Wang, Lei; Yu, Heng; Liu, Jiang; Lu, Jiwen; Zhou, Jie

    2018-05-01

    In this paper, we present an approach for left atrial appendage (LAA) multi-phase fast segmentation and quantitative assisted diagnosis of atrial fibrillation (AF) based on 4D-CT data. We take full advantage of the temporal dimension information to segment the living, flailed LAA based on a parametric max-flow method and graph-cut approach to build 3-D model of each phase. To assist the diagnosis of AF, we calculate the volumes of 3-D models, and then generate a "volume-phase" curve to calculate the important dynamic metrics: ejection fraction, filling flux, and emptying flux of the LAA's blood by volume. This approach demonstrates more precise results than the conventional approaches that calculate metrics by area, and allows for the quick analysis of LAA-volume pattern changes of in a cardiac cycle. It may also provide insight into the individual differences in the lesions of the LAA. Furthermore, we apply support vector machines (SVMs) to achieve a quantitative auto-diagnosis of the AF by exploiting seven features from volume change ratios of the LAA, and perform multivariate logistic regression analysis for the risk of LAA thrombosis. The 100 cases utilized in this research were taken from the Philips 256-iCT. The experimental results demonstrate that our approach can construct the 3-D LAA geometries robustly compared to manual annotations, and reasonably infer that the LAA undergoes filling, emptying and re-filling, re-emptying in a cardiac cycle. This research provides a potential for exploring various physiological functions of the LAA and quantitatively estimating the risk of stroke in patients with AF. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Preclinical evaluation of spatial frequency domain-enabled wide-field quantitative imaging for enhanced glioma resection

    Science.gov (United States)

    Sibai, Mira; Fisher, Carl; Veilleux, Israel; Elliott, Jonathan T.; Leblond, Frederic; Roberts, David W.; Wilson, Brian C.

    2017-07-01

    5-Aminolevelunic acid-induced protoporphyrin IX (PpIX) fluorescence-guided resection (FGR) enables maximum safe resection of glioma by providing real-time tumor contrast. However, the subjective visual assessment and the variable intrinsic optical attenuation of tissue limit this technique to reliably delineating only high-grade tumors that display strong fluorescence. We have previously shown, using a fiber-optic probe, that quantitative assessment using noninvasive point spectroscopic measurements of the absolute PpIX concentration in tissue further improves the accuracy of FGR, extending it to surgically curable low-grade glioma. More recently, we have shown that implementing spatial frequency domain imaging with a fluorescent-light transport model enables recovery of two-dimensional images of [PpIX], alleviating the need for time-consuming point sampling of the brain surface. We present first results of this technique modified for in vivo imaging on an RG2 rat brain tumor model. Despite the moderate errors in retrieving the absorption and reduced scattering coefficients in the subdiffusive regime of 14% and 19%, respectively, the recovered [PpIX] maps agree within 10% of the point [PpIX] values measured by the fiber-optic probe, validating its potential as an extension or an alternative to point sampling during glioma resection.

  18. An analysis of spatial representativeness of air temperature monitoring stations

    Science.gov (United States)

    Liu, Suhua; Su, Hongbo; Tian, Jing; Wang, Weizhen

    2018-05-01

    Surface air temperature is an essential variable for monitoring the atmosphere, and it is generally acquired at meteorological stations that can provide information about only a small area within an r m radius ( r-neighborhood) of the station, which is called the representable radius. In studies on a local scale, ground-based observations of surface air temperatures obtained from scattered stations are usually interpolated using a variety of methods without ascertaining their effectiveness. Thus, it is necessary to evaluate the spatial representativeness of ground-based observations of surface air temperature before conducting studies on a local scale. The present study used remote sensing data to estimate the spatial distribution of surface air temperature using the advection-energy balance for air temperature (ADEBAT) model. Two target stations in the study area were selected to conduct an analysis of spatial representativeness. The results showed that one station (AWS 7) had a representable radius of about 400 m with a possible error of less than 1 K, while the other station (AWS 16) had the radius of about 250 m. The representable radius was large when the heterogeneity of land cover around the station was small.

  19. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  20. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  1. Deriving Quantitative Crystallographic Information from the Wavelength-Resolved Neutron Transmission Analysis Performed in Imaging Mode

    Directory of Open Access Journals (Sweden)

    Hirotaka Sato

    2017-12-01

    Full Text Available Current status of Bragg-edge/dip neutron transmission analysis/imaging methods is presented. The method can visualize real-space distributions of bulk crystallographic information in a crystalline material over a large area (~10 cm with high spatial resolution (~100 μm. Furthermore, by using suitable spectrum analysis methods for wavelength-dependent neutron transmission data, quantitative visualization of the crystallographic information can be achieved. For example, crystallographic texture imaging, crystallite size imaging and crystalline phase imaging with texture/extinction corrections are carried out by the Rietveld-type (wide wavelength bandwidth profile fitting analysis code, RITS (Rietveld Imaging of Transmission Spectra. By using the single Bragg-edge analysis mode of RITS, evaluations of crystal lattice plane spacing (d-spacing relating to macro-strain and d-spacing distribution’s FWHM (full width at half maximum relating to micro-strain can be achieved. Macro-strain tomography is performed by a new conceptual CT (computed tomography image reconstruction algorithm, the tensor CT method. Crystalline grains and their orientations are visualized by a fast determination method of grain orientation for Bragg-dip neutron transmission spectrum. In this paper, these imaging examples with the spectrum analysis methods and the reliabilities evaluated by optical/electron microscope and X-ray/neutron diffraction, are presented. In addition, the status at compact accelerator driven pulsed neutron sources is also presented.

  2. Model Interpretation of Topological Spatial Analysis for the Visually Impaired (Blind Implemented in Google Maps

    Directory of Open Access Journals (Sweden)

    Marcelo Franco Porto

    2013-06-01

    Full Text Available The technological innovations promote the availability of geographic information on the Internet through Web GIS such as Google Earth and Google Maps. These systems contribute to the teaching and diffusion of geographical knowledge that instigates the recognition of the space we live in, leading to the creation of a spatial identity. In these products available on the Web, the interpretation and analysis of spatial information gives priority to one of the human senses: vision. Due to the fact that this representation of information is transmitted visually (image and vectors, a portion of the population is excluded from part of this knowledge because categories of analysis of geographic data such as borders, territory, and space can only be understood by people who can see. This paper deals with the development of a model of interpretation of topological spatial analysis based on the synthesis of voice and sounds that can be used by the visually impaired (blind.The implementation of a prototype in Google Maps and the usability tests performed are also examined. For the development work it was necessary to define the model of topological spatial analysis, focusing on computational implementation, which allows users to interpret the spatial relationships of regions (countries, states and municipalities, recognizing its limits, neighborhoods and extension beyond their own spatial relationships . With this goal in mind, several interface and usability guidelines were drawn up to be used by the visually impaired (blind. We conducted a detailed study of the Google Maps API (Application Programming Interface, which was the environment selected for prototype development, and studied the information available for the users of that system. The prototype was developed based on the synthesis of voice and sounds that implement the proposed model in C # language and in .NET environment. To measure the efficiency and effectiveness of the prototype, usability

  3. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  4. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  5. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  6. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  7. Qualitative and quantitative analysis of plutonium in solid waste drums

    International Nuclear Information System (INIS)

    Anno, Jacques; Escarieux, Emile

    1977-01-01

    An assessment of the results given by a study carried out for the development of qualitative and quantitative analysis, by γ spectrometry, of plutonium in solid waste drums is presented. After having reminded the standards and their incidence on the quantities of plutonium to be measured (application at industrial Pu: 20% of Pu 240 ) the equipment used is described. Measurement station provided with a mechanical system consisting of: a rail and a pulley block to bring the drums; a pit and a hydraulic jack with a rotating platform. The detection instrumentation consisting of: a high volume coaxial Ge(Li) detector with a γ ray resolution of 2 keV; an associated electronic; a processing of data by a 'Plurimat 20' minicomputer. Principles of the identification and measurements are specified and supported by experimental results. They are the following: determination of the quality of Pu by measuring the ratio between the γ ray intensities of the 239 Pu 129 keV and of the 241 Pu 148 keV; measurement of the 239 Pu mass by estimating the γ ray counting rate of the 375 keV from the calibrating curves given by plutonium samples varying from 32 mg to 80 g; correction of the results versus the source position into the drum and versus the filling in plastic materials into this drum. The experimental results obtained over 40 solid waste drums are presented along with the error estimates [fr

  8. A temperature-controlled photoelectrochemical cell for quantitative product analysis

    Science.gov (United States)

    Corson, Elizabeth R.; Creel, Erin B.; Kim, Youngsang; Urban, Jeffrey J.; Kostecki, Robert; McCloskey, Bryan D.

    2018-05-01

    In this study, we describe the design and operation of a temperature-controlled photoelectrochemical cell for analysis of gaseous and liquid products formed at an illuminated working electrode. This cell is specifically designed to quantitatively analyze photoelectrochemical processes that yield multiple gas and liquid products at low current densities and exhibit limiting reactant concentrations that prevent these processes from being studied in traditional single chamber electrolytic cells. The geometry of the cell presented in this paper enables front-illumination of the photoelectrode and maximizes the electrode surface area to electrolyte volume ratio to increase liquid product concentration and hence enhances ex situ spectroscopic sensitivity toward them. Gas is bubbled through the electrolyte in the working electrode chamber during operation to maintain a saturated reactant concentration and to continuously mix the electrolyte. Gaseous products are detected by an in-line gas chromatograph, and liquid products are analyzed ex situ by nuclear magnetic resonance. Cell performance was validated by examining carbon dioxide reduction on a silver foil electrode, showing comparable results both to those reported in the literature and identical experiments performed in a standard parallel-electrode electrochemical cell. To demonstrate a photoelectrochemical application of the cell, CO2 reduction experiments were carried out on a plasmonic nanostructured silver photocathode and showed different product distributions under dark and illuminated conditions.

  9. FFT transformed quantitative EEG analysis of short term memory load.

    Science.gov (United States)

    Singh, Yogesh; Singh, Jayvardhan; Sharma, Ratna; Talwar, Anjana

    2015-07-01

    The EEG is considered as building block of functional signaling in the brain. The role of EEG oscillations in human information processing has been intensively investigated. To study the quantitative EEG correlates of short term memory load as assessed through Sternberg memory test. The study was conducted on 34 healthy male student volunteers. The intervention consisted of Sternberg memory test, which runs on a version of the Sternberg memory scanning paradigm software on a computer. Electroencephalography (EEG) was recorded from 19 scalp locations according to 10-20 international system of electrode placement. EEG signals were analyzed offline. To overcome the problems of fixed band system, individual alpha frequency (IAF) based frequency band selection method was adopted. The outcome measures were FFT transformed absolute powers in the six bands at 19 electrode positions. Sternberg memory test served as model of short term memory load. Correlation analysis of EEG during memory task was reflected as decreased absolute power in Upper alpha band in nearly all the electrode positions; increased power in Theta band at Fronto-Temporal region and Lower 1 alpha band at Fronto-Central region. Lower 2 alpha, Beta and Gamma band power remained unchanged. Short term memory load has distinct electroencephalographic correlates resembling the mentally stressed state. This is evident from decreased power in Upper alpha band (corresponding to Alpha band of traditional EEG system) which is representative band of relaxed mental state. Fronto-temporal Theta power changes may reflect the encoding and execution of memory task.

  10. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  11. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, P; Weeke, B; Loewenstein, H [Rigshospitalet, Copenhagen (Denmark)

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10/sup 4/, 2 x 10/sup 4/, 2 x 10/sup 5/ dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A.

  12. Quantitative produced water analysis using mobile 1H NMR

    International Nuclear Information System (INIS)

    Wagner, Lisabeth; Fridjonsson, Einar O; May, Eric F; Stanwix, Paul L; Graham, Brendan F; Carroll, Matthew R J; Johns, Michael L; Kalli, Chris

    2016-01-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1 H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1 H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1 H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1–30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography. (paper)

  13. Photographers’ Nomenclature Units: A Structural and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Margarita A. Mihailova

    2017-11-01

    Full Text Available Addressing the needs of cross and intercultural communication as well as the methodology of contrastive research, the paper presents the results of the complex analysis conducted to describe semantic and pragmatic parameters of nomenclature units denoting photography equipment in the modern Russian informal discourse of professional photographers. The research is exemplified by 34 original nomenclature units and their 34 Russian equivalents used in 6871 comments posted at “Клуб.Foto.ru” web-site in 2015. The structural and quantitative analyses of photographers’ nomenclature demonstrate the users’ morphological and graphic preferences and indirectly reflect their social and professional values. The corpus-based approach developed by Kast-Aigner (2009: 141 was applied in the study with the aim to identify the nomenclature units denoting photography equipment, validate and elaborate the data of the existing corpus. The research also throws light on the problems of professional language development and derivational processes. The perspective of the study lies in the research of the broader context of professional nomenclature.

  14. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  15. Social media in epilepsy: A quantitative and qualitative analysis.

    Science.gov (United States)

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Spatial Analysis Along Networks Statistical and Computational Methods

    CERN Document Server

    Okabe, Atsuyuki

    2012-01-01

    In the real world, there are numerous and various events that occur on and alongside networks, including the occurrence of traffic accidents on highways, the location of stores alongside roads, the incidence of crime on streets and the contamination along rivers. In order to carry out analyses of those events, the researcher needs to be familiar with a range of specific techniques. Spatial Analysis Along Networks provides a practical guide to the necessary statistical techniques and their computational implementation. Each chapter illustrates a specific technique, from Stochastic Point Process

  17. Quantitative fluorescence loss in photobleaching for analysis of protein transport and aggregation

    Directory of Open Access Journals (Sweden)

    Wüstner Daniel

    2012-11-01

    Full Text Available Abstract Background Fluorescence loss in photobleaching (FLIP is a widely used imaging technique, which provides information about protein dynamics in various cellular regions. In FLIP, a small cellular region is repeatedly illuminated by an intense laser pulse, while images are taken with reduced laser power with a time lag between the bleaches. Despite its popularity, tools are lacking for quantitative analysis of FLIP experiments. Typically, the user defines regions of interest (ROIs for further analysis which is subjective and does not allow for comparing different cells and experimental settings. Results We present two complementary methods to detect and quantify protein transport and aggregation in living cells from FLIP image series. In the first approach, a stretched exponential (StrExp function is fitted to fluorescence loss (FL inside and outside the bleached region. We show by reaction–diffusion simulations, that the StrExp function can describe both, binding/barrier–limited and diffusion-limited FL kinetics. By pixel-wise regression of that function to FL kinetics of enhanced green fluorescent protein (eGFP, we determined in a user-unbiased manner from which cellular regions eGFP can be replenished in the bleached area. Spatial variation in the parameters calculated from the StrExp function allow for detecting diffusion barriers for eGFP in the nucleus and cytoplasm of living cells. Polyglutamine (polyQ disease proteins like mutant huntingtin (mtHtt can form large aggregates called inclusion bodies (IB’s. The second method combines single particle tracking with multi-compartment modelling of FL kinetics in moving IB’s to determine exchange rates of eGFP-tagged mtHtt protein (eGFP-mtHtt between aggregates and the cytoplasm. This method is self-calibrating since it relates the FL inside and outside the bleached regions. It makes it therefore possible to compare release kinetics of eGFP-mtHtt between different cells and

  18. Factors influencing the spatial extent of mobile source air pollution impacts: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Levy Jonathan I

    2007-05-01

    Full Text Available Abstract Background There has been growing interest among exposure assessors, epidemiologists, and policymakers in the concept of "hot spots", or more broadly, the "spatial extent" of impacts from traffic-related air pollutants. This review attempts to quantitatively synthesize findings about the spatial extent under various circumstances. Methods We include both the peer-reviewed literature and government reports, and focus on four significant air pollutants: carbon monoxide, benzene, nitrogen oxides, and particulate matter (including both ultrafine particle counts and fine particle mass. From the identified studies, we extracted information about significant factors that would be hypothesized to influence the spatial extent within the study, such as the study type (e.g., monitoring, air dispersion modeling, GIS-based epidemiological studies, focus on concentrations or health risks, pollutant under study, background concentration, emission rate, and meteorological factors, as well as the study's implicit or explicit definition of spatial extent. We supplement this meta-analysis with results from some illustrative atmospheric dispersion modeling. Results We found that pollutant characteristics and background concentrations best explained variability in previously published spatial extent estimates, with a modifying influence of local meteorology, once some extreme values based on health risk estimates were removed from the analysis. As hypothesized, inert pollutants with high background concentrations had the largest spatial extent (often demonstrating no significant gradient, and pollutants formed in near-source chemical reactions (e.g., nitrogen dioxide had a larger spatial extent than pollutants depleted in near-source chemical reactions or removed through coagulation processes (e.g., nitrogen oxide and ultrafine particles. Our illustrative dispersion model illustrated the complex interplay of spatial extent definitions, emission rates

  19. B1 -sensitivity analysis of quantitative magnetization transfer imaging.

    Science.gov (United States)

    Boudreau, Mathieu; Stikov, Nikola; Pike, G Bruce

    2018-01-01

    To evaluate the sensitivity of quantitative magnetization transfer (qMT) fitted parameters to B 1 inaccuracies, focusing on the difference between two categories of T 1 mapping techniques: B 1 -independent and B 1 -dependent. The B 1 -sensitivity of qMT was investigated and compared using two T 1 measurement methods: inversion recovery (IR) (B 1 -independent) and variable flip angle (VFA), B 1 -dependent). The study was separated into four stages: 1) numerical simulations, 2) sensitivity analysis of the Z-spectra, 3) healthy subjects at 3T, and 4) comparison using three different B 1 imaging techniques. For typical B 1 variations in the brain at 3T (±30%), the simulations resulted in errors of the pool-size ratio (F) ranging from -3% to 7% for VFA, and -40% to > 100% for IR, agreeing with the Z-spectra sensitivity analysis. In healthy subjects, pooled whole-brain Pearson correlation coefficients for F (comparing measured double angle and nominal flip angle B 1 maps) were ρ = 0.97/0.81 for VFA/IR. This work describes the B 1 -sensitivity characteristics of qMT, demonstrating that it varies substantially on the B 1 -dependency of the T 1 mapping method. Particularly, the pool-size ratio is more robust against B 1 inaccuracies if VFA T 1 mapping is used, so much so that B 1 mapping could be omitted without substantially biasing F. Magn Reson Med 79:276-285, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  20. Quantitative Gait Analysis in Patients with Huntington’s Disease

    Directory of Open Access Journals (Sweden)

    Seon Jong Pyo

    2017-09-01

    Full Text Available Objective Gait disturbance is the main factor contributing to a negative impact on quality of life in patients with Huntington’s disease (HD. Understanding gait features in patients with HD is essential for planning a successful gait strategy. The aim of this study was to investigate temporospatial gait parameters in patients with HD compared with healthy controls. Methods We investigated 7 patients with HD. Diagnosis was confirmed by genetic analysis, and patients were evaluated with the Unified Huntington’s Disease Rating Scale (UHDRS. Gait features were assessed with a gait analyzer. We compared the results of patients with HD to those of 7 age- and sex-matched normal controls. Results Step length and stride length were decreased and base of support was increased in the HD group compared to the control group. In addition, coefficients of variability for step and stride length were increased in the HD group. The HD group showed slower walking velocity, an increased stance/swing phase in the gait cycle and a decreased proportion of single support time compared to the control group. Cadence did not differ significantly between groups. Among the UHDRS subscores, total motor score and total behavior score were positively correlated with step length, and total behavior score was positively correlated with walking velocity in patients with HD. Conclusion Increased variability in step and stride length, slower walking velocity, increased stance phase, and decreased swing phase and single support time with preserved cadence suggest that HD gait patterns are slow, ataxic and ineffective. This study suggests that quantitative gait analysis is needed to assess gait problems in HD.

  1. Variability of apparently homogeneous soilscapes in São Paulo state, Brazil: I. spatial analysis

    Directory of Open Access Journals (Sweden)

    M. van Den Berg

    2000-06-01

    Full Text Available The spatial variability of strongly weathered soils under sugarcane and soybean/wheat rotation was quantitatively assessed on 33 fields in two regions in São Paulo State, Brazil: Araras (15 fields with sugarcane and Assis (11 fields with sugarcane and seven fields with soybean/wheat rotation. Statistical methods used were: nested analysis of variance (for 11 fields, semivariance analysis and analysis of variance within and between fields. Spatial levels from 50 m to several km were analyzed. Results are discussed with reference to a previously published study carried out in the surroundings of Passo Fundo (RS. Similar variability patterns were found for clay content, organic C content and cation exchange capacity. The fields studied are quite homogeneous with respect to these relatively stable soil characteristics. Spatial variability of other characteristics (resin extractable P, pH, base- and Al-saturation and also soil colour, varies with region and, or land use management. Soil management for sugarcane seems to have induced modifications to greater depths than for soybean/wheat rotation. Surface layers of soils under soybean/wheat present relatively little variation, apparently as a result of very intensive soil management. The major part of within-field variation occurs at short distances (< 50 m in all study areas. Hence, little extra information would be gained by increasing sampling density from, say, 1/km² to 1/50 m². For many purposes, the soils in the study regions can be mapped with the same observation density, but residual variance will not be the same in all areas. Bulk sampling may help to reveal spatial patterns between 50 and 1.000 m.

  2. Investigation of Spatial Data with Open Source Social Network Analysis and Geographic Information Systems Applications

    Science.gov (United States)

    Sabah, L.; Şimşek, M.

    2017-11-01

    Social networks are the real social experience of individuals in the online environment. In this environment, people use symbolic gestures and mimics, sharing thoughts and content. Social network analysis is the visualization of complex and large quantities of data to ensure that the overall picture appears. It is the understanding, development, quantitative and qualitative analysis of the relations in the social networks of Graph theory. Social networks are expressed in the form of nodes and edges. Nodes are people/organizations, and edges are relationships between nodes. Relations are directional, non-directional, weighted, and weightless. The purpose of this study is to examine the effects of social networks on the evaluation of person data with spatial coordinates. For this, the cluster size and the effect on the geographical area of the circle where the placements of the individual are influenced by the frequently used placeholder feature in the social networks have been studied.

  3. A Spatial Analysis of Tourism Activity in Romania

    Directory of Open Access Journals (Sweden)

    Daniela Luminita Constantin

    2018-02-01

    Full Text Available Location is a key concept in tourism sector analysis, given the dependence of this activity on the natural, built, cultural and social characteristics of a certain territory. As a result, the tourist zoning is an important instrument for delimiting tourist areas in accordance with multiple criteria, so as to lay the foundations for finding the most suitable solutions of turning to good account the resources in this field. The modern approaches proposed in this paper use a series of analytical tools that combine GIS and spatial agglomeration analysis based techniques. They can be also employed in order to examine and explain the differences between tourist zones (and sub-zones in terms of economic and social results and thus to suggest realistic ways to improve the efficiency and effectiveness of tourist activities in various geographical areas. In the described context this paper proposes an interdisciplinary perspective (spatial statistics and Geographical Information Systems for analysing the tourism activity in Romania, mainly aiming to identify the agglomerations of companies acting in this industry and assess their performance and contribution to the economic development of the corresponding regions. It also intends to contribute to a better understanding of the way in which tourism related business activities develop, in order to enhance appropriate support networks. Territorial and spatial statistics, as well as GIS based analyses are applied, using data about all companies acting in tourism industry in Romania provided by the National Authority for Tourism as well as data from the Environmental Systems Research Institute (ESRI.

  4. [Spatial analysis of childhood obesity and overweight in Peru, 2014].

    Science.gov (United States)

    Hernández-Vásquez, Akram; Bendezú-Quispe, Guido; Díaz-Seijas, Deysi; Santero, Marilina; Minckas, Nicole; Azañedo, Diego; Antiporta, Daniel A

    2016-01-01

    To estimate regional prevalence and identify the spatial patterns of the degree of overweight and obesity by districts in under five years children in Peru during 2014. Analysis of the information reported by the Information System Nutritional Status (SIEN) of the number of cases of overweight and obesity in children under five years recorded during 2014. Regional prevalence for overweight and obesity, and their respective confidence intervals to 95% were calculated. Moran index was used to determine patterns of grouping districts with high prevalence of overweight and/or obesity. Data from 1834 districts and 2,318,980 children under five years were analyzed. 158,738 cases (6.84%; CI 95%: 6.81 to 6.87) were overweight, while 56,125 (2.42%; CI 95%: 2.40 to 2.44) obesity. The highest prevalence of overweight were identified in the regions of Tacna (13.9%), Moquegua (11.8%), Callao (10.4%), Lima (10.2%) and Ica (9.3%), and in the same regions for obesity with 5.3%; 4.3%; 4.0%; 4.0% and 3.8% respectively. The spatial analysis found grouping districts of high prevalence in 10% of all districts for both overweight and obesity, identifying 199 districts for overweight (126 urban and 73 rural), and 184 for obesity (136 urban and 48 rural). The highest prevalence of overweight and obesity were identified in the Peruvian coast regions. Moreover, these regions are predominantly exhibited a spatial clustering of districts with high prevalence of overweight and obesity.

  5. Quantitative image analysis of cellular heterogeneity in breast tumors complements genomic profiling.

    Science.gov (United States)

    Yuan, Yinyin; Failmezger, Henrik; Rueda, Oscar M; Ali, H Raza; Gräf, Stefan; Chin, Suet-Feung; Schwarz, Roland F; Curtis, Christina; Dunning, Mark J; Bardwell, Helen; Johnson, Nicola; Doyle, Sarah; Turashvili, Gulisa; Provenzano, Elena; Aparicio, Sam; Caldas, Carlos; Markowetz, Florian

    2012-10-24

    Solid tumors are heterogeneous tissues composed of a mixture of cancer and normal cells, which complicates the interpretation of their molecular profiles. Furthermore, tissue architecture is generally not reflected in molecular assays, rendering this rich information underused. To address these challenges, we developed a computational approach based on standard hematoxylin and eosin-stained tissue sections and demonstrated its power in a discovery and validation cohort of 323 and 241 breast tumors, respectively. To deconvolute cellular heterogeneity and detect subtle genomic aberrations, we introduced an algorithm based on tumor cellularity to increase the comparability of copy number profiles between samples. We next devised a predictor for survival in estrogen receptor-negative breast cancer that integrated both image-based and gene expression analyses and significantly outperformed classifiers that use single data types, such as microarray expression signatures. Image processing also allowed us to describe and validate an independent prognostic factor based on quantitative analysis of spatial patterns between stromal cells, which are not detectable by molecular assays. Our quantitative, image-based method could benefit any large-scale cancer study by refining and complementing molecular assays of tumor samples.

  6. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    Science.gov (United States)

    2011-01-01

    The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular

  7. Spatially explicit multi-criteria decision analysis for managing vector-borne diseases

    Directory of Open Access Journals (Sweden)

    Hongoh Valerie

    2011-12-01

    Full Text Available Abstract The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector

  8. Quantitative Analysis of the Effect of Iterative Reconstruction Using a Phantom: Determining the Appropriate Blending Percentage

    Science.gov (United States)

    Kim, Hyun Gi; Lee, Young Han; Choi, Jin-Young; Park, Mi-Suk; Kim, Myeong-Jin; Kim, Ki Whang

    2015-01-01

    Purpose To investigate the optimal blending percentage of adaptive statistical iterative reconstruction (ASIR) in a reduced radiation dose while preserving a degree of image quality and texture that is similar to that of standard-dose computed tomography (CT). Materials and Methods The CT performance phantom was scanned with standard and dose reduction protocols including reduced mAs or kVp. Image quality parameters including noise, spatial, and low-contrast resolution, as well as image texture, were quantitatively evaluated after applying various blending percentages of ASIR. The optimal blending percentage of ASIR that preserved image quality and texture compared to standard dose CT was investigated in each radiation dose reduction protocol. Results As the percentage of ASIR increased, noise and spatial-resolution decreased, whereas low-contrast resolution increased. In the texture analysis, an increasing percentage of ASIR resulted in an increase of angular second moment, inverse difference moment, and correlation and in a decrease of contrast and entropy. The 20% and 40% dose reduction protocols with 20% and 40% ASIR blending, respectively, resulted in an optimal quality of images with preservation of the image texture. Conclusion Blending the 40% ASIR to the 40% reduced tube-current product can maximize radiation dose reduction and preserve adequate image quality and texture. PMID:25510772

  9. The spatial kinetic analysis of accelerator-driven subcritical reactor

    International Nuclear Information System (INIS)

    Takahashi, H.; An, Y.; Chen, X.

    1998-02-01

    The operation of the accelerator driven reactor with subcritical condition provides a more flexible choice of the reactor materials and of design parameters. A deep subcriticality is chosen sometime from the analysis of point kinetics. When a large reactor is operated in deep subcritical condition by using a localized spallation source, the power distribution has strong spatial dependence, and point kinetics does not provide proper analysis for reactor safety. In order to analyze the spatial and energy dependent kinetic behavior in the subcritical reactor, the authors developed a computation code which is composed of two parts, the first one is for creating the group cross section and the second part solves the multi-group kinetic diffusion equations. The reactor parameters such as the cross section of fission, scattering, and energy transfer among the several energy groups and regions are calculated by using a code modified from the Monte Carlo codes MCNPA and LAHET instead of the usual analytical method of ANISN, TWOTRAN codes. Thus the complicated geometry of the accelerator driven reactor core can be precisely taken into account. The authors analyzed the subcritical minor actinide transmutor studied by Japan Atomic Energy Research Institute (JAERI) using the code

  10. Spatial and temporal analysis of postural control in dyslexic children.

    Science.gov (United States)

    Gouleme, Nathalie; Gerard, Christophe Loic; Bui-Quoc, Emmanuel; Bucci, Maria Pia

    2015-07-01

    The aim of this study is to examine postural control of dyslexic children using both spatial and temporal analysis. Thirty dyslexic (mean age 9.7±0.3years) and thirty non-dyslexic age-matched children participated in the study. Postural stability was evaluated using Multitest Equilibre from Framiral®. Posture was recorded in the following conditions: eyes open fixating a target (EO) and eyes closed (EC) on stable (-S-) and unstable (-U-) platforms. The findings of this study showed poor postural stability in dyslexic children with respect to the non-dyslexic children group, as demonstrated by both spatial and temporal analysis. In both groups of children postural control depends on the condition, and improves when the eyes are open on a stable platform. Dyslexic children have spectral power indices that are higher than in non-dyslexic children and they showed a shorter cancelling time. Poor postural control in dyslexic children could be due to a deficit in using sensory information most likely caused by impairment in cerebellar activity. The reliability of brain activation patterns, namely in using sensory input and cerebellar activity may explain the deficit in postural control in dyslexic children. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Modern Spatial Rainfall Rate is well Correlated with Coretop δ2Hdinosterol in the South Pacific Convergence Zone: A Tool for Quantitative Reconstructions of Rainfall Rate

    Science.gov (United States)

    Sear, D. A.; Maloney, A. E.; Nelson, D. B.; Sachs, J. P.; Hassall, J. D.; Langdon, P. G.; Prebble, M.; Richey, J. N.; Schabetsberger, R.; Sichrowsky, U.; Hope, G.

    2015-12-01

    The South Pacific Convergence Zone (SPCZ) is the Southern Hemisphere's most prominent precipitation feature extending southeastward 3000 km from Papua New Guinea to French Polynesia. Determining how the SPCZ responded to climate variations before the instrumental record requires the use of indirect indicators of rainfall. The link between the hydrogen isotopic composition of fluxes of water though the hydrologic cycle, lake water, and molecular fossil 2H/1H ratios make hydrogen isotopes a promising tool for improving our understanding of this important climate feature. An analysis of coretop sediment from freshwater lakes in the SPCZ region indicates that there is a strong spatial relationship between δ2Hdinosterol and mean annual precipitation rate. The objectives of this research are to use 2H/1H ratios of the biomarker dinosterol to develop an empirical relationship between δ2Hdinosterol and modern environmental rainfall rates so that we may quantitatively reconstruct several aspects of the SPCZ's hydrological system during the late Holocene. The analysis includes lake sediment coretops from the Solomon Islands, Wallis Island, Vanuatu, Tahiti, Samoa, New Caledonia, and the Cook Islands. These islands span range of average modern precipitation rates from 3 to 7 mm/day and the coretop sediment δ2Hdinosterol values range from -240‰ to -320‰. Applying this regional coretop calibration to dated sediment cores reveals that the mean annual position and/or intensity of the SPCZ has not been static during the past 2000 years.

  12. Modern Spatial Rainfall Rate is well Correlated with Coretop δ2Hdinosterol in the South Pacific Convergence Zone: a Tool for Quantitative Reconstructions

    Science.gov (United States)

    Maloney, A. E.; Nelson, D. B.; Sachs, J. P.; Hassall, J. D.; Sear, D. A.; Langdon, P. G.; Prebble, M.; Richey, J. N.; Schabetsberger, R.; Sichrowsky, U.; Hope, G.

    2016-02-01

    The South Pacific Convergence Zone (SPCZ) is the Southern Hemisphere's most prominent precipitation feature extending southeastward 3000 km from Papua New Guinea to French Polynesia. Determining how the SPCZ responded to climate variations before the instrumental record requires the use of indirect indicators of rainfall. The link between the hydrogen isotopic composition of water fluxes though the hydrologic cycle, lake water, and molecular fossil 2H/1H ratios make hydrogen isotopes a promising tool for improving our understanding of this important climate feature. An analysis of coretop sediment from freshwater lakes in the SPCZ region indicates that there is a strong spatial relationship between δ2Hdinosterol and mean annual precipitation rate. The objectives of this research are to use 2H/1H ratios of the biomarker dinosterol to develop an empirical relationship between δ2Hdinosterol and modern environmental rainfall rates so that we may quantitatively reconstruct several aspects of the SPCZ's hydrological system during the late Holocene. The analysis includes lake sediment coretops from the Solomon Islands, Wallis Island, Vanuatu, Tahiti, Samoa, New Caledonia, and the Cook Islands. These islands span range of average modern precipitation rates from 3 to 7 mm/day and the coretop sediment δ2Hdinosterol values range from -240‰ to -320‰. Applying this regional coretop calibration to dated sediment cores reveals that the mean annual position and/or intensity of the SPCZ has not been static during the past 2000 years.

  13. Quantitative analysis of LISA pathfinder test-mass noise

    International Nuclear Information System (INIS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-01-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3x10 -14 m s -2 /√(Hz) at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise

  14. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  15. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  16. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  17. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  18. The relationship between language and spatial ability an analysis of spatial language for reconstructing the solving of spatial tasks

    CERN Document Server

    Mizzi, Angel

    2017-01-01

    This work investigates how different fifth-grade students solve spatial-verbal tasks and the role of language in this process. Based on a synthesis of theoretical foundations and methodological issues for supporting the relationship between spatial ability and language, this present study examines and classifies strategies used by students as well as the obstacles they encounter when solving spatial tasks in the reconstruction method. Contents Theoretical Framework Design and Implementation Results and Discussion from the Inductive Data Analyses Target Groups Scholars and students of mathematics education Teachers of mathematics in primary and secondary schools About the Author Angel Mizzi works as a research assistant and lecturer at the University of Duisburg-Essen, where he has successfully completed his PhD studies in mathematics education.

  19. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  20. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Science.gov (United States)

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  1. Defensible Spaces in Philadelphia: Exploring Neighborhood Boundaries Through Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Rory Kramer

    2017-02-01

    Full Text Available Few spatial scales are as important to individual outcomes as the neighborhood. However, it is nearly impossible to define neighborhoods in a generalizable way. This article proposes that by shifting the focus to measuring neighborhood boundaries rather than neighborhoods, scholars can avoid the problem of the indefinable neighborhood and better approach questions of what predicts racial segregation across areas. By quantifying an externality space theory of neighborhood boundaries, this article introduces a novel form of spatial analysis to test where potential physical markers of neighborhood boundaries (major roads, rivers, railroads, and the like are associated with persistent racial boundaries between 1990 and 2010. Using Philadelphia as a case study, the paper identifies neighborhoods with persistent racial boundaries. It theorizes that local histories of white reactions to black in-migration explain which boundaries persistently resisted racial turnover, unlike the majority of Philadelphia’s neighborhoods, and that those racial boundaries shape the location, progress, and reaction to new residential development in those neighborhoods.

  2. Quantitative analysis of geomorphic processes using satellite image data at different scales

    Science.gov (United States)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  3. Quantitative diagnosis of bladder cancer by morphometric analysis of HE images

    Science.gov (United States)

    Wu, Binlin; Nebylitsa, Samantha V.; Mukherjee, Sushmita; Jain, Manu

    2015-02-01

    In clinical practice, histopathological analysis of biopsied tissue is the main method for bladder cancer diagnosis and prognosis. The diagnosis is performed by a pathologist based on the morphological features in the image of a hematoxylin and eosin (HE) stained tissue sample. This manuscript proposes algorithms to perform morphometric analysis on the HE images, quantify the features in the images, and discriminate bladder cancers with different grades, i.e. high grade and low grade. The nuclei are separated from the background and other types of cells such as red blood cells (RBCs) and immune cells using manual outlining, color deconvolution and image segmentation. A mask of nuclei is generated for each image for quantitative morphometric analysis. The features of the nuclei in the mask image including size, shape, orientation, and their spatial distributions are measured. To quantify local clustering and alignment of nuclei, we propose a 1-nearest-neighbor (1-NN) algorithm which measures nearest neighbor distance and nearest neighbor parallelism. The global distributions of the features are measured using statistics of the proposed parameters. A linear support vector machine (SVM) algorithm is used to classify the high grade and low grade bladder cancers. The results show using a particular group of nuclei such as large ones, and combining multiple parameters can achieve better discrimination. This study shows the proposed approach can potentially help expedite pathological diagnosis by triaging potentially suspicious biopsies.

  4. Quantitative analysis of oxygen depth distribution by means of deuteron reaction

    International Nuclear Information System (INIS)

    Dyumin, A.N.; Eremin, V.K.; Konnikov, S.G.

    1993-01-01

    Experimentally are investigated and realized possibilities for using the reaction for quantitative determination of the depth profiles of the oxygen distribution in HTSC structures in layers up to 10 4 A. It is concluded that in the near-surface layers when profiling the oxygen content is achieved the spatial resolution of 150 A

  5. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Passaro, Bruno Martins

    2011-01-01

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR 20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  6. Integrating the statistical analysis of spatial data in ecology

    Science.gov (United States)

    A. M. Liebhold; J. Gurevitch

    2002-01-01

    In many areas of ecology there is an increasing emphasis on spatial relationships. Often ecologists are interested in new ways of analyzing data with the objective of quantifying spatial patterns, and in designing surveys and experiments in light of the recognition that there may be underlying spatial pattern in biotic responses. In doing so, ecologists have adopted a...

  7. Spatial prediction of landslide hazard using discriminant analysis and GIS

    Science.gov (United States)

    Peter V. Gorsevski; Paul Gessler; Randy B. Foltz

    2000-01-01

    Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...

  8. The Determinants of VAT Introduction : A Spatial Duration Analysis

    NARCIS (Netherlands)

    Cizek, P.; Lei, J.; Ligthart, J.E.

    2012-01-01

    Abstract: The spatial survival models typically impose frailties, which characterize unobserved heterogeneity, to be spatially correlated. This specification relies highly on a pre-determinate covariance structure of the errors. However, the spatial effect may not only exist in the unobserved

  9. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  10. Quantitative risk analysis for landslides ‒ Examples from Bíldudalur, NW-Iceland

    Directory of Open Access Journals (Sweden)

    R. Bell

    2004-01-01

    Full Text Available Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative

  11. Spatially resolved element analysis of historical violin varnishes by use of muPIXE.

    Science.gov (United States)

    von Bohlen, Alex; Röhrs, Stefan; Salomon, Joseph

    2007-02-01

    External muPIXE has been used for characterisation of small samples of varnish from historical violins, and pieces of varnished wood from historical and modern stringed instruments. To obtain spatially resolved information about the distribution of elements across the varnish layers single-spot analysis, line-scans, and area-mapping were performed. Local resolution of approximately 20 mum was obtained from the 3 MeV, 1 nA proton micro-probe. Results from simultaneous multi-element determination of Na, Mg, Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Br, Rb, Sr, Ag, Cd, Sn, Ba, and Pb in historical varnishes are presented. Semi-quantitative evaluation of line-scans recorded on diverse historical varnishes is reported. The applied method is discussed in detail and the results obtained are critically reviewed and compared with those in the literature.

  12. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  13. Quantitative analysis of soluble elements in environmental waters by PIXE

    International Nuclear Information System (INIS)

    Niizeki, T.; Kawasaki, K.; Adachi, M.; Tsuji, M.; Hattori, T.

    1999-01-01

    We have started PIXE research for environmental science at Van de Graaff accelerator facility in Tokyo Institute of Technology. Quantitative measurements of soluble fractions in river waters have been carried out using the preconcentrate method developed in Tohoku University. We reveal that this PIXE target preparation can be also applied to waste water samples. (author)

  14. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...

  15. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  16. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    SAM

    2014-05-14

    May 14, 2014 ... African Journal of Biotechnology. Full Length ... quantitative trait locus (QTLs) on chromosomes 1, 6, 7 and 20 in ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs ...

  17. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  18. Multitemporal spatial pattern analysis of Tulum's tropical coastal landscape

    Science.gov (United States)

    Ramírez-Forero, Sandra Carolina; López-Caloca, Alejandra; Silván-Cárdenas, José Luis

    2011-11-01

    The tropical coastal landscape of Tulum in Quintana Roo, Mexico has a high ecological, economical, social and cultural value, it provides environmental and tourism services at global, national, regional and local levels. The landscape of the area is heterogeneous and presents random fragmentation patterns. In recent years, tourist services of the region has been increased promoting an accelerate expansion of hotels, transportation and recreation infrastructure altering the complex landscape. It is important to understand the environmental dynamics through temporal changes on the spatial patterns and to propose a better management of this ecological area to the authorities. This paper addresses a multi-temporal analysis of land cover changes from 1993 to 2000 in Tulum using Thematic Mapper data acquired by Landsat-5. Two independent methodologies were applied for the analysis of changes in the landscape and for the definition of fragmentation patterns. First, an Iteratively Multivariate Alteration Detection (IR-MAD) algorithm was used to detect and localize land cover change/no-change areas. Second, the post-classification change detection evaluated using the Support Vector Machine (SVM) algorithm. Landscape metrics were calculated from the results of IR-MAD and SVM. The analysis of the metrics indicated, among other things, a higher fragmentation pattern along roadways.

  19. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    Science.gov (United States)

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  1. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    International Nuclear Information System (INIS)

    Ryan, C.G.; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-01-01

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  2. Improved Dynamic Analysis method for quantitative PIXE and SXRF element imaging of complex materials

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, C.G., E-mail: chris.ryan@csiro.au; Laird, J.S.; Fisher, L.A.; Kirkham, R.; Moorhead, G.F.

    2015-11-15

    The Dynamic Analysis (DA) method in the GeoPIXE software provides a rapid tool to project quantitative element images from PIXE and SXRF imaging event data both for off-line analysis and in real-time embedded in a data acquisition system. Initially, it assumes uniform sample composition, background shape and constant model X-ray relative intensities. A number of image correction methods can be applied in GeoPIXE to correct images to account for chemical concentration gradients, differential absorption effects, and to correct images for pileup effects. A new method, applied in a second pass, uses an end-member phase decomposition obtained from the first pass, and DA matrices determined for each end-member, to re-process the event data with each pixel treated as an admixture of end-member terms. This paper describes the new method and demonstrates through examples and Monte-Carlo simulations how it better tracks spatially complex composition and background shape while still benefitting from the speed of DA.

  3. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  4. Study of academic achievements using spatial analysis tools

    Science.gov (United States)

    González, C.; Velilla, C.; Sánchez-Girón, V.

    2012-04-01

    In the 2010/12 academic year the College of Agricultural Engineering of the Technical University of Madrid implemented three new degrees all of them adapted to the European Space for Higher Education. These degrees are namely: Graduate in Agricultural Engineering and Science, Graduate in Food Engineering and Graduate in Agro-Environmental Engineering. A total of 382 new incoming students were finally registered and a survey study was carried out with these students about their academic achievement with the aim of finding the level of dependence among the following variables: the final mark in their secondary studies, the option followed in the secondary studies (Art, Science and Technology, and Humanities and Social Sciences), the mark obtained in the entering examination to the university and in which of the two opportunities per year this examination takes place the latter mark was obtained. Similarly, another group of 77 students were evaluated independently to the former group. These students were those entering the College in the previous academic year (2009/10) and decided to change their curricula to the new ones. Subsequently, using the tools of spatial analysis of geographic information systems, we analyzed the possible relationship between the success or failure at school and the socioeconomic profile of new students in a grade. For this purpose every student was referenced assigning UTM coordinates to their postal addresses. Furthermore, all students' secondary schools were geographically coded considering their typology (public, private, and private subsidized) and fares. Each student was represented by its average geometric point in order to be correlated to their respective record. Following this procedure a map of the performance of each student could be drawn. This map can be used as a reference system, as it includes variables as the distance from the student home to the College, that can be used as a tool to calculate the probability of success or

  5. Tucker Tensor analysis of Matern functions in spatial statistics

    KAUST Repository

    Litvinenko, Alexander

    2018-03-09

    In this work, we describe advanced numerical tools for working with multivariate functions and for the analysis of large data sets. These tools will drastically reduce the required computing time and the storage cost, and, therefore, will allow us to consider much larger data sets or finer meshes. Covariance matrices are crucial in spatio-temporal statistical tasks, but are often very expensive to compute and store, especially in 3D. Therefore, we approximate covariance functions by cheap surrogates in a low-rank tensor format. We apply the Tucker and canonical tensor decompositions to a family of Matern- and Slater-type functions with varying parameters and demonstrate numerically that their approximations exhibit exponentially fast convergence. We prove the exponential convergence of the Tucker and canonical approximations in tensor rank parameters. Several statistical operations are performed in this low-rank tensor format, including evaluating the conditional covariance matrix, spatially averaged estimation variance, computing a quadratic form, determinant, trace, loglikelihood, inverse, and Cholesky decomposition of a large covariance matrix. Low-rank tensor approximations reduce the computing and storage costs essentially. For example, the storage cost is reduced from an exponential O(n^d) to a linear scaling O(drn), where d is the spatial dimension, n is the number of mesh points in one direction, and r is the tensor rank. Prerequisites for applicability of the proposed techniques are the assumptions that the data, locations, and measurements lie on a tensor (axes-parallel) grid and that the covariance function depends on a distance, ||x-y||.

  6. Quantitative analysis of Moessbauer backscatter spectra from multilayer films

    International Nuclear Information System (INIS)

    Bainbridge, J.

    1975-01-01

    The quantitative interpretation of Moessbauer backscatter spectra with particular reference to internal conversion electrons has been treated assuming that electron attenuation in a surface film can be satisfactorily described by a simple exponential law. The theory of Krakowski and Miller has been extended to include multi-layer samples, and a relation between the Moessbauer spectrum area and an individual layer thickness derived. As an example, numerical results are obtained for a duplex oxide film grown on pure iron. (Auth.)

  7. Geometrical conditions at the quantitative neutronographic texture analysis

    International Nuclear Information System (INIS)

    Tobisch, J.; Kleinstueck, K.

    1975-10-01

    The beam geometry for measuring quantitative pole figures by a neutronographic texture diffractometer is explained for transmission and reflection arrangement of spherical samples and sheets. For given dimensions of counter aperture the maximum possible cross sections of the incident beam are calculated as a function of sample dimensions and the Bragg angle theta. Methods for the calculation of absorption factors and volume correction are given. Under special conditions advantages result in the transmission case for sample motion into the direction +α. (author)

  8. Quantitative analysis of strategic and tactical purchasing decisions

    OpenAIRE

    Heijboer, G.J.

    2003-01-01

    Purchasing management is a relatively new scientific research field, partly due to the fact that purchasing has only recently been recognized as a factor of strategic importance to an organization. In this thesis, the author focuses on a selection of strategic and tactical purchasing decision problems. New quantitative models are developed for these decision problems using a range of mathematical techniques, thereby contributing to the further development of purchasing theory and its appliati...

  9. Quantitative analysis of carbon radiation in edge plasmas of LHD

    International Nuclear Information System (INIS)

    Dong, C.F.; Morita, S.; Oishi, T.; Goto, M.; Murakami, I.; Wang, E.R.; Huang, X.L.

    2013-01-01

    It is of interest to compare the carbon radiation loss between LHD and tokamaks. Since the radiation from C"3"+ is much smaller than that from C"5"+, it is also interesting to examine the difference in the detached plasma. In addition, it is important to study quantitatively the radiation from each ionization stage of carbon which is uniquely the dominant impurity in most tokamaks and LHD. (J.P.N.)

  10. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  11. Quantitative Analysis of Electron Beam Damage in Organic Thin Films

    OpenAIRE

    Leijten, Zino J. W. A.; Keizer, Arthur D. A.; de With, Gijsbertus; Friedrich, Heiner

    2017-01-01

    In transmission electron microscopy (TEM) the interaction of an electron beam with polymers such as P3HT:PCBM photovoltaic nanocomposites results in electron beam damage, which is the most important factor limiting acquisition of structural or chemical data at high spatial resolution. Beam effects can vary depending on parameters such as electron dose rate, temperature during imaging, and the presence of water and oxygen in the sample. Furthermore, beam damage will occur at different length s...

  12. Persistent solar signatures in cloud cover: spatial and temporal analysis

    International Nuclear Information System (INIS)

    Voiculescu, M; Usoskin, I

    2012-01-01

    A consensus regarding the impact of solar variability on cloud cover is far from being reached. Moreover, the impact of cloud cover on climate is among the least understood of all climate components. This motivated us to analyze the persistence of solar signals in cloud cover for the time interval 1984–2009, covering two full solar cycles. A spatial and temporal investigation of the response of low, middle and high cloud data to cosmic ray induced ionization (CRII) and UV irradiance (UVI) is performed in terms of coherence analysis of the two signals. For some key geographical regions the response of clouds to UVI and CRII is persistent over the entire time interval indicating a real link. In other regions, however, the relation is not consistent, being intermittent or out of phase, suggesting that some correlations are spurious. The constant in phase or anti-phase relationship between clouds and solar proxies over some regions, especially for low clouds with UVI and CRII, middle clouds with UVI and high clouds with CRII, definitely requires more study. Our results show that solar signatures in cloud cover persist in some key climate-defining regions for the entire time period and supports the idea that, if existing, solar effects are not visible at the global level and any analysis of solar effects on cloud cover (and, consequently, on climate) should be done at the regional level. (letter)

  13. Signal Adaptive System for Space/Spatial-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Veselin N. Ivanović

    2009-01-01

    Full Text Available This paper outlines the development of a multiple-clock-cycle implementation (MCI of a signal adaptive two-dimensional (2D system for space/spatial-frequency (S/SF signal analysis. The design is based on a method for improved S/SF representation of the analyzed 2D signals, also proposed here. The proposed MCI design optimizes critical design performances related to hardware complexity, making it a suitable system for real time implementation on an integrated chip. Additionally, the design allows the implemented system to take a variable number of clock cycles (CLKs (the only necessary ones regarding desirable—2D Wigner distribution-presentation of autoterms in different frequency-frequency points during the execution. This ability represents a major advantage of the proposed design which helps to optimize the time required for execution and produce an improved, cross-terms-free S/SF signal representation. The design has been verified by a field-programmable gate array (FPGA circuit design, capable of performing S/SF analysis of 2D signals in real time.

  14. A Spatial and Temporal analysis of Labour Market Characteristics

    Directory of Open Access Journals (Sweden)

    Pośpiech Ewa

    2016-12-01

    Full Text Available The use of spatial methods is becoming increasingly common in social and economic research as it emphasizes the relevance of spatiality to the understanding of socio-economic facts. Once embraced, the spatial factor can substantially help explain variations in the properties being examined, thus improving the quality of their description and supporting the development of econometric models. This paper explores some of the characteristics of Poland’s job market, making an inquiry into their spatial dependencies. The study looks at the country’s labour market from a local perspective, examining its properties for spatial autocorrelation (both global and local. Linear econometric models are subsequently built for such variables as the number of persons in employment, the number of women and men in employment. The models are further investigated to assess the applicability of spatial modelling in their development.

  15. Spatial Distribution Analysis of Scrub Typhus in Korea

    OpenAIRE

    Jin, Hong Sung; Chu, Chaeshin; Han, Dong Yeob

    2013-01-01

    Objective: This study analyzes the spatial distribution of scrub typhus in Korea. Methods: A spatial distribution of Orientia tsutsugamushi occurrence using a geographic information system (GIS) is presented, and analyzed by means of spatial clustering and correlations. Results: The provinces of Gangwon-do and Gyeongsangbuk-do show a low incidence throughout the year. Some districts have almost identical environmental conditions of scrub typhus incidence. The land use change of districts does...

  16. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  17. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  18. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  19. Spatial and temporal variation in selection of genes associated with pearl millet varietal quantitative traits in situ

    Directory of Open Access Journals (Sweden)

    Cedric Mariac

    2016-07-01

    Full Text Available Ongoing global climate changes imply new challenges for agriculture. Whether plants and crops can adapt to such rapid changes is still a widely debated question. We previously showed adaptation in the form of earlier flowering in pearl millet at the scale of a whole country over three decades. However, this analysis did not deal with variability of year to year selection. To understand and possibly manage plant and crop adaptation, we need more knowledge of how selection acts in situ. Is selection gradual, abrupt, and does it vary in space and over time? In the present study, we tracked the evolution of allele frequency in two genes associated with pearl millet phenotypic variation in situ. We sampled 17 populations of cultivated pearl millet over a period of two years. We tracked changes in allele frequencies in these populations by genotyping more than seven thousand individuals. We demonstrate that several allele frequencies changes are compatible with selection, by correcting allele frequency changes associated with genetic drift. We found marked variation in allele frequencies from year to year, suggesting a variable selection effect in space and over time. We estimated the strength of selection associated with variations in allele frequency. Our results suggest that the polymorphism maintained at the genes we studied is partially explained by the spatial and temporal variability of selection. In response to environmental changes, traditional pearl millet varieties could rapidly adapt thanks to this available functional variability.

  20. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  1. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  2. A book review of Spatial data analysis in ecology and agriculture using R

    Science.gov (United States)

    Spatial Data Analysis in Ecology and Agriculture Using R is a valuable resource to assist agricultural and ecological researchers with spatial data analyses using the R statistical software(www.r-project.org). Special emphasis is on spatial data sets; how-ever, the text also provides ample guidance ...

  3. Using semi-variogram analysis for providing spatially distributed information on soil surface condition for land surface modeling

    Science.gov (United States)

    Croft, Holly; Anderson, Karen; Kuhn, Nikolaus J.

    2010-05-01

    The ability to quantitatively and spatially assess soil surface roughness is important in geomorphology and land degradation studies. Soils can experience rapid structural degradation in response to land cover changes, resulting in increased susceptibility to erosion and a loss of Soil Organic Matter (SOM). Changes in soil surface condition can also alter sediment detachment, transport and deposition processes, infiltration rates and surface runoff characteristics. Deriving spatially distributed quantitative information on soil surface condition for inclusion in hydrological and soil erosion models is therefore paramount. However, due to the time and resources involved in using traditional field sampling techniques, there is a lack of spatially distributed information on soil surface condition. Laser techniques can provide data for a rapid three dimensional representation of the soil surface at a fine spatial resolution. This provides the ability to capture changes at the soil surface associated with aggregate breakdown, flow routing, erosion and sediment re-distribution. Semi-variogram analysis of the laser data can be used to represent spatial dependence within the dataset; providing information about the spatial character of soil surface structure. This experiment details the ability of semi-variogram analysis to spatially describe changes in soil surface condition. Soil for three soil types (silt, silt loam and silty clay) was sieved to produce aggregates between 1 mm and 16 mm in size and placed evenly in sample trays (25 x 20 x 2 cm). Soil samples for each soil type were exposed to five different durations of artificial rainfall, to produce progressively structurally degraded soil states. A calibrated laser profiling instrument was used to measure surface roughness over a central 10 x 10 cm plot of each soil state, at 2 mm sample spacing. The laser data were analysed within a geostatistical framework, where semi-variogram analysis quantitatively represented

  4. Quantitative analysis of water heavy by NMR spectroscopy

    International Nuclear Information System (INIS)

    Gomez Gil, V.

    1975-01-01

    Nuclear Magnetic Resonance has been applied to a wide variety of quantitative problems. A typical example has been the determination of isotopic composition. In this paper two different analytical methods for the determination of water in deuterium oxide are described. The first one, employs acetonitril as an internal standard compound and in the second one calibration curve of signal integral curve versus amount of D 2 O is constructed. Both methods give results comparable to those of mass spectrometry of IR spectroscopy. (Author) 5 refs

  5. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  6. Risk management and analysis: risk assessment (qualitative and quantitative)

    OpenAIRE

    Valentin Mazareanu

    2007-01-01

    We use to define risk as the possibility of suffering a loss. Starting this, risk management is defined as a business process whose purpose is to ensure that the organization is protected against risks and their effects. In order to prioritize, to develop a response plan and after that to monitor the identified risks we need to asses them. But at this point a question is born: should I choose a qualitative approach or a quantitative one? This paper will make a short overview over the risk eva...

  7. Correlation of quantitative histopathological morphology and quantitative radiological analysis during aseptic loosening of hip endoprostheses.

    Science.gov (United States)

    Bertz, S; Kriegsmann, J; Eckardt, A; Delank, K-S; Drees, P; Hansen, T; Otto, M

    2006-01-01

    Aseptic hip prosthesis loosening is the most important long-term complication in total hip arthroplasty. Polyethylene (PE) wear is the dominant etiologic factor in aseptic loosening, which together with other factors induces mechanisms resulting in bone loss, and finally in implant loosening. The single-shot radiograph analysis (EBRA, abbreviation for the German term "Einzel-Bild-Röntgenanalyse") is a computerized method for early radiological prediction of aseptic loosening. In this study, EBRA parameters were correlated with histomorphological parameters of the periprosthetic membrane. Periprosthetic membranes obtained from 19 patients during revision surgery of loosened ABG I-type total hip pros-theses were analyzed histologically and morphometrically. The pre-existing EBRA parameters, the thickness of the PE debris lay-er and the dimension of inclination and anteversion, were compared with the density of macrophages and giant cells. Addi-tionally, the semiquantitatively determined density of lymphocytes, plasma cells, giant cells and the size of the necrotic areas were correlated with the EBRA results. All periprosthetic membranes were classified as debris-induced type membranes. We found a positive correlation between the number of giant cells and the thickness of the PE debris layer. There was no significant correlation between the number of macrophages or all semiquantitative parameters and EBRA parameters. The number of giant cells decreased with implant duration. The morphometrically measured number of foreign body giant cells more closely reflects the results of the EBRA. The semiquantitative estimation of giant cell density could not substitute for the morphometrical analysis. The density of macrophages, lymphocytes, plasma cells and the size of necrotic areas did not correlate with the EBRA parameters, indicating that there is no correlation with aseptic loosening.

  8. a Temporal and Spatial Analysis of Urban Heat Island in Basin City Utilizing Remote Sensing Techniques

    Science.gov (United States)

    Chang, Hsiao-Tung

    2016-06-01

    Urban Heat Island (UHI) has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city's UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST) at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls "sinking heat island". From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  9. A TEMPORAL AND SPATIAL ANALYSIS OF URBAN HEAT ISLAND IN BASIN CITY UTILIZING REMOTE SENSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H.-T. Chang

    2016-06-01

    Full Text Available Urban Heat Island (UHI has been becoming a key factor in deteriorating the urban ecological environment. Spatial-temporal analysis on its prototype of basin city’s UHI and quantitatively evaluating effect from rapid urbanization will provide theoretical foundation for relieving UHI effect. Based on Landsat 8, ETM+ and TM images of Taipei basin areas from 1900 to 2015, this article has retrieved the land surface temperature (LST at summer solstice of each year, and then analysed spatial-temporal pattern and evolution characters of UHI in Taipei basin in this decade. The results showed that the expansion built district, UHI area constantly expanded from centre city to the suburb areas. The prototype of UHI in Taipei basin that showed in addition to higher temperatures in the centre city also were relatively high temperatures gathered boundaries surrounded by foot of mountains side. It calls “sinking heat island”. From 1900 to 2000, the higher UHI areas were different land use type change had obvious difference by public infrastructure works. And then, in next 15 years till 2015, building density of urban area has been increasing gradually. It has the trend that UHI flooding raises follow urban land use density. Hot spot of UHI in Taipei basin also has the same characteristics. The results suggest that anthropogenic heat release probably plays a significant role in the UHI effect, and must be considered in urban planning adaptation strategies.

  10. Analysis of Spatial Voting Patterns: An Approach in Political Socialization

    Science.gov (United States)

    Klimasewski, Ted

    1973-01-01

    Passage of the 26th Amendment gave young adults the right to vote. This study attempts to further student understanding of the electoral process by presenting a method for analyzing spatial voting patterns. The spatial emphasis adds another dimension to the temporal and behavioral-structural approaches in studying the American electoral system.…

  11. Fractal Dimension analysis for seismicity spatial and temporal ...

    Indian Academy of Sciences (India)

    23

    The research can further promote the application of fractal theory in the study ... spatial-temporal propagation characteristics of seismic activities, fractal theory is not ... provide a theoretical basis for the prevention and control of earthquakes. 2. ... random self-similar structure of the earthquake in the time series and the spatial.

  12. using gis for spatial exploratory analysis of borehole data

    African Journals Online (AJOL)

    PUBLICATIONS1

    Thus, understanding the spatial structure of aquifer characteristics could be used as a resourceful tool and as a .... x is position in one dimensional space. N(h) is Pairwise ... best fitted theoretical models, the spatial struc- ture of each variable ...

  13. A Principal Components Analysis of Dynamic Spatial Memory Biases

    Science.gov (United States)

    Motes, Michael A.; Hubbard, Timothy L.; Courtney, Jon R.; Rypma, Bart

    2008-01-01

    Research has shown that spatial memory for moving targets is often biased in the direction of implied momentum and implied gravity, suggesting that representations of the subjective experiences of these physical principles contribute to such biases. The present study examined the association between these spatial memory biases. Observers viewed…

  14. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  15. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  16. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  17. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  18. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  19. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  20. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  1. Quantitative analysis of the epithelial lining architecture in radicular cysts and odontogenic keratocysts

    Directory of Open Access Journals (Sweden)

    Landini Gabriel

    2006-02-01

    Full Text Available Abstract Background This paper describes a quantitative analysis of the cyst lining architecture in radicular cysts (of inflammatory aetiology and odontogenic keratocysts (thought to be developmental or neoplastic including its 2 counterparts: solitary and associated with the Basal Cell Naevus Syndrome (BCNS. Methods Epithelial linings from 150 images (from 9 radicular cysts, 13 solitary keratocysts and 8 BCNS keratocysts were segmented into theoretical cells using a semi-automated partition based on the intensity of the haematoxylin stain which defined exclusive areas relative to each detected nucleus. Various morphometrical parameters were extracted from these "cells" and epithelial layer membership was computed using a systematic clustering routine. Results Statistically significant differences were observed across the 3 cyst types both at the morphological and architectural levels of the lining. Case-wise discrimination between radicular cysts and keratocyst was highly accurate (with an error of just 3.3%. However, the odontogenic keratocyst subtypes could not be reliably separated into the original classes, achieving discrimination rates slightly above random allocations (60%. Conclusion The methodology presented is able to provide new measures of epithelial architecture and may help to characterise and compare tissue spatial organisation as well as provide useful procedures for automating certain aspects of histopathological diagnosis.

  2. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  3. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  4. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  5. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem

    DEFF Research Database (Denmark)

    Huang, Honggang; Larsen, Martin Røssel; Palmisano, Giuseppe

    2014-01-01

    in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160...... phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant...... and rigor mortis development in PM muscle. BIOLOGICAL SIGNIFICANCE: The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed...

  6. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two......Given a task of designing controller for mobile robots in swarms, one might wonder which distributed control paradigms should be selected. Until now, paradigms of robot controllers have been within either behaviour based control or neural network based control, which have been recognized as two...... mainstreams of controller design for mobile robots. However, in swarm robotics, it is not clear how to determine control paradigms. In this paper we study the two control paradigms with various experiments of swarm aggregation. First, we introduce the two control paradigms for mobile robots. Second, we...

  7. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    Science.gov (United States)

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    Directory of Open Access Journals (Sweden)

    Xinsheng Peng

    2014-01-01

    Full Text Available A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8-triethylamine (50 : 50 : 0.1% with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y=10706x-2959 (R2=1.0. The average recovery is 101.7%; RSD=2.22%  (n=9. This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle.

  9. Quantitative Trait Loci Analysis of Allelopathy in Rice

    DEFF Research Database (Denmark)

    Jensen, L B; Courtois, B; Olofsdotter, M

    2008-01-01

    The allelopathic potential of rice (Oryza sativa L.) against Echinochloa crus-galli (L.) Beauv. was investigated under both laboratory and greenhouse conditions. A population of 150 recombinant inbred lines (RILs) was derived through single-seed descent from a cross between the indica cultivar AC...... the population phenotype was normally distributed. Two quantitative trait loci (QTLs) were located on chromosomes 4 and 7, explaining 20% of the phenotypic variation. A second relay seeding experiment was set up, this time including charcoal in the perlite. This screening showed that the allelopathic rice...... varieties did not have any effect on the weed species when grown with charcoal, the charcoal reversing the effect of any potential allelochemicals exuded from the rice roots. The second phenotypic experiment was conducted under greenhouse conditions in pots. Thirteen QTLs were detected for four different...

  10. Quantitative analysis on electric dipole energy in Rashba band splitting.

    Science.gov (United States)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  11. Quantitative Analysis of Rat Dorsal Root Ganglion Neurons Cultured on Microelectrode Arrays Based on Fluorescence Microscopy Image Processing.

    Science.gov (United States)

    Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo

    2015-12-01

    Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.

  12. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  13. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  14. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  15. Quantitative analysis by microchip capillary electrophoresis – current limitations and problem-solving strategies

    NARCIS (Netherlands)

    Revermann, T.; Götz, S.; Künnemeyer, Jens; Karst, U.

    2008-01-01

    Obstacles and possible solutions for the application of microchip capillary electrophoresis in quantitative analysis are described and critically discussed. Differences between the phenomena occurring during conventional capillary electrophoresis and microchip-based capillary electrophoresis are

  16. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  17. Spin echo SPI methods for quantitative analysis of fluids in porous media.

    Science.gov (United States)

    Li, Linqing; Han, Hui; Balcom, Bruce J

    2009-06-01

    Fluid density imaging is highly desirable in a wide variety of porous media measurements. The SPRITE class of MRI methods has proven to be robust and general in their ability to generate density images in porous media, however the short encoding times required, with correspondingly high magnetic field gradient strengths and filter widths, and low flip angle RF pulses, yield sub-optimal S/N images, especially at low static field strength. This paper explores two implementations of pure phase encode spin echo 1D imaging, with application to a proposed new petroleum reservoir core analysis measurement. In the first implementation of the pulse sequence, we modify the spin echo single point imaging (SE-SPI) technique to acquire the k-space origin data point, with a near zero evolution time, from the free induction decay (FID) following a 90 degrees excitation pulse. Subsequent k-space data points are acquired by separately phase encoding individual echoes in a multi-echo acquisition. T(2) attenuation of the echo train yields an image convolution which causes blurring. The T(2) blur effect is moderate for porous media with T(2) lifetime distributions longer than 5 ms. As a robust, high S/N, and fast 1D imaging method, this method will be highly complementary to SPRITE techniques for the quantitative analysis of fluid content in porous media. In the second implementation of the SE-SPI pulse sequence, modification of the basic measurement permits fast determination of spatially resolved T(2) distributions in porous media through separately phase encoding each echo in a multi-echo CPMG pulse train. An individual T(2) weighted image may be acquired from each echo. The echo time (TE) of each T(2) weighted image may be reduced to 500 micros or less. These profiles can be fit to extract a T(2) distribution from each pixel employing a variety of standard inverse Laplace transform methods. Fluid content 1D images are produced as an essential by product of determining the

  18. A spatial epidemiological analysis of self-rated mental health in the slums of Dhaka

    Directory of Open Access Journals (Sweden)

    Müller Daniel

    2011-05-01

    Full Text Available Abstract Background The deprived physical environments present in slums are well-known to have adverse health effects on their residents. However, little is known about the health effects of the social environments in slums. Moreover, neighbourhood quantitative spatial analyses of the mental health status of slum residents are still rare. The aim of this paper is to study self-rated mental health data in several slums of Dhaka, Bangladesh, by accounting for neighbourhood social and physical associations using spatial statistics. We hypothesised that mental health would show a significant spatial pattern in different population groups, and that the spatial patterns would relate to spatially-correlated health-determining factors (HDF. Methods We applied a spatial epidemiological approach, including non-spatial ANOVA/ANCOVA, as well as global and local univariate and bivariate Moran's I statistics. The WHO-5 Well-being Index was used as a measure of self-rated mental health. Results We found that poor mental health (WHO-5 scores Conclusions Spatial patterns of mental health were detected and could be partly explained by spatially correlated HDF. We thereby showed that the socio-physical neighbourhood was significantly associated with health status, i.e., mental health at one location was spatially dependent on the mental health and HDF prevalent at neighbouring locations. Furthermore, the spatial patterns point to severe health disparities both within and between the slums. In addition to examining health outcomes, the methodology used here is also applicable to residuals of regression models, such as helping to avoid violating the assumption of data independence that underlies many statistical approaches. We assume that similar spatial structures can be found in other studies focussing on neighbourhood effects on health, and therefore argue for a more widespread incorporation of spatial statistics in epidemiological studies.

  19. Capacity analysis of spectrum sharing spatial multiplexing MIMO systems

    KAUST Repository

    Yang, Liang; Qaraqe, Khalid A.; Serpedin, Erchin; Alouini, Mohamed-Slim

    2014-01-01

    This paper considers a spectrum sharing (SS) multiple-input multiple-output (MIMO) system operating in a Rayleigh fading environment. First the capacity of a single-user SS spatial multiplexing system is investigated in two scenarios that assume

  20. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River ... of taxa was recorded in marginal vegetation in the channels and lagoons, ... highlights the importance of maintaining a mosaic of aquatic habitats in the Delta.

  1. A preliminary survey and analysis of the spatial distribution of ...

    African Journals Online (AJOL)

    The spatial distribution of aquatic macroinvertebrates in the Okavango River Delta, ... seasonally-flooded pools and temporary rain-filled pools in MGR and CI. ... biodiversity of the Okavango Delta, thereby contributing to its conservation.

  2. Spatial analysis based on variance of moving window averages

    OpenAIRE

    Wu, B M; Subbarao, K V; Ferrandino, F J; Hao, J J

    2006-01-01

    A new method for analysing spatial patterns was designed based on the variance of moving window averages (VMWA), which can be directly calculated in geographical information systems or a spreadsheet program (e.g. MS Excel). Different types of artificial data were generated to test the method. Regardless of data types, the VMWA method correctly determined the mean cluster sizes. This method was also employed to assess spatial patterns in historical plant disease survey data encompassing both a...

  3. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem.

    Science.gov (United States)

    Huang, Honggang; Larsen, Martin R; Palmisano, Giuseppe; Dai, Jie; Lametsch, René

    2014-06-25

    Protein phosphorylation can regulate most of the important processes in muscle, such as metabolism and contraction. The postmortem (PM) metabolism and rigor mortis have essential effects on meat quality. In order to identify and characterize the protein phosphorylation events involved in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160 phosphoproteins with 784 phosphorylation sites. Among these, 184 phosphorylation sites on 93 proteins had their phosphorylation levels significantly changed. The proteins involved in glucose metabolism and muscle contraction were the two largest clusters of phosphoproteins with significantly changed phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant changes at the phosphorylation level but were relatively stable at the total protein level, suggesting that protein phosphorylation may have important roles in meat quality development through the regulation of proteins involved in glucose metabolism and muscle contraction, thereby affecting glycolysis and rigor mortis development in PM muscle. The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed the dimethyl labeling combined with the TiSH phosphopeptide enrichment and LC-MS/MS strategy. This was the first high-throughput quantitative phosphoproteomic study in PM muscle of farm animals. In the work, both the proteome

  4. Spatial Analysis of Case-Mix and Dialysis Modality Associations.

    Science.gov (United States)

    Phirtskhalaishvili, Tamar; Bayer, Florian; Edet, Stephane; Bongiovanni, Isabelle; Hogan, Julien; Couchoud, Cécile

    2016-01-01

    ♦ Health-care systems must attempt to provide appropriate, high-quality, and economically sustainable care that meets the needs and choices of patients with end-stage renal disease (ESRD). France offers 9 different modalities of dialysis, each characterized by dialysis technique, the extent of professional assistance, and the treatment site. The aim of this study was 1) to describe the various dialysis modalities in France and the patient characteristics associated with each of them, and 2) to analyze their regional patterns to identify possible unexpected associations between case-mixes and dialysis modalities. ♦ The clinical characteristics of the 37,421 adult patients treated by dialysis were described according to their treatment modality. Agglomerative hierarchical cluster analysis was used to aggregate the regions into clusters according to their use of these modalities and the characteristics of their patients. ♦ The gradient of patient characteristics was similar from home hemodialyis (HD) to in-center HD and from non-assisted automated peritoneal dialysis (APD) to assisted continuous ambulatory peritoneal dialysis (CAPD). Analyzing their spatial distribution, we found differences in the patient case-mix on dialysis across regions but also differences in the health-care provided for them. The classification of the regions into 6 different clusters allowed us to detect some unexpected associations between case-mixes and treatment modalities. ♦ The 9 modalities of treatment available make it theoretically possible to adapt treatment to patients' clinical characteristics and abilities. However, although we found an overall appropriate association of dialysis modalities to the case-mix, major inter-region heterogeneity and the low rate of peritoneal dialysis (PD) and home HD suggest that factors besides patients' clinical conditions impact the choice of dialysis modality. The French organization should now be evaluated in terms of patients' quality of

  5. [Device for quantitative analysis of perception and pain sensation].

    Science.gov (United States)

    Arita, Hideko; Kato, Jitsu; Ogawa, Setsuro; Hanaoka, Kazuo

    2014-07-01

    The article describes an analysing device that measures the perception and intensity of pain quantitatively. While it is not necessarily true that psychological aspect is totally irrelevant to pain measurement, this device is remarkable in that it is capable of measuring the intensity of pain felt by the patient more objectively by using electric stimuli. The feature of this device is that it uses a non-pain heteresthesia for measuring the intensity of pain. The device is compact, light-weight, and portable. Unlike VAS that requires only a scale, the device requires a person to carry out the measurement. Nevertheless, as the National Health Insurance (NHI) coverage has been approved, introduction of the device may be facilitated in terms of budget for the purchase and labor. The device is useful to better understand not only the intensity of pain but also the pathological conditions, resulting in more appropriate treatment, by (1) comparing degree of pain or VAS values taken by a multicenter study with those of a patient; (2) using both degree of pain and VAS; and (3) multiple measurements of degree of pain and VAS in one case.

  6. Quantitative analysis of impact measurements using dynamic load cells

    Directory of Open Access Journals (Sweden)

    Brent J. Maranzano

    2016-03-01

    Full Text Available A mathematical model is used to estimate material properties from a short duration transient impact force measured by dropping spheres onto rectangular coupons fixed to a dynamic load cell. The contact stress between the dynamic load cell surface and the projectile are modeled using Hertzian contact mechanics. Due to the short impact time relative to the load cell dynamics, an additional Kelvin–Voigt element is included in the model to account for the finite response time of the piezoelectric crystal. Calculations with and without the Kelvin–Voigt element are compared to experimental data collected from combinations of polymeric spheres and polymeric and metallic surfaces. The results illustrate that the inclusion of the Kelvin–Voigt element qualitatively captures the post impact resonance and non-linear behavior of the load cell signal and quantitatively improves the estimation of the Young's elastic modulus and Poisson's ratio. Mathematically, the additional KV element couples one additional differential equation to the Hertzian spring-dashpot equation. The model can be numerically integrated in seconds using standard numerical techniques allowing for its use as a rapid technique for the estimation of material properties. Keywords: Young's modulus, Poisson's ratio, Dynamic load cell

  7. Quantitative risk analysis offshore-Human and organizational factors

    International Nuclear Information System (INIS)

    Espen Skogdalen, Jon; Vinnem, Jan Erik

    2011-01-01

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  8. Quantitative analysis of cholesteatoma using high resolution computed tomography

    International Nuclear Information System (INIS)

    Kikuchi, Shigeru; Yamasoba, Tatsuya; Iinuma, Toshitaka.

    1992-01-01

    Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author)

  9. Quantitative XRD analysis of {110} twin density in biotic aragonites.

    Science.gov (United States)

    Suzuki, Michio; Kim, Hyejin; Mukai, Hiroki; Nagasawa, Hiromichi; Kogure, Toshihiro

    2012-12-01

    {110} Twin densities in biotic aragonite have been estimated quantitatively from the peak widths of specific reflections in powder X-ray diffraction (XRD) patterns, as well as direct confirmation of the twins using transmission electron microscopy (TEM). Influence of the twin density on the peak widths in the XRD pattern was simulated using DIFFaX program, regarding (110) twin as interstratification of two types of aragonite unit layers with mirrored relationship. The simulation suggested that the twin density can be estimated from the difference of the peak widths between 111 and 021, or between 221 and 211 reflections. Biotic aragonite in the crossed-lamellar microstructure (three species) and nacreous microstructure (four species) of molluscan shells, fish otoliths (two species), and a coral were investigated. The XRD analyses indicated that aragonite crystals in the crossed-lamellar microstructure of the three species contain high density of the twins, which is consistent with the TEM examination. On the other hand, aragonite in the nacre of the four species showed almost no difference of the peak widths between the paired reflections, indicating low twin densities. The results for the fish otoliths were varied between the species. Such variation of the twin density in biotic aragonites may reflect different schemes of crystal growth in biomineralization. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Quantitative Analysis and Efficient Surface Modification of Silica Nanoparticles

    Directory of Open Access Journals (Sweden)

    Hak-Sung Jung

    2012-01-01

    Full Text Available Aminofunctional trialkoxysilanes such as aminopropyltrimethoxysilane (APTMS and (3-trimethoxysilylpropyldiethylenetriamine (DETAS were employed as a surface modification molecule for generating monolayer modification on the surface of silica (SiO2 nanoparticles. We were able to quantitatively analyze the number of amine functional groups on the modified SiO2 nanoparticles by acid-base back titration method and determine the effective number of amine functional groups for the successive chemical reaction by absorption measurements after treating with fluorescent rhodamine B isothiocyanate (RITC molecules. The numbers of amine sites measured by back titration were 2.7 and 7.7 ea/nm2 for SiO2-APTMS and SiO2-DETAS, respectively, while the numbers of effective amine sites measured by absorption calibration were about one fifth of the total amine sites, namely, 0.44 and 1.3 ea/nm2 for SiO2-APTMS(RITC and SiO2-DETAS(RITC, respectively. Furthermore, it was confirmed that the reactivity of amino groups on the surface-modified silica nanoparticles could be maintained in ethanol for more than 1.5 months without showing any significant differences in the reactivity.

  11. Quantitative analysis of TALE-DNA interactions suggests polarity effects.

    Science.gov (United States)

    Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P

    2013-04-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.

  12. Quantitative analysis of the thermal damping of coherent axion oscillations

    International Nuclear Information System (INIS)

    Turner, M.S.

    1985-01-01

    Unruh and Wald have recently discussed a new mechanism for damping coherent axion oscillations, ''thermal damping,'' which occurs due to the temperature dependence of the axion mass and neutrino viscosity. We investigate the effect quantitatively and find that the present energy density in axions can be written as rho/sub a/ = rho/sub a0//(1+J/sub UW/), where rho/sub a/0 is what the axion energy density would be in the absence of the thermal-damping effect and J/sub UW/ is an integral whose integrand depends upon (dm/sub a//dT) 2 . As a function of f(equivalentPeccei-Quinn symmetry-breaking scale) J/sub UW/ achieves its maximum value for f/sub PQ/approx. =3 x 10 12 GeV; unless the axion mass turn-on is very sudden, Vertical Bar(T/m/sub a/)(dm/sub a//dT)Vertical Bar>>1, J/sub UW/ is <<1, implying that this damping mechanism is not significant

  13. Quantitative analysis of minerals by X-ray diffraction

    International Nuclear Information System (INIS)

    Pietroluongo, L.R.V.; Veiga, M.M. da

    1982-01-01

    Considerations about the X-ray diffraction technique for quantitative analyses are made; some experiments carried out at CETEM - Centro de Tecnologia Mineral (Rio de Janeiro, Brazil) with synthetic samples and real samples of diatomites (from northeastern region of Brazil) are described. Quartz quantification has been a problem for analytical chemists and is of great importance to the industries which use this raw material. Comments are made about the main factors influencing the intensity of diffracted X-rays, such as: the crystallinity of the mineral phase; the granulometry, the preferential orientation; sample preparation and pressing, the chemical composition of standards and experimental analytical conditions. Several analytical methods used are described: direct measurement of the height or area of a peak resulting from a particular reflection and comparison with a pre-calibrated curve; method of sequential addition of the mineral of interest in the sample and extrapolation of results for ZERO addition; methods of external and internal standards. (C.L.B.) [pt

  14. Quantitative analysis of complexes in electron irradiated CZ silicon

    International Nuclear Information System (INIS)

    Inoue, N.; Ohyama, H.; Goto, Y.; Sugiyama, T.

    2007-01-01

    Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1x10 15 -1x10 17 cm -3 ) and helium dose (5x10 12 -5x10 13 cm -2 ) or electron dose (1x10 15 -1x10 17 cm -2 ) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial grade with low electron dose can be detected clearly. Concentration of these complexes is estimated. It is clarified that the complex configuration and thermal behavior in low carbon and low dose samples is simple and almost confined within the individual complex family compared to those in high concentration and high dose samples. Well-established complex behavior in electron-irradiated sample is compared to that in He-irradiated samples, obtained by deep level transient spectroscopy (DLTS) or cathodoluminescence (CL), which had close relation to the Si power device performance

  15. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  16. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  17. A CGE analysis for quantitative evaluation of electricity market changes

    International Nuclear Information System (INIS)

    Hwang, Won-Sik; Lee, Jeong-Dong

    2015-01-01

    Risk and uncertainty entailed by electricity industry privatization impose a heavy burden on the political determination. In this sense, ex ante analyses are important in order to investigate the economic effects of privatization or liberalization in the electricity industry. For the purpose of fulfilling these quantitative analyses, a novel approach is developed, incorporating a top-down and bottom-up model that takes into account economic effects and technological constraints simultaneously. This study also examines various counterfactual scenarios after Korean electricity industry reform through the integrated framework. Simulation results imply that authorities should prepare an improved regulatory system and policy measures such as forward contracts for industry reform, in order to promote competition in the distribution sector as well as the generation sector. -- Highlights: •A novel approach is proposed for incorporating a top-down and bottom-up model. •This study examines various counterfactual scenarios after Korean electricity industry reform. •An improved regulatory system and policy measures are required before the reform

  18. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  19. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    Science.gov (United States)

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line.

  20. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  1. Comparative analysis of time efficiency and spatial resolution between different EIT reconstruction algorithms

    International Nuclear Information System (INIS)

    Kacarska, Marija; Loskovska, Suzana

    2002-01-01

    In this paper comparative analysis between different EIT algorithms is presented. Analysis is made for spatial and temporal resolution of obtained images by several different algorithms. Discussions consider spatial resolution dependent on data acquisition method, too. Obtained results show that conventional applied-current EIT is more powerful compared to induced-current EIT. (Author)

  2. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  3. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  4. Quantitative analysis of some volatile components in Mimusops elengi L.

    Directory of Open Access Journals (Sweden)

    Chantana Aromdee

    2009-08-01

    Full Text Available Dried pikul flower (Mimusops elengi L., Sapotaceae is used in many recipes of Thai traditional medicine i.e. cardiotonic and stomachic. In this study, fresh and dried pikul flowers were investigated. The odour of pikul flower, even when it was dried, is very strong and characteristic. The constituents of volatile oils in fresh and dried pikul flowers extracted by ether were analysed by gas chromatography-mass spectrometry. 2-Phenylethanol, 4-hydroxybenzenemethanol and cinnamyl alcohol were mainly found in fresh flower, 10.49, 8.69 and 6.17%, respectively. Whereas those mainly found in dried flowers were long chain carboxylic acid ester and (Z-9-octadecenoic acid, 5.37 and 4.71% of ether extract, respectively.An analytical method simultaneously determining benzyl alcohol, 2-phenylethanol and methyl paraben was developed by using the GC-FID method. The percent recoveries were 91.66, 104.59 and 105.28%, respectively. The intraday variations(% RSD were 7.22, 6.67 and 1.86%; and the interday variation were 3.12, 2.52 and 3.55%, respectively. Detection limits were 0.005, 0.014 and 0.001 ppm, and quantitation limits were 0.015, 0.048 and 0.003 ppm, respectively. Benzyl alcohol, 2-phenylethanol and methyl paraben content of dried flowers (9 samples from various drug stores in Thailand and one sample from China were 6.40-13.46, 17.57-196.57 and 27.35-355.53 ppm, respectively.

  5. Quantitative genetic analysis of anxiety trait in bipolar disorder.

    Science.gov (United States)

    Contreras, J; Hare, E; Chavarría, G; Raventós, H

    2018-01-01

    Bipolar disorder type I (BPI) affects approximately 1% of the world population. Although genetic influences on bipolar disorder are well established, identification of genes that predispose to the illness has been difficult. Most genetic studies are based on categorical diagnosis. One strategy to overcome this obstacle is the use of quantitative endophenotypes, as has been done for other medical disorders. We studied 619 individuals, 568 participants from 61 extended families and 51 unrelated healthy controls. The sample was 55% female and had a mean age of 43.25 (SD 13.90; range 18-78). Heritability and genetic correlation of the trait scale from the Anxiety State and Trait Inventory (STAI) was computed by using the general linear model (SOLAR package software). we observed that anxiety trait meets the following criteria for an endophenotype of bipolar disorder type I (BPI): 1) association with BPI (individuals with BPI showed the highest trait score (F = 15.20 [5,24], p = 0.009), 2) state-independence confirmed after conducting a test-retest in 321 subjects, 3) co-segregation within families 4) heritability of 0.70 (SE: 0.060), p = 2.33 × 10 -14 and 5) genetic correlation with BPI was 0.20, (SE = 0.17, p = 3.12 × 10 -5 ). Confounding factors such as comorbid disorders and pharmacological treatment could affect the clinical relationship between BPI and anxiety trait. Further research is needed to evaluate if anxiety traits are specially related to BPI in comparison with other traits such as anger, attention or response inhibition deficit, pathological impulsivity or low self-directedness. Anxiety trait is a heritable phenotype that follows a normal distribution when measured not only in subjects with BPI but also in unrelated healthy controls. It could be used as an endophenotype in BPI for the identification of genomic regions with susceptibility genes for this disorder. Published by Elsevier B.V.

  6. Colorectal carcinoma: Ex vivo evaluation using 3-T high-spatial-resolution quantitative T2 mapping and its correlation with histopathologic findings.

    Science.gov (United States)

    Yamada, Ichiro; Yoshino, Norio; Hikishima, Keigo; Miyasaka, Naoyuki; Yamauchi, Shinichi; Uetake, Hiroyuki; Yasuno, Masamichi; Saida, Yukihisa; Tateishi, Ukihide; Kobayashi, Daisuke; Eishi, Yoshinobu

    2017-05-01

    In this study, we aimed to evaluate the feasibility of determining the mural invasion depths of colorectal carcinomas using high-spatial-resolution (HSR) quantitative T2 mapping on a 3-T magnetic resonance (MR) scanner. Twenty colorectal specimens containing adenocarcinomas were imaged on a 3-T MR system equipped with a 4-channel phased-array surface coil. HSR quantitative T2 maps were acquired using a spin-echo sequence with a repetition time/echo time of 7650/22.6-361.6ms (16 echoes), 87×43.5-mm field of view, 2-mm section thickness, 448×224 matrix, and average of 1. HSR fast-spin-echo T2-weighted images were also acquired. Differences between the T2 values (ms) of the tumor tissue, colorectal wall layers, and fibrosis were measured, and the MR images and histopathologic findings were compared. In all specimens (20/20, 100%), the HSR quantitative T2 maps clearly depicted an 8-layer normal colorectal wall in which the T2 values of each layer differed from those of the adjacent layer(s) (PT2 maps and histopathologic data yielded the same findings regarding the tumor invasion depth. Our results indicate that 3-T HSR quantitative T2 mapping is useful for distinguishing colorectal wall layers and differentiating tumor and fibrotic tissues. Accordingly, this technique could be used to determine mural invasion by colorectal carcinomas with a high level of accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Quantitative analysis of multiple high-resolution mass spectrometry images using chemometric methods: quantitation of chlordecone in mouse liver.

    Science.gov (United States)

    Mohammadi, Saeedeh; Parastar, Hadi

    2018-05-15

    In this work, a chemometrics-based strategy is developed for quantitative mass spectrometry imaging (MSI). In this regard, quantification of chlordecone as a carcinogenic organochlorinated pesticide (C10Cll0O) in mouse liver using the matrix-assisted laser desorption ionization MSI (MALDI-MSI) method is used as a case study. The MSI datasets corresponded to 1, 5 and 10 days of mouse exposure to the standard chlordecone in the quantity range of 0 to 450 μg g-1. The binning approach in the m/z direction is used to group high resolution m/z values and to reduce the big data size. To consider the effect of bin size on the quality of results, three different bin sizes of 0.25, 0.5 and 1.0 were chosen. Afterwards, three-way MSI data arrays (two spatial and one m/z dimensions) for seven standards and four unknown samples were column-wise augmented with m/z values as the common mode. Then, these datasets were analyzed using multivariate curve resolution-alternating least squares (MCR-ALS) using proper constraints. The resolved mass spectra were used for identification of chlordecone in the presence of a complex background and interference. Additionally, the augmented spatial profiles were post-processed and 2D images for each component were obtained in calibration and unknown samples. The sum of these profiles was utilized to set the calibration curve and to obtain the analytical figures of merit (AFOMs). Inspection of the results showed that the lower bin size (i.e., 0.25) provides more accurate results. Finally, the obtained results by MCR for three datasets were compared with those of gas chromatography-mass spectrometry (GC-MS) and MALDI-MSI. The results showed that the MCR-assisted method gives a higher amount of chlordecone than MALDI-MSI and a lower amount than GC-MS. It is concluded that a combination of chemometric methods with MSI can be considered as an alternative way for MSI quantification.

  8. Construction of the quantitative analysis environment using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Shirakawa, Seiji; Ushiroda, Tomoya; Hashimoto, Hiroshi; Tadokoro, Masanori; Uno, Masaki; Tsujimoto, Masakazu; Ishiguro, Masanobu; Toyama, Hiroshi

    2013-01-01

    The thoracic phantom image was acquisitioned of the axial section to construct maps of the source and density with Monte Carlo (MC) simulation. The phantom was Heart/Liver Type HL (Kyoto Kagaku Co., Ltd.) single photon emission CT (SPECT)/CT machine was Symbia T6 (Siemence) with the collimator LMEGP (low-medium energy general purpose). Maps were constructed from CT images with an in-house software using Visual studio C Sharp (Microsoft). The code simulation of imaging nuclear detectors (SIMIND) was used for MC simulation, Prominence processor (Nihon Medi-Physics) for filter processing and image reconstruction, and the environment DELL Precision T7400 for all image processes. For the actual experiment, the phantom was given 15 MBq of 99m Tc assuming the uptake 2% at the dose of 740 MBq in its myocardial portion and SPECT image was acquisitioned and reconstructed with Butter-worth filter and filter back projection method. CT images were similarly obtained in 0.3 mm thick slices, which were filed in one formatted with digital imaging and communication in medicine (DICOM), and then processed for application to SIMIND for mapping the source and density. Physical and mensuration factors were examined in ideal images by sequential exclusion and simulation of those factors as attenuation, scattering, spatial resolution deterioration and statistical fluctuation. Gamma energy spectrum, SPECT projection and reconstructed images given by the simulation were found to well agree with the actual data, and the precision of MC simulation was confirmed. Physical and mensuration factors were found to be evaluable individually, suggesting the usefulness of the simulation for assessing the precision of their correction. (T.T.)

  9. Quantitative morphotectonic analysis of the South-Eastern Carpathians

    Science.gov (United States)

    Ionuţ Cristea, Alexandru

    2015-04-01

    South-Eastern Carpathians (Vrancea Region) have received an increasing scientific attention during the past years, mostly resulting in a detailed reconstruction of their exumation history. Moreover structural and thermocronological data suggest that the frontal part of the SE Carpathians conserves the youngest topography in the Romanian Carpathians resulting from a deformational process occurring during the late Pliocene - Early Pleistocene. This significant tectonic activity continues to the present time as it is confirmed by the geodetic measurements and by the frequency of crustal earthquakes. The specific effects of the Quaternary deformations on the regional fluvial system were associated so far with an increased incision and the formation of the degradational (strath) terraces, downstream tiling of terraces, the establishment of local drainage divides and young longitudinal river profiles. Our study further investigates the possible influence of the recent tectonic activity on the characteristics of the drainage basins in the area and the distribution of the over-steepened stream reaches using spatial autocorrelation techniques (Getis Ord Gi* statistics and Anselin's Local Moran's I). For the first, hypsometric integrals (Hi) and transverse topographic symmetry factor were analyzed. For the last, we used locally computed normalized channel steepness index (ksn). Due to the highly variable lithology in the region (specific to the Flysch areas), additional correlations of the determined values with the geological units and rock types have been made in order to assess the effects. The results show that the geographic clustering of the high Hi and ksn values is more significant than the lithological one, and, although the rock strength have local influences, this is not sufficient to explain the regional distribution of the values, generally between 26.5o and 26.66o E (p

  10. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  11. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  12. Quantitative spectroscopy for the analysis of GOME data

    Science.gov (United States)

    Chance, K.

    1997-01-01

    Accurate analysis of the global ozone monitoring experiment (GOME) data to obtain atmospheric constituents requires reliable, traceable spectroscopic parameters for atmospheric absorption and scattering. Results are summarized for research that includes: the re-determination of Rayleigh scattering cross sections and phase functions for the 200 nm to 1000 nm range; the analysis of solar spectra to obtain a high-resolution reference spectrum with excellent absolute vacuum wavelength calibration; Ring effect cross sections and phase functions determined directly from accurate molecular parameters of N2 and O2; O2 A band line intensities and pressure broadening coefficients; and the analysis of absolute accuracies for ultraviolet and visible absorption cross sections of O3 and other trace species measurable by GOME.

  13. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  14. Neighborhood social capital and crime victimization: comparison of spatial regression analysis and hierarchical regression analysis.

    Science.gov (United States)

    Takagi, Daisuke; Ikeda, Ken'ichi; Kawachi, Ichiro

    2012-11-01

    Crime is an important determinant of public health outcomes, including quality of life, mental well-being, and health behavior. A body of research has documented the association between community social capital and crime victimization. The association between social capital and crime victimization has been examined at multiple levels of spatial aggregation, ranging from entire countries, to states, metropolitan areas, counties, and neighborhoods. In multilevel analysis, the spatial boundaries at level 2 are most often drawn from administrative boundaries (e.g., Census tracts in the U.S.). One problem with adopting administrative definitions of neighborhoods is that it ignores spatial spillover. We conducted a study of social capital and crime victimization in one ward of Tokyo city, using a spatial Durbin model with an inverse-distance weighting matrix that assigned each respondent a unique level of "exposure" to social capital based on all other residents' perceptions. The study is based on a postal questionnaire sent to 20-69 years old residents of Arakawa Ward, Tokyo. The response rate was 43.7%. We examined the contextual influence of generalized trust, perceptions of reciprocity, two types of social network variables, as well as two principal components of social capital (constructed from the above four variables). Our outcome measure was self-reported crime victimization in the last five years. In the spatial Durbin model, we found that neighborhood generalized trust, reciprocity, supportive networks and two principal components of social capital were each inversely associated with crime victimization. By contrast, a multilevel regression performed with the same data (using administrative neighborhood boundaries) found generally null associations between neighborhood social capital and crime. Spatial regression methods may be more appropriate for investigating the contextual influence of social capital in homogeneous cultural settings such as Japan. Copyright

  15. Quantitative Safety and Security Analysis from a Communication Perspective

    DEFF Research Database (Denmark)

    Malinowsky, Boris; Schwefel, Hans-Peter; Jung, Oliver

    2014-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real...... at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective...

  16. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    Science.gov (United States)

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI.

  17. FROM UNEMPLOYMENT TO WORK: AN ECONOMETRIC ANALYSIS WITH SPATIAL CONSTRAINTS

    Directory of Open Access Journals (Sweden)

    Oana Calavrezo

    2009-03-01

    Full Text Available The aim of our research is to analyze how the urban organization affects the unemployment-to-work transitions by considering several spatial indicators. This permits to capture two separate effects: "spatial mismatch" and "neighbourhood effects". In order to study the unemployment-to-work transitions, we implement survival models. They are applied on a sample obtained by merging three French databases: the "Trajectoires des demandeurs d'emplois" survey, the 1999 French census and finally, a database containing town inventory information. More precisely, in this paper, we analyze the duration of the first observed employment episode by using spatial indicators and by controlling three potential biases (endogeneity bias, selection bias and attrition bias.

  18. A digital elevation analysis: Spatially distributed flow apportioning algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang-Hyun; Kim, Kyung-Hyun [Pusan National University, Pusan(Korea); Jung, Sun-Hee [Korea Environment Institute, (Korea)

    2001-06-30

    A flow determination algorithm is proposed for the distributed hydrologic model. The advantages of a single flow direction scheme and multiple flow direction schemes are selectively considered to address the drawbacks of existing algorithms. A spatially varied flow apportioning factor is introduced in order to accommodate the accumulated area from upslope cells. The channel initiation threshold area(CIT) concept is expanded and integrated into the spatially distributed flow apportioning algorithm in order to delineate a realistic channel network. An application of a field example suggests that the linearly distributed flow apportioning scheme provides some advantages over existing approaches, such as the relaxation of over-dissipation problems near channel cells, the connectivity feature of river cells, the continuity of saturated areas and the negligence of the optimization of few parameters in existing algorithms. The effects of grid sizes are explored spatially as well as statistically. (author). 28 refs., 7 figs.

  19. Quantitative analysis of natural resource management options at different scales

    NARCIS (Netherlands)

    Keulen, van H.

    2007-01-01

    Natural capital (land, water, air) consists of many resources, each with its own quality, dynamics and renewability, but with strong interactions. The increasing competition for the natural resources, especially land and water, calls for a basic redirection in the analysis of land use. In this

  20. A quantitative method for Failure Mode and Effects Analysis

    NARCIS (Netherlands)

    Braaksma, Anne Johannes Jan; Meesters, A.J.; Klingenberg, W.; Hicks, C.

    2012-01-01

    Failure Mode and Effects Analysis (FMEA) is commonly used for designing maintenance routines by analysing potential failures, predicting their effect and facilitating preventive action. It is used to make decisions on operational and capital expenditure. The literature has reported that despite its