WorldWideScience

Sample records for regional quantitative analysis

  1. Quantitative analysis of regional myocardial performance in coronary artery disease

    Science.gov (United States)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  2. Quantitative evaluation of regional blood flow in pulmonary sarcoidosis with Bull's eye analysis

    International Nuclear Information System (INIS)

    Akaki, Shiro

    1991-01-01

    Lung perfusion scintigraphy was performed in 23 patients with pulmonary sarcoidosis and in 11 normal volunteers. Bull's eye analysis was used to analyze regional pulmonary blood flow quantitatively. First, whole lung perfusion images were divided into three regions by three concentric circles. Then radial axes were projected from the center to define 36 x 10deg sectors. The counts for each sector were calculated and a Bull's eye image was displayed. The counts were compared with the lower limit of normal (mean -2SD), and as the indices of reduction in perfusion, extent score (ES) and severity score (SS) were calculated. ES and SS showed significant reduction in perfusion 16 patients (70%) with sarcoidosis. In stage II sarcoidosis, both ES and SS were significantly higher than in stage I sarcoidosis (p 67 Ga scintigraphy findings. In comparison with clinical data, ES had a positive correlation with serum angiotensin-converting enzyme activity (p + /CD8 + ratio (p<0.05). The Bull's eye analysis was considered useful for the quantitative evaluation of regional pulmonary blood flow in pulmonary sarcoidosis, and it was suggested that the mechanism of reduction in perfusion might result mainly in its alveolitis and angitis. Ventilation abnormality, which may happen prior to reduction in perfusion, may be an important factor of reduction in perfusion. (author)

  3. Anthropogenic activities and coastal environmental quality: a regional quantitative analysis in southeast China with management implications.

    Science.gov (United States)

    Chen, Kai; Liu, Yan; Huang, Dongren; Ke, Hongwei; Chen, Huorong; Zhang, Songbin; Yang, Shengyun; Cai, Minggang

    2018-02-01

    Regional analysis of environmental issues has always been a hot topic in the field of sustainable development. Because the different levels of economic growth, urbanization, resource endowments, etc. in different regions generate apparently different ecological responses, a better description and comparison across different regions will provide more valuable implications for ecological improvement and policymaking. In this study, seven typical bays in southeast China that are a rapid developing area were selected to quantitatively analyze the relationship between socioeconomic development and coastal environmental quality. Based on the water quality data from 2007 to 2015, the multivariate statistical method was applied to analyze the potential environmental risks and to classify the seven bays based on their environmental quality status. The possible variation trends of environmental indices were predicted based on the cross-regional panel data by Environmental Kuznets Curve. The results showed that there were significant regional differences among the seven bays, especially Quanzhou, Xiamen, and Luoyuan Bays, suffered from severer artificial disturbances than other bays, despite their different development patterns. Socioeconomic development level was significantly associated with some water quality indices (pH, DIN, PO 4 -P); the association was roughly positive: the areas with higher GDP per capita have some worse water quality indices. In addition, the decreasing trend of pH values and the increasing trend of nutrient concentration in the seven bays will continue in the foreseeable future. In consideration of the variation trends, the limiting nutrient strategy should be implemented to mitigate the deterioration of the coastal environments.

  4. Imputation-based analysis of association studies: candidate regions and quantitative traits.

    Directory of Open Access Journals (Sweden)

    Bertrand Servin

    2007-07-01

    Full Text Available We introduce a new framework for the analysis of association studies, designed to allow untyped variants to be more effectively and directly tested for association with a phenotype. The idea is to combine knowledge on patterns of correlation among SNPs (e.g., from the International HapMap project or resequencing data in a candidate region of interest with genotype data at tag SNPs collected on a phenotyped study sample, to estimate ("impute" unmeasured genotypes, and then assess association between the phenotype and these estimated genotypes. Compared with standard single-SNP tests, this approach results in increased power to detect association, even in cases in which the causal variant is typed, with the greatest gain occurring when multiple causal variants are present. It also provides more interpretable explanations for observed associations, including assessing, for each SNP, the strength of the evidence that it (rather than another correlated SNP is causal. Although we focus on association studies with quantitative phenotype and a relatively restricted region (e.g., a candidate gene, the framework is applicable and computationally practical for whole genome association studies. Methods described here are implemented in a software package, Bim-Bam, available from the Stephens Lab website http://stephenslab.uchicago.edu/software.html.

  5. Regional quantitative analysis of cortical surface maps of FDG PET images

    CERN Document Server

    Protas, H D; Hayashi, K M; Chin Lung, Yu; Bergsneider, M; Sung Cheng, Huang

    2006-01-01

    Cortical surface maps are advantageous for visualizing the 3D profile of cortical gray matter development and atrophy, and for integrating structural and functional images. In addition, cortical surface maps for PET data, when analyzed in conjunction with structural MRI data allow us to investigate, and correct for, partial volume effects. Here we compared quantitative regional PET values based on a 3D cortical surface modeling approach with values obtained directly from the 3D FDG PET images in various atlas-defined regions of interest (ROIs; temporal, parietal, frontal, and occipital lobes). FDG PET and 3D MR (SPGR) images were obtained and aligned to ICBM space for 15 normal subjects. Each image was further elastically warped in 2D parameter space of the cortical surface, to align major cortical sulci. For each point within a 15 mm distance of the cortex, the value of the PET intensity was averaged to give a cortical surface map of FDG uptake. The average PET values on the cortical surface map were calcula...

  6. Quantitative analysis of raw materials mining of Sverdlovsk region in Russia

    Science.gov (United States)

    Tarasyev, Alexander M.; Vasilev, Julian; Turygina, Victoria F.

    2016-06-01

    The purpose of this article is to show the application of some qualitative methods for the analysis of a dataset for raw materials. The main approaches used are related to the correlation analysis and forecasting with trend lines. It is proved that the future mining of particular ores can be predicted on the basis of mathematical modeling. It is also shown that there exists a strong correlation between the mining of some specific raw materials. Some of the revealed correlations have meaningful explanations, and for others one should look for sophisticated interpretations. The applied approach can be used for forecasting of raw materials exploitation in various regions of Russia and in other countries.

  7. Quantitative Analysis of Regional Cerebral Blood Flow using 99mTc-HMPAO SPECT in Parkinson's Disease

    International Nuclear Information System (INIS)

    Lee, Myung Chul; Bae, Sang Kyun; Chung, June Key; Koh, Chang Soon; Roh, Jae Kyu; Myung, Ho Jin; Lee, Myung Hae

    1992-01-01

    Regional cerebral blood flow were measured in 10 patients with Parkinson's disease and 12 normal persons using 99m Tc-HMPAO SPECT. Reconstructed images were interpreted qualitatively and were compared with those findings of CT. For the quantitative analysis, six pairs of region of interest matched with the perfusion territories of large cerebral arteries and cerebellar hemisphere were determined. From the count values, indices showing the degree of asymmetry between right and left cerebral or cerebellar hemisphere, cerebral asymmetry index (ASI) and percent index of cerebellar asymmetry (PIA), and an index showing change of each region, region to cerebellum ratio (RCR) were obtained. ASI of normal persons and patients were 0.082 ± 0.033 and 0.108 ± 0.062, respectively and PIA were -0.4 ± 0.7% and -0.7 ± 1.0%, respectively, which showed no statistically significant difference between normal persons and patients. Among 10 RCR's, those of both regions of basal ganglia and both regions of anterior cerebral artery were significantly reduced. We concluded that the most significant reduction of regional cerebral blood flow in patients with Parkinson's disease was observed in the regions of basal ganglia and in the regions of anterior cerebral artery, and the degree of change in hemispheric blood flow was similar in both hemisphere.

  8. Quantitative Analysis of Regional Cerebral Blood Flow using {sup 99m}Tc-HMPAO SPECT in Parkinson's Disease

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Chul; Bae, Sang Kyun; Chung, June Key; Koh, Chang Soon; Roh, Jae Kyu; Myung, Ho Jin [Seoul National University College of Medicine, Seoul (Korea, Republic of); Lee, Myung Hae [Asan Medical Center, Seoul (Korea, Republic of)

    1992-07-15

    Regional cerebral blood flow were measured in 10 patients with Parkinson's disease and 12 normal persons using {sup 99m}Tc-HMPAO SPECT. Reconstructed images were interpreted qualitatively and were compared with those findings of CT. For the quantitative analysis, six pairs of region of interest matched with the perfusion territories of large cerebral arteries and cerebellar hemisphere were determined. From the count values, indices showing the degree of asymmetry between right and left cerebral or cerebellar hemisphere, cerebral asymmetry index (ASI) and percent index of cerebellar asymmetry (PIA), and an index showing change of each region, region to cerebellum ratio (RCR) were obtained. ASI of normal persons and patients were 0.082 +- 0.033 and 0.108 +- 0.062, respectively and PIA were -0.4 +- 0.7% and -0.7 +- 1.0%, respectively, which showed no statistically significant difference between normal persons and patients. Among 10 RCR's, those of both regions of basal ganglia and both regions of anterior cerebral artery were significantly reduced. We concluded that the most significant reduction of regional cerebral blood flow in patients with Parkinson's disease was observed in the regions of basal ganglia and in the regions of anterior cerebral artery, and the degree of change in hemispheric blood flow was similar in both hemisphere.

  9. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  10. Parameters of Regional Cooperative Behavior in the German Biotech Industry – A Quantitative Social Network Analysis

    DEFF Research Database (Denmark)

    Mitze, Timo; Strotebeck, Falk

    We analyse the determinants of network formation in Germany’s biotechnology industry using social network analysis combined with a regression approach for count data. Outcome variable of interest is the degree centrality of German regions, which is specified as a function of the region’s innovative...... and economic performance as well as biotech-related policy variables. The inclusion of the latter allows us to shed new light on the question to what extent R&D-based cluster policies are able to impact on the formation of the German biotech network. Our results show that policy indicators such as the volume...... of public funding for collaborative R&D activity are positively correlated with the region’s overall and interregional degree centrality. However, besides this direct funding effect, we do not observe any further (non-pecuniary) advantages such as prestige or image effects. Regarding the role played...

  11. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  12. Quantitative analysis and prediction of regional lymph node status in rectal cancer based on computed tomography imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Chunyan; Liu, Lizhi; Li, Li [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Imaging Diagnosis and Interventional Center, Cancer Center, Guangzhou, Guangdong (China); Cai, Hongmin; Tian, Haiying [Sun Yat-Sen University, Department of Automation, School of Science Information and Technology, Guangzhou (China); Li, Liren [Sun Yat-sen University, State Key Laboratory of Oncology in Southern China, Department of Abdominal (colon and rectal) Surgery, Cancer Center, Guangzhou (China)

    2011-11-15

    To quantitatively evaluate regional lymph nodes in rectal cancer patients by using an automated, computer-aided approach, and to assess the accuracy of this approach in differentiating benign and malignant lymph nodes. Patients (228) with newly diagnosed rectal cancer, confirmed by biopsy, underwent enhanced computed tomography (CT). Patients were assigned to the benign node or malignant node group according to histopathological analysis of node samples. All CT-detected lymph nodes were segmented using the edge detection method, and seven quantitative parameters of each node were measured. To increase the prediction accuracy, a hierarchical model combining the merits of the support and relevance vector machines was proposed to achieve higher performance. Of the 220 lymph nodes evaluated, 125 were positive and 95 were negative for metastases. Fractal dimension obtained by the Minkowski box-counting approach was higher in malignant nodes than in benign nodes, and there was a significant difference in heterogeneity between metastatic and non-metastatic lymph nodes. The overall performance of the proposed model is shown to have accuracy as high as 88% using morphological characterisation of lymph nodes. Computer-aided quantitative analysis can improve the prediction of node status in rectal cancer. (orig.)

  13. Borneo : a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    NARCIS (Netherlands)

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species

  14. Borneo: a quantitative analysis of botanical richness, endemicity and floristic regions based on herbarium records

    OpenAIRE

    Raes, Niels

    2009-01-01

    Based on the digitized herbarium records housed at the National Herbarium of the Netherlands I developed high spatial resolution patterns of Borneo's botanical richness, endemicity, and the floristic regions. The patterns are derived from species distribution models which predict a species occurrence based on the identified relationships between species recorded presences and the ecological circumstances at those localities. A new statistical method was developed to test the species distribut...

  15. Quantitative risk analysis using vulnerability indicators to assess food insecurity in the Niayes agricultural region of West Senegal

    Directory of Open Access Journals (Sweden)

    Mateugue Diack

    2017-11-01

    Full Text Available There is an increasing need to develop indicators of vulnerability and adaptive capacity to determine the robustness of response strategies over time and better understand the underlying processes. This study aimed to determine levels of risk of food insecurity using defined vulnerability indicators. For the purpose of this study, factors influencing food insecurity and different vulnerable indicators were examined using quantitative and qualitative research methods. Observations made on the physical environment (using tools for spatial analysis and socio-economic surveys conducted with local populations have quantified vulnerability indicators in the Niayes agricultural region. Application of the Classification and Regression Tree (CART model has enabled us to quantify the level of vulnerability of the zone. The results show that the decrease in agricultural surface areas is the most discriminant one in this study. The speed of reduction of the agricultural areas has specially increased between 2009 and 2014, with a loss of 65% of these areas. Therefore, a decision-making system, centred on the need for reinforcing the resilience of local populations, by preserving the agricultural vocation of the Niayes region and even in the Sahelian regions requires support and extension services for the farmers in order to promote sustainable agricultural practices.

  16. A quantitative analysis of diabetic retinopathy screening in a regional treatment centre.

    LENUS (Irish Health Repository)

    James, M

    2014-11-01

    The aim of the study was to assess the current diabetic retinopathy screening infrastructure and implications on workload for a designated treatment centre following roll-out of a national screening programme. A combination of chart analysis and patient questionnaire was undertaken over a 4-week period in 2011 at Cork University Hospital (CUH). Data were collected on 97 patients and categorized. as demographic, medical, and screening-related. The majority of patients (80; 82.5%} had either no retinopathy or background retinopathy only. One (1.0%) patient was deemed to be ungradable due to dense cataract, while 6 (6.2%) patients had non-diabetic ocular pathology requiring follow-up. Only 11% were screened through retinal photography. In all, 74 (76.3%) patients were deemed suitable for community rather than hospital screening. Digital retinal photography is an underused screening resource Significant numbers of patients could be discharged from hospital-based to community screening to offset the increased workload expected from the national screening programme.

  17. Relationships between regional economic sectors and water use in a water-scarce area in China: A quantitative analysis

    Science.gov (United States)

    Wang, Weiping; Gao, Lei; Liu, Pin; Hailu, Atakelty

    2014-07-01

    Northern China has been facing severe water scarcity as a result of vigorous economic growth, population expansion and changing lifestyles. A typical case is Shandong province whose water resources per capita is approximately only a sixth of the national average and a twentieth of the global average. It is useful to assess the implications of the province’s growth and trade patterns for water use and water conservation strategies. This study quantitatively analyses relationships between regional economic sectors and water use in Shandong using an input-output model for virtual water resources. The changes in key indicators for 1997-2007 are tracked and the effects of water-saving policies on these changes are examined. The results highlight the benefits of applying a virtual water trade analysis on a water-scarce region where water resources exhibit highly heterogeneous temporal and geographical distributions. The net export of virtual water in Shandong was initially large, but this declined over the years and the province has recently become a net importer. Between 1997 and 2002, water use in most sectors increased due to rapid urbanisation and industrialisation. Since then, water use in all Shandong economic sectors exhibit a downward trend despite continued increases in goods and services net exports, a trend which can be attributed to the vigorous implementation of water-saving policies and measures, especially water use quotas. Economic sectors consume water directly and indirectly and understanding the pattern of virtual water trade implied by sectoral relationships is important for managing water scarcity problems. This study fills the knowledge gap in the existing literature created by the lack of case studies that dynamically assess virtual water trade and analyse the effects of water-saving policies and measures. The study draws policy recommendations that are relevant for future water planning in Shandong and other regions in northern China.

  18. Improved noninvasive assessment of coronary artery disease by quantitative analysis of regional stress myocardial distribution and washout of thallium-201

    International Nuclear Information System (INIS)

    Maddahi, J.; Garcia, E.V.; Berman, D.S.; Waxman, A.; Swan, H.J.C.; Forrester, J.

    1981-01-01

    Visual interpretation of stress-redistribution thallium-201 ( 201 Tl) scintigrams is subject to observer variability and is suboptimal for evaluation of extent of coronary artery disease (CAD). An objective, computerized technique has been developed that quantitatively expresses the relative space-time myocardial distribution of 201 Tl. Multiple-view, maximum-count circumferential profiles for stress myocardial distribution of 201 Tl and segmental percent washout were analyzed in a pilot group of 31 normal subjects and 20 patients with CAD to develop quantitative criteria for abnormality. Subsequently, quantitative analysis was applied prospectively to a group of 22 normal subjects and 45 CAD patients and compared with visual interpretation of scintigrams for detection and evaluation of CAD. The sensitivity and specificity of the quantitative technique (93% and 91%, respectively) were not significantly different from those of the visual method (91% and 86%). The quantitative analysis significantly (p 201 Tl imaging over the visual method in the left anterior descending artery (from 56% to 80%), left circumflex artery (from 34% to 63%) and right coronary artery (from 65% to 94%) without significant loss of specificity. Using quantitative analysis, sensitivity for detection of deseased vessels did not diminish as the number of vessels involved increased, as it did with visual interpretations. In patients with one-vessel disease, 86% of the lesions were detected by both techniques; however, in patients with three-vessel disease, quantitative analysis detected 83% of the lesions, while the sensitivity was only 53% for the visual method. Seventy percent of the coronary arteries with moderate

  19. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  20. Quantitative analysis of DNA methylation at all human imprinted regions reveals preservation of epigenetic stability in adult somatic tissue

    Directory of Open Access Journals (Sweden)

    Woodfine Kathryn

    2011-01-01

    Full Text Available Abstract Background Genes subject to genomic imprinting are mono-allelically expressed in a parent-of-origin dependent manner. Each imprinted locus has at least one differentially methylated region (DMR which has allele specific DNA methylation and contributes to imprinted gene expression. Once DMRs are established, they are potentially able to withstand normal genome reprogramming events that occur during cell differentiation and germ-line DMRs are stably maintained throughout development. These DMRs, in addition to being either maternally or paternally methylated, have differences in whether methylation was acquired in the germ-line or post fertilization and are present in a variety of genomic locations with different Cytosine-phosphate guanine (CpG densities and CTCF binding capacities. We therefore examined the stability of maintenance of DNA methylation imprints and determined the normal baseline DNA methylation levels in several adult tissues for all imprinted genes. In order to do this, we first developed and validated 50 highly specific, quantitative DNA methylation pyrosequencing assays for the known DMRs associated with human imprinted genes. Results Remarkable stability of the DNA methylation imprint was observed in all germ-line DMRs and paternally methylated somatic DMRs (which maintained average methylation levels of between 35% - 65% in all somatic tissues, independent of gene expression. Maternally methylated somatic DMRs were found to have more variation with tissue specific methylation patterns. Most DMRs, however, showed some intra-individual variability for DNA methylation levels in peripheral blood, suggesting that more than one DMR needs to be examined in order to get an overall impression of the epigenetic stability in a tissue. The plasticity of DNA methylation at imprinted genes was examined in a panel of normal and cancer cell lines. All cell lines showed changes in DNA methylation, especially at the paternal germ

  1. Noninvasive identification of left main and triple vessel coronary artery disease: improved accuracy using quantitative analysis of regional myocardial stress distribution and washout of thallium-201

    International Nuclear Information System (INIS)

    Maddahi, J.; Abdulla, A.; Garcia, E.V.; Swan, H.J.; Berman, D.S.

    1986-01-01

    The capabilities of visual and quantitative analysis of stress redistribution thallium-201 scintigrams, exercise electrocardiography and exercise blood pressure response were compared for correct identification of extensive coronary disease, defined as left main or triple vessel coronary artery disease, or both (50% or more luminal diameter coronary narrowing), in 105 consecutive patients with suspected coronary artery disease. Extensive disease was present in 56 patients and the remaining 49 had either less extensive coronary artery disease (n = 34) or normal coronary arteriograms (n = 15). Although exercise blood pressure response, exercise electrocardiography and visual thallium-201 analysis were highly specific (98, 88 and 96%, respectively), they were insensitive for identification of patients with extensive disease (14, 45 and 16%, respectively). Quantitative thallium-201 analysis significantly improved the sensitivity of visual thallium-201 analysis for identification of patients with extensive disease (from 16 to 63%, p less than 0.001) without a significant loss of specificity (96 versus 86%, p = NS). Eighteen (64%) of the 28 patients who were misclassified by visual analysis as having less extensive disease were correctly classified as having extensive disease by virtue of quantitative analysis of regional myocardial thallium-201 washout. When the results of quantitative thallium-201 analysis were combined with those of blood pressure and electrocardiographic response to exercise, the sensitivity and specificity for identification of patients with extensive disease was 86 and 76%, respectively, and the highest overall accuracy (0.82) was obtained

  2. Quantitative computed tomography of lung parenchyma in patients with emphysema: analysis of higher-density lung regions

    Science.gov (United States)

    Lederman, Dror; Leader, Joseph K.; Zheng, Bin; Sciurba, Frank C.; Tan, Jun; Gur, David

    2011-03-01

    Quantitative computed tomography (CT) has been widely used to detect and evaluate the presence (or absence) of emphysema applying the density masks at specific thresholds, e.g., -910 or -950 Hounsfield Unit (HU). However, it has also been observed that subjects with similar density-mask based emphysema scores could have varying lung function, possibly indicating differences of disease severity. To assess this possible discrepancy, we investigated whether density distribution of "viable" lung parenchyma regions with pixel values > -910 HU correlates with lung function. A dataset of 38 subjects, who underwent both pulmonary function testing and CT examinations in a COPD SCCOR study, was assembled. After the lung regions depicted on CT images were automatically segmented by a computerized scheme, we systematically divided the lung parenchyma into different density groups (bins) and computed a number of statistical features (i.e., mean, standard deviation (STD), skewness of the pixel value distributions) in these density bins. We then analyzed the correlations between each feature and lung function. The correlation between diffusion lung capacity (DLCO) and STD of pixel values in the bin of -910HU lung parenchyma and lung function, which indicates that similar to the conventional density mask method, the pixel value distribution features in "viable" lung parenchyma areas may also provide clinically useful information to improve assessments of lung disease severity as measured by lung functional tests.

  3. Brain regions associated with cognitive impairment in patients with Parkinson disease: quantitative analysis of cerebral blood flow using 123I iodoamphetamine SPECT.

    Science.gov (United States)

    Hattori, Naoya; Yabe, Ichiro; Hirata, Kenji; Shiga, Tohru; Sakushima, Ken; Tsuji-Akimoto, Sachiko; Sasaki, Hidenao; Tamaki, Nagara

    2013-05-01

    Cognitive impairment is a representative neuropsychiatric presentation that accompanies Parkinson disease (PD). The purpose of this study was to localize the cerebral regions associated with cognitive impairment in patients with PD using quantitative SPECT. Thirty-two patients with PD (mean [SD] age, 75 [8] years; 25 women; Hoehn-Yahr scores from 2 to 5) underwent quantitative brain SPECT using 123I iodoamphetamine. Parametric images of regional cerebral blood flow (rCBF) were spatially normalized to the standard brain atlas. First, voxel-by-voxel comparison between patients with PD with versus without cognitive impairment was performed to visualize overall trend of regional differences. Next, the individual quantitative rCBF values were extracted in representative cortical regions using a standard region-of-interest template to compare the quantitative rCBF values. Patients with cognitive impairment showed trends of lower rCBF in the left frontal and temporal cortices as well as in the bilateral medial frontal and anterior cingulate cortices in the voxel-by-voxel analyses. Region-of-interest-based analysis demonstrated significantly lower rCBF in the bilateral anterior cingulate cortices (right, 25.8 [5.5] vs 28.9 [5.7] mL per 100 g/min, P left, 25.8 [5.8] vs 29.1 [5.7] mL per 100 g/min, P left frontal and temporal cortices as well as in the bilateral medial frontal and anterior cingulate cortices. The results suggested dysexecutive function as an underlining mechanism of cognitive impairment in patients with PD.

  4. γ-diketone central neuropathy: quantitative morphometric analysis of axons in rat spinal cord white matter regions and nerve roots

    International Nuclear Information System (INIS)

    LoPachin, Richard M.; Jortner, Bernard S.; Reid, Maria L.; Das, Soma

    2003-01-01

    A quantitative analytical method was used to measure myelinated axon morphometric parameters (e.g., axon area, ratio of axon area/fiber area, and index of circularity) in rat nervous tissue during intoxication with 2,5-hexanedione (HD). Parameters were assessed in nerve roots (dorsal and ventral) and in ascending (gracile fasciculus and spinocerebellar tract) and descending (corticospinal and rubrospinal tracts) spinal cord white matter tracts (L4-L5) of rats intoxicated with HD at two different daily dose-rates (175 or 400 mg HD/kg/day, gavage). For each dose-rate, tissue was sampled at four neurological endpoints: unaffected, slight, moderate, and severe toxicity, as determined by gait analysis and measurements of grip strength. Results indicate that, regardless of the HD dose-rate, axon atrophy (reduced axon area) was a widespread, abundant effect that developed in concert with neurological deficits. The atrophy response occurred contemporaneously in both ascending and descending spinal tracts, which suggests that loss of caliber developed simultaneously along the proximodistal axon axis. In contrast, swollen axons were a numerically small component and were present in nerve roots and spinal tracts only during subchronic intoxication at the lower HD dose-rate (i.e., 175 mg/kg/day). Intoxication at the higher dose-rate (400 mg/kg/day) produced neurological deficits in the absence of axonal swellings. These observations in conjunction with our previous studies of HD-induced peripheral neuropathy (Toxicol. Appl. Pharmacol. 135 (1995) 58; and Toxicol. Appl. Pharmacol. 165 (2000) 127) indicate that axon atrophy, and not axonal swelling, is a primary neuropathic phenomenon

  5. Selection of the regions of interest (SRI) in the SPECT semi-quantitative analysis of central dopaminergic receptors

    International Nuclear Information System (INIS)

    Baulieu, J.L.; Prunier-Levilion, C.; Tranquart, F.; Ribeiro, M.J.; Chartier, J.R.; Guilloteau, D.; Autret, A.; Besnard, J.C.; Bekhechi, D.; Chossat, F.

    1997-01-01

    The aim of this work was to compare different types of SRIs used in the SPECT semi-quantitative analysis of central dopaminergic receptors. The SPECT with 123 I iodolisuride (Cis bio international) was carried out in the same center with a Helix - Elscint double head camera with 'fan beam', one hour after injection of 123 I iodolisuride (190 ± 31 MBq). In 8 patients afflicted with Parkinson's disease (group 1) and 9 patients presenting an extra-pyramidal syndrome by striatal stretching (group 2), two approaches of SRI tracing were undertaken: 1. Geometrical and standard (circles, ellipses, rectangles) SRIs; 2. Anatomical and individual SRIs based on TDM and perfusion scintigraphy. The SRIs were placed on the entire striatum, the head of cauda nucleus, putamen, thalamus, frontal, occipital cortex and cerebellum. In total, for each patient, 31 ratios were calculated of the striatal activity and the activity of a references zone. The discriminative value of the ratios was evaluated by the p value of comparison between groups 1 and 2. A correlation has been searched for between the ratios taken 2 by 2. The most discriminative ratios were: cauda/occipital, cauda/frontal, striatum/occipital based on geometrical standard SRIs (p 0.001, p = 0.002, p = 0.003, respectively). A close correlation has been found between the ratios with occipital and cerebellar references (r 2 0.71) but not between the ratios with frontal or occipital reference, or frontal and cerebellum reference. In the employed conditions, the geometrical tracing of the SRIs is preferable as against an anatomic tracing. The occipital cortex is the best reference while the frontal activity can not be retained as reference. The cauda/occipital ratios allow a very good discrimination between the Parkinson's disease and other extra pyramidal syndromes investigated by 123 I iodolisuride SPECT

  6. Optimization of Region of Interest Drawing for Quantitative Analysis: Differentiation Between Benign and Malignant Breast Lesions on Contrast-Enhanced Sonography.

    Science.gov (United States)

    Nakata, Norio; Ohta, Tomoyuki; Nishioka, Makiko; Takeyama, Hiroshi; Toriumi, Yasuo; Kato, Kumiko; Nogi, Hiroko; Kamio, Makiko; Fukuda, Kunihiko

    2015-11-01

    This study was performed to evaluate the diagnostic utility of quantitative analysis of benign and malignant breast lesions using contrast-enhanced sonography. Contrast-enhanced sonography using the perflubutane-based contrast agent Sonazoid (Daiichi Sankyo, Tokyo, Japan) was performed in 94 pathologically proven palpable breast mass lesions, which could be depicted with B-mode sonography. Quantitative analyses using the time-intensity curve on contrast-enhanced sonography were performed in 5 region of interest (ROI) types (manually traced ROI and circular ROIs of 5, 10, 15, and 20 mm in diameter). The peak signal intensity, initial slope, time to peak, positive enhancement integral, and wash-out ratio were investigated in each ROI. There were significant differences between benign and malignant lesions in the time to peak (P benign and malignant lesions in the time to peak (P benign and malignant breast lesions. © 2015 by the American Institute of Ultrasound in Medicine.

  7. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  8. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  9. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  10. [Cloning and sequence analysis of the DHBV genome of the brown ducks in Guilin region and establishment of the quantitative method for detecting DHBV].

    Science.gov (United States)

    Su, He-Ling; Huang, Ri-Dong; He, Song-Qing; Xu, Qing; Zhu, Hua; Mo, Zhi-Jing; Liu, Qing-Bo; Liu, Yong-Ming

    2013-03-01

    Brown ducks carrying DHBV were widely used as hepatitis B animal model in the research of the activity and toxicity of anti-HBV dugs. Studies showed that the ratio of DHBV carriers in the brown ducks in Guilin region was relatively high. Nevertheless, the characters of the DHBV genome of Guilin brown duck remain unknown. Here we report the cloning of the genome of Guilin brown duck DHBV and the sequence analysis of the genome. The full length of the DHBV genome of Guilin brown duck was 3 027bp. Analysis using ORF finder found that there was an ORF for an unknown peptide other than S-ORF, PORF and C-ORF in the genome of the DHBV. Vector NTI 8. 0 analysis revealed that the unknown peptide contained a motif which binded to HLA * 0201. Aligning with the DHBV sequences from different countries and regions indicated that there were no obvious differences of regional distribution among the sequences. A fluorescence quantitative PCR for detecting DHBV was establishment based on the recombinant plasmid pGEM-DHBV-S constructed. This study laid the groundwork for using Guilin brown duck as a hepatitis B animal model.

  11. Employment growth in rural regions of the EU; A quantitative analysis for the period 1980-1995

    NARCIS (Netherlands)

    Esposti, R.; Godeschalk, F.E.; Kuhmonen, T.; Post, J.H.; Sotte, F.; Terluin, I.J.

    1999-01-01

    In this report it is examined whether rural regions in the EU with a relatively high (low) employment growth in the 1980s and early 1990s have some common socio-economic characterics, which can contribute to the explanation of their employment performance. We have grouped the socio-economic

  12. Spatial–Temporal Modeling for Regional Economic Development: A Quantitative Analysis with Panel Data from Western China

    Directory of Open Access Journals (Sweden)

    Jingxiao Zhang

    2017-10-01

    Full Text Available The objective of this research is to analyze regional economic difference and explore the influencing factors, which would eventually provide an effective foundation to narrow the regional economic differences. In this paper, a new regional economic difference model is established considering the interactions between the spatial weight and human capital and foreign direct investment (FDI. With the panel data from twelve western provinces in China, the empirical research is conducted by adopting feasible generalized least squares (FGLS fixed effects model. The preliminary results show that: (1 the spatial spillover effect of human capital and FDI is significant to the formation of regional economic difference; and (2 the total capital formation, government expenditure, FDI, human capital and patent application authorization are positively correlated with GDP growth per capita, while the number of medical institutions is negatively correlated with GDP growth per capita. In addition, the robust test is carried out for validation by using the filter variable method, spatial lag model and spatial error model. The robustness test results show that the results of the FGLS fixed effects model are validated by the filter variable method. The other two robust test results show that: (1 the total capital formation and the fixed asset investment is of 99.9% significance, which represents that they play a key role in the formation of economic development difference; and (2 the coefficients’ symbols of the other variables are consistent with the FGLS fixed effect model but a little different on the significances, which enhance the effectiveness of the proposed regional economic difference model.

  13. Quantitative structure analysis of genetic diversity among spring bread wheats (Triticum aestivum L.) from different geographical regions.

    Science.gov (United States)

    Hai, Lin; Wagner, Carola; Friedt, Wolfgang

    2007-07-01

    Genetic diversity in spring bread wheat (T. aestivum L.) was studied in a total of 69 accessions. For this purpose, 52 microsatellite (SSR) markers were used and a total of 406 alleles were detected, of which 182 (44.8%) occurred at a frequency of bread wheats was H ( e ) = 0.65. A comparatively higher diversity was observed between wheat varieties from Southern European countries (Austria/Switzerland, Portugal/Spain) corresponding to those from other regions.

  14. Research awareness, attitudes and barriers among clinical staff in a regional cancer centre. Part 1: a quantitative analysis.

    Science.gov (United States)

    Caldwell, B; Coltart, K; Hutchison, C; McJury, M; Morrison, A; Paterson, C; Thomson, M

    2017-09-01

    Research is of key importance in delivering high-quality patient care through evidence-based practice. Attitude towards research and barriers to research can have an impact on research activity. A survey was conducted to establish the levels of research awareness and attitudes among clinical staff groups in this regional cancer centre and identify any barriers to participation in research. The survey consisted of 26 questions and was distributed electronically and completed online. The response rate was 22.3% (n = 123). All participants felt that clinical research will help the regional cancer centre develop and progress treatments in the future. A positive attitude towards research was evident and consistent across professional groups. The main identified barriers to research included lacking the required knowledge, skills and training, lacking support from managers, and lack of opportunity or time to be involved in research, in particular for allied health professionals. However, there appears to be the foundation of a healthy research culture for nurses supported by management. The results of the survey support the implementation of an action plan based on the recommendations of this journal article. © 2016 John Wiley & Sons Ltd.

  15. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  16. Quantitative Concept Analysis

    NARCIS (Netherlands)

    Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas

    2012-01-01

    Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all

  17. Evaluation of antibiotic usage in swine reproduction farms in Umbria region based on the quantitative analysis of antimicrobial consumption

    Directory of Open Access Journals (Sweden)

    Fausto Scoppetta

    2017-09-01

    Full Text Available Antibiotic use in food-producing animals has considerable impact on public health, especially with respect to the development and spread of antibiotic resistance. Pigs represent one of the main species in which antibiotics are frequently used for different purposes. Surveillance of antibiotic consumption and dose appropriateness, through novel approaches based on defined daily doses, is strongly needed to assess farms’ antibiotic risk, in terms of spread of antibiotic resistance and possibile presence of residues in meat. In this study, antibiotic consumption was monitored in 14 swine reproduction farms, together with managerial, structural, and health aspects. Most of the controlled farms (65% were classified as at medium antibiotic risk, 21% at high antibiotic risk, and 14% at low antibiotic risk. Critical aspects of antibiotic administration concerned treatments for suckling and weaner piglets, oral antibiotic administration, treatment and diagnosis of gastroenteric infections, and use of critically important antimicrobials for human medicine, especially colistin. These aspects could be considered critical aspects of antibiotic use in from-farrow-to-wean/finish swine farms in the Umbria region and must be controlled to minimize risks. Even though a small number of farms in Umbria region are at high antibiotic risk, the risk of antibiotic resistance should be minimized, and management and biosecurity of the farms should be improved by extending the use of antimicrobial susceptibility tests and optimizing the diagnostic methods for infectious diseases. Furthermore, farmers’ and veterinarians’ knowledge of antibiotic resistance should be improved and the prudent use of antibiotics encouraged to prevent the development and spread of resistant microorganisms.

  18. Optimization of sample absorbance for quantitative analysis in the presence of pathlength error in the IR and NIR regions

    International Nuclear Information System (INIS)

    Hirschfeld, T.; Honigs, D.; Hieftje, G.

    1985-01-01

    Optical absorbance levels for quantiative analysis in the presence of photometric error have been described in the past. In newer instrumentation, such as FT-IR and NIRA spectrometers, the photometric error is no longer limiting. In these instruments, pathlength error due to cell or sampling irreproducibility is often a major concern. One can derive optimal absorbance by taking both pathlength and photometric errors into account. This paper analyzes the cases of pathlength error >> photometric error (trivial) and various cases in which the pathlength errors and the photometric error are of the same order: adjustable concentration (trivial until dilution errors are considered), constant relative pathlength error (trivial), and constant absolute pathlength error. The latter, in particular, is analyzed in detail to give the behavior of the error, the behavior of the optimal absorbance in its presence, and the total error levels attainable

  19. Energy potential of region and its quantitative assessment

    Directory of Open Access Journals (Sweden)

    Tatyana Aleksandrovna Kovalenko

    2013-09-01

    Full Text Available The purpose of the article is the development of the concept of the energy potential of the region (EPR, the analysis of the existing structure of relationships for the EPR elements in Ukraine and improvement of a quantitative assessment of energy potential of the region (country. The methods of an assessment of the existing condition of energy potential of the territory are the subject matter of the research. As a result of the analysis of concept’s definitions of energy potential of the region, it has further development and included the consumer potential of energy resources and capacity of management. The structure of relationships between elements of energy potential is developed for the Ukraine region. The new economic indicator — the realized energy potential is offered for an EPR assessment. By means of this indicator, the assessment of energy potential for the different countries of the world and a number of Ukraine areas of is performed.

  20. Quantitation of regional cerebral blood flow corrected for partial volume effect using O-15 water and PET: I. Theory, error analysis, and stereologic comparison

    DEFF Research Database (Denmark)

    Lida, H; Law, I; Pakkenberg, B

    2000-01-01

    Limited spatial resolution of positron emission tomography (PET) can cause significant underestimation in the observed regional radioactivity concentration (so-called partial volume effect or PVE) resulting in systematic errors in estimating quantitative physiologic parameters. The authors have...... formulated four mathematical models that describe the dynamic behavior of a freely diffusible tracer (H215O) in a region of interest (ROI) incorporating estimates of regional tissue flow that are independent of PVE. The current study was intended to evaluate the feasibility of these models and to establish...... a methodology to accurately quantify regional cerebral blood flow (CBF) corrected for PVE in cortical gray matter regions. Five monkeys were studied with PET after IV H2(15)O two times (n = 3) or three times (n = 2) in a row. Two ROIs were drawn on structural magnetic resonance imaging (MRI) scans and projected...

  1. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  2. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  3. Analysis and interpretation of specific ethanol metabolites, ethyl sulfate, and ethyl glucuronide in sewage effluent for the quantitative measurement of regional alcohol consumption.

    Science.gov (United States)

    Reid, Malcolm J; Langford, Katherine H; Mørland, Jørg; Thomas, Kevin V

    2011-09-01

    The quantitative measurement of urinary metabolites in sewage streams and the subsequent estimation of consumption rates of the parent compounds have previously been demonstrated for pharmaceuticals and narcotics. Ethyl sulfate and ethyl glucuronide are excreted in urine following the ingestion of alcohol, and are useful biomarkers for the identification of acute alcohol consumption. This study reports a novel ion-exchange-mediated chromatographic method for the quantitative measurement of ethyl sulfate and ethyl glucuronide in sewage effluent, and presents a novel calculation method for the purposes of relating the resulting sewage concentrations with rates of alcohol consumption in the region. A total of 100 sewage samples covering a 25-day period were collected from a treatment plant servicing approximately 500,000 people, and analyzed for levels of ethyl sulfate and ethyl glucuronide. The resulting data were then used to estimate combined alcohol consumption rates for the region, and the results were compared with alcohol related sales statistics for the same region. Ethyl glucuronide was found to be unstable in sewage effluent. Ethyl sulfate was stable and measurable in all samples at concentrations ranging from 16 to 246 nM. The highest concentrations of the alcohol biomarker were observed during weekend periods. Sixty one percent of the total mass of ethyl sulfate in sewage effluent corresponds to alcohol consumption on Friday and Saturday. Sales statistics for alcohol show that consumption in the region is approximately 6,750 kg/d. The quantity of ethyl sulfate passing through the sewage system is consistent with consumption of 4,900 to 7,800 kg/d.   Sewage epidemiology assessments of ethyl sulfate can provide accurate estimates of community alcohol consumption, and detailed examination of the kinetics of this biomarker in sewage streams can also identify time-dependent trends in alcohol consumption patterns. 2011 by the Research Society on Alcoholism.

  4. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  5. Quantitative circumferential strain analysis using adenosine triphosphate-stress/rest 3-T tagged magnetic resonance to evaluate regional contractile dysfunction in ischemic heart disease

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, Masashi, E-mail: m.nakamura1230@gmail.com [Department of Radiology, Ehime University Graduate School of Medicine, Shitsukawa, Toon-city, Ehime 791-0295 (Japan); Kido, Tomoyuki [Department of Radiology, Saiseikai Matsuyama Hospital, Ehime 791-0295 (Japan); Kido, Teruhito; Tanabe, Yuki; Matsuda, Takuya; Nishiyama, Yoshiko; Miyagawa, Masao; Mochizuki, Teruhito [Department of Radiology, Ehime University Graduate School of Medicine, Shitsukawa, Toon-city, Ehime 791-0295 (Japan)

    2015-08-15

    Highlights: • Infarcted segments could be differentiated from non-ischemic and ischemic segments with high sensitivity and specificity under at rest conditions. • The time-to-peak circumferential strain values in infarcted segments were more significantly delayed than those in non-ischemic and ischemic segments. • Both circumferential strain and circumferential systolic strain rate values under ATP-stress conditions were significantly lower in ischemic segments than in non-ischemic segments. • Subtracting stress and rest circumferential strain had a higher diagnostic capability for ischemia relative to only utilizing rest or ATP-stress circumferential strain values. • A circumferential strain analysis using tagged MR can quantitatively assess contractile dysfunction in ischemic and infarcted myocardium. - Abstract: Purpose: We evaluated whether a quantitative circumferential strain (CS) analysis using adenosine triphosphate (ATP)-stress/rest 3-T tagged magnetic resonance (MR) imaging can depict myocardial ischemia as contractile dysfunction during stress in patients with suspected coronary artery disease (CAD). We evaluated whether it can differentiate between non-ischemia, myocardial ischemia, and infarction. We assessed its diagnostic performance in comparison with ATP-stress myocardial perfusion MR and late gadolinium enhancement (LGE)-MR imaging. Methods: In 38 patients suspected of having CAD, myocardial segments were categorized as non-ischemic (n = 485), ischemic (n = 74), or infarcted (n = 49) from the results of perfusion MR and LGE-MR. The peak negative CS value, peak circumferential systolic strain rate (CSR), and time-to-peak CS were measured in 16 segments. Results: A cutoff value of −12.0% for CS at rest allowed differentiation between infarcted and other segments with a sensitivity of 79%, specificity of 76%, accuracy of 76%, and an area under the curve (AUC) of 0.81. Additionally, a cutoff value of 477.3 ms for time-to-peak CS at rest

  6. Quantitative conformational analysis of the core region of N-glycans using residual dipolar couplings, aqueous molecular dynamics, and steric alignment

    International Nuclear Information System (INIS)

    Almond, Andrew; Duus, Jens O.

    2001-01-01

    A method is described for quantitatively investigating the dynamic conformation of small oligosaccharides containing an α(1 → 6) linkage. It was applied to the oligosaccharide Man-α(1 → 3) {Man-α (1 → 6)}Man-α-O-Me, which is a core region frequently observed in N-linked glycans. The approach tests an aqueous molecular dynamics simulation, capable of predicting microscopic dynamics, against experimental residual dipolar couplings, by assuming that alignment is caused purely by steric hindrance. The experimental constraints were heteronuclear and homonuclear residual dipolar couplings, and in particular those within the α(1 → 6) linkage itself. Powerful spin-state-selective pulse sequences and editing schemes were used to obtain the most relevant couplings for testing the model. Molecular dynamics simulations in water over a period of 50 ns were not able to predict the correct rotamer population at the α(1 → 6) linkage to agree with the experimental data. However, this sampling problem could be corrected using a simple maximum likelihood optimisation, indicating that the simulation was modelling local dynamics correctly. The maximum likelihood prediction of the residual dipolar couplings was found to be an almost equal population of the gg and gt rotamer conformations at the α(1 → 6) linkage, and the tg conformation was predicted to be unstable and unpopulated in aqueous solution. In this case all twelve measured residual dipolar couplings could be satisfied. This conformer population could also be used to make predictions of scalar couplings with the use of a previously derived empirical equation, and is qualitatively in agreement with previous predictions based on NMR, X-ray crystallography and optical data

  7. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  8. Regional reliability of quantitative signal targeting with alternating radiofrequency (STAR) labeling of arterial regions (QUASAR).

    Science.gov (United States)

    Tatewaki, Yasuko; Higano, Shuichi; Taki, Yasuyuki; Thyreau, Benjamin; Murata, Takaki; Mugikura, Shunji; Ito, Daisuke; Takase, Kei; Takahashi, Shoki

    2014-01-01

    Quantitative signal targeting with alternating radiofrequency labeling of arterial regions (QUASAR) is a recent spin labeling technique that could improve the reliability of brain perfusion measurements. Although it is considered reliable for measuring gray matter as a whole, it has never been evaluated regionally. Here we assessed this regional reliability. Using a 3-Tesla Philips Achieva whole-body system, we scanned four times 10 healthy volunteers, in two sessions 2 weeks apart, to obtain QUASAR images. We computed perfusion images and ran a voxel-based analysis within all brain structures. We also calculated mean regional cerebral blood flow (rCBF) within regions of interest configured for each arterial territory distribution. The mean CBF over whole gray matter was 37.74 with intraclass correlation coefficient (ICC) of .70. In white matter, it was 13.94 with an ICC of .30. Voxel-wise ICC and coefficient-of-variation maps showed relatively lower reliability in watershed areas and white matter especially in deeper white matter. The absolute mean rCBF values were consistent with the ones reported from PET, as was the relatively low variability in different feeding arteries. Thus, QUASAR reliability for regional perfusion is high within gray matter, but uncertain within white matter. © 2014 The Authors. Journal of Neuroimaging published by the American Society of Neuroimaging.

  9. Quantitative map interpretation in regional planning surveys. | J.A. ...

    African Journals Online (AJOL)

    A procedure followed for the quantitative interpretation of maps compiled for regional planning purposes of the Upper Orange catchment-basin is presented. The analyses provided useful figures concerning the distribution of dominant vegetation components and their association with relevant habitat factors. Keywords: ...

  10. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  11. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  12. Nuclear power regional analysis

    International Nuclear Information System (INIS)

    Parera, María Delia

    2011-01-01

    In this study, a regional analysis of the Argentine electricity market was carried out considering the effects of regional cooperation, national and international interconnections; additionally, the possibilities of insertion of new nuclear power plants in different regions were evaluated, indicating the most suitable areas for these facilities to increase the penetration of nuclear energy in national energy matrix. The interconnection of electricity markets and natural gas due to the linkage between both energy forms was also studied. With this purpose, MESSAGE program was used (Model for Energy Supply Strategy Alternatives and their General Environmental Impacts), promoted by the International Atomic Energy Agency (IAEA). This model performs a country-level economic optimization, resulting in the minimum cost for the modelling system. Regionalization executed by the Wholesale Electricity Market Management Company (CAMMESA, by its Spanish acronym) that divides the country into eight regions. The characteristics and the needs of each region, their respective demands and supplies of electricity and natural gas, as well as existing and planned interconnections, consisting of power lines and pipelines were taken into account. According to the results obtained through the model, nuclear is a competitive option. (author) [es

  13. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  14. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  15. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  16. Analysis of changes in sagging type ST-T segment induced by exercise. Assessment of regional myocardial blood flow using quantitative 13NH3 positron emission tomography

    International Nuclear Information System (INIS)

    Watanabe, Takuya; Akutsu, Yasushi; Okazaki, Osamu

    1995-01-01

    Regional myocardial blood flow (RMBF) associated with exercise-induced ST depression was assessed using 13 NH 3 positron emission tomography (PET) to determine the significance of horizontal and sagging type ST segments. The subjects were 25 patients with angina pectoris, 25 patients with myocardial infarction, and 5 healthy male volunteers. Eleven regions of interests (ROI) were prepared to calculate RMBF. ST segments were unchanged in 27 patients (Group A) and were depressed in 23 patients (Group B). A 10% increase in RMBF was significantly observed in Group A (74.1%) than Group B (34.8%). In Group B, ST depression was divided into horizontal type (8 patients) and sagging type (15 patients). According to the type of ST depression, RMBF was increased by 10% or more in 50% (4/8) for horizontal type and in 26.7% (4/15) for sagging type. These findings suggested that unfavorable increase in RMBF in stenosiss-related coronary vessels may contribute to the development of ST depression induced by exercise. A constant increase in RMBF in all ROIs, including those with unfavorable RMBF increase, may be involved in the occurrence of horizontal type ST depression; sagging type ST depression may, however, occur by an increased difference in blood flow between unfavorable and favorable RMBF. (N.K.)

  17. The Role of Hemispheral Asymmetry and Regional Activity of Quantitative EEG in Children with Stuttering

    Science.gov (United States)

    Ozge, Aynur; Toros, Fevziye; Comelekoglu, Ulku

    2004-01-01

    We investigated the role of delayed cerebral maturation, hemisphere asymmetry and regional differences in children with stuttering and healthy controls during resting state and hyperventilation, using conventional EEG techniques and quantitative EEG (QEEG) analysis. This cross-sectional case control study included 26 children with stuttering and…

  18. Quantitative and regional evaluation methods for lung scintigraphs

    International Nuclear Information System (INIS)

    Fichter, J.

    1982-01-01

    For the evaluation of perfusion lung scintigraphs with regard to the quantitative valuation and also with regard to the choice of the regions new criteria were presented. In addition to the usual methods of sectioning each lung lobe into upper, middle and lower level and the determination of the per cent activity share of the total activity the following values were established: the median of the activity distribution and the differences of the per cent counting rate as well as of the median of the corresponding regions of the right and left lung. The individual regions should describe the functional structures (lobe and segment structure). A corresponding computer program takes over the projection of lobe and segment regions in a simplified form onto the scintigraph with consideration of individual lung stretching. With the help of a clinical study on 60 patients and 18 control persons with 99mTc-MAA and 133 Xe-gas lung scintigraphs the following results could be determined: depending on the combination of the 32 parameters available for evaluation and the choice of regions between 4 and 20 of the 60 patients were falsely negatively classified and 1 to 2 of the 18 controls were falsely positive. The accuracy of the Tc-scintigraph proved to be better. All together using the best possible parameter combinations comparative results were attained. (TRV) [de

  19. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  20. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  1. A contribution to genetic etiology of complex regional pain syndrome type I (algodystropy syndrome) based on quantitative analysis of digitopalmar dermatoglyphics in sixty men.

    Science.gov (United States)

    Cvjeticanin, Miljenko; Jajić, Zrinka; Jajić, Ivo

    2005-01-01

    The patterns of the ridges of the skin of the fingers and palms were determined in sixty men with complex regional pain syndrome (type I) as a measure of disease prevention. The study included 25 dermatoglyphic traits: number of epidermal ridges on all ten fingers; their sum for five and ten fingers; four traits on both palms, i.e. between a-b, b-c and c-d triradii; atd angles: and their bilateral sum. The data obtained were compared with those recorded in a control group of 200 pairs of imprints of phenotipycally healthy male adults from the Zagreb area. Statistically significant difference from control values were found in 12 dermatoglyphic variables, including an increased sum of ridges on nine fingers (except for left second finger pad), and total sum for five and ten fingers. These findings suggested the polygenic system responsible for development of dermatoglyphics to be identical with some polygenic loci for the onset of algodystrophy syndrome, which might prove useful in disease prevention (e.g., taking fingerprints following a trauma and before rehabilitation), and to facilitate identification of risk groups, and thus the treatment for this longterm and yet obscure syndrome.

  2. Quantitative dose-volume response analysis of changes in parotid gland function after radiotherapy in the head-and-neck region

    International Nuclear Information System (INIS)

    Roesink, Judith M.; Moerland, Marinus A.; Battermann, Jan J.; Hordijk, Gerrit Jan; Terhaard, Chris H.J.

    2001-01-01

    Purpose: To study the radiation tolerance of the parotid glands as a function of dose and volume irradiated. Methods and Materials: One hundred eight patients treated with primary or postoperative radiotherapy for various malignancies in the head-and-neck region were prospectively evaluated. Stimulated parotid flow rate was measured before radiotherapy and 6 weeks, 6 months, and 1 year after radiotherapy. Parotid gland dose-volume histograms were derived from CT-based treatment planning. The normal tissue complication probability model proposed by Lyman was fit to the data. A complication was defined as stimulated parotid flow rate 50 (the dose to the whole organ leading to a complication probability of 50%) was found to be 31, 35, and 39 Gy at 6 weeks, 6 months, and 1 year postradiotherapy, respectively. The volume dependency parameter n was around 1, which means that the mean parotid dose correlates best with the observed complications. There was no steep dose-response curve (m=0.45 at 1 year postradiotherapy). Conclusions: This study on dose/volume/parotid gland function relationships revealed a linear correlation between postradiotherapy flow ratio and parotid gland dose and a strong volume dependency. No threshold dose was found. Recovery of parotid gland function was shown at 6 months and 1 year after radiotherapy. In radiation planning, attempts should be made to achieve a mean parotid gland dose at least below 39 Gy (leading to a complication probability of 50%)

  3. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  4. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  5. MOVES regional level sensitivity analysis

    Science.gov (United States)

    2012-01-01

    The MOVES Regional Level Sensitivity Analysis was conducted to increase understanding of the operations of the MOVES Model in regional emissions analysis and to highlight the following: : the relative sensitivity of selected MOVES Model input paramet...

  6. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  7. Fundamentals of quantitative PET data analysis

    NARCIS (Netherlands)

    Willemsen, ATM; van den Hoff, J

    2002-01-01

    Drug analysis and development with PET should fully exhaust the ability of this tomographic technique to quantify regional tracer concentrations in vivo. Data evaluation based on visual inspection or assessment of regional image contrast is not sufficient for this purpose since much of the

  8. Regional energy facility siting analysis

    International Nuclear Information System (INIS)

    Eberhart, R.C.; Eagles, T.W.

    1976-01-01

    Results of the energy facility siting analysis portion of a regional pilot study performed for the anticipated National Energy Siting and Facility Report are presented. The question of cell analysis versus site-specific analysis is explored, including an evaluation of the difference in depth between the two approaches. A discussion of the possible accomplishments of regional analysis is presented. It is concluded that regional sitting analysis could be of use in a national siting study, if its inherent limits are recognized

  9. Regional ejection fraction: a quantitative radionuclide index of regional left ventricular performance

    International Nuclear Information System (INIS)

    Maddox, D.E.; Wynne, J.; Uren, R.; Parker, J.A.; Idoine, J.; Siegel, L.C.; Neill, J.M.; Cohn, P.F.; Holman, B.L.

    1979-01-01

    Left ventricular regional ejection fractions were derived from background-corrected, time-activity curves in 43 patients assessed by both gated equilibrium radionuclide angiocardiography and left ventricular contrast angiography. From a single, modified left anterior oblique projection, the regional change in background corrected counts was determined in each of three anatomic regions. The normal range for regional radionuclide ejection fraction was determined in 10 patients with normal contrast ventriculograms and without obstructive coronary artery disease at coronary arteriography. Regional ejection fraction was compared with percent segmental axis shortening and extent of akinetic segments in corresponding regions of the contrast ventriculogram. Radionuclide and roentgenographic methods were in agreement as to the presence or absence of abnormal wall motion in 83 of 99 left ventricular regions (84%) in 33 patients evaluated prospectively. Comparison of regional ejection fraction demonstrated significant differences between regions with roentgenographically determined normokinesis hypokinesis, and akinesis. We conclude that the left ventricular regional ejection fraction provides a reliable quantitative assessment of regional left ventricular performance

  10. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  11. Regions and media from quantitative and qualitative perspectives: the case of Czech Republic

    Directory of Open Access Journals (Sweden)

    Jan Sucháček

    2013-01-01

    Full Text Available Media become increasingly important in co-creating the image of spatial units at various scales. The situation is even more intriguing in transition/post-transitions countries, which were exposed to modernization trends in rather short, almost compressed periods. The article aims at showing how media shape the image of NUTS III regions in the Czech Republic. Comparisons show TV coverage embodies media agenda in a satisfactory manner as it has one of the highest impacts on the public on the one hand and is representative enough on the other. That is why TV coverage at the national level with contributions related to individual NUTS III in the Czech Republic was chosen as a point of departure. Thus, the objective of the paper is to analyze and interpret TV news related to NUTS III regions in the Czech Republic. This will be accomplished from both quantitative and qualitative perspectives. Quantitative analysis is focusing on the number of contributions related to the size of the region in question. Nonetheless, self-governing regions in the Czech Republic will be evaluated also from qualitative perspective when the composition of TV news will be accentuated. Although it is stated only seldom media analysis is of utmost importance in relation to regional development. In order to quantify and evaluate afore mentioned dependencies the methods of regression and correlation analysis will be utilized. Moreover, correspondence analysis and analysis of contingency tables will be used in the qualitative part of our research.

  12. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  13. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  14. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  15. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  16. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  17. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  18. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  19. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  20. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  1. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  2. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  3. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  4. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  5. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  6. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  7. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  8. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  9. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  10. Regional climate change mitigation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rowlands, Ian H [UNEP Collaborating Centre on Energy and Environment, and Univ. of Waterloo (Canada)

    1998-10-01

    The purpose of this paper is to explore some of the key methodological issues that arise from an analysis of regional climate change mitigation options. The rationale for any analysis of regional mitigation activities, emphasising both the theoretical attractiveness and the existing political encouragement and the methodology that has been developed are reviewed. The differences arising from the fact that mitigation analyses have been taken from the level of the national - where the majority of the work has been completed to date - to the level of the international - that is, the `regional` - will be especially highlighted. (EG)

  11. Regional climate change mitigation analysis

    International Nuclear Information System (INIS)

    Rowlands, Ian H.

    1998-01-01

    The purpose of this paper is to explore some of the key methodological issues that arise from an analysis of regional climate change mitigation options. The rationale for any analysis of regional mitigation activities, emphasising both the theoretical attractiveness and the existing political encouragement and the methodology that has been developed are reviewed. The differences arising from the fact that mitigation analyses have been taken from the level of the national - where the majority of the work has been completed to date - to the level of the international - that is, the 'regional' - will be especially highlighted. (EG)

  12. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  13. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  14. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  15. Quantitation of global and regional left ventricular function by MRI

    NARCIS (Netherlands)

    van der Geest, RJ; Reiber, JHC; Reiber, JHC; VanDerWall, EE

    1998-01-01

    Magnetic resonance imaging (MRI) provides several imaging strategies for assessing left ventricular function. As a three-dimensional imaging technique, all measurements can be performed without relying on geometrical assumptions. Global and regional function parameters can be derived from

  16. Quantitative Earthquake Prediction on Global and Regional Scales

    International Nuclear Information System (INIS)

    Kossobokov, Vladimir G.

    2006-01-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  17. Quantitative Earthquake Prediction on Global and Regional Scales

    Science.gov (United States)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  18. Euler deconvolution and spectral analysis of regional aeromagnetic ...

    African Journals Online (AJOL)

    Existing regional aeromagnetic data from the south-central Zimbabwe craton has been analysed using 3D Euler deconvolution and spectral analysis to obtain quantitative information on the geological units and structures for depth constraints on the geotectonic interpretation of the region. The Euler solution maps confirm ...

  19. Quantitative expression profile of distinct functional regions in the adult mouse brain.

    Directory of Open Access Journals (Sweden)

    Takeya Kasukawa

    Full Text Available The adult mammalian brain is composed of distinct regions with specialized roles including regulation of circadian clocks, feeding, sleep/awake, and seasonal rhythms. To find quantitative differences of expression among such various brain regions, we conducted the BrainStars (B* project, in which we profiled the genome-wide expression of ∼50 small brain regions, including sensory centers, and centers for motion, time, memory, fear, and feeding. To avoid confounds from temporal differences in gene expression, we sampled each region every 4 hours for 24 hours, and pooled the samples for DNA-microarray assays. Therefore, we focused on spatial differences in gene expression. We used informatics to identify candidate genes with expression changes showing high or low expression in specific regions. We also identified candidate genes with stable expression across brain regions that can be used as new internal control genes, and ligand-receptor interactions of neurohormones and neurotransmitters. Through these analyses, we found 8,159 multi-state genes, 2,212 regional marker gene candidates for 44 small brain regions, 915 internal control gene candidates, and 23,864 inferred ligand-receptor interactions. We also found that these sets include well-known genes as well as novel candidate genes that might be related to specific functions in brain regions. We used our findings to develop an integrated database (http://brainstars.org/ for exploring genome-wide expression in the adult mouse brain, and have made this database openly accessible. These new resources will help accelerate the functional analysis of the mammalian brain and the elucidation of its regulatory network systems.

  20. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  1. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  2. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  3. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  4. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  5. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  6. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  7. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  8. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  9. Reduced deep regional cerebral venous oxygen saturation in hemodialysis patients using quantitative susceptibility mapping.

    Science.gov (United States)

    Chai, Chao; Liu, Saifeng; Fan, Linlin; Liu, Lei; Li, Jinping; Zuo, Chao; Qian, Tianyi; Haacke, E Mark; Shen, Wen; Xia, Shuang

    2018-02-01

    Cerebral venous oxygen saturation (SvO 2 ) is an important indicator of brain function. There was debate about lower cerebral oxygen metabolism in hemodialysis patients and there were no reports about the changes of deep regional cerebral SvO 2 in hemodialysis patients. In this study, we aim to explore the deep regional cerebral SvO 2 from straight sinus using quantitative susceptibility mapping (QSM) and the correlation with clinical risk factors and neuropsychiatric testing . 52 hemodialysis patients and 54 age-and gender-matched healthy controls were enrolled. QSM reconstructed from original phase data of 3.0 T susceptibility-weighted imaging was used to measure the susceptibility of straight sinus. The susceptibility was used to calculate the deep regional cerebral SvO 2 and compare with healthy individuals. Correlation analysis was performed to investigate the correlation between deep regional cerebral SvO 2 , clinical risk factors and neuropsychiatric testing. The deep regional cerebral SvO 2 of hemodialysis patients (72.5 ± 3.7%) was significantly lower than healthy controls (76.0 ± 2.1%) (P deep regional cerebral SvO 2 in patients. The Mini-Mental State Examination (MMSE) scores of hemodialysis patients were significantly lower than healthy controls (P deep regional cerebral SvO 2 did not correlate with MMSE scores (P = 0.630). In summary, the decreased deep regional cerebral SvO 2 occurred in hemodialysis patients and dialysis duration, parathyroid hormone, hematocrit, hemoglobin and red blood cell may be clinical risk factors.

  10. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    International Nuclear Information System (INIS)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu

    2001-01-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with 99m Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or 99m Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  11. Quantitative analysis of infantile ureteropelvic junction obstruction by diuretic renography

    Energy Technology Data Exchange (ETDEWEB)

    Ueno, Shigeru; Suzuki, Yutaka; Murakami, Takeshi; Yokoyama, Seishichi; Hirakawa, Hitoshi; Tajima, Tomoo; Makuuchi, Hiroyasu [Tokai Univ., Isehara, Kanagawa (Japan). School of Medicine

    2001-04-01

    Infantile hydronephrosis detected by ultrasonography poses a clinical dilemma on how to treat the condition. This article reports a retrospective study to evaluate infantile hydronephrosis due to suspected ureteropelvic junction (UPJ) obstruction by means of standardized diuretic renography and to speculate its usefulness for quantitative assessment and management of this condition. Between November 1992 and July 1999, 43 patients who had the disease detected in their fetal or infantile period were submitted to this study. Standardized diuretic renograms were obtained with {sup 99m}Tc-labeled diethylene-triaminepenta-acetate (Tc-99m-DTPA) or {sup 99m}Tc-labeled mercaptoacetyl triglycine (Tc-99m-MAG3) as radiopharmaceuticals. Drainage half-time clearance (T 1/2) of the activity at each region of interest set to encompass the entire kidney and the dilated pelvis was used as an index of quantitative analysis of UPJ obstruction. Initial T 1/2s of 32 kidneys with suspected UPJ obstruction were significantly longer than those of 37 without obstruction. T 1/2s of kidneys which had undergone pyeloplasty decreased promptly after surgery whereas those of units followed up without surgery decreased more sluggishly. These findings demonstrate that a standardized diuretic renographic analysis with T 1/2 can reliably assess infantile hydronephrosis with UPJ obstruction and be helpful in making a decision on surgical intervention. (author)

  12. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  13. Quantitative analysis of spatial variability of geotechnical parameters

    Science.gov (United States)

    Fang, Xing

    2018-04-01

    Geotechnical parameters are the basic parameters of geotechnical engineering design, while the geotechnical parameters have strong regional characteristics. At the same time, the spatial variability of geotechnical parameters has been recognized. It is gradually introduced into the reliability analysis of geotechnical engineering. Based on the statistical theory of geostatistical spatial information, the spatial variability of geotechnical parameters is quantitatively analyzed. At the same time, the evaluation of geotechnical parameters and the correlation coefficient between geotechnical parameters are calculated. A residential district of Tianjin Survey Institute was selected as the research object. There are 68 boreholes in this area and 9 layers of mechanical stratification. The parameters are water content, natural gravity, void ratio, liquid limit, plasticity index, liquidity index, compressibility coefficient, compressive modulus, internal friction angle, cohesion and SP index. According to the principle of statistical correlation, the correlation coefficient of geotechnical parameters is calculated. According to the correlation coefficient, the law of geotechnical parameters is obtained.

  14. THE ENVIRONMENT OF REGIONAL DEVELOPMENT FINANCIAL ANALYSIS

    OpenAIRE

    Bechis Liviu; MOSCVICIOV Andrei

    2012-01-01

    The paper presents the difference between the two concepts regionalism and regionalization. It also presents the three types of regionalism analysis depending on the dimension and the nature of the relations: regionalism at national level, transnational regionalism and international regionalism analysis.

  15. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  16. Quantitative evaluation of regional cerebral blood flow by visual stimulation in {sup 99m}Tc- HMPAO brain SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Juh, Ra Hyeong; Suh, Tae Suk; Kwark, Chul Eun; Choe, Bo Young; Lee, Hyoung Koo; Chung, Yong An; Kim, Sung Hoon; Chung, Soo Kyo [College of Medicine, The Catholic Univ. of Seoul, Seoul (Korea, Republic of)

    2002-06-01

    The purpose of this study is to investigate the effects of visual activation and quantitative analysis of regional cerebral blood flow. Visual activation was known to increase regional cerebral blood flow in the visual cortex in occipital lobe. We evaluated that change in the distribution of '9{sup 9m}Tc-HMPAO (Hexamethyl propylene amine oxime) to reflect in regional cerebral blood flow. The six volunteers were injected with 925 MBq (mean ages: 26.75 years, n=6, 3men, 3women) underwent MRI and {sup 99m}Tc-HMPAO SPECT during a rest state with closed eyes and visual stimulated with 8 Hz LED. We delineate the region of interest and calculated the mean count per voxel in each of the fifteen slices to quantitative analysis. The ROI to whole brain ratio and regional index was calculated pixel to pixel subtraction visual non-activation image from visual activation image and constructed brain map using a statistical parameter map(SPM99). The mean regional cerebral blood flow was increased due to visual stimulation. The increase rate of the mean regional cerebral blood flow which of the activation region in primary visual cortex of occipital lobe was 32.50{+-}5.67%. The significant activation sites using a statistical parameter of brain constructed a rendering image and image fusion with SPECT and MRI. Visual activation was revealed significant increase through quantitative analysis in visual cortex. Activation region was certified in Talairach coordinate and primary visual cortex (Ba17),visual association area (Ba18,19) of Brodmann.

  17. Quantitative evaluation of regional cerebral blood flow by visual stimulation in 99mTc-HMPAO brain SPECT

    International Nuclear Information System (INIS)

    Juh, R. H.; Suh, T. S.; Chung, Y. A.

    2002-01-01

    The purpose of this study is to investigate the effects of visual activation and quantitative analysis of regional cerebral blood flow. Visual activation was known to increase regional cerebral blood flow in the visual cortex in occipital lobe. We evaluated that change in the distribution of 99mTc-HMPAO (Hexamethyl propylene amine oxime) to reflect in regional cerebral blood flow. The six volunteers were injected with 925 MBq (mean ages: 26.75 years, n=6, 3men, 3women) underwent MRI and 99mTc- HMPAO SPECT during a rest state with closed eyes and visual stimulated with 8 Hz LED. We delineate the region of interest and calculated the mean count per voxel in each of the fifteen slices to quantitative analysis. The ROI to whole brain ratio and regional index was calculated pixel to pixel subtraction visual non-activation image from visual activation image and constructed brain map using a statistical parameter map (SPM99). The mean regional cerebral blood flow was increased due to visual stimulation. The increase rate of the mean regional cerebral blood flow which of the activation region in primary visual cortex of occipital lobe was 32.50±5.67%. The significant activation sites using a statistical parameter of brain constructed a rendering image and image fusion with SPECT and MRI. Visual activation was revealed significant increase through quantitative analysis in visual cortex. Activation region was certified in Talairach coordinate and primary visual cortex (Ba17),visual association area (Ba18,19) of Brodmann

  18. Quantitative evaluation of regional cerebral blood flow by visual stimulation in 99mTc- HMPAO brain SPECT

    International Nuclear Information System (INIS)

    Juh, Ra Hyeong; Suh, Tae Suk; Kwark, Chul Eun; Choe, Bo Young; Lee, Hyoung Koo; Chung, Yong An; Kim, Sung Hoon; Chung, Soo Kyo

    2002-01-01

    The purpose of this study is to investigate the effects of visual activation and quantitative analysis of regional cerebral blood flow. Visual activation was known to increase regional cerebral blood flow in the visual cortex in occipital lobe. We evaluated that change in the distribution of '9 9m Tc-HMPAO (Hexamethyl propylene amine oxime) to reflect in regional cerebral blood flow. The six volunteers were injected with 925 MBq (mean ages: 26.75 years, n=6, 3men, 3women) underwent MRI and 99m Tc-HMPAO SPECT during a rest state with closed eyes and visual stimulated with 8 Hz LED. We delineate the region of interest and calculated the mean count per voxel in each of the fifteen slices to quantitative analysis. The ROI to whole brain ratio and regional index was calculated pixel to pixel subtraction visual non-activation image from visual activation image and constructed brain map using a statistical parameter map(SPM99). The mean regional cerebral blood flow was increased due to visual stimulation. The increase rate of the mean regional cerebral blood flow which of the activation region in primary visual cortex of occipital lobe was 32.50±5.67%. The significant activation sites using a statistical parameter of brain constructed a rendering image and image fusion with SPECT and MRI. Visual activation was revealed significant increase through quantitative analysis in visual cortex. Activation region was certified in Talairach coordinate and primary visual cortex (Ba17),visual association area (Ba18,19) of Brodmann

  19. Quantitative evaluation of regional cerebral blood flow by visual stimulation in {sup 99m}Tc-HMPAO brain SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Juh, R. H.; Suh, T. S.; Chung, Y. A. [The Catholic Univ., of Korea, Seoul (Korea, Republic of)

    2002-07-01

    The purpose of this study is to investigate the effects of visual activation and quantitative analysis of regional cerebral blood flow. Visual activation was known to increase regional cerebral blood flow in the visual cortex in occipital lobe. We evaluated that change in the distribution of 99mTc-HMPAO (Hexamethyl propylene amine oxime) to reflect in regional cerebral blood flow. The six volunteers were injected with 925 MBq (mean ages: 26.75 years, n=6, 3men, 3women) underwent MRI and 99mTc- HMPAO SPECT during a rest state with closed eyes and visual stimulated with 8 Hz LED. We delineate the region of interest and calculated the mean count per voxel in each of the fifteen slices to quantitative analysis. The ROI to whole brain ratio and regional index was calculated pixel to pixel subtraction visual non-activation image from visual activation image and constructed brain map using a statistical parameter map (SPM99). The mean regional cerebral blood flow was increased due to visual stimulation. The increase rate of the mean regional cerebral blood flow which of the activation region in primary visual cortex of occipital lobe was 32.50{+-}5.67%. The significant activation sites using a statistical parameter of brain constructed a rendering image and image fusion with SPECT and MRI. Visual activation was revealed significant increase through quantitative analysis in visual cortex. Activation region was certified in Talairach coordinate and primary visual cortex (Ba17),visual association area (Ba18,19) of Brodmann.

  20. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  1. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  2. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  3. Quantitative gas analysis with FT-IR

    DEFF Research Database (Denmark)

    Bak, J.; Larsen, A.

    1995-01-01

    Calibration spectra of CO in the 2.38-5100 ppm concentration range (22 spectra) have been measured with a spectral resolution of 4 cm(-1), in the mid-IR (2186-2001 cm(-1)) region, with a Fourier transform infrared (FT-IR) instrument. The multivariate calibration method partial least-squares (PLS1...

  4. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  5. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  6. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  7. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  8. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  9. A method to quantitate regional wall motion in left ventriculography using Hildreth algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Terashima, Mikio [Hyogo Red Cross Blood Center (Japan); Naito, Hiroaki; Sato, Yoshinobu; Tamura, Shinichi; Kurosawa, Tsutomu

    1998-06-01

    Quantitative measurement of ventricular wall motion is indispensable for objective evaluation of cardiac function associated with coronary artery disease. We have modified the Hildreth`s algorithm to estimate excursions of the ventricular wall on left ventricular images yielded by various imaging techniques. Tagging cine-MRI was carried out on 7 healthy volunteers. The original Hildreth method, the modified Hildreth method and the centerline method were applied to the outlines of the images obtained, to estimate excursion of the left ventricular wall and regional shortening and to evaluate the accuracy of these methods when measuring these parameters, compared to the values of these parameters measured actually using the attached tags. The accuracy of the original Hildreth method was comparable to that of the centerline method, while the modified Hildreth method was significantly more accurate than the centerline method (P<0.05). Regional shortening as estimated using the modified Hildreth method differed less from the actually measured regional shortening than did the shortening estimated using the centerline method (P<0.05). The modified Hildreth method allowed reasonable estimation of left ventricular wall excursion in all cases where it was applied. These results indicate that when applied to left ventriculograms for ventricular wall motion analysis, the modified Hildreth method is more useful than the original Hildreth method. (author)

  10. Analysis of methods for quantitative renography

    International Nuclear Information System (INIS)

    Archambaud, F.; Maksud, P.; Prigent, A.; Perrin-Fayolle, O.

    1995-01-01

    This article reviews the main methods using renography to estimate renal perfusion indices and to quantify differential and global renal function. The review addresses the pathophysiological significance of estimated parameters according to the underlying models and the choice of the radiopharmaceutical. The dependence of these parameters on the region of interest characteristics and on the methods of background and attenuation corrections are surveyed. Some current recommendations are proposed. (authors). 66 refs., 8 figs

  11. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  12. Strategic municipal solid waste management: A quantitative model for Italian regions

    International Nuclear Information System (INIS)

    Cucchiella, Federica; D’Adamo, Idiano; Gastaldi, Massimo

    2014-01-01

    Highlights: • Definition of new plan waste management based on incineration. • Profitability of waste facilities based on economic and financial indicators. • The amount of wastes generated are considered not annually constant and with a regional detail. • A sensitivity analysis is used to test some of the initial assumptions. • Regional strategies are proposed for optimize benefits from correct waste management. - Abstract: Current economic crisis brought to light the structural deficiencies of European economy. This paper aims to improve the performances of a policy on sustainable municipal solid waste management strategies. Specifically, the attention is focused on Italian country that reports a high rate of landfilling. Waste to Energy plant is an attractive technological option in municipal solid waste, but it is a subject of intense debate. Incinerators require effective and efficient controls to avoid emissions of harmful pollutants into the air, land and water, which may influence human health and environment. To address waste management situation, this study uses a multi-objective mathematical programming. A new plan is presented to evaluate and quantify the effects of initiatives for diversion of current waste from landfill. In an attempt to better simulate realistic waste management scenarios, the amount of waste generated is not annually constant and changes are accounted in waste diversion rates. Moreover, due to the geographical characteristics of Italy, the realization of new facilities is replicated with a regional detail. In this paper economic and financial indicators are used to define the profitability of waste facilities. Moreover, a sensitivity analysis is used to test some of the initial assumptions. Once identified the efficient Waste to Energy plant, regional strategies of waste management are proposed to optimize financial and environmental benefits of the sector. The proposed waste management framework provides a concrete scheme

  13. Implementing quantitative analysis and its complement

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Nelson, W.R.; Shepherd, J.C.

    1982-01-01

    This paper presents an application of risk analysis for the evaluation of nuclear reactor facility operation. Common cause failure analysis (CCFA) techniques to identify potential problem areas are discussed. Integration of CCFA and response trees, a particular form of the path sets of a success tree, to gain significant insight into the operation of the facility is also demonstrated. An example illustrating the development of the risk analysis methodology, development of the fault trees, generation of response trees, and evaluation of the CCFA is presented to explain the technique

  14. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  15. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  16. FFT transformed quantitative EEG analysis of short term memory load.

    Science.gov (United States)

    Singh, Yogesh; Singh, Jayvardhan; Sharma, Ratna; Talwar, Anjana

    2015-07-01

    The EEG is considered as building block of functional signaling in the brain. The role of EEG oscillations in human information processing has been intensively investigated. To study the quantitative EEG correlates of short term memory load as assessed through Sternberg memory test. The study was conducted on 34 healthy male student volunteers. The intervention consisted of Sternberg memory test, which runs on a version of the Sternberg memory scanning paradigm software on a computer. Electroencephalography (EEG) was recorded from 19 scalp locations according to 10-20 international system of electrode placement. EEG signals were analyzed offline. To overcome the problems of fixed band system, individual alpha frequency (IAF) based frequency band selection method was adopted. The outcome measures were FFT transformed absolute powers in the six bands at 19 electrode positions. Sternberg memory test served as model of short term memory load. Correlation analysis of EEG during memory task was reflected as decreased absolute power in Upper alpha band in nearly all the electrode positions; increased power in Theta band at Fronto-Temporal region and Lower 1 alpha band at Fronto-Central region. Lower 2 alpha, Beta and Gamma band power remained unchanged. Short term memory load has distinct electroencephalographic correlates resembling the mentally stressed state. This is evident from decreased power in Upper alpha band (corresponding to Alpha band of traditional EEG system) which is representative band of relaxed mental state. Fronto-temporal Theta power changes may reflect the encoding and execution of memory task.

  17. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  18. Quantitative genetic analysis of total glucosinolate, oil and protein ...

    African Journals Online (AJOL)

    Quantitative genetic analysis of total glucosinolate, oil and protein contents in Ethiopian mustard ( Brassica carinata A. Braun) ... Seeds were analyzed using HPLC (glucosinolates), NMR (oil) and NIRS (protein). Analyses of variance, Hayman's method of diallel analysis and a mixed linear model of genetic analysis were ...

  19. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  20. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  1. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  2. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  3. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  4. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  5. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  6. Quantitative analysis of normal thallium-201 tomographic studies

    International Nuclear Information System (INIS)

    Eisner, R.L.; Gober, A.; Cerqueira, M.

    1985-01-01

    To determine the normal (nl) distribution of Tl-201 uptake post exercise (EX) and at redistribution (RD) and nl washout, Tl-201 rotational tomographic (tomo) studies were performed in 40 subjects: 16 angiographic (angio) nls and 24 nl volunteers (12 from Emory and 12 from Yale). Oblique angle short axis slices were subjected to maximal count circumferential profile analysis. Data were displayed as a ''bullseye'' functional map with the apex at the center and base at the periphery. The bullseye was not uniform in all regions because of the variable effects of attenuation and resolution at different view angles. In all studies, the septum: lateral wall ratio was 1.0 in males and approximately equal to 1.0 in females. This occurred predominantly because of anterior defects due to breast soft tissue attenuation. EX and RD bullseyes were similar. Using a bi-exponential model for Tl kinetics, 4 hour normalized washout ranged 49-54% in each group and showed minimal variation between walls throughout the bullseye. Thus, there are well defined variations in Tl-201 uptake in the nl myocardium which must be taken into consideration when analyzing pt data. Because of these defects and the lack of adequate methods for attenuation correction, quantitative analysis of Tl-201 studies must include direct comparison with gender-matched nl data sets

  7. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  8. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  9. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  10. Quantitative analysis of deuterium by gas chromatography

    International Nuclear Information System (INIS)

    Isomura, Shohei; Kaetsu, Hayato

    1981-01-01

    An analytical method for the determination of deuterium concentration in water and hydrogen gas by gas chromatography is described. HD and D 2 in a hydrogen gas sample were separated from H 2 by a column packed with Molecular Sieve 13X, using extra pure hydrogen gas as carrier. A thermal conductivity detector was used. Concentrations of deuterium were determined by comparison with standard samples. The error inherent to the present method was less a 1% on the basis of the calibration curves obtained with the standard samples. The average time required for the analysis was about 3 minutes. (author)

  11. Influence of corrosion layers on quantitative analysis

    International Nuclear Information System (INIS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Roehrich, J.; Strub, E.

    2005-01-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed

  12. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  13. Human eyeball model reconstruction and quantitative analysis.

    Science.gov (United States)

    Xing, Qi; Wei, Qi

    2014-01-01

    Determining shape of the eyeball is important to diagnose eyeball disease like myopia. In this paper, we present an automatic approach to precisely reconstruct three dimensional geometric shape of eyeball from MR Images. The model development pipeline involved image segmentation, registration, B-Spline surface fitting and subdivision surface fitting, neither of which required manual interaction. From the high resolution resultant models, geometric characteristics of the eyeball can be accurately quantified and analyzed. In addition to the eight metrics commonly used by existing studies, we proposed two novel metrics, Gaussian Curvature Analysis and Sphere Distance Deviation, to quantify the cornea shape and the whole eyeball surface respectively. The experiment results showed that the reconstructed eyeball models accurately represent the complex morphology of the eye. The ten metrics parameterize the eyeball among different subjects, which can potentially be used for eye disease diagnosis.

  14. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  15. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    Science.gov (United States)

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  16. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  17. Parameter determination for quantitative PIXE analysis using genetic algorithms

    International Nuclear Information System (INIS)

    Aspiazu, J.; Belmont-Moreno, E.

    1996-01-01

    For biological and environmental samples, PIXE technique is in particular advantage for elemental analysis, but the quantitative analysis implies accomplishing complex calculations that require the knowledge of more than a dozen parameters. Using a genetic algorithm, the authors give here an account of the procedure to obtain the best values for the parameters necessary to fit the efficiency for a X-ray detector. The values for some variables involved in quantitative PIXE analysis, were manipulated in a similar way as the genetic information is treated in a biological process. The authors carried out the algorithm until they reproduce, within the confidence interval, the elemental concentrations corresponding to a reference material

  18. Quantitative analysis of carbon in plutonium

    International Nuclear Information System (INIS)

    Lefevre, Chantal.

    1979-11-01

    The aim of this study is to develop a method for the determination of carbon traces (20 to 400 ppm) in plutonium. The development of a carbon in plutonium standard is described, then the content of this substance is determined and its validity as a standard shown by analysis in two different ways. In the first method used, reaction of the metal with sulphur and determination of carbon as carbon sulphide, the following parameters were studied: influence of excess reagent, surface growth of samples in contact with sulphur, temperature and reaction time. The results obtained are in agreement with those obtained by the conventional method of carbon determination, combustion in oxygen and measurement of carbon in the form of carbon dioxide. Owing to the presence of this standard we were then able to study the different parameters involved in plutonium combustion so that the reaction can be made complete: temperature reached during combustion, role of flux, metal surface in contact with oxygen and finally method of cleaning plutonium samples [fr

  19. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  20. New possibilities for quantitative measurements of regional cerebral blood flow with gold-195m

    International Nuclear Information System (INIS)

    Lindner, P.; Nickel, O.

    1985-01-01

    A previously reported theory for quantitative cerebral blood flow measurement for nondiffusible radiotracers has been applied to patients after stroke and to volunteers undergoing a mental stimulation exercise. The energy spectrum of gold-195m shows two strong photon peaks, one at an energy level of 68 keV and a second at an energy-level of 262 keV. The low energy peak is suitable for perfusion studies in lateral views of the hemispheres; no look-through effect is seen. The high energy level is good for studies in posterior-anterior positions. Parametric images for quantitative regional cerebral blood flow can be generated. The area of occluded vessels in the case of stroke can be detected. Quantitative activation patterns of cerebral blood flow during mental stimulation can be generated. The results prove that, not only with freely diffusible indicators like xenon but also with nondiffusible indicators, it is possible to measure quantitatively cerebral blood flow patterns

  1. Regional and Historical Minerageny of the Gemstone Complexes in Russia (Quantitative Aspects

    Directory of Open Access Journals (Sweden)

    Polyanin V.S.

    2015-06-01

    Full Text Available The paper presents an approximate quantitative estimation of the mineragenic potentials of gems in the paleogeodynamic systems, complexes, and geological formations of Russia, which are characterized by different age and regional distribution. General trends of the changes in the scale and intensity of formation processes of the same gem deposits over geologic time were identified.

  2. Mapping Quantitatively Regional Drug Absorption in Canines with IntelliCap System

    NARCIS (Netherlands)

    Becker, D.; Schütz, H.; Beyerbach, A.; Zou, H.; Shimizu, J.; Iordanov, V.P.

    2011-01-01

    Quantitative regional absorption of a drug under development was studied by using the novel IntelliCap system. IntelliCap is an orally swallowable programmable drug delivery capsule and capable of real-time monitoring of physiological conditions (pH, temperature), consequently allowing localization

  3. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  6. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  7. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  8. A quantitative analysis of the causes of the global climate change research distribution

    DEFF Research Database (Denmark)

    Pasgaard, Maya; Strange, Niels

    2013-01-01

    investigates whether the need for knowledge on climate changes in the most vulnerable regions of the world is met by the supply of knowledge measured by scientific research publications from the last decade. A quantitative analysis of more than 15,000 scientific publications from 197 countries investigates...... the poorer, fragile and more vulnerable regions of the world. A quantitative keywords analysis of all publications shows that different knowledge domains and research themes dominate across regions, reflecting the divergent global concerns in relation to climate change. In general, research on climate change...... the distribution of climate change research and the potential causes of this distribution. More than 13 explanatory variables representing vulnerability, geographical, demographical, economical and institutional indicators are included in the analysis. The results show that the supply of climate change knowledge...

  9. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  10. Transcription-factor occupancy at HOT regions quantitatively predicts RNA polymerase recruitment in five human cell lines.

    KAUST Repository

    Foley, Joseph W

    2013-10-20

    BACKGROUND: High-occupancy target (HOT) regions are compact genome loci occupied by many different transcription factors (TFs). HOT regions were initially defined in invertebrate model organisms, and we here show that they are a ubiquitous feature of the human gene-regulation landscape. RESULTS: We identified HOT regions by a comprehensive analysis of ChIP-seq data from 96 DNA-associated proteins in 5 human cell lines. Most HOT regions co-localize with RNA polymerase II binding sites, but many are not near the promoters of annotated genes. At HOT promoters, TF occupancy is strongly predictive of transcription preinitiation complex recruitment and moderately predictive of initiating Pol II recruitment, but only weakly predictive of elongating Pol II and RNA transcript abundance. TF occupancy varies quantitatively within human HOT regions; we used this variation to discover novel associations between TFs. The sequence motif associated with any given TF\\'s direct DNA binding is somewhat predictive of its empirical occupancy, but a great deal of occupancy occurs at sites without the TF\\'s motif, implying indirect recruitment by another TF whose motif is present. CONCLUSIONS: Mammalian HOT regions are regulatory hubs that integrate the signals from diverse regulatory pathways to quantitatively tune the promoter for RNA polymerase II recruitment.

  11. Transcription-factor occupancy at HOT regions quantitatively predicts RNA polymerase recruitment in five human cell lines.

    KAUST Repository

    Foley, Joseph W; Sidow, Arend

    2013-01-01

    BACKGROUND: High-occupancy target (HOT) regions are compact genome loci occupied by many different transcription factors (TFs). HOT regions were initially defined in invertebrate model organisms, and we here show that they are a ubiquitous feature of the human gene-regulation landscape. RESULTS: We identified HOT regions by a comprehensive analysis of ChIP-seq data from 96 DNA-associated proteins in 5 human cell lines. Most HOT regions co-localize with RNA polymerase II binding sites, but many are not near the promoters of annotated genes. At HOT promoters, TF occupancy is strongly predictive of transcription preinitiation complex recruitment and moderately predictive of initiating Pol II recruitment, but only weakly predictive of elongating Pol II and RNA transcript abundance. TF occupancy varies quantitatively within human HOT regions; we used this variation to discover novel associations between TFs. The sequence motif associated with any given TF's direct DNA binding is somewhat predictive of its empirical occupancy, but a great deal of occupancy occurs at sites without the TF's motif, implying indirect recruitment by another TF whose motif is present. CONCLUSIONS: Mammalian HOT regions are regulatory hubs that integrate the signals from diverse regulatory pathways to quantitatively tune the promoter for RNA polymerase II recruitment.

  12. Quantitative Development and Distribution of Zooplankton in Medium Lakes of the Kostanay Region (North Kazakhstan Region)

    Science.gov (United States)

    Aubakirova, Gulzhan A.; Syzdykov, Kuanysh N.; Kurzhykayev, Zhumagazy; Uskenov, Rashit B.; Narbayev, Serik; Begenova, Ainagul B.; Zhumakayeva, Aikumys N.; Sabdinova, Dinara K.; Akhmedinov, Serikbay N.

    2016-01-01

    The assessment of water resources plays an important environmental and economic role, since it allows developing an effective program of regional development with regard to the environmental load. The hydro-chemical regime of lakes includes water temperature, content of biogenic elements, total mineralization, oxygen regime, and other parameters…

  13. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  14. Original methods of quantitative analysis developed for diverse samples in various research fields. Quantitative analysis at NMCC

    International Nuclear Information System (INIS)

    Sera, Koichiro

    2003-01-01

    Nishina Memorial Cyclotron Center (NMCC) has been opened for nationwide-common utilization of positron nuclear medicine (PET) and PIXE since April 1993. At the present time, nearly 40 subjects of PIXE in various research fields are pursued here, and more than 50,000 samples have been analyzed up to the present. In order to perform quantitative analyses of diverse samples, technical developments in sample preparation, measurement and data analysis have been continuously carried out. Especially, a standard-free method for quantitative analysis'' made it possible to perform analysis of infinitesimal samples, powdered samples and untreated bio samples, which could not be well analyzed quantitatively in the past. The standard-free method'' and a ''powdered internal standard method'' made the process for target preparation quite easier. It has been confirmed that results obtained by these methods show satisfactory accuracy and reproducibility preventing any ambiguity coming from complicated target preparation processes. (author)

  15. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  16. Method of quantitative x-ray diffractometric analysis of Ta-Ta2C system

    International Nuclear Information System (INIS)

    Gavrish, A.A.; Glazunov, M.P.; Korolev, Yu.M.; Spitsyn, V.I.; Fedoseev, G.K.

    1976-01-01

    The syste86 Ta-Ta 2 C has beemonsidered because of specific features of diffraction patterns of the components, namely, overlapping of the most intensive reflexes of both phases. The method of standard binary system has been used for quantitative analysis. Because of overlapping of the intensive reflexes dsub(1/01)=2.36(Ta 2 C) and dsub(110)=2.33(Ta), the other, most intensive, reflexes have been used for quantitative determination of Ta 2 C and Ta: dsub(103)=1.404 A for tantalum subcarbide and dsub(211)=1.35A for tantalum. Besides, the TaTa 2 C phases have been determined quantitatively with the use of another pair of reflexes: dsub(102)=1.82 A for Ta 2 C and dsub(200)=1.65 A for tantalum. The agreement between the results obtained while performing the quantitative phase analysis is good. To increase reliability and accuracy of the quantitative determination of Ta and Ta 2 C, it is expedient to carry out the analysis with the use of two above-mentioned pairs of reflexes located in different regions of the diffraction spectrum. Thus, the procedure of quantitative analysis of Ta and Ta 2 C in different ratios has been developed taking into account the specific features of the diffraction patterns of these components as well as the ability of Ta 2 C to texture in the process of preparation

  17. Quantitative analysis of LISA pathfinder test-mass noise

    International Nuclear Information System (INIS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-01-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3x10 -14 m s -2 /√(Hz) at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise

  18. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  19. Quantitative Proteomic Analysis of Sulfolobus solfataricus Membrane Proteins

    NARCIS (Netherlands)

    Pham, T.K.; Sierocinski, P.; Oost, van der J.; Wright, P.C.

    2010-01-01

    A quantitative proteomic analysis of the membrane of the archaeon Sulfolobus solfataricus P2 using iTRAQ was successfully demonstrated in this technical note. The estimated number of membrane proteins of this organism is 883 (predicted based on Gravy score), corresponding to 30 % of the total

  20. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. Keywords: Spleen, Rat, Protein extraction, Label-free quantitative proteomics

  1. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    The importance of data analysis in quantitative assessment of natural resources remains significant in the sustainable management of complex tropical forest resources. Analyses of data from complex tropical forest stands have not been easy or clear due to improper data management. It is pivotal to practical researches ...

  2. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  3. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  4. Quantitative regional validation of the visual rating scale for posterior cortical atrophy

    International Nuclear Information System (INIS)

    Moeller, Christiane; Benedictus, Marije R.; Koedam, Esther L.G.M.; Scheltens, Philip; Flier, Wiesje M. van der; Versteeg, Adriaan; Wattjes, Mike P.; Barkhof, Frederik; Vrenken, Hugo

    2014-01-01

    Validate the four-point visual rating scale for posterior cortical atrophy (PCA) on magnetic resonance images (MRI) through quantitative grey matter (GM) volumetry and voxel-based morphometry (VBM) to justify its use in clinical practice. Two hundred twenty-nine patients with probable Alzheimer's disease and 128 with subjective memory complaints underwent 3T MRI. PCA was rated according to the visual rating scale. GM volumes of six posterior structures and the total posterior region were extracted using IBASPM and compared among PCA groups. To determine which anatomical regions contributed most to the visual scores, we used binary logistic regression. VBM compared local GM density among groups. Patients were categorised according to their PCA scores: PCA-0 (n = 122), PCA-1 (n = 143), PCA-2 (n = 79), and PCA-3 (n = 13). All structures except the posterior cingulate differed significantly among groups. The inferior parietal gyrus volume discriminated the most between rating scale levels. VBM showed that PCA-1 had a lower GM volume than PCA-0 in the parietal region and other brain regions, whereas between PCA-1 and PCA-2/3 GM atrophy was mostly restricted to posterior regions. The visual PCA rating scale is quantitatively validated and reliably reflects GM atrophy in parietal regions, making it a valuable tool for the daily radiological assessment of dementia. (orig.)

  5. Quantitative regional validation of the visual rating scale for posterior cortical atrophy

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, Christiane; Benedictus, Marije R.; Koedam, Esther L.G.M.; Scheltens, Philip [VU University Medical Center, Alzheimer Center and Department of Neurology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Flier, Wiesje M. van der [VU University Medical Center, Alzheimer Center and Department of Neurology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); VU University Medical Center, Department of Epidemiology and Biostatistics, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Versteeg, Adriaan; Wattjes, Mike P.; Barkhof, Frederik [VU University Medical Center, Department of Radiology and Nuclear Medicine, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); Vrenken, Hugo [VU University Medical Center, Department of Radiology and Nuclear Medicine, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands); VU University Medical Center, Department of Physics and Medical Technology, Neuroscience Campus Amsterdam, P.O. Box 7057, Amsterdam (Netherlands)

    2014-02-15

    Validate the four-point visual rating scale for posterior cortical atrophy (PCA) on magnetic resonance images (MRI) through quantitative grey matter (GM) volumetry and voxel-based morphometry (VBM) to justify its use in clinical practice. Two hundred twenty-nine patients with probable Alzheimer's disease and 128 with subjective memory complaints underwent 3T MRI. PCA was rated according to the visual rating scale. GM volumes of six posterior structures and the total posterior region were extracted using IBASPM and compared among PCA groups. To determine which anatomical regions contributed most to the visual scores, we used binary logistic regression. VBM compared local GM density among groups. Patients were categorised according to their PCA scores: PCA-0 (n = 122), PCA-1 (n = 143), PCA-2 (n = 79), and PCA-3 (n = 13). All structures except the posterior cingulate differed significantly among groups. The inferior parietal gyrus volume discriminated the most between rating scale levels. VBM showed that PCA-1 had a lower GM volume than PCA-0 in the parietal region and other brain regions, whereas between PCA-1 and PCA-2/3 GM atrophy was mostly restricted to posterior regions. The visual PCA rating scale is quantitatively validated and reliably reflects GM atrophy in parietal regions, making it a valuable tool for the daily radiological assessment of dementia. (orig.)

  6. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  7. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    Science.gov (United States)

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  8. New possibilities for quantitative measurements of regional cerebral blood flow with Au-195 m

    International Nuclear Information System (INIS)

    Lindner, P.; Nickel, O.

    1984-01-01

    A previously reported theory for quantitative cerebral blood flow measurement for nondiffusible radiotracers has been applied on patients after stroke and an volunteers undergoing a mental stimulation exercise. Quantitative measurements of cerebral blood flow patterns not only in p-a. but also in lateral views of the brain are possible by the use of the recently developed generator for the short lived (30 sec) isotope Au-195 m. The energy spectrum of the eluate of the generator shows two strong photon peaks, one at an energy level of 68 KeV and a second at an energy-level of 262 KeV. The low energy peak is suitable for perfusion studies in lateral views of the hemispheres, no ''look through'' effect is seen. The high energy level is good for studies in p-a-positions. The studies last less than 1 minute and can be repeated after 3 minutes. Parametric images for quantitative regional cerebral blood flow can be generated. The area of occluded vessels in the case of stroke can be detected. Quantitative activation patterns of cerebral blood flow during mental stimulation can be generated. The results prove that not only with freely diffusible indicators like Xenon but also with nondiffusible indicators it is possible to measure quantitatively cerebral blood flow patterns. (orig.)

  9. In vivo regional quantitation of intrathoracic /sup 99m/Tc using SPECT: concise communication

    International Nuclear Information System (INIS)

    Osborne, D.; Jaszczak, R.; Coleman, R.E.; Greer, K.; Lischko, M.

    1982-01-01

    A whole-body single-photon emission computed tomographic system (SPECT) was used to quantitate the activities of a series of /sup 99m/Tc point sources in the dog's thorax and to evaluate attenuation of a uniform esophageal line source containing a known concentration of /sup 99m/Tc. A first-order attenuation correction and an empirically derived attenuation coefficient of 0.09 cm-1 were used in the SPECT analyses of the intrathoracic point sources. The relationship between SPECT measurements of multiple point-source activities and the same sources measured in air was linear over a range of 100 to 1000 muCi (slope 1.08; R2 coefficient of determination 0.97). These data are sufficiently accurate to allow an estimate of the regional activity of radiopharmaceutical in the dog's thorax and justify their use in experimental quantitation of regional pulmonary perfusion

  10. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  11. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  12. Video digitizer (real time-frame grabber) with region of interest suitable for quantitative data analysis used on the infrared and H alpha cameras installed on the DIII-D experiment

    International Nuclear Information System (INIS)

    Ferguson, S.W.; Kevan, D.K.; Hill, D.N.; Allen, S.L.

    1987-01-01

    This paper describes a CAMAC based video digitizer with region of interest (ROI) capability that was designed for use with the infrared and H alpha cameras installed by Lawrence Livermore Laboratory on the DIII-D experiment at G.A. Technologies in San Diego, California. The video digitizer uses a custom built CAMAC video synchronizer module to clock data into a CAMAC transient recorder on a line-by-line basis starting at the beginning of a field. The number of fields that are recorded is limited only by the available transient recorder memory. In order to conserve memory, the CAMAC video synchronizer module provides for the alternative selection of a specific region of interest in each successive field to be recorded. Memory conservation can be optimized by specifying lines in the field, start time, stop time, and the number of data samples per line. This video frame grabber has proved versatile for capturing video in such diverse applications as recording video fields from a video tape recorder played in slow motion or recording video fields in real time during a DIII-D shot. In other cases, one or more lines of video are recorded per frame to give a cross sectional slice of the plasma. Since all the data in the digitizer memory is synchronized to video fields and lines, the data can be read directly into the control computer in the proper matrix format to facilitate rapid processing, display, and permanent storage

  13. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  14. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H

    2016-01-01

    to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm......BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...

  15. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    International Nuclear Information System (INIS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A

    2013-01-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses. (paper)

  16. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    Science.gov (United States)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  17. Quantitative analysis of the ATV data base, Stage 2

    International Nuclear Information System (INIS)

    Stenquist, C.; Kjellbert, N.A.

    1981-01-01

    A supplementary study of the Swedish ATV data base was carried out. The study was limited to an analysis of the quantitative coverage of component failures from 1979 through 1980. The results indicate that the coverage of component failures is about 75-80 per cent related to the failure reports and work order sheets at the reactor sites together with SKI's ''Safety Related Occurrences''. In general there has been an improvement compared to previous years. (Auth.)

  18. Quantitative analysis of culture using millions of digitized books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2010-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  19. Quantitative Analysis of Culture Using Millions of Digitized Books

    OpenAIRE

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K.; Google Books Team; Pickett, Joseph; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics,’ focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pu...

  20. ANALYSIS AND PLANNING OF REGIONAL DEVELOPMENT - CONTEXTUAL VARIABLES TO DEVELOP A MODEL FOR MONITORING FINANCIAL INDICATORS AT REGIONAL LEVEL.

    Directory of Open Access Journals (Sweden)

    CRIS TINA GRADEA

    2013-12-01

    Full Text Available Application of quantitative techniques in regional analysis can provide an understanding of both the change in time of regional economic performance and the interdependencies between economic sectors, including the use of projections to test the potential future development of the region. Qualitative techniques allow also the explanation of the reason for regional development patterns occurring in a region and the improvement of analysts' ability to reflect on the results and economic opportunities for a future based on collective experience, wisdom and judgment of the actors in region economies.

  1. Quantitative risk analysis of a space shuttle subsystem

    International Nuclear Information System (INIS)

    Frank, M.V.

    1989-01-01

    This paper reports that in an attempt to investigate methods for risk management other than qualitative analysis techniques, NASA has funded pilot study quantitative risk analyses for space shuttle subsystems. The authors performed one such study of two shuttle subsystems with McDonnell Douglas Astronautics Company. The subsystems were the auxiliary power units (APU) on the orbiter, and the hydraulic power units on the solid rocket booster. The technology and results of the APU study are presented in this paper. Drawing from a rich in-flight database as well as from a wealth of tests and analyses, the study quantitatively assessed the risk of APU-initiated scenarios on the shuttle during all phases of a flight mission. Damage states of interest were loss of crew/vehicle, aborted mission, and launch scrub. A quantitative risk analysis approach to deciding on important items for risk management was contrasted with the current NASA failure mode and effects analysis/critical item list approach

  2. Analysis of regional climate strategies in the Barents region

    Energy Technology Data Exchange (ETDEWEB)

    Himanen, S.; Inkeroeinen, J.; Latola, K.; Vaisanen, T.; Alasaarela, E.

    2012-11-15

    Climate change is a global phenomenon with especially harsh effects on the Arctic and northern regions. The Arctic's average temperature has risen at almost twice the rate as elsewhere in the past few decades. Since 1966, the Arctic land area covered by snow in early summer has shrunk by almost a fifth. The Barents Region consists of the northern parts of Norway, Sweden, Finland and Russia (i.e. the European part of Russia). Climate change will cause serious impacts in the Barents Region because of its higher density of population living under harsh climatic conditions, thus setting it apart from other Arctic areas. In many cases, economic activities, like tourism, rely on certain weather conditions. For this reason, climate change and adaptation to it is of special urgency for the region. Regional climate change strategies are important tools for addressing mitigation and adaptation to climate change as they can be used to consolidate the efforts of different stakeholders of the public and private sectors. Regional strategies can be important factors in achieving the national and international goals. The study evaluated how the national climate change goals were implemented in the regional and local strategies and programmes in northern Finland. The specific goal was to describe the processes by which the regional strategies were prepared and implemented, and how the work was expanded to include the whole of northern Finland. Finally, the Finnish preparatory processes were compared to case examples of processes for preparing climate change strategies elsewhere in the Barents Region. This analysis provides examples of good practices in preparing a climate change strategy and implementing it. (orig.)

  3. Ratio of slopes method for quantitative analysis in ceramic bodies

    International Nuclear Information System (INIS)

    Zainal Arifin Ahmad; Ahmad Fauzi Mohd Noor; Radzali Othman; Messer, P.F.

    1996-01-01

    A quantitative x-ray diffraction analysis technique developed at University of Sheffield was adopted, rather than the previously widely used internal standard method, to determine the amount of the phases present in a reformulated whiteware porcelain and a BaTiO sub 3 electrochemical material. This method, although still employs an internal standard, was found to be very easy and accurate. The required weight fraction of a phase in the mixture to be analysed is determined from the ratio of slopes of two linear plots, designated as the analysis and reference lines, passing through their origins using the least squares method

  4. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  5. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  6. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  7. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

  8. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  9. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  10. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  11. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    Science.gov (United States)

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  12. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  13. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  14. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  15. Regional quantitative noninvasive assessment of cerebral perfusion and function with N-Isopropyl-[123I]p-iodoamphetamine

    International Nuclear Information System (INIS)

    von Schulthess, G.K.; Ketz, E.; Schubiger, P.A.; Bekier, A.

    1985-01-01

    Although several reports on the clinical usefulness of N-isopropyl-[ 123 I]p-iodoamphetamine (IMP) in the diagnosis of cerebral disease have appeared in the literature, quantitative, noninvasive measurements of regional cerebral blood flow with this method pose difficulties because cerebral IMP uptake not only depends on cerebral perfusion but also on cerebral function. Rather than trying to develop a method to measure cerebral perfusion with IMP, the authors have chosen to test a method to quantitatively evaluate planar and emission computed tomographic (ECT) studies by comparing the data obtained in patients with established pathology with the data obtained in a group of normal individuals. Using this method, absolute cerebral IMP uptake (counts/pixel/mCi/min) and planar anterior right-left ratios were obtained. Also measured were right-left ratios obtained from 12 paired regions in three ECT slices. The evaluation of the patients cerebral IMP uptake asymmetries relative to the normal standard values is a useful adjunct to qualitative image analysis in assessing the presence ans severity of disease, as qualitative analysis is prone to false-positive and negative results. Cerebral IMP uptake as measured in cts/pixel/mCi/min is abnormal only in severe cerebral disease and therefore generally a less helpful parameter

  16. Scientific aspects of urolithiasis: quantitative stone analysis and crystallization experiments

    International Nuclear Information System (INIS)

    Wandt, M.A.E.

    1986-03-01

    The theory, development and results of three quantitative analytical procedures are described and the crystallization experiments in a rotary evaporator are presented. Of the different methods of quantitative X-ray powder diffraction analyses, the 'internal standard method' and a microanalytical technique were identified as the two most useful procedures for the quantitative analysis of urinary calculi. 'Reference intensity ratios' for 6 major stone phases were determined and were used in the analysis of 20 calculi by the 'internal standard method'. Inductively coupled plasma atomic emission spectroscopic (ICP-AES) methods were also investigated, developed and used in this study. Various procedures for the digestion of calculi were tested and a mixture of HNO 3 and HC1O 4 was eventually found to be the most successful. The major elements Ca, Mg, and P in 41 calculi were determined. For the determination of trace elements, a new microwave-assisted digestion procedure was developed and used for the digestion of 100 calculi. Fluoride concentrations in two stone collections were determined using a fluoride-ion sensitive electrode and the HNO 3 /HC1O 4 digestion prodecure used for the ICP study. A series of crystallization experiments involving a standard reference artificial urine was carried out in a rotary evaporator. The effect of pH and urine composition was studied by varying the former and by including uric acid, urea, creatinine, MgO, methylene blue, chondroitin sulphate A, and fluoride in the reference solution. Crystals formed in these experiments were subjected to qualitative and semi-quantitative X-ray powder diffraction analyses. Scanning electron microscopy of several deposits was also carried out. Similar deposits to those observed in calculi were obtained with the fast evaporator. The results presented suggest that this system provides a simple, yet very useful means for studying the crystallization characteristics of urine solutions

  17. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  18. Quantitative Holocene climatic reconstructions for the lower Yangtze region of China

    Science.gov (United States)

    Li, Jianyong; Dodson, John; Yan, Hong; Wang, Weiming; Innes, James B.; Zong, Yongqiang; Zhang, Xiaojian; Xu, Qinghai; Ni, Jian; Lu, Fengyan

    2018-02-01

    Quantitative proxy-based and high-resolution palaeoclimatic datasets are scarce for the lower reaches of the Yangtze River (LYR) basin. This region is in a transitional vegetation zone which is climatologically sensitive; and as a birthplace for prehistorical civilization in China, it is important to understand how palaeoclimatic dynamics played a role in affecting cultural development in the region. We present a pollen-based and regionally-averaged Holocene climatic twin-dataset for mean total annual precipitation (PANN) and mean annual temperature (TANN) covering the last 10,000 years for the LYR region. This is based on the technique of weighted averaging-partial least squares regression to establish robust calibration models for obtaining reliable climatic inferences. The pollen-based reconstructions generally show an early Holocene climatic optimum with both abundant monsoonal rainfall and warm thermal conditions, and a declining pattern of both PANN and TANN values in the middle to late Holocene. The main driving forces behind the Holocene climatic changes in the LYR area are likely summer solar insolation associated with tropical or subtropical macro-scale climatic circulations such as the Intertropical Convergence Zone (ITCZ), Western Pacific Subtropical High (WPSH), and El Niño/Southern Oscillation (ENSO). Regional multi-proxy comparisons indicate that the Holocene variations in precipitation and temperature for the LYR region display an in-phase relationship with other related proxy records from southern monsoonal China and the Indian monsoon-influenced regions, but are inconsistent with the Holocene moisture or temperature records from northern monsoonal China and the westerly-dominated region in northwestern China. Overall, our comprehensive palaeoclimatic dataset and models may be significant tools for understanding the Holocene Asian monsoonal evolution and for anticipating its future dynamics in eastern Asia.

  19. Fluorescent foci quantitation for high-throughput analysis

    Directory of Open Access Journals (Sweden)

    Elena Ledesma-Fernández

    2015-06-01

    Full Text Available A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells.

  20. Quantitative x-ray fractographic analysis of fatigue fractures

    International Nuclear Information System (INIS)

    Saprykin, Yu.V.

    1983-01-01

    The study deals with quantitative X-ray fractographic investigation of fatigue fractures of samples with sharp notches tested at various stresses and temperatures with the purpose of establishing a connection between material crack resistance parameters and local plastic instability zones restraining and controlling the crack growth. At fatigue fractures of notched Kh18N9T steel samples tested at +20 and -196 deg C a zone of sharp ring notch effect being analogous to the zone in which crack growth rate is controlled by the microshifting mechanisms is singled out. The size of the notched effect zone in the investigate steel is unambiguosly bound to to the stress amplitude. This provides the possibility to determine the stress value by the results of quantitative fractographic analysis of notched sample fractures. A possibility of determining one of the threshold values of cyclic material fracture toughness by the results of fatigue testing and fractography of notched sample fractures is shown. Correlation between the size of the hsub(s) crack effect zone in the notched sample, delta material yield limit and characteristic of cyclic Ksub(s) fracture toughness has been found. Such correlation widens the possibilities of quantitative diagnostics of fractures by the methods of X-ray fractography

  1. The cost of electricity distribution in Italy: a quantitative analysis

    International Nuclear Information System (INIS)

    Scarpa, C.

    1998-01-01

    This paper presents a quantitative analysis of the cost of medium and low tension electricity distribution in Italy. An econometric analysis of the cost function is proposed, on the basis of data on 147 zones of the dominant firm, ENEL. Data are available only for 1996, which has forced to carry out only a cross-section OLS analysis. The econometric estimate shows the existence of significant scale economies, that the current organisational structure does not exploit. On this basis is also possible to control to what extent exogenous cost drivers affect costs. The role of numerous exogenous factors considered seems however quite limited. The area of the distribution zone and an indicator of quality are the only elements that appear significant from an economic viewpoint [it

  2. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  3. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  4. Program for the quantitative and qualitative analysis of

    International Nuclear Information System (INIS)

    Tepelea, V.; Purice, E.; Dan, R.; Calcev, G.; Domnisan, M.; Galis, V.; Teodosiu, G.; Debert, C.; Mocanu, N.; Nastase, M.

    1985-01-01

    A computer code for processing of data from neutron activation analysis is described. The code is capable of qualitative and quantitative analysis of regular spectra from neutron irradiated samples, measured by a Ge(li) detector. Multichannel analysers with 1024 channels, such as TN 1705 or a Romanian made MCA 79, and an ITC interface can be used. The code is implemented on FELIX M118 and FELIX M216 microcomputers. Spectrum processing is performed off line, after storing the data on a floppy disk. The background is assumed to be a polynomial of first, second or third degree. Qualitative analysis is performed by recursive least square, Gaussian curve fitting. The elements are identified using a polynomial relation between energy and channel, obtained by calibration with a standard sample

  5. Quantitative analysis of light elements in aerosol samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Reis, M.A.; Jesus, A.P.; Ribeiro, J.P.

    2006-01-01

    Quantitative PIGE analysis of aerosol samples collected on nuclepore polycarbonate filters was performed by a method that avoids the use of comparative standards. Nuclear cross sections and calibration parameters established before in an extensive work on thick and intermediate samples were employed. For these samples, the excitation functions of nuclear reactions, induced by the incident protons on target's light elements, were used as input for a code that evaluates the gamma-ray yield integrating along the depth of the sample. In the present work we apply the same code to validate the use of an effective energy for thin sample analysis. Results pertaining to boron, fluorine and sodium concentrations are presented. In order to establish a correlation with sodium values, PIXE results related to chlorine are also presented, giving support to the reliability of this PIGE method for thin film analysis

  6. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    Science.gov (United States)

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc.

  8. Quantitative analysis method for niobium in lead zirconate titanate

    International Nuclear Information System (INIS)

    Hara, Hideo; Hashimoto, Toshio

    1986-01-01

    Lead zirconate titanate (PZT) is a strong dielectric ceramic having piezoelectric and pyroelectric properties, and is used most as a piezoelectric material. Also it is a main component of lead lanthanum zirconate titanate (PLZT), which is a typical electrical-optical conversion element. Since these have been developed, the various electronic parts utilizing the piezoelectric characteristics have been put in practical use. The characteristics can be set up by changing the composition of PZT and the kinds and amount of additives. Among the additives, niobium has the action to make metallic ion vacancy in crystals, and by the formation of this vacancy, to ease the movement of domain walls in crystal grains, and to increase resistivity. Accordingly, it is necessary to accurately determine the niobium content for the research and development, quality control and process control. The quantitative analysis methods for niobium used so far have respective demerits, therefore, the authors examined the quantitative analysis of niobium in PZT by using an inductively coupled plasma emission spectro-analysis apparatus which has remarkably developed recently. As the result, the method of dissolving a specimen with hydrochloric acid and hydrofluoric acid, and masking unstable lead with ethylene diamine tetraacetic acid 2 sodium and fluoride ions with boric acid was established. The apparatus, reagents, the experiment and the results are reported. (Kako, I.)

  9. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    Science.gov (United States)

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  10. Quantitative imaging analysis of posterior fossa ependymoma location in children.

    Science.gov (United States)

    Sabin, Noah D; Merchant, Thomas E; Li, Xingyu; Li, Yimei; Klimo, Paul; Boop, Frederick A; Ellison, David W; Ogg, Robert J

    2016-08-01

    Imaging descriptions of posterior fossa ependymoma in children have focused on magnetic resonance imaging (MRI) signal and local anatomic relationships with imaging location only recently used to classify these neoplasms. We developed a quantitative method for analyzing the location of ependymoma in the posterior fossa, tested its effectiveness in distinguishing groups of tumors, and examined potential associations of distinct tumor groups with treatment and prognostic factors. Pre-operative MRI examinations of the brain for 38 children with histopathologically proven posterior fossa ependymoma were analyzed. Tumor margin contours and anatomic landmarks were manually marked and used to calculate the centroid of each tumor. Landmarks were used to calculate a transformation to align, scale, and rotate each patient's image coordinates to a common coordinate space. Hierarchical cluster analysis of the location and morphological variables was performed to detect multivariate patterns in tumor characteristics. The ependymomas were also characterized as "central" or "lateral" based on published radiological criteria. Therapeutic details and demographic, recurrence, and survival information were obtained from medical records and analyzed with the tumor location and morphology to identify prognostic tumor characteristics. Cluster analysis yielded two distinct tumor groups based on centroid location The cluster groups were associated with differences in PFS (p = .044), "central" vs. "lateral" radiological designation (p = .035), and marginally associated with multiple operative interventions (p = .064). Posterior fossa ependymoma can be objectively classified based on quantitative analysis of tumor location, and these classifications are associated with prognostic and treatment factors.

  11. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  12. Quantitative analysis of tritium distribution in austenitic stainless steels welds

    International Nuclear Information System (INIS)

    Roustila, A.; Kuromoto, N.; Brass, A.M.; Chene, J.

    1994-01-01

    Tritium autoradiography was used to study the tritium distribution in laser and arc (TIG) weldments performed on tritiated AISI 316 samples. Quantitative values of the local tritium concentration were obtained from the microdensitometric analysis of the autoradiographs. This procedure was used to map the tritium concentration in the samples before and after laser and TIG treatments. The effect of the detritiation conditions and of welding on the tritium distribution in the material is extensively characterized. The results illustrate the interest of the technique for predicting a possible embrittlement of the material associated with a local enhancement of the tritium concentration and the presence of helium 3 generated by tritium decay. ((orig.))

  13. Quantitative x-ray fluorescent analysis using fundamental parameters

    International Nuclear Information System (INIS)

    Sparks, C.J. Jr.

    1976-01-01

    A monochromatic source of x-rays for sample excitation permits the use of pure elemental standards and relatively simple calculations to convert the measured fluorescent intensities to an absolute basis of weight per unit weight of sample. Only the mass absorption coefficients of the sample for the exciting and the fluorescent radiation need be determined. Besides the direct measurement of these absorption coefficients in the sample, other techniques are considered which require fewer sample manipulations and measurements. These fundamental parameters methods permit quantitative analysis without recourse to the time-consuming process of preparing nearly identical standards

  14. Quantitative analysis with energy dispersive X-ray fluorescence analyser

    International Nuclear Information System (INIS)

    Kataria, S.K.; Kapoor, S.S.; Lal, M.; Rao, B.V.N.

    1977-01-01

    Quantitative analysis of samples using radioisotope excited energy dispersive x-ray fluorescence system is described. The complete set-up is built around a locally made Si(Li) detector x-ray spectrometer with an energy resolution of 220 eV at 5.94 KeV. The photopeaks observed in the x-ray fluorescence spectra are fitted with a Gaussian function and the intensities of the characteristic x-ray lines are extracted, which in turn are used for calculating the elemental concentrations. The results for a few typical cases are presented. (author)

  15. Computer compensation for NMR quantitative analysis of trace components

    International Nuclear Information System (INIS)

    Nakayama, T.; Fujiwara, Y.

    1981-01-01

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA

  16. Stochastic filtering of quantitative data from STR DNA analysis

    DEFF Research Database (Denmark)

    Tvedebrink, Torben; Eriksen, Poul Svante; Mogensen, Helle Smidt

    due to the apparatus used for measurements). Pull-up effects (more systematic increase caused by overlap in the spectrum) Stutters (peaks located four basepairs before the true peak). We present filtering techniques for all three technical artifacts based on statistical analysis of data from......The quantitative data observed from analysing STR DNA is a mixture of contributions from various sources. Apart from the true allelic peaks, the observed signal consists of at least three components resulting from the measurement technique and the PCR amplification: Background noise (random noise...... controlled experiments conducted at The Section of Forensic Genetics, Department of Forensic Medicine, Faculty of Health Sciences, Universityof Copenhagen, Denmark....

  17. Quantitative sacroiliac scintigraphy. The effect of method of selection of region of interest

    International Nuclear Information System (INIS)

    Davis, M.C.; Turner, D.A.; Charters, J.R.; Golden, H.E.; Ali, A.; Fordham, E.W.

    1984-01-01

    Various authors have advocated quantitative methods of evaluating bone scintigrams to detect sacroiliitis, while others have not found them useful. Many explanations for this disagreement have been offered, including differences in the method of case selection, ethnicity, gender, and previous drug therapy. It would appear that one of the most important impediments to consistent results is the variability of selecting sacroiliac joint and reference regions of interest (ROIs). The effect of ROI selection would seem particularly important because of the normal variability of radioactivity within the reference regions that have been used (sacrum, spine, iliac wing) and the inhomogeneity of activity in the SI joints. We have investigated the effect of ROI selection, using five different methods representative of, though not necessarily identical to, those found in the literature. Each method produced unique mean indices that were different for patients with ankylosing spondylitis (AS) and controls. The method of Ayres (19) proved superior (largest mean difference, smallest variance), but none worked well as a diagnostic tool because of substantial overlap of the distributions of indices of patient and control groups. We conclude that ROI selection is important in determining results, and quantitative scintigraphic methods in general are not effective tools for diagnosing AS. Among the possible factors limiting success, difficulty in selecting a stable reference area seems of particular importance

  18. Quantitative analysis of the epitaxial recrystallization effect induced by swift heavy ions in silicon carbide

    International Nuclear Information System (INIS)

    Benyagoub, A.

    2015-01-01

    This paper discusses recent results on the recrystallization effect induced by swift heavy ions (SHI) in pre-damaged silicon carbide. The recrystallization kinetics was followed by using increasing SHI fluences and by starting from different levels of initial damage within the SiC samples. The quantitative analysis of the data shows that the recrystallization rate depends drastically on the local amount of crystalline material: it is nil in fully amorphous regions and becomes more significant with increasing amount of crystalline material. For instance, in samples initially nearly half-disordered, the recrystallization rate per incident ion is found to be 3 orders of magnitude higher than what it is observed with the well-known IBIEC process using low energy ions. This high rate can therefore not be accounted for by the existing IBIEC models. Moreover, decreasing the electronic energy loss leads to a drastic reduction of the recrystallization rate. A comprehensive quantitative analysis of all the experimental results shows that the SHI induced high recrystallization rate can only be explained by a mechanism based on the melting of the amorphous zones through a thermal spike process followed by an epitaxial recrystallization initiated from the neighboring crystalline regions if the size of the latter exceeds a certain critical value. This quantitative analysis also reveals that recent molecular dynamics calculations supposed to reproduce this phenomenon are wrong since they overestimated the recrystallization rate by a factor ∼40.

  19. Absorption correction factor in X-ray fluorescent quantitative analysis

    International Nuclear Information System (INIS)

    Pimjun, S.

    1994-01-01

    An experiment on absorption correction factor in X-ray fluorescent quantitative analysis were carried out. Standard samples were prepared from the mixture of Fe 2 O 3 and tapioca flour at various concentration of Fe 2 O 3 ranging from 5% to 25%. Unknown samples were kaolin containing 3.5% to-50% of Fe 2 O 3 Kaolin samples were diluted with tapioca flour in order to reduce the absorption of FeK α and make them easy to prepare. Pressed samples with 0.150 /cm 2 and 2.76 cm in diameter, were used in the experiment. Absorption correction factor is related to total mass absorption coefficient (χ) which varied with sample composition. In known sample, χ can be calculated by conveniently the formula. However in unknown sample, χ can be determined by Emission-Transmission method. It was found that the relationship between corrected FeK α intensity and contents of Fe 2 O 3 in these samples was linear. This result indicate that this correction factor can be used to adjust the accuracy of X-ray intensity. Therefore, this correction factor is essential in quantitative analysis of elements comprising in any sample by X-ray fluorescent technique

  20. Developments in Dynamic Analysis for quantitative PIXE true elemental imaging

    International Nuclear Information System (INIS)

    Ryan, C.G.

    2001-01-01

    Dynamic Analysis (DA) is a method for projecting quantitative major and trace element images from PIXE event data-streams (off-line or on-line) obtained using the Nuclear Microprobe. The method separates full elemental spectral signatures to produce images that strongly reject artifacts due to overlapping elements, detector effects (such as escape peaks and tailing) and background. The images are also quantitative, stored in ppm-charge units, enabling images to be directly interrogated for the concentrations of all elements in areas of the images. Recent advances in the method include the correction for changing X-ray yields due to varying sample compositions across the image area and the construction of statistical variance images. The resulting accuracy of major element concentrations extracted directly from these images is better than 3% relative as determined from comparisons with electron microprobe point analysis. These results are complemented by error estimates derived from the variance images together with detection limits. This paper provides an update of research on these issues, introduces new software designed to make DA more accessible, and illustrates the application of the method to selected geological problems.

  1. Quantitative XPS analysis of high Tc superconductor surfaces

    International Nuclear Information System (INIS)

    Jablonski, A.; Sanada, N.; Suzuki, Y.; Fukuda, Y.; Nagoshi, M.

    1993-01-01

    The procedure of quantitative XPS analysis involving the relative sensitivity factors is most convenient to apply to high T c superconductor surfaces because this procedure does not require standards. However, a considerable limitation of such an approach is its relatively low accuracy. In the present work, a proposition is made to use for this purpose a modification of the relative sensitivity factor approach accounting for the matrix and the instrumental effects. The accuracy of this modification when applied to the binary metal alloys is 2% or better. A quantitative XPS analysis was made for surfaces of the compounds Bi 2 Sr 2 CuO 6 , Bi 2 Sr 2 CaCu 2 O 8 , and YBa 2 Cu 3 O Y . The surface composition determined for the polycrystalline samples corresponds reasonably well to the bulk stoichiometry. Slight deficiency of oxygen was found for the Bi-based compounds. The surface exposed on cleavage of the Bi 2 Sr 2 CaCu 2 O 8 single crystal was found to be enriched with bismuth, which indicates that the cleavage occurs along the BiO planes. This result is in agreement with the STM studies published in the literature

  2. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  3. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  4. Quantitative analysis of tellurium in simple substance sulfur

    International Nuclear Information System (INIS)

    Arikawa, Yoshiko

    1976-01-01

    The MIBK extraction-bismuthiol-2 absorptiometric method for the quantitative analysis of tellurium was studied. The method and its limitation were compared with the atomic absorption method. The period of time required to boil the solution in order to decompose excess hydrogen peroxide and to reduce tellurium from 6 valance to 4 valance was examined. As a result of experiment, the decomposition was fast in the alkaline solution. It takes 30 minutes with alkaline solution and 40 minutes with acid solution to indicate constant absorption. A method of analyzing the sample containing tellurium less than 5 ppm was studied. The experiment revealed that the sample containing a very small amount of tellurium can be analyzed when concentration by extraction is carried out for the sample solutions which are divided into one gram each because it is difficult to treat several grams of the sample at one time. This method also is suitable for the quantitative analysis of selenium. This method showed good addition effect and reproducibility within the relative error of 5%. The comparison between the calibration curve of the standard solution of tellurium 4 subjected to the reaction with bismuthiol-2 and the calibration curve obtained from the extraction of tellurium 4 with MIBK indicated that the extraction is perfect. The result by bismuthiol-2 method and that by atom absorption method coincided quite well on the same sample. (Iwakiri, K.)

  5. Multivariate calibration applied to the quantitative analysis of infrared spectra

    Energy Technology Data Exchange (ETDEWEB)

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  6. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    International Nuclear Information System (INIS)

    Ando, Katsutoshi; Tobino, Kazunori; Kurihara, Masatoshi; Kataoka, Hideyuki; Doi, Tokuhide; Hoshika, Yoshito; Takahashi, Kazuhisa; Seyama, Kuniaki

    2012-01-01

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm 2 and 5–10 mm 2 and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL CO /VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p CO /VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  7. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  8. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    Directory of Open Access Journals (Sweden)

    Jiwoong Choi

    Full Text Available Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells through interactions with host cells. We explored this with serial inspiratory computed tomography (CT and image matching to assess regional changes in lung expansion.We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05.Lung regions of metastatic sarcoma patients (but not the normal control group demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05. There was also evidence of increased lung "tissue" volume (non-air components in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions.This new quantitative CT (QCT method for linking

  9. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  10. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    International Nuclear Information System (INIS)

    Pořízka, P.; Demidov, A.; Kaiser, J.; Keivanian, J.; Gornushkin, I.; Panne, U.; Riedel, J.

    2014-01-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  11. Laser-induced breakdown spectroscopy for in situ qualitative and quantitative analysis of mineral ores

    Energy Technology Data Exchange (ETDEWEB)

    Pořízka, P. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Demidov, A. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Kaiser, J. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technická 2896/2, 61669 Brno (Czech Republic); Keivanian, J. [Institute for Mining, Technical University Clausthal, Erzstraße 18, 38678 Clausthal-Zellerfeld (Germany); Gornushkin, I. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Panne, U. [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany); Chemistry Department, Humboldt Univerisät zu Berlin, Brook-Taylor-Straße 2, D-12489 Berlin (Germany); Riedel, J., E-mail: jens.riedel@bam.de [BAM, Federal Institute for Materials Research and Testing, Richard Willstätter-Straße 11, D-12489 Berlin (Germany)

    2014-11-01

    In this work, the potential of laser-induced breakdown spectroscopy (LIBS) for discrimination and analysis of geological materials was examined. The research was focused on classification of mineral ores using their LIBS spectra prior to quantitative determination of copper. Quantitative analysis is not a trivial task in LIBS measurement because intensities of emission lines in laser-induced plasmas (LIP) are strongly affected by the sample matrix (matrix effect). To circumvent this effect, typically matrix-matched standards are used to obtain matrix-dependent calibration curves. If the sample set consists of a mixture of different matrices, even in this approach, the corresponding matrix has to be known prior to the downstream data analysis. For this categorization, the multielemental character of LIBS spectra can be of help. In this contribution, a principal component analysis (PCA) was employed on the measured data set to discriminate individual rocks as individual matrices against each other according to their overall elemental composition. Twenty-seven igneous rock samples were analyzed in the form of fine dust, classified and subsequently quantitatively analyzed. Two different LIBS setups in two laboratories were used to prove the reproducibility of classification and quantification. A superposition of partial calibration plots constructed from the individual clustered data displayed a large improvement in precision and accuracy compared to the calibration plot constructed from all ore samples. The classification of mineral samples with complex matrices can thus be recommended prior to LIBS system calibration and quantitative analysis. - Highlights: • Twenty seven igneous rocks were measured on different LIBS systems. • Principal component analysis (PCA) was employed for classification. • The necessity of the classification of the rock (ore) samples prior to the quantification analysis is stressed. • Classification based on the whole LIP spectra and

  12. Quantitative image analysis in sonograms of the thyroid gland

    Energy Technology Data Exchange (ETDEWEB)

    Catherine, Skouroliakou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Maria, Lyra [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)]. E-mail: mlyra@pindos.uoa.gr; Aristides, Antoniou [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece); Lambros, Vlahos [A' Department of Radiology, University of Athens, Vas.Sophias Ave, Athens 11528 (Greece)

    2006-12-20

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  13. Comparative Analysis of Innovative Systems in the Russian Regions

    Directory of Open Access Journals (Sweden)

    Nadezhda Nikolaevna Mikheeva

    2014-12-01

    Full Text Available This article was conducted with the financial support of the Program of fundamental studies of the Presidium of the Russian Academy of Sciences No. 31, «The Role of Space in the Modernization of Russia: Natural and Socio-Economic Potential» (project 7.2 «Tools of Regional Policy and the Effectiveness of Their Use» Despite the abundance of literature on innovation in the regions, there is no frame of reference about the patterns and mechanisms of formation of regional innovation systems (RIS. So it’s next to impossible to differentiate approaches to stimulation of innovations in the regions taking into account their specific characters. Therefore, the author attempts to formalize the definition of RIS and to provide not only qualitative but also quantitative evaluation of different types of regional innovation systems. This paper is trying to find a set of models of RIS development in Russian regions with specific characters. These regional peculiarities play a key role in a process of selecting methods for further RIS development, including methods of state support of innovation, which are adequate to the characteristics of the regional innovation system. The author obtained the following results: 1 presentation of various approaches to define regional innovative systems; 2 proposition of RIS structure that includes 5 blocks (creation of innovations; production and realization of innovative goods and services in the region; innovative infrastructure of the region; demand for innovations and innovative policy and 3 development of the system of statistical indicators that characterize RIS. On the basis of formal and substantive analysis of these indicators the researcher defined 6 models of regional innovative systems prevailing in the Russian circumstances

  14. Quantitative analysis of agricultural land use change in China

    Science.gov (United States)

    Chou, Jieming; Dong, Wenjie; Wang, Shuyu; Fu, Yuqing

    This article reviews the potential impacts of climate change on land use change in China. Crop sown area is used as index to quantitatively analyze the temporal-spatial changes and the utilization of the agricultural land. A new concept is defined as potential multiple cropping index to reflect the potential sowing ability. The impacting mechanism, land use status and its surplus capacity are investigated as well. The main conclusions are as following; During 1949-2010, the agricultural land was the greatest in amount in the middle of China, followed by that in the country's eastern and western regions. The most rapid increase and decrease of agricultural land were observed in Xinjiang and North China respectively, Northwest China and South China is also changed rapid. The variation trend before 1980 differed significantly from that after 1980. Agricultural land was affected by both natural and social factors, such as regional climate and environmental changes, population growth, economic development, and implementation of policies. In this paper, the effects of temperature and urbanization on the coverage of agriculture land are evaluated, and the results show that the urbanization can greatly affects the amount of agriculture land in South China, Northeast China, Xinjiang and Southwest China. From 1980 to 2009, the extent of agricultural land use had increased as the surplus capacity had decreased. Still, large remaining potential space is available, but the future utilization of agricultural land should be carried out with scientific planning and management for the sustainable development.

  15. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  17. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  18. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  20. MR imaging of Minamata disease. Qualitative and quantitative analysis

    International Nuclear Information System (INIS)

    Korogi, Yukunori; Takahashi, Mutsumasa; Sumi, Minako; Hirai, Toshinori; Okuda, Tomoko; Shinzato, Jintetsu; Okajima, Toru.

    1994-01-01

    Minamata disease (MD), a result of methylmercury poisoning, is a neurological illness caused by ingestion of contaminated seafood. We evaluated MR findings of patients with MD qualitatively and quantitatively. Magnetic resonance imaging at 1.5 Tesla was performed in seven patients with MD and in eight control subjects. All of our patients showed typical neurological findings like sensory disturbance, constriction of the visual fields, and ataxia. In the quantitative image analysis, inferior and middle parts of the cerebellar vermis and cerebellar hemispheres were significantly atrophic in comparison with the normal controls. There were no significant differences in measurements of the basis pontis, middle cerebellar peduncles, corpus callosum, or cerebral hemispheres between MD and the normal controls. The calcarine sulci and central sulci were significantly dilated, reflecting atrophy of the visual cortex and postcentral cortex, respectively. The lesions located in the calcarine area, cerebellum, and postcentral gyri were related to three characteristic manifestations of this disease, constriction of the visual fields, ataxia, and sensory disturbance, respectively. MR imaging has proved to be useful in evaluating the CNS abnormalities of methylmercury poisoning. (author)

  1. Spatial data analysis for exploration of regional scale geothermal resources

    Science.gov (United States)

    Moghaddam, Majid Kiavarz; Noorollahi, Younes; Samadzadegan, Farhad; Sharifi, Mohammad Ali; Itoi, Ryuichi

    2013-10-01

    Defining a comprehensive conceptual model of the resources sought is one of the most important steps in geothermal potential mapping. In this study, Fry analysis as a spatial distribution method and 5% well existence, distance distribution, weights of evidence (WofE), and evidential belief function (EBFs) methods as spatial association methods were applied comparatively to known geothermal occurrences, and to publicly-available regional-scale geoscience data in Akita and Iwate provinces within the Tohoku volcanic arc, in northern Japan. Fry analysis and rose diagrams revealed similar directional patterns of geothermal wells and volcanoes, NNW-, NNE-, NE-trending faults, hotsprings and fumaroles. Among the spatial association methods, WofE defined a conceptual model correspondent with the real world situations, approved with the aid of expert opinion. The results of the spatial association analyses quantitatively indicated that the known geothermal occurrences are strongly spatially-associated with geological features such as volcanoes, craters, NNW-, NNE-, NE-direction faults and geochemical features such as hotsprings, hydrothermal alteration zones and fumaroles. Geophysical data contains temperature gradients over 100 °C/km and heat flow over 100 mW/m2. In general, geochemical and geophysical data were better evidence layers than geological data for exploring geothermal resources. The spatial analyses of the case study area suggested that quantitative knowledge from hydrothermal geothermal resources was significantly useful for further exploration and for geothermal potential mapping in the case study region. The results can also be extended to the regions with nearly similar characteristics.

  2. Quantitative methodology to extract regional magnetotelluric impedances and determine the dimension of the conductivity structure

    Energy Technology Data Exchange (ETDEWEB)

    Groom, R [PetRos EiKon Incorporated, Ontario (Canada); Kurtz, R; Jones, A; Boerner, D [Geological Survey of Canada, Ontario (Canada)

    1996-05-01

    This paper describes a systematic method for determining the appropriate dimensionality of magnetotelluric (MT) data from a site, and illustrates the application of this method to analyze both synthetic data and real data. Additionally, it describes the extraction of regional impedance responses from multiple sites. This method was examined extensively with synthetic data, and proven to be successful. It was demonstrated for two neighboring sites that the analysis methodology can be extremely useful in unraveling the bulk regional response when hidden by strong three-dimensional effects. Although there may still be some uncertainties remaining in the true levels for the regional responses for stations LIT000 and LITW02, the analysis has provided models which not only fit the data but are consistent for neighboring sites. It was suggested from these data that the stations are seeing significantly different structures. 12 refs.

  3. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  4. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  5. Qualitative and quantitative reliability analysis of safety systems

    International Nuclear Information System (INIS)

    Karimi, R.; Rasmussen, N.; Wolf, L.

    1980-05-01

    A code has been developed for the comprehensive analysis of a fault tree. The code designated UNRAC (UNReliability Analysis Code) calculates the following characteristics of an input fault tree: (1) minimal cut sets; (2) top event unavailability as point estimate and/or in time dependent form; (3) quantitative importance of each component involved; and, (4) error bound on the top event unavailability. UNRAC can analyze fault trees, with any kind of gates (EOR, NAND, NOR, AND, OR), up to a maximum of 250 components and/or gates. The code is benchmarked against WAMCUT, MODCUT, KITT, BIT-FRANTIC, and PL-MODT. The results showed that UNRAC produces results more consistent with the KITT results than either BIT-FRANTIC or PL-MODT. Overall it is demonstrated that UNRAC is an efficient easy-to-use code and has the advantage of being able to do a complete fault tree analysis with this single code. Applications of fault tree analysis to safety studies of nuclear reactors are considered

  6. Quantitative analysis and classification of AFM images of human hair.

    Science.gov (United States)

    Gurden, S P; Monteiro, V F; Longo, E; Ferreira, M M C

    2004-07-01

    The surface topography of human hair, as defined by the outer layer of cellular sheets, termed cuticles, largely determines the cosmetic properties of the hair. The condition of the cuticles is of great cosmetic importance, but also has the potential to aid diagnosis in the medical and forensic sciences. Atomic force microscopy (AFM) has been demonstrated to offer unique advantages for analysis of the hair surface, mainly due to the high image resolution and the ease of sample preparation. This article presents an algorithm for the automatic analysis of AFM images of human hair. The cuticular structure is characterized using a series of descriptors, such as step height, tilt angle and cuticle density, allowing quantitative analysis and comparison of different images. The usefulness of this approach is demonstrated by a classification study. Thirty-eight AFM images were measured, consisting of hair samples from (a) untreated and bleached hair samples, and (b) the root and distal ends of the hair fibre. The multivariate classification technique partial least squares discriminant analysis is used to test the ability of the algorithm to characterize the images according to the properties of the hair samples. Most of the images (86%) were found to be classified correctly.

  7. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain

    Science.gov (United States)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  8. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain.

    Science.gov (United States)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions

  9. Regional Convergence of Income: Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Vera Ivanovna Ivanova

    2014-12-01

    Full Text Available Russia has a huge territory and a strong interregional heterogeneity, so we can assume that geographical factors have a significant impact on the pace of economic growth in Russian regions. Therefore the article is focused on the following issues: 1 correlation between comparative advantages of geographical location and differences in growth rates; 2 impact of more developed regions on their neighbors and 3 correlation between economic growth of regions and their spatial interaction. The article is devoted to the empirical analysis of regional per capita incomes from 1996 to 2012 and explores the dynamics of the spatial autocorrelation of regional development indicator. It is shown that there is a problem of measuring the intensity of spatial dependence: factor value of Moran’s index varies greatly depending on the choice of the matrix of distances. In addition, with the help of spatial econometrics the author tests the following hypotheses: 1 there is convergence between regions for a specified period; 2 the process of beta convergence is explained by the spatial arrangement of regions and 3 there is positive impact of market size on regional growth. The author empirically confirmed all three hypotheses

  10. Quantitative analysis of allantoin in Iranian corn silk

    OpenAIRE

    E. Khanpour*; M. Modarresi

    2017-01-01

    Background and objectives: Zea mays is cultivated in different parts of Iran and corn silk is used in traditional medicine. Allantoin is one of the major compounds in corn silk. The purpose of this research was the quantitatve analysis of allantoin in corn silks belonging to several regions of Iran. Methods: The samples of corn silk were prepared from three provinces of Iran (Kermanshah, Fars and Razavi Khorasan). The dried plant materials were infused in boiling distilled water with a temper...

  11. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  12. Quali- and quantitative analysis of commercial coffee by NMR

    International Nuclear Information System (INIS)

    Tavares, Leila Aley; Ferreira, Antonio Gilberto

    2006-01-01

    Coffee is one of the beverages most widely consumed in the world and the 'cafezinho' is normally prepared from a blend of roasted powder of two species, Coffea arabica and Coffea canephora. Each one exhibits differences in their taste and in the chemical composition, especially in the caffeine percentage. There are several procedures proposed in the literature for caffeine determination in different samples like soft drinks, coffee, medicines, etc but most of them need a sample workup which involves at least one step of purification. This work describes the quantitative analysis of caffeine using 1 H NMR and the identification of the major components in commercial coffee samples using 1D and 2D NMR techniques without any sample pre-treatment. (author)

  13. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantitative image analysis of WE43-T6 cracking behavior

    International Nuclear Information System (INIS)

    Ahmad, A; Yahya, Z

    2013-01-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  15. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  16. Quantitative analysis of fission products by γ spectrography

    International Nuclear Information System (INIS)

    Malet, G.

    1962-01-01

    The activity of the fission products present in treated solutions of irradiated fuels is given as a function of the time of cooling and of the irradiation time. The variation of the ratio ( 144 Ce + 144 Pr activity)/ 137 Cs activity) as a function of these same parameters is also given. From these results a method is deduced giving the 'age' of the solution analyzed. By γ-scintillation spectrography it was possible to estimate the following elements individually: 141 Ce, 144 Ce + 144 Pr, 103 Ru, 106 Ru + 106 Rh, 137 Cs, 95 Zr + 95 Nb. Yield curves are given for the case of a single emitter. Of the various existing methods, that of the least squares was used for the quantitative analysis of the afore-mentioned fission products. The accuracy attained varies from 3 to 10%. (author) [fr

  17. Quantitative image analysis for investigating cell-matrix interactions

    Science.gov (United States)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  18. Quantitative Immunofluorescence Analysis of Nucleolus-Associated Chromatin.

    Science.gov (United States)

    Dillinger, Stefan; Németh, Attila

    2016-01-01

    The nuclear distribution of eu- and heterochromatin is nonrandom, heterogeneous, and dynamic, which is mirrored by specific spatiotemporal arrangements of histone posttranslational modifications (PTMs). Here we describe a semiautomated method for the analysis of histone PTM localization patterns within the mammalian nucleus using confocal laser scanning microscope images of fixed, immunofluorescence stained cells as data source. The ImageJ-based process includes the segmentation of the nucleus, furthermore measurements of total fluorescence intensities, the heterogeneity of the staining, and the frequency of the brightest pixels in the region of interest (ROI). In the presented image analysis pipeline, the perinucleolar chromatin is selected as primary ROI, and the nuclear periphery as secondary ROI.

  19. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  20. Qualitative and quantitative analysis of women's perceptions of transvaginal surgery.

    Science.gov (United States)

    Bingener, Juliane; Sloan, Jeff A; Ghosh, Karthik; McConico, Andrea; Mariani, Andrea

    2012-04-01

    Prior surveys evaluating women's perceptions of transvaginal surgery both support and refute the acceptability of transvaginal access. Most surveys employed mainly quantitative analysis, limiting the insight into the women's perspective. In this mixed-methods study, we include qualitative and quantitative methodology to assess women's perceptions of transvaginal procedures. Women seen at the outpatient clinics of a tertiary-care center were asked to complete a survey. Demographics and preferences for appendectomy, cholecystectomy, and tubal ligation were elicited, along with open-ended questions about concerns or benefits of transvaginal access. Multivariate logistic regression models were constructed to examine the impact of age, education, parity, and prior transvaginal procedures on preferences. For the qualitative evaluation, content analysis by independent investigators identified themes, issues, and concerns raised in the comments. The completed survey tool was returned by 409 women (grouped mean age 53 years, mean number of 2 children, 82% ≥ some college education, and 56% with previous transvaginal procedure). The transvaginal approach was acceptable for tubal ligation to 59%, for appendectomy to 43%, and for cholecystectomy to 41% of the women. The most frequently mentioned factors that would make women prefer a vaginal approach were decreased invasiveness (14.4%), recovery time (13.9%), scarring (13.7%), pain (6%), and surgical entry location relative to organ removed (4.4%). The most frequently mentioned concerns about the vaginal approach were the possibility of complications/safety (14.7%), pain (9%), infection (5.6%), and recovery time (4.9%). A number of women voiced technical concerns about the vaginal approach. As in prior studies, scarring and pain were important issues to be considered, but recovery time and increased invasiveness were also in the "top five" list. The surveyed women appeared to actively participate in evaluating the technical

  1. Quantitative analysis of contrast-enhanced ultrasonography of the bowel wall can predict disease activity in inflammatory bowel disease

    Energy Technology Data Exchange (ETDEWEB)

    Romanini, Laura, E-mail: laura.romanini@libero.it [Department of Radiology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Passamonti, Matteo, E-mail: matteopassamonti@gmail.com [Department of Radiology-AO Provincia di Lodi, Via Fissiraga, 15, 26900 Lodi (Italy); Navarria, Mario, E-mail: navarria.mario@tiscali.it [Department of Radiology-ASL Vallecamonica-Sebino, Via Manzoni 142, 25040 Esine, BS (Italy); Lanzarotto, Francesco, E-mail: francesco.lanzarotto@spedalicivili.brescia.it [Department of Gastroenterology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Villanacci, Vincenzo, E-mail: villanac@alice.it [Department of Pathology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Grazioli, Luigi, E-mail: radiologia1@spedalicivili.brescia.it [Department of Radiology, Spedali Civili di Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy); Calliada, Fabrizio, E-mail: fabrizio.calliada@gmail.com [Department of Radiology, University of Pavia, Viale Camillo Golgi 19, 27100 Pavia (Italy); Maroldi, Roberto, E-mail: rmaroldi@gmail.com [Department of Radiology, University of Brescia, P.le Spedali Civili, 1, 25123 Brescia (Italy)

    2014-08-15

    Purpose: To evaluate the accuracy of quantitative analysis of bowel wall enhancement in inflammatory bowel disease (IBD) with contrast enhanced ultrasound (CEUS) by comparing the results with vascular density in a biopsy sample from the same area of the intestinal tract, and to determine the usefulness of this analysis for the prediction of disease activity. Materials and methods: This prospective study was approved by our institute's ethics committee and all patients gave written informed consent. We enrolled 33 consecutive adult patients undergoing colonoscopy and biopsy for IBD. All patients underwent CEUS and the results were quantitatively analyzed. Vessel count per high-power field on biopsy specimens was compared with colonoscopy, baseline ultrasonography, and CEUS findings, and with analysis of peak intensity, time to peak, regional blood volume, mean transit time, and regional blood flow. Results in patients with high and low vascular density were compared using Fisher's test, t-test, Pearson's correlation test, and receiver operating characteristic curve (ROC) analysis. Cutoff values were determined using ROC analysis, and sensitivity and specificity were calculated. Results: High vascular density (>265 vessels per field) on histological examination was significantly correlated with active disease on colonoscopy, baseline ultrasonography, and CEUS (p < .0001). Quantitative analysis showed a higher enhancement peak, a shorter time to peak enhancement, a higher regional blood flow and regional blood volume in patients with high vascular density than in those with low vascular density. Cutoff values to distinguish between active and inactive disease were identified for peak enhancement (>40.5%), and regional blood flow (>54.8 ml/min). Conclusion: Quantitative analysis of CEUS data correlates with disease activity as determined by vascular density. Quantitative parameters of CEUS can be used to predict active disease with high sensitivity and

  2. Quantitative analysis of contrast-enhanced ultrasonography of the bowel wall can predict disease activity in inflammatory bowel disease

    International Nuclear Information System (INIS)

    Romanini, Laura; Passamonti, Matteo; Navarria, Mario; Lanzarotto, Francesco; Villanacci, Vincenzo; Grazioli, Luigi; Calliada, Fabrizio; Maroldi, Roberto

    2014-01-01

    Purpose: To evaluate the accuracy of quantitative analysis of bowel wall enhancement in inflammatory bowel disease (IBD) with contrast enhanced ultrasound (CEUS) by comparing the results with vascular density in a biopsy sample from the same area of the intestinal tract, and to determine the usefulness of this analysis for the prediction of disease activity. Materials and methods: This prospective study was approved by our institute's ethics committee and all patients gave written informed consent. We enrolled 33 consecutive adult patients undergoing colonoscopy and biopsy for IBD. All patients underwent CEUS and the results were quantitatively analyzed. Vessel count per high-power field on biopsy specimens was compared with colonoscopy, baseline ultrasonography, and CEUS findings, and with analysis of peak intensity, time to peak, regional blood volume, mean transit time, and regional blood flow. Results in patients with high and low vascular density were compared using Fisher's test, t-test, Pearson's correlation test, and receiver operating characteristic curve (ROC) analysis. Cutoff values were determined using ROC analysis, and sensitivity and specificity were calculated. Results: High vascular density (>265 vessels per field) on histological examination was significantly correlated with active disease on colonoscopy, baseline ultrasonography, and CEUS (p < .0001). Quantitative analysis showed a higher enhancement peak, a shorter time to peak enhancement, a higher regional blood flow and regional blood volume in patients with high vascular density than in those with low vascular density. Cutoff values to distinguish between active and inactive disease were identified for peak enhancement (>40.5%), and regional blood flow (>54.8 ml/min). Conclusion: Quantitative analysis of CEUS data correlates with disease activity as determined by vascular density. Quantitative parameters of CEUS can be used to predict active disease with high sensitivity and

  3. Quantitative analysis of protein-ligand interactions by NMR.

    Science.gov (United States)

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  4. Quantitative mass-spectrometric analysis of hydrogen helium isotope mixtures

    International Nuclear Information System (INIS)

    Langer, U.

    1998-12-01

    This work deals with the mass-spectrometric method for the quantitative analysis of hydrogen-helium-isotope mixtures, with special attention to fusion plasma diagnostics. The aim was to use the low-resolution mass spectrometry, a standard measuring method which is well established in science and industry. This task is solved by means of the vector mass spectrometry, where a mass spectrum is repeatedly measured, but with stepwise variation of the parameter settings of a quadruple mass spectrometer. In this way, interfering mass spectra can be decomposed and, moreover, it is possible to analyze underestimated mass spectra of complex hydrogen-helium-isotope mixtures. In this work experimental investigations are presented which show that there are different parameters which are suitable for the UMS-method. With an optimal choice of the parameter settings hydrogen-helium-isotope mixtures can be analyzed with an accuracy of 1-3 %. In practice, a low sensitivity for small helium concentration has to be noted. To cope with this task, a method for selective hydrogen pressure reduction has been developed. Experimental investigations and calculations show that small helium amounts (about 1 %) in a hydrogen atmosphere can be analyzed with an accuracy of 3 - 10 %. Finally, this work deals with the effects of the measuring and calibration error on the resulting error in spectrum decomposition. This aspect has been investigated both in general mass-spectrometric gas analysis and in the analysis of hydrogen-helium-mixtures by means of the vector mass spectrometry. (author)

  5. Quantitative charge-tags for sterol and oxysterol analysis.

    Science.gov (United States)

    Crick, Peter J; William Bentley, T; Abdel-Khalik, Jonas; Matthews, Ian; Clayton, Peter T; Morris, Andrew A; Bigger, Brian W; Zerbinati, Chiara; Tritapepe, Luigi; Iuliano, Luigi; Wang, Yuqin; Griffiths, William J

    2015-02-01

    Global sterol analysis is challenging owing to the extreme diversity of sterol natural products, the tendency of cholesterol to dominate in abundance over all other sterols, and the structural lack of a strong chromophore or readily ionized functional group. We developed a method to overcome these challenges by using different isotope-labeled versions of the Girard P reagent (GP) as quantitative charge-tags for the LC-MS analysis of sterols including oxysterols. Sterols/oxysterols in plasma were extracted in ethanol containing deuterated internal standards, separated by C18 solid-phase extraction, and derivatized with GP, with or without prior oxidation of 3β-hydroxy to 3-oxo groups. By use of different isotope-labeled GPs, it was possible to analyze in a single LC-MS analysis both sterols/oxysterols that naturally possess a 3-oxo group and those with a 3β-hydroxy group. Intra- and interassay CVs were sterols/oxysterols in a single analytical run and can be used to identify inborn errors of cholesterol synthesis and metabolism. © 2014 American Association for Clinical Chemistry.

  6. [Quantitative analysis of drug expenditures variability in dermatology units].

    Science.gov (United States)

    Moreno-Ramírez, David; Ferrándiz, Lara; Ramírez-Soto, Gabriel; Muñoyerro, M Dolores

    2013-01-01

    Variability in adjusted drug expenditures among clinical departments raises the possibility of difficult access to certain therapies at the time that avoidable expenditures may also exist. Nevertheless, drug expenditures are not usually applied to clinical practice variability analysis. To identify and quantify variability in drug expenditures in comparable dermatology department of the Servicio Andaluz de Salud. Comparative economic analysis regarding the drug expenditures adjusted to population and health care production in 18 dermatology departments of the Servicio Andaluz de Salud. The 2012 cost and production data (homogeneous production units -HPU-)were provided by Inforcoan, the cost accounting information system of the Servicio Andaluz de Salud. The observed drug expenditure ratio ranged from 0.97?/inh to 8.90?/inh and from 208.45?/HPU to 1,471.95?/ HPU. The Pearson correlation between drug expenditure and population was 0.25 and 0.35 for the correlation between expenditure and homogeneous production (p=0.32 and p=0,15, respectively), both Pearson coefficients confirming the lack of correlation and arelevant degree of variability in drug expenditures. The quantitative analysis of variability performed through Pearson correlation has confirmed the existence of drug expenditure variability among comparable dermatology departments. Copyright © 2013 SEFH. Published by AULA MEDICA. All rights reserved.

  7. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  8. Assessment of homogeneity of regions for regional flood frequency analysis

    Science.gov (United States)

    Lee, Jeong Eun; Kim, Nam Won

    2016-04-01

    This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  9. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    Science.gov (United States)

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  10. Groundwater availability in the United States: the value of quantitative regional assessments

    Science.gov (United States)

    Dennehy, Kevin F.; Reilly, Thomas E.; Cunningham, William L.

    2015-01-01

    The sustainability of water resources is under continued threat from the challenges associated with a growing population, competing demands, and a changing climate. Freshwater scarcity has become a fact in many areas. Much of the United States surface-water supplies are fully apportioned for use; thus, in some areas the only potential alternative freshwater source that can provide needed quantities is groundwater. Although frequently overlooked, groundwater serves as the principal reserve of freshwater in the US and represents much of the potential supply during periods of drought. Some nations have requirements to monitor and characterize the availability of groundwater such as the European Union’s Water Framework Directive (EPCEU 2000). In the US there is no such national requirement. Quantitative regional groundwater availability assessments, however, are essential to document the status and trends of groundwater availability for the US and make informed water-resource decisions possible now and in the future. Barthel (2014) highlighted that the value of regional groundwater assessments goes well beyond just quantifying the resource so that it can be better managed. The tools and techniques required to evaluate these unique regional systems advance the science of hydrogeology and provide enhanced methods that can benefit local-scale groundwater investigations. In addition, a significant, yet under-utilized benefit is the digital spatial and temporal data sets routinely generated as part of these studies. Even though there is no legal or regulatory requirement for regional groundwater assessments in the US, there is a logical basis for their implementation. The purpose of this essay is to articulate the rationale for and reaffirm the value of regional groundwater assessments primarily in the US; however, the arguments hold for all nations. The importance of the data sets and the methods and model development that occur as part of these assessments is stressed

  11. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  12. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  13. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  14. Quantitative analysis of minerals by X-ray diffraction

    International Nuclear Information System (INIS)

    Pietroluongo, L.R.V.; Veiga, M.M. da

    1982-01-01

    Considerations about the X-ray diffraction technique for quantitative analyses are made; some experiments carried out at CETEM - Centro de Tecnologia Mineral (Rio de Janeiro, Brazil) with synthetic samples and real samples of diatomites (from northeastern region of Brazil) are described. Quartz quantification has been a problem for analytical chemists and is of great importance to the industries which use this raw material. Comments are made about the main factors influencing the intensity of diffracted X-rays, such as: the crystallinity of the mineral phase; the granulometry, the preferential orientation; sample preparation and pressing, the chemical composition of standards and experimental analytical conditions. Several analytical methods used are described: direct measurement of the height or area of a peak resulting from a particular reflection and comparison with a pre-calibrated curve; method of sequential addition of the mineral of interest in the sample and extrapolation of results for ZERO addition; methods of external and internal standards. (C.L.B.) [pt

  15. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  16. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    International Nuclear Information System (INIS)

    Charland, P.; Peters, T.; McGill Univ., Montreal, Quebec

    1996-01-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer's perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions

  17. Quantitative Machine Learning Analysis of Brain MRI Morphology throughout Aging.

    Science.gov (United States)

    Shamir, Lior; Long, Joe

    2016-01-01

    While cognition is clearly affected by aging, it is unclear whether the process of brain aging is driven solely by accumulation of environmental damage, or involves biological pathways. We applied quantitative image analysis to profile the alteration of brain tissues during aging. A dataset of 463 brain MRI images taken from a cohort of 416 subjects was analyzed using a large set of low-level numerical image content descriptors computed from the entire brain MRI images. The correlation between the numerical image content descriptors and the age was computed, and the alterations of the brain tissues during aging were quantified and profiled using machine learning. The comprehensive set of global image content descriptors provides high Pearson correlation of ~0.9822 with the chronological age, indicating that the machine learning analysis of global features is sensitive to the age of the subjects. Profiling of the predicted age shows several periods of mild changes, separated by shorter periods of more rapid alterations. The periods with the most rapid changes were around the age of 55, and around the age of 65. The results show that the process of brain aging of is not linear, and exhibit short periods of rapid aging separated by periods of milder change. These results are in agreement with patterns observed in cognitive decline, mental health status, and general human aging, suggesting that brain aging might not be driven solely by accumulation of environmental damage. Code and data used in the experiments are publicly available.

  18. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  19. Quantitative analysis of the security performance in wireless LANs

    Directory of Open Access Journals (Sweden)

    Poonam Jindal

    2017-07-01

    Full Text Available A comprehensive experimental study to analyze the security performance of a WLAN based on IEEE 802.11 b/g/n standards in various network scenarios is presented in this paper. By setting-up an experimental testbed we have measured results for a layered security model in terms of throughput, response time, encryption overheads, frame loss and jitter. Through numerical results obtained from the testbed, we have presented quantitative as well as realistic findings for both security mechanisms and network performance. It establishes the fact that there is always a tradeoff between the security strength and the associated network performance. It is observed that the non-roaming network always performs better than the roaming network under all network scenarios. To analyze the benefits offered by a particular security protocol a relative security strength index model is demonstrated. Further we have presented the statistical analysis of our experimental data. We found that different security protocols have different robustness against mobility. By choosing the robust security protocol, network performance can be improved. The presented analysis is significant and useful with reference to the assessment of the suitability of security protocols for given real time application.

  20. Survival Prediction in Pancreatic Ductal Adenocarcinoma by Quantitative Computed Tomography Image Analysis.

    Science.gov (United States)

    Attiyeh, Marc A; Chakraborty, Jayasree; Doussot, Alexandre; Langdon-Embry, Liana; Mainarich, Shiana; Gönen, Mithat; Balachandran, Vinod P; D'Angelica, Michael I; DeMatteo, Ronald P; Jarnagin, William R; Kingham, T Peter; Allen, Peter J; Simpson, Amber L; Do, Richard K

    2018-04-01

    Pancreatic cancer is a highly lethal cancer with no established a priori markers of survival. Existing nomograms rely mainly on post-resection data and are of limited utility in directing surgical management. This study investigated the use of quantitative computed tomography (CT) features to preoperatively assess survival for pancreatic ductal adenocarcinoma (PDAC) patients. A prospectively maintained database identified consecutive chemotherapy-naive patients with CT angiography and resected PDAC between 2009 and 2012. Variation in CT enhancement patterns was extracted from the tumor region using texture analysis, a quantitative image analysis tool previously described in the literature. Two continuous survival models were constructed, with 70% of the data (training set) using Cox regression, first based only on preoperative serum cancer antigen (CA) 19-9 levels and image features (model A), and then on CA19-9, image features, and the Brennan score (composite pathology score; model B). The remaining 30% of the data (test set) were reserved for independent validation. A total of 161 patients were included in the analysis. Training and test sets contained 113 and 48 patients, respectively. Quantitative image features combined with CA19-9 achieved a c-index of 0.69 [integrated Brier score (IBS) 0.224] on the test data, while combining CA19-9, imaging, and the Brennan score achieved a c-index of 0.74 (IBS 0.200) on the test data. We present two continuous survival prediction models for resected PDAC patients. Quantitative analysis of CT texture features is associated with overall survival. Further work includes applying the model to an external dataset to increase the sample size for training and to determine its applicability.

  1. Quantitative analysis of exercise 201Tl myocardial emission CT in patients with coronary artery disease

    International Nuclear Information System (INIS)

    Okada, Mitsuhiro; Kawai, Naoki; Yamamoto, Shuhei

    1984-01-01

    The clinical usefulness of quantitative analysis of exercise thallium-201 myocardial emission computed tomography (ECT) was evaluated in coronary artery disease (CAD). The subjects consisted of 20 CAD patients and five normal controls. All CAD patients underwent coronary angiography. Tomographic thallium-201 myocardial imaging was performed with a rotating gamma camera, and long-axial and short-axial myocardial images of the left ventricle were reconstructed. The tomographic images were interpreted quantitatively using circumferential profile analysis. Based on features of regional myocardial thallium-201 kinetics, two types of abnormalities were studied: (1) diminished initial distribution (stress defect) and (2) slow washout of thallium-201, as evidenced by patients' initial thallium-201 uptake and 3-hour washout rate profiles which fell below the normal limits, respectively. Two diagnostic criteria including the stress defect and a combination of the stress defect and slow washout were used to detect coronary artery lesions of significance (>=75 % luminal narrowing). The ischemic volumes were also evaluated by quantitative analysis using thallium-201 ECT. The diagnostic accuracy of the stress defect criterion was 95 % for left anterior descending, 90 % for right, and 70 % for left circumflex coronary artery lesions. The combined criteria of the stress defect and slow washout increased detection sensitivity with a moderate loss of specificity for identifying individual coronary artery lesion. A relatively high diagnostic accuracy was obtained using the stress defect criterion for multiple vessel disease (75 %). Ischemic myocardial volume was significantly larger in triple vessel than in single vessel disease (p < 0.05) using the combined criteria. It was concluded that quantitative analysis of exercise thallium-201 myocardial ECT images proves useful for evaluating coronary artery lesions. (author)

  2. Quantitative analysis of night skyglow amplification under cloudy conditions

    Science.gov (United States)

    Kocifaj, Miroslav; Solano Lamphar, Héctor Antonio

    2014-10-01

    The radiance produced by artificial light is a major source of nighttime over-illumination. It can, however, be treated experimentally using ground-based and satellite data. These two types of data complement each other and together have a high information content. For instance, the satellite data enable upward light emissions to be normalized, and this in turn allows skyglow levels at the ground to be modelled under cloudy or overcast conditions. Excessive night lighting imposes an unacceptable burden on nature, humans and professional astronomy. For this reason, there is a pressing need to determine the total amount of downwelling diffuse radiation. Undoubtedly, cloudy periods can cause a significant increase in skyglow as a result of amplification owing to diffuse reflection from clouds. While it is recognized that the amplification factor (AF) varies with cloud cover, the effects of different types of clouds, of atmospheric turbidity and of the geometrical relationships between the positions of an individual observer, the cloud layer, and the light source are in general poorly known. In this paper the AF is quantitatively analysed considering different aerosol optical depths (AODs), urban layout sizes and cloud types with specific albedos and altitudes. The computational results show that the AF peaks near the edges of a city rather than at its centre. In addition, the AF appears to be a decreasing function of AOD, which is particularly important when modelling the skyglow in regions with apparent temporal or seasonal variability of atmospheric turbidity. The findings in this paper will be useful to those designing engineering applications or modelling light pollution, as well as to astronomers and environmental scientists who aim to predict the amplification of skyglow caused by clouds. In addition, the semi-analytical formulae can be used to estimate the AF levels, especially in densely populated metropolitan regions for which detailed computations may be CPU

  3. Regional sustainability in Northern Australia. A quantitative assessment of social, economic and environmental impacts

    International Nuclear Information System (INIS)

    Wood, Richard; Garnett, Stephen

    2010-01-01

    This paper seeks to provide a picture of sustainability of the Northern Territory by analysing a number of sustainability indicators across indigenous status and remoteness class. The paper seeks to extend current socio-economic statistics and analysis by including environmental considerations in a 'triple bottom line' or 'sustainability assessment' approach. Further, a life-cycle approach is employed for a number of indicators so that both direct and indirect impacts are considered where applicable. Whereas urban populations are generally doing better against most quantitative economic and social indicators, environmental indicators show the opposite, reflecting the increasing market-based environmental impacts of urban populations. As we seek to value these environmental impacts appropriately, it would be beneficial to start incorporating these results in policy and planning. (author)

  4. Regional sustainability in Northern Australia. A quantitative assessment of social, economic and environmental impacts

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Richard [School for Environmental Research, Charles Darwin University, NT 0909 (Australia); Industrial Ecology Program, NTNU, Trondheim (Norway); Integrated Sustainability Analysis, University of Sydney, NSW 2006 (Australia); Garnett, Stephen [School for Environmental Research, Charles Darwin University, NT 0909 (Australia)

    2010-07-15

    This paper seeks to provide a picture of sustainability of the Northern Territory by analysing a number of sustainability indicators across indigenous status and remoteness class. The paper seeks to extend current socio-economic statistics and analysis by including environmental considerations in a 'triple bottom line' or 'sustainability assessment' approach. Further, a life-cycle approach is employed for a number of indicators so that both direct and indirect impacts are considered where applicable. Whereas urban populations are generally doing better against most quantitative economic and social indicators, environmental indicators show the opposite, reflecting the increasing market-based environmental impacts of urban populations. As we seek to value these environmental impacts appropriately, it would be beneficial to start incorporating these results in policy and planning. (author)

  5. Quantitative analysis of trivalent uranium and lanthanides in a molten chloride by absorption spectrophotometry

    International Nuclear Information System (INIS)

    Toshiyuki Fujii; Akihiro Uehara; Hajimu Yamana

    2013-01-01

    As an analytical application for pyrochemical reprocessing using molten salts, quantitative analysis of uranium and lanthanides by UV/Vis/NIR absorption spectrophotometry was performed. Electronic absorption spectra of LiCl-KCl eutectic at 773 K including trivalent uranium and eight rare earth elements (Y, La, Ce, Pr, Nd, Sm, Eu, and Gd as fission product elements) were measured in the wavenumber region of 4,500-33,000 cm -1 . The composition of the solutes was simulated for a reductive extraction condition in a pyroreprocessing process for spent nuclear fuels, that is, about 2 wt% U and 0.1-2 wt% rare earth elements. Since U(III) possesses strong absorption bands due to f-d transitions, an optical quartz cell with short light path length of 1 mm was adopted in the analysis. The quantitative analysis of trivalent U, Nd, Pr, and Sm was possible with their f-f transition intensities in the NIR region. The analytical results agree with the prepared concentrations within 2σ experimental uncertainties. (author)

  6. Quantitative fluorescence loss in photobleaching for analysis of protein transport and aggregation

    Directory of Open Access Journals (Sweden)

    Wüstner Daniel

    2012-11-01

    Full Text Available Abstract Background Fluorescence loss in photobleaching (FLIP is a widely used imaging technique, which provides information about protein dynamics in various cellular regions. In FLIP, a small cellular region is repeatedly illuminated by an intense laser pulse, while images are taken with reduced laser power with a time lag between the bleaches. Despite its popularity, tools are lacking for quantitative analysis of FLIP experiments. Typically, the user defines regions of interest (ROIs for further analysis which is subjective and does not allow for comparing different cells and experimental settings. Results We present two complementary methods to detect and quantify protein transport and aggregation in living cells from FLIP image series. In the first approach, a stretched exponential (StrExp function is fitted to fluorescence loss (FL inside and outside the bleached region. We show by reaction–diffusion simulations, that the StrExp function can describe both, binding/barrier–limited and diffusion-limited FL kinetics. By pixel-wise regression of that function to FL kinetics of enhanced green fluorescent protein (eGFP, we determined in a user-unbiased manner from which cellular regions eGFP can be replenished in the bleached area. Spatial variation in the parameters calculated from the StrExp function allow for detecting diffusion barriers for eGFP in the nucleus and cytoplasm of living cells. Polyglutamine (polyQ disease proteins like mutant huntingtin (mtHtt can form large aggregates called inclusion bodies (IB’s. The second method combines single particle tracking with multi-compartment modelling of FL kinetics in moving IB’s to determine exchange rates of eGFP-tagged mtHtt protein (eGFP-mtHtt between aggregates and the cytoplasm. This method is self-calibrating since it relates the FL inside and outside the bleached regions. It makes it therefore possible to compare release kinetics of eGFP-mtHtt between different cells and

  7. Synchrotron radiation microprobe quantitative analysis method for biomedical specimens

    International Nuclear Information System (INIS)

    Xu Qing; Shao Hanru

    1994-01-01

    Relative changes of trace elemental content in biomedical specimens are obtained easily by means of synchrotron radiation X-ray fluorescence microprobe analysis (SXRFM). However, the accurate assignment of concentration on a g/g basis is difficult. Because it is necessary to know both the trace elemental content and the specimen mass in the irradiated volume simultaneously. the specimen mass is a function of the spatial position and can not be weighed. It is possible to measure the specimen mass indirectly by measuring the intensity of Compton scattered peak for normal XRF analysis using a X-ray tube with Mo anode, if the matrix was consisted of light elements and the specimen was a thin sample. The Compton peak is not presented in fluorescence spectrum for white light SXRFM analysis. The continuous background in the spectrum was resulted from the Compton scattering with a linear polarization X-ray source. Biomedical specimens for SXRFM analysis, for example biological section and human hair, are always a thin sample for high energy X-ray, and they consist of H,C,N and O etc. light elements, which implies a linear relationship between the specimen mass and the Compton scattering background in the high energy region of spectrum. By this way , it is possible to carry out measurement of concentration for SXRFM analysis

  8. Quantitative assessment of early diabetic retinopathy using fractal analysis.

    Science.gov (United States)

    Cheung, Ning; Donaghue, Kim C; Liew, Gerald; Rogers, Sophie L; Wang, Jie Jin; Lim, Shueh-Wen; Jenkins, Alicia J; Hsu, Wynne; Li Lee, Mong; Wong, Tien Y

    2009-01-01

    Fractal analysis can quantify the geometric complexity of the retinal vascular branching pattern and may therefore offer a new method to quantify early diabetic microvascular damage. In this study, we examined the relationship between retinal fractal dimension and retinopathy in young individuals with type 1 diabetes. We conducted a cross-sectional study of 729 patients with type 1 diabetes (aged 12-20 years) who had seven-field stereoscopic retinal photographs taken of both eyes. From these photographs, retinopathy was graded according to the modified Airlie House classification, and fractal dimension was quantified using a computer-based program following a standardized protocol. In this study, 137 patients (18.8%) had diabetic retinopathy signs; of these, 105 had mild retinopathy. Median (interquartile range) retinal fractal dimension was 1.46214 (1.45023-1.47217). After adjustment for age, sex, diabetes duration, A1C, blood pressure, and total cholesterol, increasing retinal vascular fractal dimension was significantly associated with increasing odds of retinopathy (odds ratio 3.92 [95% CI 2.02-7.61] for fourth versus first quartile of fractal dimension). In multivariate analysis, each 0.01 increase in retinal vascular fractal dimension was associated with a nearly 40% increased odds of retinopathy (1.37 [1.21-1.56]). This association remained after additional adjustment for retinal vascular caliber. Greater retinal fractal dimension, representing increased geometric complexity of the retinal vasculature, is independently associated with early diabetic retinopathy signs in type 1 diabetes. Fractal analysis of fundus photographs may allow quantitative measurement of early diabetic microvascular damage.

  9. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  10. Collocations and collocation types in ESP textbooks: Quantitative pedagogical analysis

    Directory of Open Access Journals (Sweden)

    Bogdanović Vesna Ž.

    2016-01-01

    Full Text Available The term collocation, even though it is rather common in the English language grammar, it is not a well known or commonly used term in the textbooks and scientific papers written in the Serbian language. Collocating is usually defined as a natural appearance of two (or more words, which are usually one next to another even though they can be separated in the text, while collocations are defined as words with natural semantic and/or syntactic relations being joined together in a sentence. Collocations are naturally used in all English written texts, including scientific texts and papers. Using two textbooks for English for Specific Purposes (ESP for intermediate students' courses, this paper presents the frequency of collocations and their typology. The paper tries to investigate the relationship between lexical and grammatical collocations written in the ESP texts and the reasons for their presence. There is an overview of the most used subtypes of lexical collocations as well. Furthermore, on applying the basic corpus analysis based on the quantitative analysis, the paper presents the number of open, restricted and bound collocations in ESP texts, trying to draw conclusions on their frequency and hence the modes for their learning. There is also a section related to the number and usage of scientific collocations, both common scientific and narrow-professional ones. The conclusion is that the number of present collocations in the selected two textbooks imposes a demand for further analysis of these lexical connections, as well as new modes for their teaching and presentations to the English learning students.

  11. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis.

    Science.gov (United States)

    Radzikowski, Jacek; Stefanidis, Anthony; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more effective strategies that take into account the complex

  12. Quantitative MRI analysis of the brain after twenty-two years of neuromyelitis optica indicates focal tissue damage

    DEFF Research Database (Denmark)

    Aradi, Mihaly; Koszegi, Edit; Orsi, Gergely

    2013-01-01

    ). In such abnormal NAWM regions, biexponential diffusion analysis and quantitative spectroscopy indicated extracellular edema and axonal loss, respectively. Repeated analysis 6 months later identified the same alterations. Such patchy alterations were not detectable in the NAWM of the 3 cases with short-term NMO......BACKGROUND: The long-term effect of neuromyelitis optica (NMO) on the brain is not well established. METHODS: After 22 years of NMO, a patient's brain was examined by quantitative T1- and T2-weighted mono- and biexponential diffusion and proton spectroscopy. It was compared to 3 cases with short...

  13. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  14. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    problem was the basis for an inverse problem, i.e. revealing depth of a body occurrence, its location in a space as well as determining physical properties. At the same time, this method has not received a wide practical application taking into account complexity of real geological media. Careful analysis piezo- and seismoelectric anomalies shows the possibility of application of quantitative analysis of these effects advanced methodologies developed in magnetic prospecting for complex physical-geological conditions (Eppelbaum et al., 2000, 2001, 2010; Eppelbaum, 2010; 2011, 2015). Employment of these methodologies (improved modifications of tangents, characteristic points areal methods) for obtaining quantitative characteristics of ore bodies, environmental features and archaeological targets (models of horizontal circular cylinder, sphere, thin bed, thick bed and thin horizontal plate were utilized) have demonstrated their effectiveness. Case study at the archaeological site Tel Kara Hadid Field piezoelectric observations were conducted at the ancient archaeological site Tel Kara Hadid with gold-quartz mineralization in southern Israel within the Precambrian terrain at the northern extension of the Arabian-Nubian Shield (Neishtadt et al., 2006). The area of the archaeological site is located eight kilometers north of the town of Eilat, in an area of strong industrial noise. Ancient river alluvial terraces (extremely heterogeneous at a local scale, varying from boulders to silt) cover the quartz veins and complicate their identification. Piezoelectric measurements conducted over a quartz vein covered by surface sediments (approximately of 0.4 m thickness) produced a sharp (500 μV ) piezoelectric anomaly. Values recorded over the host rocks (clays and shales of basic composition) were close to zero. The observed piezoelectric anomaly was successfully interpreted by the use of methodologies developed in magnetic prospecting. For effective integration of piezo- and

  15. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    Science.gov (United States)

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  16. Quantitative analysis of elastography images in the detection of breast cancer

    International Nuclear Information System (INIS)

    Landoni, V.; Francione, V.; Marzi, S.; Pasciuti, K.; Ferrante, F.; Saracca, E.; Pedrini, M.; Strigari, L.; Crecco, M.; Di Nallo, A.

    2012-01-01

    Purpose: The aim of this study was to develop a quantitative method for breast cancer diagnosis based on elastosonography images in order to reduce whenever possible unnecessary biopsies. The proposed method was validated by correlating the results of quantitative analysis with the diagnosis assessed by histopathologic exam. Material and methods: 109 images of breast lesions (50 benign and 59 malignant) were acquired with the traditional B-mode technique and with elastographic modality. Images in Digital Imaging and COmmunications in Medicine format (DICOM) were exported into a software, written in Visual Basic, especially developed to perform this study. The lesion was contoured and the mean grey value and softness inside the region of interest (ROI) were calculated. The correlations between variables were investigated and receiver operating characteristic (ROC) curve analysis was performed to assess the diagnostic accuracy of the proposed method. Pathologic results were used as standard reference. Results: Both the mean grey value and the softness inside the ROI resulted statistically different at the t test for the two populations of lesions (i.e., benign versus malignant): p < 0.0001. The area under the curve (AUC) was 0.924 (0.834–0.973) and 0.917 (0.826–0.970) for the mean grey value and for the softness respectively. Conclusions: Quantitative elastosonography is a promising ultrasound technique in the detection of breast cancer but large prospective trials are necessary to determine whether quantitative analysis of images can help to overcome some pitfalls of the methodic.

  17. Quantitative genetic analysis of anxiety trait in bipolar disorder.

    Science.gov (United States)

    Contreras, J; Hare, E; Chavarría, G; Raventós, H

    2018-01-01

    Bipolar disorder type I (BPI) affects approximately 1% of the world population. Although genetic influences on bipolar disorder are well established, identification of genes that predispose to the illness has been difficult. Most genetic studies are based on categorical diagnosis. One strategy to overcome this obstacle is the use of quantitative endophenotypes, as has been done for other medical disorders. We studied 619 individuals, 568 participants from 61 extended families and 51 unrelated healthy controls. The sample was 55% female and had a mean age of 43.25 (SD 13.90; range 18-78). Heritability and genetic correlation of the trait scale from the Anxiety State and Trait Inventory (STAI) was computed by using the general linear model (SOLAR package software). we observed that anxiety trait meets the following criteria for an endophenotype of bipolar disorder type I (BPI): 1) association with BPI (individuals with BPI showed the highest trait score (F = 15.20 [5,24], p = 0.009), 2) state-independence confirmed after conducting a test-retest in 321 subjects, 3) co-segregation within families 4) heritability of 0.70 (SE: 0.060), p = 2.33 × 10 -14 and 5) genetic correlation with BPI was 0.20, (SE = 0.17, p = 3.12 × 10 -5 ). Confounding factors such as comorbid disorders and pharmacological treatment could affect the clinical relationship between BPI and anxiety trait. Further research is needed to evaluate if anxiety traits are specially related to BPI in comparison with other traits such as anger, attention or response inhibition deficit, pathological impulsivity or low self-directedness. Anxiety trait is a heritable phenotype that follows a normal distribution when measured not only in subjects with BPI but also in unrelated healthy controls. It could be used as an endophenotype in BPI for the identification of genomic regions with susceptibility genes for this disorder. Published by Elsevier B.V.

  18. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  19. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  20. Quantitative analysis of dynamic association in live biological fluorescent samples.

    Directory of Open Access Journals (Sweden)

    Pekka Ruusuvuori

    Full Text Available Determining vesicle localization and association in live microscopy may be challenging due to non-simultaneous imaging of rapidly moving objects with two excitation channels. Besides errors due to movement of objects, imaging may also introduce shifting between the image channels, and traditional colocalization methods cannot handle such situations. Our approach to quantifying the association between tagged proteins is to use an object-based method where the exact match of object locations is not assumed. Point-pattern matching provides a measure of correspondence between two point-sets under various changes between the sets. Thus, it can be used for robust quantitative analysis of vesicle association between image channels. Results for a large set of synthetic images shows that the novel association method based on point-pattern matching demonstrates robust capability to detect association of closely located vesicles in live cell-microscopy where traditional colocalization methods fail to produce results. In addition, the method outperforms compared Iterated Closest Points registration method. Results for fixed and live experimental data shows the association method to perform comparably to traditional methods in colocalization studies for fixed cells and to perform favorably in association studies for live cells.

  1. Qualitative and quantitative analysis of plutonium in solid waste drums

    International Nuclear Information System (INIS)

    Anno, Jacques; Escarieux, Emile

    1977-01-01

    An assessment of the results given by a study carried out for the development of qualitative and quantitative analysis, by γ spectrometry, of plutonium in solid waste drums is presented. After having reminded the standards and their incidence on the quantities of plutonium to be measured (application at industrial Pu: 20% of Pu 240 ) the equipment used is described. Measurement station provided with a mechanical system consisting of: a rail and a pulley block to bring the drums; a pit and a hydraulic jack with a rotating platform. The detection instrumentation consisting of: a high volume coaxial Ge(Li) detector with a γ ray resolution of 2 keV; an associated electronic; a processing of data by a 'Plurimat 20' minicomputer. Principles of the identification and measurements are specified and supported by experimental results. They are the following: determination of the quality of Pu by measuring the ratio between the γ ray intensities of the 239 Pu 129 keV and of the 241 Pu 148 keV; measurement of the 239 Pu mass by estimating the γ ray counting rate of the 375 keV from the calibrating curves given by plutonium samples varying from 32 mg to 80 g; correction of the results versus the source position into the drum and versus the filling in plastic materials into this drum. The experimental results obtained over 40 solid waste drums are presented along with the error estimates [fr

  2. A temperature-controlled photoelectrochemical cell for quantitative product analysis

    Science.gov (United States)

    Corson, Elizabeth R.; Creel, Erin B.; Kim, Youngsang; Urban, Jeffrey J.; Kostecki, Robert; McCloskey, Bryan D.

    2018-05-01

    In this study, we describe the design and operation of a temperature-controlled photoelectrochemical cell for analysis of gaseous and liquid products formed at an illuminated working electrode. This cell is specifically designed to quantitatively analyze photoelectrochemical processes that yield multiple gas and liquid products at low current densities and exhibit limiting reactant concentrations that prevent these processes from being studied in traditional single chamber electrolytic cells. The geometry of the cell presented in this paper enables front-illumination of the photoelectrode and maximizes the electrode surface area to electrolyte volume ratio to increase liquid product concentration and hence enhances ex situ spectroscopic sensitivity toward them. Gas is bubbled through the electrolyte in the working electrode chamber during operation to maintain a saturated reactant concentration and to continuously mix the electrolyte. Gaseous products are detected by an in-line gas chromatograph, and liquid products are analyzed ex situ by nuclear magnetic resonance. Cell performance was validated by examining carbon dioxide reduction on a silver foil electrode, showing comparable results both to those reported in the literature and identical experiments performed in a standard parallel-electrode electrochemical cell. To demonstrate a photoelectrochemical application of the cell, CO2 reduction experiments were carried out on a plasmonic nanostructured silver photocathode and showed different product distributions under dark and illuminated conditions.

  3. Quantitative risk analysis of the pipeline GASDUC III - solutions

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Edmilson P.; Bettoni, Izabel Cristina [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In this work the quantitative risks analysis to the external public of the Pipeline Cabiunas - REDUC (GASDUC III), with 180 km, linking the municipalities of Macae and Duque de Caxias - RJ was performed by the Companies PETROBRAS and ITSEMAP do Brasil. In addition to the large diameter of the pipeline 38 inches and high operation pressure 100 kgf/cm{sup 2} operating with natural gas through several densely populated areas. Initially, the individual risk contours were calculated without considering mitigating measures, obtaining as result the individual risk contour with frequencies of 1x10{sup -06} per year involving sensitive occupations and therefore considered unacceptable when compared with the INEA criterion. The societal risk was calculated for eight densely populated areas and their respective FN-curves situated below the advised limit established by INEA, except for two areas that required the proposal of additional mitigating measures to the reduction of societal risk. Regarding to societal risk, the FN-curve should be below the advised limit presented in the Technical Instruction of INEA. The individual and societal risk were reassessed incorporating some mitigating measures and the results situated below the advised limits established by INEA and PETROBRAS has obtained the license for installation of the pipeline. (author)

  4. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    Energy Technology Data Exchange (ETDEWEB)

    Prahl, P; Weeke, B; Loewenstein, H [Rigshospitalet, Copenhagen (Denmark)

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10/sup 4/, 2 x 10/sup 4/, 2 x 10/sup 5/ dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A.

  5. Quantitative produced water analysis using mobile 1H NMR

    International Nuclear Information System (INIS)

    Wagner, Lisabeth; Fridjonsson, Einar O; May, Eric F; Stanwix, Paul L; Graham, Brendan F; Carroll, Matthew R J; Johns, Michael L; Kalli, Chris

    2016-01-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1 H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1 H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1 H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1–30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography. (paper)

  6. Photographers’ Nomenclature Units: A Structural and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Margarita A. Mihailova

    2017-11-01

    Full Text Available Addressing the needs of cross and intercultural communication as well as the methodology of contrastive research, the paper presents the results of the complex analysis conducted to describe semantic and pragmatic parameters of nomenclature units denoting photography equipment in the modern Russian informal discourse of professional photographers. The research is exemplified by 34 original nomenclature units and their 34 Russian equivalents used in 6871 comments posted at “Клуб.Foto.ru” web-site in 2015. The structural and quantitative analyses of photographers’ nomenclature demonstrate the users’ morphological and graphic preferences and indirectly reflect their social and professional values. The corpus-based approach developed by Kast-Aigner (2009: 141 was applied in the study with the aim to identify the nomenclature units denoting photography equipment, validate and elaborate the data of the existing corpus. The research also throws light on the problems of professional language development and derivational processes. The perspective of the study lies in the research of the broader context of professional nomenclature.

  7. Quantitative immunoelectrophoretic analysis of extract from cow hair and dander

    International Nuclear Information System (INIS)

    Prahl, P.; Weeke, B.; Loewenstein, H.

    1978-01-01

    Quantitative immunoelectrophoresis used for the analysis of a dialysed, centrifuged and freeze-dried extract from cow hair and dander revealed 17 antigens. Five of these were identified as serum proteins. Partial identity to antigens of serum and extract from hair and dander of goat, sheep, swine, horse, dog, cat, and guinea pig, and to antigens of house dust was demonstrated. Sera from 36 patients with manifest allergy to cow hair and dander selected on the basis of case history, RAST, skin and provocation test, were examined in crossed radioimmunoelectrophoresis (CRIE); sera from five persons with high serum IgE, but without allergy to cow hair and dander, and sera from five normal individuals were controls. 31/36 of the sera contained IgE with specific affinity for two of the antigens of the extract. Further, two major and six minor allergens were identified. The control sera showed no specific IgE binding. A significant positive correlation was found between RAST and CRIE for the first group of patients. The approximate molecular weights of the four major allergens obtained by means of gel chromatography were: 2.4 x 10 4 , 2 x 10 4 , 2 x 10 5 dalton, respectively. Using Con-A and Con-A Sepharose in crossed immunoaffinoelectrophoresis, eight of the antigens were revealed to contain groups with affinity for Con-A. (author)

  8. Social media in epilepsy: A quantitative and qualitative analysis.

    Science.gov (United States)

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Regional analysis of the nuclear-electricity

    International Nuclear Information System (INIS)

    Parera, M. D.

    2011-11-01

    In this study was realized a regional analysis of the Argentinean electric market contemplating the effects of regional cooperation, the internal and international interconnections; and the possibilities of insert of new nuclear power stations were evaluated in different regions of the country, indicating the most appropriate areas to carry out these facilities to increase the penetration of the nuclear energy in the national energy matrix. Also was studied the interconnection of the electricity and natural gas markets, due to the existent linking among both energy forms. With this purpose the program Message (Model for energy supply strategy alternatives and their general environmental impacts) was used, promoted by the International Atomic Energy Agency. This model carries out an economic optimization level country, obtaining the minimum cost as a result for the modeling system. The division for regions realized by the Compania Administradora del Mercado Mayorista Electrico (CAMMESA) was used, which divides to the country in eight regions. They were considered the characteristics and necessities of each one of them, their respective demands and offers of electric power and natural gas, as well as their existent and projected interconnections, composed by the electric lines and gas pipes. According to the results obtained through the model, the nuclear-electricity is a competitive option. (Author)

  10. Geographical data structures supporting regional analysis

    International Nuclear Information System (INIS)

    Edwards, R.G.; Durfee, R.C.

    1978-01-01

    In recent years the computer has become a valuable aid in solving regional environmental problems. Over a hundred different geographic information systems have been developed to digitize, store, analyze, and display spatially distributed data. One important aspect of these systems is the data structure (e.g. grids, polygons, segments) used to model the environment being studied. This paper presents eight common geographic data structures and their use in studies of coal resources, power plant siting, population distributions, LANDSAT imagery analysis, and landuse analysis

  11. B1 -sensitivity analysis of quantitative magnetization transfer imaging.

    Science.gov (United States)

    Boudreau, Mathieu; Stikov, Nikola; Pike, G Bruce

    2018-01-01

    To evaluate the sensitivity of quantitative magnetization transfer (qMT) fitted parameters to B 1 inaccuracies, focusing on the difference between two categories of T 1 mapping techniques: B 1 -independent and B 1 -dependent. The B 1 -sensitivity of qMT was investigated and compared using two T 1 measurement methods: inversion recovery (IR) (B 1 -independent) and variable flip angle (VFA), B 1 -dependent). The study was separated into four stages: 1) numerical simulations, 2) sensitivity analysis of the Z-spectra, 3) healthy subjects at 3T, and 4) comparison using three different B 1 imaging techniques. For typical B 1 variations in the brain at 3T (±30%), the simulations resulted in errors of the pool-size ratio (F) ranging from -3% to 7% for VFA, and -40% to > 100% for IR, agreeing with the Z-spectra sensitivity analysis. In healthy subjects, pooled whole-brain Pearson correlation coefficients for F (comparing measured double angle and nominal flip angle B 1 maps) were ρ = 0.97/0.81 for VFA/IR. This work describes the B 1 -sensitivity characteristics of qMT, demonstrating that it varies substantially on the B 1 -dependency of the T 1 mapping method. Particularly, the pool-size ratio is more robust against B 1 inaccuracies if VFA T 1 mapping is used, so much so that B 1 mapping could be omitted without substantially biasing F. Magn Reson Med 79:276-285, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  12. Quantitative Gait Analysis in Patients with Huntington’s Disease

    Directory of Open Access Journals (Sweden)

    Seon Jong Pyo

    2017-09-01

    Full Text Available Objective Gait disturbance is the main factor contributing to a negative impact on quality of life in patients with Huntington’s disease (HD. Understanding gait features in patients with HD is essential for planning a successful gait strategy. The aim of this study was to investigate temporospatial gait parameters in patients with HD compared with healthy controls. Methods We investigated 7 patients with HD. Diagnosis was confirmed by genetic analysis, and patients were evaluated with the Unified Huntington’s Disease Rating Scale (UHDRS. Gait features were assessed with a gait analyzer. We compared the results of patients with HD to those of 7 age- and sex-matched normal controls. Results Step length and stride length were decreased and base of support was increased in the HD group compared to the control group. In addition, coefficients of variability for step and stride length were increased in the HD group. The HD group showed slower walking velocity, an increased stance/swing phase in the gait cycle and a decreased proportion of single support time compared to the control group. Cadence did not differ significantly between groups. Among the UHDRS subscores, total motor score and total behavior score were positively correlated with step length, and total behavior score was positively correlated with walking velocity in patients with HD. Conclusion Increased variability in step and stride length, slower walking velocity, increased stance phase, and decreased swing phase and single support time with preserved cadence suggest that HD gait patterns are slow, ataxic and ineffective. This study suggests that quantitative gait analysis is needed to assess gait problems in HD.

  13. Myelination progression in language-correlated regions in brain of normal children determined by quantitative MRI assessment.

    Science.gov (United States)

    Su, Peijen; Kuan, Chen-Chieh; Kaga, Kimitaka; Sano, Masaki; Mima, Kazuo

    2008-12-01

    To investigate the myelination progression course in language-correlated regions of children with normal brain development by quantitative magnetic resonance imaging (MRI) analysis compared with histological studies. The subjects were 241 neurologically intact neonates, infants and young children (128 boys and 113 girls) who underwent MRI between 2001 and 2007 at the University of Tokyo Hospital, ranging in age from 0 to 429 weeks corrected by postnatal age. To compare their data with adult values, 25 adolescents and adults (14 men and 11 women, aged from 14 to 83 years) were examined as controls. Axial T2-weighted images were obtained using spin-echo sequences at 1.5 T. Subjects with a history of prematurity, birth asphyxia, low Apgar score, seizures, active systemic disease, congenital anomaly, delayed development, infarcts, hemorrhages, brain lesions, or central nervous system malformation were excluded from the analysis. Seven regions of interest in language-correlated areas, namely Broca's area, Wernicke's area, the arcuate fasciculus, and the angular gyrus, as well as their right hemisphere homologous regions, and the auditory cortex, the motor cortex, and the visual cortex were examined. Signal intensity obtained by a region-of-interest methodology progresses from hyper- to hypointensity during myelination. We chose the inferior cerebellar peduncle as the internal standard of maturation. Myelination in all these seven language-correlated regions examined in this study shared the same curve pattern: no myelination was observed at birth, it reached maturation at about 1.5 years of age, and it continued to progress slowly thereafter into adult life. On the basis of scatter plot results, we put these areas into three groups: Group A, which included the motor cortex, the auditory cortex, and the visual cortex, myelinated faster than Group B, which included Broca's area, Wernicke's area, and the angular gyrus before 1.5 years old; Group C, consisting of the

  14. Characterising Ageing in the Human Brainstem Using Quantitative Multimodal MRI Analysis

    Directory of Open Access Journals (Sweden)

    Christian eLambert

    2013-08-01

    Full Text Available Ageing is ubiquitous to the human condition. The MRI correlates of healthy ageing have been extensively investigated using a range of modalities, including volumetric MRI, quantitative MRI and DTI. Despite this, the reported brainstem related changes remain sparse. This is, in part, due to the technical and methodological limitations in quantitatively assessing and statistically analysing this region. By utilising a new method of brainstem segmentation, a large cohort of 100 healthy adults were assessed in this study for the effects of ageing within the human brainstem in vivo. Using quantitative MRI (qMRI, tensor based morphometry (TBM and voxel based quantification (VBQ, the volumetric and quantitative changes across healthy adults between 19-75 years were characterised. In addition to the increased R2* in substantia nigra corresponding to increasing iron deposition with age, several novel findings were reported in the current study. These include selective volumetric loss of the brachium conjunctivum, with a corresponding decrease in magnetisation transfer (MT and increase in proton density (PD, accounting for the previously described midbrain shrinkage. Additionally, we found increases in R1 and PD in several pontine and medullary structures. We consider these changes in the context of well-characterised, functional age-related changes, and propose potential biophysical mechanisms. This study provides detailed quantitative analysis of the internal architecture of the brainstem and provides a baseline for further studies of neurodegenerative diseases that are characterised by early, pre-clinical involvement of the brainstem, such as Parkinson’s and Alzheimer’s diseases.

  15. Quantitative analysis of diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) for brain disorders

    Science.gov (United States)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon

    2013-07-01

    This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.

  16. Genetic basis of qualitative and quantitative resistance to powdery mildew in wheat: from consensus regions to candidate genes.

    Science.gov (United States)

    Marone, Daniela; Russo, Maria A; Laidò, Giovanni; De Vita, Pasquale; Papa, Roberto; Blanco, Antonio; Gadaleta, Agata; Rubiales, Diego; Mastrangelo, Anna M

    2013-08-19

    Powdery mildew (Blumeria graminis f. sp. tritici) is one of the most damaging diseases of wheat. The objective of this study was to identify the wheat genomic regions that are involved in the control of powdery mildew resistance through a quantitative trait loci (QTL) meta-analysis approach. This meta-analysis allows the use of collected QTL data from different published studies to obtain consensus QTL across different genetic backgrounds, thus providing a better definition of the regions responsible for the trait, and the possibility to obtain molecular markers that will be suitable for marker-assisted selection. Five QTL for resistance to powdery mildew were identified under field conditions in the durum-wheat segregating population Creso × Pedroso. An integrated map was developed for the projection of resistance genes/ alleles and the QTL from the present study and the literature, and to investigate their distribution in the wheat genome. Molecular markers that correspond to candidate genes for plant responses to pathogens were also projected onto the map, particularly considering NBS-LRR and receptor-like protein kinases. More than 80 independent QTL and 51 resistance genes from 62 different mapping populations were projected onto the consensus map using the Biomercator statistical software. Twenty-four MQTL that comprised 2-6 initial QTL that had widely varying confidence intervals were found on 15 chromosomes. The co-location of the resistance QTL and genes was investigated. Moreover, from analysis of the sequences of DArT markers, 28 DArT clones mapped on wheat chromosomes have been shown to be associated with the NBS-LRR genes and positioned in the same regions as the MQTL for powdery mildew resistance. The results from the present study provide a detailed analysis of the genetic basis of resistance to powdery mildew in wheat. The study of the Creso × Pedroso durum-wheat population has revealed some QTL that had not been previously identified. Furthermore

  17. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  18. Use of local noise power spectrum and wavelet analysis in quantitative image quality assurance for EPIDs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Soyoung [Department of Radiation Oncology, University Hospitals Case and Medical Center, Cleveland, Ohio 44106 (United States); Yan, Guanghua; Bassett, Philip; Samant, Sanjiv, E-mail: samant@ufl.edu [Department of Radiation Oncology, University of Florida College of Medicine, Gainesville, Florida 32608 (United States); Gopal, Arun [Department of Radiation Oncology, University of Maryland School of Medicine, Baltimore, Maryland 21201 (United States)

    2016-09-15

    Purpose: To investigate the use of local noise power spectrum (NPS) to characterize image noise and wavelet analysis to isolate defective pixels and inter-subpanel flat-fielding artifacts for quantitative quality assurance (QA) of electronic portal imaging devices (EPIDs). Methods: A total of 93 image sets including custom-made bar-pattern images and open exposure images were collected from four iViewGT a-Si EPID systems over three years. Global quantitative metrics such as modulation transform function (MTF), NPS, and detective quantum efficiency (DQE) were computed for each image set. Local NPS was also calculated for individual subpanels by sampling region of interests within each subpanel of the EPID. The 1D NPS, obtained by radially averaging the 2D NPS, was fitted to a power-law function. The r-square value of the linear regression analysis was used as a singular metric to characterize the noise properties of individual subpanels of the EPID. The sensitivity of the local NPS was first compared with the global quantitative metrics using historical image sets. It was then compared with two commonly used commercial QA systems with images collected after applying two different EPID calibration methods (single-level gain and multilevel gain). To detect isolated defective pixels and inter-subpanel flat-fielding artifacts, Haar wavelet transform was applied on the images. Results: Global quantitative metrics including MTF, NPS, and DQE showed little change over the period of data collection. On the contrary, a strong correlation between the local NPS (r-square values) and the variation of the EPID noise condition was observed. The local NPS analysis indicated image quality improvement with the r-square values increased from 0.80 ± 0.03 (before calibration) to 0.85 ± 0.03 (after single-level gain calibration) and to 0.96 ± 0.03 (after multilevel gain calibration), while the commercial QA systems failed to distinguish the image quality improvement between the two

  19. Serial thallium-201 imaging after dipyridamole for coronary disease detection: quantitative analysis using myocardial clearance

    International Nuclear Information System (INIS)

    Okada, R.D.; Dai, Y.H.; Boucher, C.A.; Pohost, G.M.

    1984-01-01

    After dipyridamole, canine studies have demonstrated a slower rate of myocardial thallium-201 clearance from zones distal to a coronary artery stenosis compared to normal zones. To determine if criteria based on canine myocardial thallium-201 clearance rates could be applied clinically, 40 patients with and 26 patients without coronary artery disease (CAD) had serial thallium-201 images obtained for 2 to 5 hours after dipyridamole. Regions of interest were manually placed over six left ventricular segments in two projections for each of three imaging times. The myocardial thallium-201 clearance rate was calculated for each of the six segments and, using the clearance rate criterion found in canine studies, was considered abnormal if less than 6.5%/hr. Using this criterion alone, 22 of 26 patients (85%) without CAD had normal and 30 of 40 patients (75%) with CAD had abnormal myocardial thallium-201 clearance rates. A quantitative analysis of regional inhomogeneity in tracer distribution (normal was greater than or equal to 25% difference between segments) was negative in 24 of 26 patients (92%) without CAD and positive in 20 of 40 patients (50%) with CAD. When both clearance rate and regional inhomogeneity were considered, 21 of 26 patients (81%) without CAD had negative and 36 of 40 patients (90%) with CAD had positive results. Thus, post-dipyridamole myocardial clearance rate criteria derived from canine studies can be applied to clinical thallium imaging. Quantitative analysis of serial thallium-201 images after dipyridamole is optimized by using myocardial thallium-201 clearance rates. Such an approach is independent of regional inhomogeneities in tracer distribution

  20. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  1. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  2. Combining SPECT and Quantitative EEG Analysis for the Automated Differential Diagnosis of Disorders with Amnestic Symptoms

    Directory of Open Access Journals (Sweden)

    Yvonne Höller

    2017-09-01

    Full Text Available Single photon emission computed tomography (SPECT and Electroencephalography (EEG have become established tools in routine diagnostics of dementia. We aimed to increase the diagnostic power by combining quantitative markers from SPECT and EEG for differential diagnosis of disorders with amnestic symptoms. We hypothesize that the combination of SPECT with measures of interaction (connectivity in the EEG yields higher diagnostic accuracy than the single modalities. We examined 39 patients with Alzheimer's dementia (AD, 69 patients with depressive cognitive impairment (DCI, 71 patients with amnestic mild cognitive impairment (aMCI, and 41 patients with amnestic subjective cognitive complaints (aSCC. We calculated 14 measures of interaction from a standard clinical EEG-recording and derived graph-theoretic network measures. From regional brain perfusion measured by 99mTc-hexamethyl-propylene-aminoxime (HMPAO-SPECT in 46 regions, we calculated relative cerebral perfusion in these patients. Patient groups were classified pairwise with a linear support vector machine. Classification was conducted separately for each biomarker, and then again for each EEG- biomarker combined with SPECT. Combination of SPECT with EEG-biomarkers outperformed single use of SPECT or EEG when classifying aSCC vs. AD (90%, aMCI vs. AD (70%, and AD vs. DCI (100%, while a selection of EEG measures performed best when classifying aSCC vs. aMCI (82% and aMCI vs. DCI (90%. Only the contrast between aSCC and DCI did not result in above-chance classification accuracy (60%. In general, accuracies were higher when measures of interaction (i.e., connectivity measures were applied directly than when graph-theoretical measures were derived. We suggest that quantitative analysis of EEG and machine-learning techniques can support differentiating AD, aMCI, aSCC, and DCC, especially when being combined with imaging methods such as SPECT. Quantitative analysis of EEG connectivity could become

  3. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  4. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  5. The research of the quantitative prediction of the deposits concentrated regions of the large and super-large sized mineral deposits in China

    International Nuclear Information System (INIS)

    Zhao Zhenyu; Wang Shicheng

    2003-01-01

    By the general theory and method of mineral resources prognosis of synthetic information, the locative and quantitative prediction of the large and super-large sized mineral deposits of solid resources of 1 : 5,000,000 are developed in china. The deposit concentrated regions is model unit, the anomaly concentrated regions is prediction unit. The mineral prognosis of synthetic information is developed on GIS platform. The technical route and work method of looking for the large and super-large sized mineral resources and basic principle of compiling attribute table of the variables and the response variables are mentioned. In research of prediction of resources quantity, the locative and quantitative prediction are processed by separately the quantification theory Ⅲ and the corresponding characteristic analysis, two methods are compared. It is very important for resources prediction of western ten provinces in china, it is helpful. (authors)

  6. Clinical significance of quantitative analysis of facial nerve enhancement on MRI in Bell's palsy.

    Science.gov (United States)

    Song, Mee Hyun; Kim, Jinna; Jeon, Ju Hyun; Cho, Chang Il; Yoo, Eun Hye; Lee, Won-Sang; Lee, Ho-Ki

    2008-11-01

    Quantitative analysis of the facial nerve on the lesion side as well as the normal side, which allowed for more accurate measurement of facial nerve enhancement in patients with facial palsy, showed statistically significant correlation with the initial severity of facial nerve inflammation, although little prognostic significance was shown. This study investigated the clinical significance of quantitative measurement of facial nerve enhancement in patients with Bell's palsy by analyzing the enhancement pattern and correlating MRI findings with initial severity of facial palsy and clinical outcome. Facial nerve enhancement was measured quantitatively by using the region of interest on pre- and postcontrast T1-weighted images in 44 patients diagnosed with Bell's palsy. The signal intensity increase on the lesion side was first compared with that of the contralateral side and then correlated with the initial degree of facial palsy and prognosis. The lesion side showed significantly higher signal intensity increase compared with the normal side in all of the segments except for the mastoid segment. Signal intensity increase at the internal auditory canal and labyrinthine segments showed correlation with the initial degree of facial palsy but no significant difference was found between different prognostic groups.

  7. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  8. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Science.gov (United States)

    Salminen, Aino; Kopra, K. A. Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S.; Sinisalo, Juha; Pussinen, Pirkko J.

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4–5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39–4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52). The highest OR 3.59 (95% CI 1.94–6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and

  9. Development of a method for a quantitative evaluation of region oriented policy

    NARCIS (Netherlands)

    Soest, van F.; Stein, A.; Dekkers, A.L.M.; Duijvenboden, van W.

    2001-01-01

    Modern environmental policy implementation in many developed countries is increasingly regionally oriented. Regional governments have undertaken measures designed for the specific needs of the region but, so far, the resulting change in environmental quality has hardly been monitored. This study

  10. Regional homogeneity of electoral space: comparative analysis (on the material of 100 national cases

    Directory of Open Access Journals (Sweden)

    A. O. Avksentiev

    2015-12-01

    Full Text Available In the article the author examines dependence on electoral behavior from territorial belonging. «Regional homogeneity» and «electoral space» categories are conceptualized. It is argued, that such regional homogeneity is a characteristic of electoral space and can be quantified. Quantitative measurement of government regional homogeneity has direct connection with risk of separatism, civil conflicts, or legitimacy crisis on deviant territories. It is proposed the formulae for evaluation of regional homogeneity quantitative method which has been based on statistics analysis instrument, especially, variation coefficient. Possible directions of study with the use of this index according to individual political subjects and the whole political space (state, region, electoral district are defined. Calculation of appropriate indexes for Ukrainian electoral space (return of 1991­2015 elections and 100 other national cases. The dynamics of Ukraine regional homogeneity on the material of 1991­2015 electoral statistics is analyzed.

  11. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  12. Quantitative analysis of ultrasound B-mode images of carotid atherosclerotic plaque: correlation with visual classification and histological examination

    DEFF Research Database (Denmark)

    Wilhjelm, Jens E.; Grønholdt, Marie-Louise; Wiebe, Brit

    1998-01-01

    regions of the plaque in still ultrasound images from three orthogonal scan planes and finally a histological analysis of the surgically removed plaque. The quantitative comparison was made with the linear model and with separation of the available data into training and test sets. The comparison......This paper presents a quantitative comparison of three types of information available for 52 patients scheduled for carotid endarterectomy: subjective classification of the ultrasound images obtained during scanning before operation, first- and second-order statistical features extracted from...

  13. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  14. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    International Nuclear Information System (INIS)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H.

    1998-01-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size≥0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions

  15. Diagnostic accuracy of quantitative 99mTc-MIBI scintimammography according to ROC curve analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Lee, H. K.; Seo, J. W.; Cho, N. S.; Cha, K. H.; Lee, T. H. [Gachon Medical College, Gil Medical Center, Inchon (Korea, Republic of)

    1998-07-01

    99mTc-sestamibi scintimammography (SMM) has been shown to be a useful diagnostic test in the detection of breast cancer and the receiver operating characteristic (ROC) curve analysis provides detailed information of a diagnostic test. The purpose of this study was to evaluate the feasibility and efficacy of quantitative indices of SMM in the detection of malignant breast lesions according to ROC analysis. Prone anterior, lateral planar and supine SPECT imagings were performed on 75 female patients (mean age=43.4 yr) with breast mass (size{>=}0.8cm) after intravenous injection of 20-30 mCi 99mTc-sestamibi. 45 Malignant (Invasive ductal ca(36), Inv lobular ca(5), Inv duc + lob (1), Inv tubular ca (3)) and 30 benign (fibroadenoma (13), fib cyst(12), Fat necrosis(3), papilloma(1), paraffinoma (1)) lesions were histologically proven. Data were analyzed by creating three regions of interest (ROIs) over designated areas: lesion, normal breast and right chest wall. Lesion to normal (L/NL) and lesion to chest wall (L/CW) ratios were calculated for each patient both on the planar and SPECT. The area under the ROC curve (AUC) was calculated and compared among four semiquantitative indices and an average scintimammographic index (SMM(mean)) from arithmatic mean. ROC curve analysis revealed planar L/N, SPECT L/N and L/CW ratios provide comparable better diagnostic accuracies for detection of breast cancer than planar L/CW ratio (p<0.05), respectively. For quantitative SMM of 75 lesions, malignancy rate was 60%, and Sensitivity, Specificity, Positive Predictive Value, Negative Predictive Value and Accuracy were 0.78, 0.77, 0.84, 0.72 and 0.77, respectively. Quantitative SMM is an useful objective method for differentiating malignant from benign breast lesions.

  16. Quantitative analysis of results for quality assurance in radiotherapy

    International Nuclear Information System (INIS)

    Passaro, Bruno Martins

    2011-01-01

    The linear accelerators represent the most important, practical and versatile source of ionizing radiation in radiotherapy. These functional characteristics influence the geometric and dosimetric accuracy of therapeutic doses applied to patients. The performance of this equipment may vary due to electronic defects, component failures or mechanical breakdowns, or may vary due to the deterioration and aging of components. Maintaining the quality of care depends on the stability of the accelerators and quality control of the institutions to monitor deviations in the parameters of the beam. The aim of this study is to assess and analyze the stability of the calibration factor of linear accelerators, as well as the other dosimetric parameters normally included in a program of quality control in radiotherapy. The average calibration factors of the accelerators for the period of approximately four years for the Clinac 600C and Clinac 6EX were (0,998 ± 0,012) and (0,996 ± 0,014), respectively. For the Clinac 2100CD 6 MV and 15 MV was (1,008 ± 0,009) and (1,006 ± 0,010), respectively, in a period of approximately four years. Statistical analysis of the three linear accelerators was found that the coefficient of variation of calibration factors had values below 2% which shows a consistency in the data. By calculating the normal distribution of calibration factors, we found that for the Clinac 600C and Clinac 2100CD, is an expected probability that more than 90% of cases the values are within acceptable limits according to the TG-142, while for the Clinac 6EX is expected around 85% since this had several exchanges of accelerator components. The values of TPR 20,10 of three accelerators are practically constant and within acceptable limits according to the TG-142. It can be concluded that a detailed study of data from the calibration factor of the accelerators and TPR20,10 from a quantitative point of view, is extremely useful in a quality assurance program. (author)

  17. A quantitative spatiotemporal analysis of microglia morphology during ischemic stroke and reperfusion

    Directory of Open Access Journals (Sweden)

    Morrison Helena W

    2013-01-01

    Full Text Available Abstract Background Microglia cells continuously survey the healthy brain in a ramified morphology and, in response to injury, undergo progressive morphological and functional changes that encompass microglia activation. Although ideally positioned for immediate response to ischemic stroke (IS and reperfusion, their progressive morphological transformation into activated cells has not been quantified. In addition, it is not well understood if diverse microglia morphologies correlate to diverse microglia functions. As such, the dichotomous nature of these cells continues to confound our understanding of microglia-mediated injury after IS and reperfusion. The purpose of this study was to quantitatively characterize the spatiotemporal pattern of microglia morphology during the evolution of cerebral injury after IS and reperfusion. Methods Male C57Bl/6 mice were subjected to focal cerebral ischemia and periods of reperfusion (0, 8 and 24 h. The microglia process length/cell and number of endpoints/cell was quantified from immunofluorescent confocal images of brain regions using a skeleton analysis method developed for this study. Live cell morphology and process activity were measured from movies acquired in acute brain slices from GFP-CX3CR1 transgenic mice after IS and 24-h reperfusion. Regional CD11b and iNOS expressions were measured from confocal images and Western blot, respectively, to assess microglia proinflammatory function. Results Quantitative analysis reveals a significant spatiotemporal relationship between microglia morphology and evolving cerebral injury in the ipsilateral hemisphere after IS and reperfusion. Microglia were both hyper- and de-ramified in striatal and cortical brain regions (respectively after 60 min of focal cerebral ischemia. However, a de-ramified morphology was prominent when ischemia was coupled to reperfusion. Live microglia were de-ramified, and, in addition, process activity was severely blunted proximal to

  18. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  19. Global tractography with embedded anatomical priors for quantitative connectivity analysis

    Directory of Open Access Journals (Sweden)

    Alia eLemkaddem

    2014-11-01

    Full Text Available The main assumption of fiber-tracking algorithms is that fiber trajectories are represented by paths of highest diffusion, which is usually accomplished by following the principal diffusion directions estimated in every voxel from the measured diffusion MRI data. The state-of-the-art approaches, known as global tractography, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The tractograms obtained with these algorithms outperform any previous technique but, unfortunately, the price to pay is an increased computational cost which is not suitable in many practical settings, both in terms of time and memory requirements. Furthermore, existing global tractography algorithms suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are used during in the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the white matter. This does not only unnecessarily slow down the estimation procedure and potentially biases any subsequent analysis but also, most importantly, prevents the de facto quantification of brain connectivity. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications by explicitly enforcing anatomical priors of the tracts in the optimization and considering the effective contribution of each of them, i.e. volume, to the acquired diffusion MRI image. We evaluated our approach on both a realistic diffusion MRI phantom and in-vivo data, and also compared its performance to existing tractography aloprithms.

  20. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  1. Quantitative analysis of soluble elements in environmental waters by PIXE

    International Nuclear Information System (INIS)

    Niizeki, T.; Kawasaki, K.; Adachi, M.; Tsuji, M.; Hattori, T.

    1999-01-01

    We have started PIXE research for environmental science at Van de Graaff accelerator facility in Tokyo Institute of Technology. Quantitative measurements of soluble fractions in river waters have been carried out using the preconcentrate method developed in Tohoku University. We reveal that this PIXE target preparation can be also applied to waste water samples. (author)

  2. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...

  3. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... enhancing safety margins and adding greater precision to quantitative accident prediction [45]. One deterministic methodology is the stringency level (SL) approach, which is recognized as a valuable decision tool in the selection of standardized materials specifications to prevent potential failures [3].

  4. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    SAM

    2014-05-14

    May 14, 2014 ... African Journal of Biotechnology. Full Length ... quantitative trait locus (QTLs) on chromosomes 1, 6, 7 and 20 in ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs ...

  5. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  6. Common biology of craving across legal and illegal drugs - a quantitative meta-analysis of cue-reactivity brain response.

    Science.gov (United States)

    Kühn, Simone; Gallinat, Jürgen

    2011-04-01

    The present quantitative meta-analysis set out to test whether cue-reactivity responses in humans differ across drugs of abuse and whether these responses constitute the biological basis of drug craving as a core psychopathology of addiction. By means of activation likelihood estimation, we investigated the concurrence of brain regions activated by cue-induced craving paradigms across studies on nicotine, alcohol and cocaine addicts. Furthermore, we analysed the concurrence of brain regions positively correlated with self-reported craving in nicotine and alcohol studies. We found direct overlap between nicotine, alcohol and cocaine cue reactivity in the ventral striatum. In addition, regions of close proximity were observed in the anterior cingulate cortex (ACC; nicotine and cocaine) and amygdala (alcohol, nicotine and cocaine). Brain regions of concurrence in drug cue-reactivity paradigms that overlapped with brain regions of concurrence in self-reported craving correlations were found in the ACC, ventral striatum and right pallidum (for alcohol). This first quantitative meta-analysis on drug cue reactivity identifies brain regions underlying nicotine, alcohol and cocaine dependency, i.e. the ventral striatum. The ACC, right pallidum and ventral striatum were related to drug cue reactivity as well as self-reported craving, suggesting that this set of brain regions constitutes the core circuit of drug craving in nicotine and alcohol addiction. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  7. Differentiating invasive and pre-invasive lung cancer by quantitative analysis of histopathologic images

    Science.gov (United States)

    Zhou, Chuan; Sun, Hongliu; Chan, Heang-Ping; Chughtai, Aamer; Wei, Jun; Hadjiiski, Lubomir; Kazerooni, Ella

    2018-02-01

    We are developing automated radiopathomics method for diagnosis of lung nodule subtypes. In this study, we investigated the feasibility of using quantitative methods to analyze the tumor nuclei and cytoplasm in pathologic wholeslide images for the classification of pathologic subtypes of invasive nodules and pre-invasive nodules. We developed a multiscale blob detection method with watershed transform (MBD-WT) to segment the tumor cells. Pathomic features were extracted to characterize the size, morphology, sharpness, and gray level variation in each segmented nucleus and the heterogeneity patterns of tumor nuclei and cytoplasm. With permission of the National Lung Screening Trial (NLST) project, a data set containing 90 digital haematoxylin and eosin (HE) whole-slide images from 48 cases was used in this study. The 48 cases contain 77 regions of invasive subtypes and 43 regions of pre-invasive subtypes outlined by a pathologist on the HE images using the pathological tumor region description provided by NLST as reference. A logistic regression model (LRM) was built using leave-one-case-out resampling and receiver operating characteristic (ROC) analysis for classification of invasive and pre-invasive subtypes. With 11 selected features, the LRM achieved a test area under the ROC curve (AUC) value of 0.91+/-0.03. The results demonstrated that the pathologic invasiveness of lung adenocarcinomas could be categorized with high accuracy using pathomics analysis.

  8. Quantitation of regional cerebral blood flow corrected for partial volume effect using O-15 water and PET

    DEFF Research Database (Denmark)

    IIda, H.; Law, I.; Pakkenberg, B.

    2000-01-01

    Limited spatial resolution of positron emission tomography (PET) can cause significant underestimation in the observed regional radioactivity concentration (so-called partial volume effect or PVE) resulting in systematic errors in estimating quantitative physiologic parameters. The authors have...... formulated four mathematical models that describe the dynamic behavior of a freely diffusible tracer (H215O) in a region of interest (ROI) incorporating estimates of regional tissue flow that are independent of PVE. The current study was intended to evaluate the feasibility of these models and to establish...... a methodology to accurately quantify regional cerebral blood flow (CBF) corrected for PVE in cortical gray matter regions. Five monkeys were studied with PET after IV H2(15)O two times (n = 3) or three times (n = 2) in a row. Two ROIs were drawn on structural magnetic resonance imaging (MRI) scans and projected...

  9. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  10. ANALYSIS OF CRISIS LEVEL IN REGIONS OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Irina Abramova

    2017-12-01

    corresponds to the limits of the indicators from 0.75 to 0.5. The timely implementation of liquidation measures to neutralize the effects of the existing ones and prevent new crises will lead to the transition of the region into a zone of deep crisis. The zone of deep crisis is characterized by a partial destruction of the socio-economic system of the region. Out of such a situation requires the use of systemic measures of anti-crisis state and regional management with the assistance of foreign aid. A quantitative indicator of this zone, its threshold is numerical measure, which is limited to 75 percent deviation from the threshold level of the non-crisis zone, which corresponds to the limits of indicators from 0.5 to 0.25. The zone of bankruptcy involves the complete destruction of the region as a social and economic system. The reasons for such a situation are force majeure circumstances (wars, natural disasters, man-made disasters, etc.. Such a state of the region is characterized by the cessation of the work of enterprises and organizations, the economic and social devastation of the region, the intensification of migration processes. The solution to the current situation is targeted state crisis management. A numerical indicator of this zone, its threshold is considered numerical measurements, characterized by more than 75 percent deviation from the threshold level of the non-crisis zone, which corresponds to the limits of indicators from 0.25 to 0.0. Results of the survey showed that there was a moderate level of crisis according to economic parameters with a high risk of transition into a deep crisis in 14 of 27 regions as of 2015. Practical implications. Thus, the conducted analysis on the crisis of socio-economic development of Ukraine’s regions made it possible to detect the level of its depth according to social and economic parameters and to determine the weakest areas that need the most support and display in anti-crisis regional management. Value

  11. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    Science.gov (United States)

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  12. Quantitative neuroanatomy of all Purkinje cells with light sheet microscopy and high-throughput image analysis

    Directory of Open Access Journals (Sweden)

    Ludovico eSilvestri

    2015-05-01

    Full Text Available Characterizing the cytoarchitecture of mammalian central nervous system on a brain-wide scale is becoming a compelling need in neuroscience. For example, realistic modeling of brain activity requires the definition of quantitative features of large neuronal populations in the whole brain. Quantitative anatomical maps will also be crucial to classify the cytoarchtitectonic abnormalities associated with neuronal pathologies in a high reproducible and reliable manner. In this paper, we apply recent advances in optical microscopy and image analysis to characterize the spatial distribution of Purkinje cells across the whole cerebellum. Light sheet microscopy was used to image with micron-scale resolution a fixed and cleared cerebellum of an L7-GFP transgenic mouse, in which all Purkinje cells are fluorescently labeled. A fast and scalable algorithm for fully automated cell identification was applied on the image to extract the position of all the fluorescent Purkinje cells. This vectorized representation of the cell population allows a thorough characterization of the complex three-dimensional distribution of the neurons, highlighting the presence of gaps inside the lamellar organization of Purkinje cells, whose density is believed to play a significant role in autism spectrum disorders. Furthermore, clustering analysis of the localized somata permits dividing the whole cerebellum in groups of Purkinje cells with high spatial correlation, suggesting new possibilities of anatomical partition. The quantitative approach presented here can be extended to study the distribution of different types of cell in many brain regions and across the whole encephalon, providing a robust base for building realistic computational models of the brain, and for unbiased morphological tissue screening in presence of pathologies and/or drug treatments.

  13. Quantitative chromatography in the analysis of labelled compounds 1. Quantitative paper chromotography of amino acids by A spot comparison technique

    International Nuclear Information System (INIS)

    Barakat, M.F.; Farag, A.N.; El-Gharbawy, A.A.

    1974-01-01

    For the determination of the specific activity of labelled compounds separated by paper sheet chromatography, it was found essential to perfect the quantitative aspect of the paper chromatographic technique. Actually, so far paper chromatography has been used as a separation tool mainly and its use in quantification of the separated materials is by far less studied. In the present work, the quantitative analysis of amino acids by paper sheet chromatography has been carried out by methods, depending on the use of the relative spot area values for correcting the experimental data obtained. The results obtained were good and reproducible. The main advantage of the proposed technique is its extreme simplicity. No complicated equipment of procedures are necessary

  14. Analysis of trade condition in Ras region

    Directory of Open Access Journals (Sweden)

    Andelić Slavica

    2017-01-01

    Full Text Available Modern academic literature in the field of trade in macro and mesoeconomic atmosphere, is trying to shed light on the data which defines exchange flows in intra and international environment. The study of this work is based on the database based through state registers, where with their sizing and analysis, we are coming to a deeper insight into the condition of market channels of Ras region and its relationship with the environment. The aim of this work is meticulous interpretation of trade patterns as a result of macro and meso trade policy, which could serve as an incentive for local and governmental structures in developing commercial potential of the southern part of our country.

  15. Seismic fault analysis of Chicoutimi region

    International Nuclear Information System (INIS)

    Woussen, G.; Ngandee, S.

    1996-01-01

    On November 25, 1988, an earthquake measuring 6.5 on the Richter Scale occurred at a depth of 29 km in Precambrian bedrock in the Saguenay Region (Quebec). Given that the seismic event was located near a major zone of normal faults, it is important to determine if the earthquake could be associated with this large structure or with faults associated with this structure. This is discussed through a compilation and interpretation of structural discontinuities on key outcrops in the vicinity of the epicenter. The report is broken in four parts. The first part gives a brief overview of the geology in order to provide a geologic context for the structural measurements. The second comprises an analysis of fractures in each of the three lithotectonic units defined in the first part. The third part discusses the data and the fourth provides a conclusion. 30 refs., 53 figs

  16. Quantitative analysis of Moessbauer backscatter spectra from multilayer films

    International Nuclear Information System (INIS)

    Bainbridge, J.

    1975-01-01

    The quantitative interpretation of Moessbauer backscatter spectra with particular reference to internal conversion electrons has been treated assuming that electron attenuation in a surface film can be satisfactorily described by a simple exponential law. The theory of Krakowski and Miller has been extended to include multi-layer samples, and a relation between the Moessbauer spectrum area and an individual layer thickness derived. As an example, numerical results are obtained for a duplex oxide film grown on pure iron. (Auth.)

  17. Geometrical conditions at the quantitative neutronographic texture analysis

    International Nuclear Information System (INIS)

    Tobisch, J.; Kleinstueck, K.

    1975-10-01

    The beam geometry for measuring quantitative pole figures by a neutronographic texture diffractometer is explained for transmission and reflection arrangement of spherical samples and sheets. For given dimensions of counter aperture the maximum possible cross sections of the incident beam are calculated as a function of sample dimensions and the Bragg angle theta. Methods for the calculation of absorption factors and volume correction are given. Under special conditions advantages result in the transmission case for sample motion into the direction +α. (author)

  18. Quantitative analysis of strategic and tactical purchasing decisions

    OpenAIRE

    Heijboer, G.J.

    2003-01-01

    Purchasing management is a relatively new scientific research field, partly due to the fact that purchasing has only recently been recognized as a factor of strategic importance to an organization. In this thesis, the author focuses on a selection of strategic and tactical purchasing decision problems. New quantitative models are developed for these decision problems using a range of mathematical techniques, thereby contributing to the further development of purchasing theory and its appliati...

  19. Quantitative analysis of carbon radiation in edge plasmas of LHD

    International Nuclear Information System (INIS)

    Dong, C.F.; Morita, S.; Oishi, T.; Goto, M.; Murakami, I.; Wang, E.R.; Huang, X.L.

    2013-01-01

    It is of interest to compare the carbon radiation loss between LHD and tokamaks. Since the radiation from C"3"+ is much smaller than that from C"5"+, it is also interesting to examine the difference in the detached plasma. In addition, it is important to study quantitatively the radiation from each ionization stage of carbon which is uniquely the dominant impurity in most tokamaks and LHD. (J.P.N.)

  20. Quantitative analysis of allantoin in Iranian corn silk

    Directory of Open Access Journals (Sweden)

    E. Khanpour*

    2017-11-01

    Full Text Available Background and objectives: Zea mays is cultivated in different parts of Iran and corn silk is used in traditional medicine. Allantoin is one of the major compounds in corn silk. The purpose of this research was the quantitatve analysis of allantoin in corn silks belonging to several regions of Iran. Methods: The samples of corn silk were prepared from three provinces of Iran (Kermanshah, Fars and Razavi Khorasan. The dried plant materials were infused in boiling distilled water with a temperature of 90-95 °C on magnetic stirrer for 30 min. The levels of allantoin in aqueous extracts were determined by HPLC. Quantification was achieved using an C18 column (250×4.6 mm, 5 µm under isocratic conditions and phosphate buffer solution (pH 3.0 as the mobile phase at a flow rate of 0.5 mL/min. Column effluent was monitored at 210 nm. The calibration curve of allantoin standard was plotted with concentrations from 6.25 to 100 µg/mL. Results: The calibration curve of standard was linear over the concentration range used (R2=0.9999. The results showed that the amount of allantoin in samples was between 205 and 374 mg/100g of dry plant material. The corn silk samples of Razavi Khorasan and Fars provinces showed the lowest and highest amount of allantoin, respectively. Conclusion: The levels of allantoin obtained in this study were higher than the values reported in other studies; therefore, the researchers of this project are investigating the wound healing effect of corn silk.

  1. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  2. Correlation between Parameters of Calcaneal Quantitative Ultrasound and Hip Structural Analysis in Osteoporotic Fracture Patients.

    Directory of Open Access Journals (Sweden)

    Licheng Zhang

    Full Text Available Calcaneal quantitative ultrasound (QUS, which is used in the evaluation of osteoporosis, is believed to be intimately associated with the characteristics of the proximal femur. However, the specific associations of calcaneal QUS with characteristics of the hip sub-regions remain unclear.A cross-sectional assessment of 53 osteoporotic patients was performed for the skeletal status of the heel and hip.We prospectively enrolled 53 female osteoporotic patients with femoral fractures. Calcaneal QUS, dual energy X-ray absorptiometry (DXA, and hip structural analysis (HSA were performed for each patient. Femoral heads were obtained during the surgery, and principal compressive trabeculae (PCT were extracted by a three-dimensional printing technique-assisted method. Pearson's correlation between QUS measurement with DXA, HSA-derived parameters and Young's modulus were calculated in order to evaluate the specific association of QUS with the parameters for the hip sub-regions, including the femoral neck, trochanteric and Ward's areas, and the femoral shaft, respectively.Significant correlations were found between estimated BMD (Est.BMD and BMD of different sub-regions of proximal femur. However, the correlation coefficient of trochanteric area (r = 0.356, p = 0.009 was higher than that of the neck area (r = 0.297, p = 0.031 and total proximal femur (r = 0.291, p = 0.034. Furthermore, the quantitative ultrasound index (QUI was significantly correlated with the HSA-derived parameters of the trochanteric area (r value: 0.315-0.356, all p<0.05 as well as with the Young's modulus of PCT from the femoral head (r = 0.589, p<0.001.The calcaneal bone had an intimate association with the trochanteric cancellous bone. To a certain extent, the parameters of the calcaneal QUS can reflect the characteristics of the trochanteric area of the proximal hip, although not specifically reflective of those of the femoral neck or shaft.

  3. Trial of quantitative analysis of cardiac function by 3D reconstruction of multislice cine MR images

    International Nuclear Information System (INIS)

    Yamamoto, Hideki; Sei, Tetsurou; Nakagawa, Tomio; Hiraki, Yoshio.

    1994-01-01

    Non-invasive techniques for measuring the dynamic behavior of the left ventricle (LV) can be invaluable tool in the diagnosis of the heart disease. In this paper we present methods for quantitative analysis of cardiac function using a compact magnetic resonance image processing system. A 256 x 256 magnetic resonance transaxial image of the left ventricle in a normal case is obtained. After gray level thresholding and region segmentation, the boundary of the left ventricular chamber is extracted. Then, the boundaries of the left ventricular chamber are displayed three-dimensionally by using the Z-buffer algorithm. Thus, LV volume and ejection fraction are calculated. Here, the value of LV ejection fraction is 60%. These results agree reasonably well with the corresponding data obtained by the echocardiography. (author)

  4. Criteria for definition of regional functional improvement on quantitative post-stress gated myocardial SPET after bypass surgery in patients with ischaemic cardiomyopathy

    International Nuclear Information System (INIS)

    Lee, Dong Soo; Cheon, Gi Jeong; Paeng, Jin Chul; Chung, June-Key; Lee, Myung Chul; Kim, Ki Bong

    2002-01-01

    Myocardial viability can be defined as functional improvement of dysfunctional myocardium after revascularization. The purpose of this study was to define the optimal criteria for definition of regional functional improvement after coronary artery bypass graft (CABG) surgery on quantitative gated single-photon emission tomography (SPET). Thirty-two patients (26 men, 6 women; age 56±13 years) with coronary artery disease (three-vessel disease, 17; two-vessel disease, 15; previous history of myocardial infarction, 9) and severe left ventricular dysfunction (LVEF≤35%) underwent CABG. Rest thallium-201/dipyridamole stress technetium-99m methoxyisobutylisonitrile gated myocardial SPET was performed before and 3 months after CABG. Global LV functional improvement was defined as either an improvement in LVEF of 10% (n=15) or an improvement in LVEF of 5% combined with a decrease in end-systolic volume of 10 ml (n=2) after CABG on quantitative gated SPET. Postoperative regional wall thickening improvement (ΔRWT), regional wall motion improvement (ΔRWM) and regional resting (ΔRP) and stress perfusion improvement (ΔRstrP) were used to determine global functional improvement by ROC curve analysis, and the optimal criteria for definition of viable regional dysfunctional myocardium were defined on the ROC curves. Correlations were verified by determining the number of improved myocardial regions and LVEF improvement. LVEF was improved from 25%±6% to 34%±11% after CABG. A total of 229 segments were dysfunctional (wall motion ≤2 mm, thickening ≤20%) before CABG. On ROC curve analysis using global functional improvement as an indicator of viability, the areas under the ROC curves (AUCs) of ΔRWT and ΔRWM were 0.717 and 0.620, respectively. The AUC of ΔRWT was significantly larger than that of ΔRWM (P=0.009) and the optimal cut-off value of ΔRWT was 15%. The AUCs of ΔRP and ΔRstrP were not significant. The correlation coefficients between summed ΔRWT and

  5. Direct comparison of low- and mid-frequency Raman spectroscopy for quantitative solid-state pharmaceutical analysis.

    Science.gov (United States)

    Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J

    2018-02-05

    This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Quantitative phase analysis of a highly textured industrial sample using a Rietveld profile analysis

    International Nuclear Information System (INIS)

    Shin, Eunjoo; Huh, Moo-Young; Seong, Baek-Seok; Lee, Chang-Hee

    2001-01-01

    For the quantitative phase analysis on highly textured two-phase materials, samples with known weight fractions of zirconium and aluminum were prepared. Strong texture components prevailed in both zirconium and aluminum sheet. The diffraction patterns of samples were measured by the neutron and refined by the Rietveld method. The preferred orientation correction of diffraction patterns was carried out by means of recalculated pole figures from the ODF. The present Rietveld analysis of various samples with different weight fractions showed that the absolute error of the calculated weight fractions was less than 7.1%. (author)

  7. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  8. The Quantitative Analysis of a team game performance made by men basketball teams at OG 2008

    OpenAIRE

    Kocián, Michal

    2009-01-01

    Title: The quantitative analysis of e team game performance made by men basketball teams at Olympis games 2008 Aims: Find reason successes and failures of teams in Olympis game play-off using quantitative (numerical) observation of selected game statistics. Method: The thesis was made on the basic a quantitative (numerical) observation of videorecordings using KVANTÝM. Results: Obtained selected statistic desribed the most essentials events for team winning or loss. Keywords: basketball, team...

  9. A quantitative study of regional cerebral blood flow in childhood using {sup 123}I-IMP-SPECT. With emphasis on age-related changes

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Ayame; Kishi, Kazuko; Sejima, Hitoshi; Haneda, Noriyuki; Uchida, Nobue; Sugimura, Kazuro; Ito, Masatoshi; Shiraishi, Hideyuki [Shimane Medical Univ., Izumo (Japan)

    1996-11-01

    Single photon emission computed tomography (SPECT), using N-isopropyl-p-={sup 123}I= iodoamphetamine ({sup 123}I-IMP) was used for quantitative analysis of regional cerebral blood flow (rCBF) on 26 individuals between 0 and 19 years of age. The rCBF showed age-related changes; it was low in early infancy, increased in late infancy through early childhood, and decreased and remained constant after puberty. The rCBF through cerebral cortex varied more greatly than through thalamus and cerebellum, and seemed to depend more closely on age. In the case of 4 months of age rCBF was very low at the frontal region and was very high at the occipital region. In more older cases, rCBF in the cerebral cortex was higher than in the thalamus. In childhood, rCBF was very inconsistent and showed a great inter-individual variance. (author)

  10. A quantitative study of regional cerebral blood flow in childhood using 123I-IMP-SPECT. With emphasis on age-related changes

    International Nuclear Information System (INIS)

    Kobayashi, Ayame; Kishi, Kazuko; Sejima, Hitoshi; Haneda, Noriyuki; Uchida, Nobue; Sugimura, Kazuro; Ito, Masatoshi; Shiraishi, Hideyuki

    1996-01-01

    Single photon emission computed tomography (SPECT), using N-isopropyl-p-[ 123 I] iodoamphetamine ( 123 I-IMP) was used for quantitative analysis of regional cerebral blood flow (rCBF) on 26 individuals between 0 and 19 years of age. The rCBF showed age-related changes; it was low in early infancy, increased in late infancy through early childhood, and decreased and remained constant after puberty. The rCBF through cerebral cortex varied more greatly than through thalamus and cerebellum, and seemed to depend more closely on age. In the case of 4 months of age rCBF was very low at the frontal region and was very high at the occipital region. In more older cases, rCBF in the cerebral cortex was higher than in the thalamus. In childhood, rCBF was very inconsistent and showed a great inter-individual variance. (author)

  11. QUANTITATIVE STUDY OF GASTRIC EPITHELIAL LESIONS BY NUCLEOLAR ORGANIZER REGION STAINING

    Directory of Open Access Journals (Sweden)

    M.R. Arab

    2004-11-01

    Full Text Available Nucleolar organizer regions (NOR are defined as nucleolar components containing a set of argyrophilic proteins which are selectively stained by colloidal silver nitrate staining. Although studies have shown that the number of NOR dots or particles is directly related to the rapidity of cell proliferation in cancer cells, prognostic or diagnostic value of NOR remains controversial. The aim of the present study was to asses the proliferative activity of the NOR in different gastric epithelial lesions. For these purposes 60 biopsy and surgical specimens of stomach from pathology files of Khatamalanbia and Imam Hospitals were chosen. For each patient, 3-5 paraffin sections were prepared and stained by one step colloidal silver nitrate solution. In each section intranuclear dots in 100 cell nuclei were counted by two of authors in randomly selected fields and data were analyzed by ANOVA. Statistical analysis showed significant difference for NOR number between gastritis, different grades of dysplasia and carcinoma. The shape and number of NOR showed a grater variability in carcinoma compared to other lesions. It seems that NOR could reflect the proliferative activity of cells.

  12. Quantitation, regional vulnerability, and kinetic modeling of brain glucose metabolism in mild Alzheimer's disease

    International Nuclear Information System (INIS)

    Mosconi, Lisa; Rusinek, Henry; De Santi, Susan; Li, Yi; Tsui, Wai H.; De Leon, Mony J.; Wang, Gene-Jack; Fowler, Joanna; Pupi, Alberto

    2007-01-01

    To examine CMRglc measures and corresponding glucose transport (K 1 and k 2 ) and phosphorylation (k 3 ) rates in the medial temporal lobe (MTL, comprising the hippocampus and amygdala) and posterior cingulate cortex (PCC) in mild Alzheimer's disease (AD). Dynamic FDG PET with arterial blood sampling was performed in seven mild AD patients (age 68 ± 8 years, four females, median MMSE 23) and six normal (NL) elderly (age 69 ± 9 years, three females, median MMSE 30). Absolute CMRglc (μmol/100 g/min) was calculated from MRI-defined regions of interest using multiparametric analysis with individually fitted kinetic rate constants, Gjedde-Patlak plot, and Sokoloff's autoradiographic method with population-based rate constants. Relative ROI/pons CMRglc (unitless) was also examined. With all methods, AD patients showed significant CMRglc reductions in the hippocampus and PCC, and a trend towards reduced parietotemporal CMRglc, as compared with NL. Significant k 3 reductions were found in the hippocampus, PCC and amygdala. K 1 reductions were restricted to the hippocampus. Relative CMRglc had the largest effect sizes in separating AD from NL. However, the magnitude of CMRglc reductions was 1.2- to 1.9-fold greater with absolute than with relative measures. CMRglc reductions are most prominent in the MTL and PCC in mild AD, as detected with both absolute and relative CMRglc measures. Results are discussed in terms of clinical and pharmaceutical applicability. (orig.)

  13. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis

    Directory of Open Access Journals (Sweden)

    Akira Ishikawa

    2017-11-01

    Full Text Available Large numbers of quantitative trait loci (QTL affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  14. A Strategy for Identifying Quantitative Trait Genes Using Gene Expression Analysis and Causal Analysis.

    Science.gov (United States)

    Ishikawa, Akira

    2017-11-27

    Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.

  15. Clustering applications in financial and economic analysis of the crop production in the Russian regions

    Directory of Open Access Journals (Sweden)

    Gromov Vladislav Vladimirovich

    2013-08-01

    Full Text Available We used the complex mathematical modeling, multivariate statistical-analysis, fuzzy sets to analyze the financial and economic state of the crop production in Russian regions. We developed a system of indicators, detecting the state agricultural sector in the region, based on the results of correlation, factor, cluster analysis and statistics of the Federal State Statistics Service. We performed clustering analyses to divide regions of Russia on selected factors into five groups. A qualitative and quantitative characteristics of each cluster was received.

  16. Quantitative analysis of water heavy by NMR spectroscopy

    International Nuclear Information System (INIS)

    Gomez Gil, V.

    1975-01-01

    Nuclear Magnetic Resonance has been applied to a wide variety of quantitative problems. A typical example has been the determination of isotopic composition. In this paper two different analytical methods for the determination of water in deuterium oxide are described. The first one, employs acetonitril as an internal standard compound and in the second one calibration curve of signal integral curve versus amount of D 2 O is constructed. Both methods give results comparable to those of mass spectrometry of IR spectroscopy. (Author) 5 refs

  17. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  18. Risk management and analysis: risk assessment (qualitative and quantitative)

    OpenAIRE

    Valentin Mazareanu

    2007-01-01

    We use to define risk as the possibility of suffering a loss. Starting this, risk management is defined as a business process whose purpose is to ensure that the organization is protected against risks and their effects. In order to prioritize, to develop a response plan and after that to monitor the identified risks we need to asses them. But at this point a question is born: should I choose a qualitative approach or a quantitative one? This paper will make a short overview over the risk eva...

  19. Correlation of quantitative histopathological morphology and quantitative radiological analysis during aseptic loosening of hip endoprostheses.

    Science.gov (United States)

    Bertz, S; Kriegsmann, J; Eckardt, A; Delank, K-S; Drees, P; Hansen, T; Otto, M

    2006-01-01

    Aseptic hip prosthesis loosening is the most important long-term complication in total hip arthroplasty. Polyethylene (PE) wear is the dominant etiologic factor in aseptic loosening, which together with other factors induces mechanisms resulting in bone loss, and finally in implant loosening. The single-shot radiograph analysis (EBRA, abbreviation for the German term "Einzel-Bild-Röntgenanalyse") is a computerized method for early radiological prediction of aseptic loosening. In this study, EBRA parameters were correlated with histomorphological parameters of the periprosthetic membrane. Periprosthetic membranes obtained from 19 patients during revision surgery of loosened ABG I-type total hip pros-theses were analyzed histologically and morphometrically. The pre-existing EBRA parameters, the thickness of the PE debris lay-er and the dimension of inclination and anteversion, were compared with the density of macrophages and giant cells. Addi-tionally, the semiquantitatively determined density of lymphocytes, plasma cells, giant cells and the size of the necrotic areas were correlated with the EBRA results. All periprosthetic membranes were classified as debris-induced type membranes. We found a positive correlation between the number of giant cells and the thickness of the PE debris layer. There was no significant correlation between the number of macrophages or all semiquantitative parameters and EBRA parameters. The number of giant cells decreased with implant duration. The morphometrically measured number of foreign body giant cells more closely reflects the results of the EBRA. The semiquantitative estimation of giant cell density could not substitute for the morphometrical analysis. The density of macrophages, lymphocytes, plasma cells and the size of necrotic areas did not correlate with the EBRA parameters, indicating that there is no correlation with aseptic loosening.

  20. MCM - 2 and Ki - 67 as proliferation markers in renal cell carcinoma: A quantitative and semi - quantitative analysis.

    Science.gov (United States)

    Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia

    2016-01-01

    Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they

  1. Quantitative EEG analysis in minimally conscious state patients during postural changes.

    Science.gov (United States)

    Greco, A; Carboncini, M C; Virgillito, A; Lanata, A; Valenza, G; Scilingo, E P

    2013-01-01

    Mobilization and postural changes of patients with cognitive impairment are standard clinical practices useful for both psychic and physical rehabilitation process. During this process, several physiological signals, such as Electroen-cephalogram (EEG), Electrocardiogram (ECG), Photopletysmography (PPG), Respiration activity (RESP), Electrodermal activity (EDA), are monitored and processed. In this paper we investigated how quantitative EEG (qEEG) changes with postural modifications in minimally conscious state patients. This study is quite novel and no similar experimental data can be found in the current literature, therefore, although results are very encouraging, a quantitative analysis of the cortical area activated in such postural changes still needs to be deeply investigated. More specifically, this paper shows EEG power spectra and brain symmetry index modifications during a verticalization procedure, from 0 to 60 degrees, of three patients in Minimally Consciousness State (MCS) with focused region of impairment. Experimental results show a significant increase of the power in β band (12 - 30 Hz), commonly associated to human alertness process, thus suggesting that mobilization and postural changes can have beneficial effects in MCS patients.

  2. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  3. Micro-computer system for quantitative image analysis of damage microstructure

    International Nuclear Information System (INIS)

    Kohyama, A.; Kohno, Y.; Satoh, K.; Igata, N.

    1984-01-01

    Quantitative image analysis of radiation induced damage microstructure is very important in evaluating material behaviors in radiation environment. But, quite a few improvement have been seen in quantitative analysis of damage microstructure in these decades. The objective of this work is to develop new system for quantitative image analysis of damage microstructure which could improve accuracy and efficiency of data sampling and processing and could enable to get new information about mutual relations among dislocations, precipitates, cavities, grain boundaries, etc. In this system, data sampling is done with X-Y digitizer. The cavity microstructure in dual-ion irradiated 316 SS is analyzed and the effectiveness of this system is discussed. (orig.)

  4. Quantitative analysis of the secretion of the MCP family of chemokines by muscle cells

    DEFF Research Database (Denmark)

    Henningsen, Jeanette; Pedersen, Bente Klarlund; Kratchmarova, Irina

    2011-01-01

    by Amino acids in Cell culture (SILAC) method for quantitative analysis resulted in the identification and generation of quantitative profiles of 59 growth factors and cytokines, including 9 classical chemokines. The members of the CC chemokine family of proteins such as monocyte chemotactic proteins 1, 2...

  5. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms.

  6. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    OpenAIRE

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective: To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods: TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results: Both assays provided good linearity, accuracy, reproducibility and selectivity for dete...

  7. Transportation and quantitative analysis of socio-economic development of relations

    Science.gov (United States)

    Chen, Yun

    2017-12-01

    Transportation has a close relationship with socio-economic. This article selects the indicators which can measure the development of transportation and socio-economic, using the method of correlation analysis, regression analysis, intensity of transportation analysis and transport elastic analysis, to analyze the relationship between them quantitatively, so that it has the fact guiding sense in the national development planning for the future.

  8. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    Science.gov (United States)

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  9. A quantitative assessment of groundwater resources in the Middle East and North Africa region

    Science.gov (United States)

    Lezzaik, Khalil; Milewski, Adam

    2018-02-01

    The Middle East and North Africa (MENA) region is the world's most water-stressed region, with its countries constituting 12 of the 15 most water-stressed countries globally. Because of data paucity, comprehensive regional-scale assessments of groundwater resources in the MENA region have been lacking. The presented study addresses this issue by using a distributed ArcGIS model, parametrized with gridded data sets, to estimate groundwater storage reserves in the region based on generated aquifer saturated thickness and effective porosity estimates. Furthermore, monthly gravimetric datasets (GRACE) and land surface parameters (GLDAS) were used to quantify changes in groundwater storage between 2003 and 2014. Total groundwater reserves in the region were estimated at 1.28 × 106 cubic kilometers (km3) with an uncertainty range between 816,000 and 1.93 × 106 km3. Most of the reserves are located within large sedimentary basins in North Africa and the Arabian Peninsula, with Algeria, Libya, Egypt, and Saudi Arabia accounting for approximately 75% of the region's total freshwater reserves. Alternatively, small groundwater reserves were found in fractured Precambrian basement exposures. As for groundwater changes between 2003 and 2014, all MENA countries except for Morocco exhibited declines in groundwater storage. However, given the region's large groundwater reserves, groundwater changes between 2003 and 2014 are minimal and represent no immediate short-term threat to the MENA region, with some exceptions. Notwithstanding this, the study recommends the development of sustainable and efficient groundwater management policies to optimally utilize the region's groundwater resources, especially in the face of climate change, demographic expansion, and socio-economic development.

  10. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem

    DEFF Research Database (Denmark)

    Huang, Honggang; Larsen, Martin Røssel; Palmisano, Giuseppe

    2014-01-01

    in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160...... phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant...... and rigor mortis development in PM muscle. BIOLOGICAL SIGNIFICANCE: The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed...

  11. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two......Given a task of designing controller for mobile robots in swarms, one might wonder which distributed control paradigms should be selected. Until now, paradigms of robot controllers have been within either behaviour based control or neural network based control, which have been recognized as two...... mainstreams of controller design for mobile robots. However, in swarm robotics, it is not clear how to determine control paradigms. In this paper we study the two control paradigms with various experiments of swarm aggregation. First, we introduce the two control paradigms for mobile robots. Second, we...

  12. Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.

    Science.gov (United States)

    Li, Zitong; Sillanpää, Mikko J

    2015-12-01

    Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    Directory of Open Access Journals (Sweden)

    Xinsheng Peng

    2014-01-01

    Full Text Available A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8-triethylamine (50 : 50 : 0.1% with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y=10706x-2959 (R2=1.0. The average recovery is 101.7%; RSD=2.22%  (n=9. This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle.

  14. Quantitative Trait Loci Analysis of Allelopathy in Rice

    DEFF Research Database (Denmark)

    Jensen, L B; Courtois, B; Olofsdotter, M

    2008-01-01

    The allelopathic potential of rice (Oryza sativa L.) against Echinochloa crus-galli (L.) Beauv. was investigated under both laboratory and greenhouse conditions. A population of 150 recombinant inbred lines (RILs) was derived through single-seed descent from a cross between the indica cultivar AC...... the population phenotype was normally distributed. Two quantitative trait loci (QTLs) were located on chromosomes 4 and 7, explaining 20% of the phenotypic variation. A second relay seeding experiment was set up, this time including charcoal in the perlite. This screening showed that the allelopathic rice...... varieties did not have any effect on the weed species when grown with charcoal, the charcoal reversing the effect of any potential allelochemicals exuded from the rice roots. The second phenotypic experiment was conducted under greenhouse conditions in pots. Thirteen QTLs were detected for four different...

  15. Quantitative analysis on electric dipole energy in Rashba band splitting.

    Science.gov (United States)

    Hong, Jisook; Rhim, Jun-Won; Kim, Changyoung; Ryong Park, Seung; Hoon Shim, Ji

    2015-09-01

    We report on quantitative comparison between the electric dipole energy and the Rashba band splitting in model systems of Bi and Sb triangular monolayers under a perpendicular electric field. We used both first-principles and tight binding calculations on p-orbitals with spin-orbit coupling. First-principles calculation shows Rashba band splitting in both systems. It also shows asymmetric charge distributions in the Rashba split bands which are induced by the orbital angular momentum. We calculated the electric dipole energies from coupling of the asymmetric charge distribution and external electric field, and compared it to the Rashba splitting. Remarkably, the total split energy is found to come mostly from the difference in the electric dipole energy for both Bi and Sb systems. A perturbative approach for long wave length limit starting from tight binding calculation also supports that the Rashba band splitting originates mostly from the electric dipole energy difference in the strong atomic spin-orbit coupling regime.

  16. Combinational Reasoning of Quantitative Fuzzy Topological Relations for Simple Fuzzy Regions

    Science.gov (United States)

    Liu, Bo; Li, Dajun; Xia, Yuanping; Ruan, Jian; Xu, Lili; Wu, Huanyi

    2015-01-01

    In recent years, formalization and reasoning of topological relations have become a hot topic as a means to generate knowledge about the relations between spatial objects at the conceptual and geometrical levels. These mechanisms have been widely used in spatial data query, spatial data mining, evaluation of equivalence and similarity in a spatial scene, as well as for consistency assessment of the topological relations of multi-resolution spatial databases. The concept of computational fuzzy topological space is applied to simple fuzzy regions to efficiently and more accurately solve fuzzy topological relations. Thus, extending the existing research and improving upon the previous work, this paper presents a new method to describe fuzzy topological relations between simple spatial regions in Geographic Information Sciences (GIS) and Artificial Intelligence (AI). Firstly, we propose a new definition for simple fuzzy line segments and simple fuzzy regions based on the computational fuzzy topology. And then, based on the new definitions, we also propose a new combinational reasoning method to compute the topological relations between simple fuzzy regions, moreover, this study has discovered that there are (1) 23 different topological relations between a simple crisp region and a simple fuzzy region; (2) 152 different topological relations between two simple fuzzy regions. In the end, we have discussed some examples to demonstrate the validity of the new method, through comparisons with existing fuzzy models, we showed that the proposed method can compute more than the existing models, as it is more expressive than the existing fuzzy models. PMID:25775452

  17. Quantitative analysis of alveolar bone change following implant placement using intraoral radiographic subtraction

    International Nuclear Information System (INIS)

    Kimura, Hiroyuki; Kanda, Shigenobu; Tanaka, Takemasa

    2002-01-01

    The purpose of this study was to develop a procedure for quantitative analysis using intraoral radiographs of alveolar bone after placement of dental implants and to consider the validity of the method. We evaluated the ten patients (2 males and 8 females, average age: 48.4 years-old), who were treated with dental implant operation in the site of mandibular molar region, since October of 1999 until September of 2000 in Kimura Dental Clinic (Kumamoto, Japan). We evaluated the intraoral radiographs taken pre- and post- operatively and at follow-up examination. To detect alveolar bone change on radiograph, we adopted the digital subtraction method. Although the radiographs were taken under an ordinary technique with cone indicator, we did not apply the standardized technique with fixing material customized for each patient. Therefore, we used geometric correction and density compensation before subtraction. We assessed the basic statistical values (mean, variance, kurtosis and skewness) of the region of interest (ROI) of the subtracted images. Also, we noted PPD (probing pocket depth) and BOP (bleeding on probing) at each site as indicators of clinical findings and all implanted sites were classified according to the PPD or BOP, i.e. PPD increased group ''PPD (+)'' and PPD stable group ''PPD (-)'', likewise BOP positive group ''BOP (+)'' and negative group ''BOP (-)''. We considered the statistical values of ROI in each group and compared these findings. Mean and variance values of PPD (+) were higher than those of PPD (-) and there was a significant difference in mean value (p=0.031). Similarly, mean and variance values of BOP (+) were statistically higher than those of BOP (-) (p=0.041 and p=0.0087, respectively). Concerning kurtosis and skewness, there was no difference between PPD (+) and PPD (-), or between BOP (+) and BOP (-). Using our method, the radiographs taken for follow-up examination could be assessed quantitatively. It is suggested that geometric

  18. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  19. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  20. Operation Iraqi Freedom 04 - 06: Opportunities to Apply Quantitative Methods to Intelligence Analysis

    National Research Council Canada - National Science Library

    Hansen, Eric C

    2005-01-01

    The purpose of this presentation is to illustrate the need for a quantitative analytical capability within organizations and staffs that provide intelligence analysis to Army, Joint, and Coalition Force headquarters...

  1. Quantitative analysis by microchip capillary electrophoresis – current limitations and problem-solving strategies

    NARCIS (Netherlands)

    Revermann, T.; Götz, S.; Künnemeyer, Jens; Karst, U.

    2008-01-01

    Obstacles and possible solutions for the application of microchip capillary electrophoresis in quantitative analysis are described and critically discussed. Differences between the phenomena occurring during conventional capillary electrophoresis and microchip-based capillary electrophoresis are

  2. Quantitative method of X-ray diffraction phase analysis of building materials

    International Nuclear Information System (INIS)

    Czuba, J.; Dziedzic, A.

    1978-01-01

    Quantitative method of X-ray diffraction phase analysis of building materials, with use of internal standard, has been presented. The errors committed by determining the content of particular phases have been also given. (author)

  3. Regional analysis and environmental impact assessment

    International Nuclear Information System (INIS)

    Parzyck, D.C.; Brocksen, R.W.; Emanuel, W.R.

    1976-01-01

    This paper presents a number of techniques that can be used to assess environmental impacts on a regional scale. Regional methodologies have been developed which examine impacts upon aquatic and terrestrial biota in regions through consideration of changes in land use, land cover, air quality, water resource use, and water quality. Techniques used to assess long-range atmospheric transport, water resources, effects on sensitive forest and animal species, and impacts on man are presented in this paper, along with an optimization approach which serves to integrate the analytical techniques in an overall assessment framework. A brief review of the research approach and certain modeling techniques used within one regional studies program is provided. While it is not an all inclusive report on regional analyses, it does present an illustration of the types of analyses that can be performed on a regional scale

  4. A quantitative magnetic resonance histology atlas of postnatal rat brain development with regional estimates of growth and variability.

    Science.gov (United States)

    Calabrese, Evan; Badea, Alexandra; Watson, Charles; Johnson, G Allan

    2013-05-01

    There has been growing interest in the role of postnatal brain development in the etiology of several neurologic diseases. The rat has long been recognized as a powerful model system for studying neuropathology and the safety of pharmacologic treatments. However, the complex spatiotemporal changes that occur during rat neurodevelopment remain to be elucidated. This work establishes the first magnetic resonance histology (MRH) atlas of the developing rat brain, with an emphasis on quantitation. The atlas comprises five specimens at each of nine time points, imaged with eight distinct MR contrasts and segmented into 26 developmentally defined brain regions. The atlas was used to establish a timeline of morphometric changes and variability throughout neurodevelopment and represents a quantitative database of rat neurodevelopment for characterizing rat models of human neurologic disease. Published by Elsevier Inc.

  5. Quantitative phosphoproteomic analysis of porcine muscle within 24 h postmortem.

    Science.gov (United States)

    Huang, Honggang; Larsen, Martin R; Palmisano, Giuseppe; Dai, Jie; Lametsch, René

    2014-06-25

    Protein phosphorylation can regulate most of the important processes in muscle, such as metabolism and contraction. The postmortem (PM) metabolism and rigor mortis have essential effects on meat quality. In order to identify and characterize the protein phosphorylation events involved in meat quality development, a quantitative mass spectrometry-based phosphoproteomic study was performed to analyze the porcine muscle within 24h PM using dimethyl labeling combined with the TiSH phosphopeptide enrichment strategy. In total 305 unique proteins were identified, including 160 phosphoproteins with 784 phosphorylation sites. Among these, 184 phosphorylation sites on 93 proteins had their phosphorylation levels significantly changed. The proteins involved in glucose metabolism and muscle contraction were the two largest clusters of phosphoproteins with significantly changed phosphorylation levels in muscle within 24 h PM. The high phosphorylation level of heat shock proteins (HSPs) in early PM may be an adaptive response to slaughter stress and protect muscle cell from apoptosis, as observed in the serine 84 of HSP27. This work indicated that PM muscle proteins underwent significant changes at the phosphorylation level but were relatively stable at the total protein level, suggesting that protein phosphorylation may have important roles in meat quality development through the regulation of proteins involved in glucose metabolism and muscle contraction, thereby affecting glycolysis and rigor mortis development in PM muscle. The manuscript describes the characterization of postmortem (PM) porcine muscle within 24 h postmortem from the perspective of protein phosphorylation using advanced phosphoproteomic techniques. In the study, the authors employed the dimethyl labeling combined with the TiSH phosphopeptide enrichment and LC-MS/MS strategy. This was the first high-throughput quantitative phosphoproteomic study in PM muscle of farm animals. In the work, both the proteome

  6. Quantitative morphotectonic analysis of the South-Eastern Carpathians

    Science.gov (United States)

    Ionuţ Cristea, Alexandru

    2015-04-01

    South-Eastern Carpathians (Vrancea Region) have received an increasing scientific attention during the past years, mostly resulting in a detailed reconstruction of their exumation history. Moreover structural and thermocronological data suggest that the frontal part of the SE Carpathians conserves the youngest topography in the Romanian Carpathians resulting from a deformational process occurring during the late Pliocene - Early Pleistocene. This significant tectonic activity continues to the present time as it is confirmed by the geodetic measurements and by the frequency of crustal earthquakes. The specific effects of the Quaternary deformations on the regional fluvial system were associated so far with an increased incision and the formation of the degradational (strath) terraces, downstream tiling of terraces, the establishment of local drainage divides and young longitudinal river profiles. Our study further investigates the possible influence of the recent tectonic activity on the characteristics of the drainage basins in the area and the distribution of the over-steepened stream reaches using spatial autocorrelation techniques (Getis Ord Gi* statistics and Anselin's Local Moran's I). For the first, hypsometric integrals (Hi) and transverse topographic symmetry factor were analyzed. For the last, we used locally computed normalized channel steepness index (ksn). Due to the highly variable lithology in the region (specific to the Flysch areas), additional correlations of the determined values with the geological units and rock types have been made in order to assess the effects. The results show that the geographic clustering of the high Hi and ksn values is more significant than the lithological one, and, although the rock strength have local influences, this is not sufficient to explain the regional distribution of the values, generally between 26.5o and 26.66o E (p

  7. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    Science.gov (United States)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  8. Siberian Regional Identity in the Context of Historical Consciousness (Content Analysis of Tomsk Regional Media

    Directory of Open Access Journals (Sweden)

    A V Bocharov

    2011-12-01

    Full Text Available The article presents a model to study the Siberian regional identity in the context of historical consciousness, as well as the results of its practical application in the content analysis of the publications by the Tomsk regional media. On the basis of the content analysis procedures the author demonstrates how, through historical memory, the regional identity is formed and manifested in the regional media in various spheres of society.

  9. [Device for quantitative analysis of perception and pain sensation].

    Science.gov (United States)

    Arita, Hideko; Kato, Jitsu; Ogawa, Setsuro; Hanaoka, Kazuo

    2014-07-01

    The article describes an analysing device that measures the perception and intensity of pain quantitatively. While it is not necessarily true that psychological aspect is totally irrelevant to pain measurement, this device is remarkable in that it is capable of measuring the intensity of pain felt by the patient more objectively by using electric stimuli. The feature of this device is that it uses a non-pain heteresthesia for measuring the intensity of pain. The device is compact, light-weight, and portable. Unlike VAS that requires only a scale, the device requires a person to carry out the measurement. Nevertheless, as the National Health Insurance (NHI) coverage has been approved, introduction of the device may be facilitated in terms of budget for the purchase and labor. The device is useful to better understand not only the intensity of pain but also the pathological conditions, resulting in more appropriate treatment, by (1) comparing degree of pain or VAS values taken by a multicenter study with those of a patient; (2) using both degree of pain and VAS; and (3) multiple measurements of degree of pain and VAS in one case.

  10. Quantitative analysis of impact measurements using dynamic load cells

    Directory of Open Access Journals (Sweden)

    Brent J. Maranzano

    2016-03-01

    Full Text Available A mathematical model is used to estimate material properties from a short duration transient impact force measured by dropping spheres onto rectangular coupons fixed to a dynamic load cell. The contact stress between the dynamic load cell surface and the projectile are modeled using Hertzian contact mechanics. Due to the short impact time relative to the load cell dynamics, an additional Kelvin–Voigt element is included in the model to account for the finite response time of the piezoelectric crystal. Calculations with and without the Kelvin–Voigt element are compared to experimental data collected from combinations of polymeric spheres and polymeric and metallic surfaces. The results illustrate that the inclusion of the Kelvin–Voigt element qualitatively captures the post impact resonance and non-linear behavior of the load cell signal and quantitatively improves the estimation of the Young's elastic modulus and Poisson's ratio. Mathematically, the additional KV element couples one additional differential equation to the Hertzian spring-dashpot equation. The model can be numerically integrated in seconds using standard numerical techniques allowing for its use as a rapid technique for the estimation of material properties. Keywords: Young's modulus, Poisson's ratio, Dynamic load cell

  11. Quantitative risk analysis offshore-Human and organizational factors

    International Nuclear Information System (INIS)

    Espen Skogdalen, Jon; Vinnem, Jan Erik

    2011-01-01

    Quantitative Risk Analyses (QRAs) are one of the main tools for risk management within the Norwegian and UK oil and gas industry. Much criticism has been given to the limitations related to the QRA-models and that the QRAs do not include human and organizational factors (HOF-factors). Norway and UK offshore legislation and guidelines require that the HOF-factors are included in the QRAs. A study of 15 QRAs shows that the factors are to some extent included, and there are large differences between the QRAs. The QRAs are categorized into four levels according to the findings. Level 1 QRAs do not describe or comment on the HOF-factors at all. Relevant research projects have been conducted to fulfill the requirements of Level 3 analyses. At this level, there is a systematic collection of data related to HOF. The methods are systematic and documented, and the QRAs are adjusted. None of the QRAs fulfill the Level 4 requirements. Level 4 QRAs include the model and describe the HOF-factors as well as explain how the results should be followed up in the overall risk management. Safety audits by regulatory authorities are probably necessary to point out the direction for QRA and speed up the development.

  12. Quantitative analysis of cholesteatoma using high resolution computed tomography

    International Nuclear Information System (INIS)

    Kikuchi, Shigeru; Yamasoba, Tatsuya; Iinuma, Toshitaka.

    1992-01-01

    Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author)

  13. Quantitative XRD analysis of {110} twin density in biotic aragonites.

    Science.gov (United States)

    Suzuki, Michio; Kim, Hyejin; Mukai, Hiroki; Nagasawa, Hiromichi; Kogure, Toshihiro

    2012-12-01

    {110} Twin densities in biotic aragonite have been estimated quantitatively from the peak widths of specific reflections in powder X-ray diffraction (XRD) patterns, as well as direct confirmation of the twins using transmission electron microscopy (TEM). Influence of the twin density on the peak widths in the XRD pattern was simulated using DIFFaX program, regarding (110) twin as interstratification of two types of aragonite unit layers with mirrored relationship. The simulation suggested that the twin density can be estimated from the difference of the peak widths between 111 and 021, or between 221 and 211 reflections. Biotic aragonite in the crossed-lamellar microstructure (three species) and nacreous microstructure (four species) of molluscan shells, fish otoliths (two species), and a coral were investigated. The XRD analyses indicated that aragonite crystals in the crossed-lamellar microstructure of the three species contain high density of the twins, which is consistent with the TEM examination. On the other hand, aragonite in the nacre of the four species showed almost no difference of the peak widths between the paired reflections, indicating low twin densities. The results for the fish otoliths were varied between the species. Such variation of the twin density in biotic aragonites may reflect different schemes of crystal growth in biomineralization. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Quantitative Analysis and Efficient Surface Modification of Silica Nanoparticles

    Directory of Open Access Journals (Sweden)

    Hak-Sung Jung

    2012-01-01

    Full Text Available Aminofunctional trialkoxysilanes such as aminopropyltrimethoxysilane (APTMS and (3-trimethoxysilylpropyldiethylenetriamine (DETAS were employed as a surface modification molecule for generating monolayer modification on the surface of silica (SiO2 nanoparticles. We were able to quantitatively analyze the number of amine functional groups on the modified SiO2 nanoparticles by acid-base back titration method and determine the effective number of amine functional groups for the successive chemical reaction by absorption measurements after treating with fluorescent rhodamine B isothiocyanate (RITC molecules. The numbers of amine sites measured by back titration were 2.7 and 7.7 ea/nm2 for SiO2-APTMS and SiO2-DETAS, respectively, while the numbers of effective amine sites measured by absorption calibration were about one fifth of the total amine sites, namely, 0.44 and 1.3 ea/nm2 for SiO2-APTMS(RITC and SiO2-DETAS(RITC, respectively. Furthermore, it was confirmed that the reactivity of amino groups on the surface-modified silica nanoparticles could be maintained in ethanol for more than 1.5 months without showing any significant differences in the reactivity.

  15. Quantitative analysis of TALE-DNA interactions suggests polarity effects.

    Science.gov (United States)

    Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P

    2013-04-01

    Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.

  16. Quantitative analysis of the thermal damping of coherent axion oscillations

    International Nuclear Information System (INIS)

    Turner, M.S.

    1985-01-01

    Unruh and Wald have recently discussed a new mechanism for damping coherent axion oscillations, ''thermal damping,'' which occurs due to the temperature dependence of the axion mass and neutrino viscosity. We investigate the effect quantitatively and find that the present energy density in axions can be written as rho/sub a/ = rho/sub a0//(1+J/sub UW/), where rho/sub a/0 is what the axion energy density would be in the absence of the thermal-damping effect and J/sub UW/ is an integral whose integrand depends upon (dm/sub a//dT) 2 . As a function of f(equivalentPeccei-Quinn symmetry-breaking scale) J/sub UW/ achieves its maximum value for f/sub PQ/approx. =3 x 10 12 GeV; unless the axion mass turn-on is very sudden, Vertical Bar(T/m/sub a/)(dm/sub a//dT)Vertical Bar>>1, J/sub UW/ is <<1, implying that this damping mechanism is not significant

  17. Quantitative analysis of complexes in electron irradiated CZ silicon

    International Nuclear Information System (INIS)

    Inoue, N.; Ohyama, H.; Goto, Y.; Sugiyama, T.

    2007-01-01

    Complexes in helium or electron irradiated silicon are quantitatively analyzed by highly sensitive and accurate infrared (IR) absorption spectroscopy. Carbon concentration (1x10 15 -1x10 17 cm -3 ) and helium dose (5x10 12 -5x10 13 cm -2 ) or electron dose (1x10 15 -1x10 17 cm -2 ) are changed by two orders of magnitude in relatively low regime compared to the previous works. It is demonstrated that the carbon-related complex in low carbon concentration silicon of commercial grade with low electron dose can be detected clearly. Concentration of these complexes is estimated. It is clarified that the complex configuration and thermal behavior in low carbon and low dose samples is simple and almost confined within the individual complex family compared to those in high concentration and high dose samples. Well-established complex behavior in electron-irradiated sample is compared to that in He-irradiated samples, obtained by deep level transient spectroscopy (DLTS) or cathodoluminescence (CL), which had close relation to the Si power device performance

  18. Quantitative analysis of task selection for brain-computer interfaces

    Science.gov (United States)

    Llera, Alberto; Gómez, Vicenç; Kappen, Hilbert J.

    2014-10-01

    Objective. To assess quantitatively the impact of task selection in the performance of brain-computer interfaces (BCI). Approach. We consider the task-pairs derived from multi-class BCI imagery movement tasks in three different datasets. We analyze for the first time the benefits of task selection on a large-scale basis (109 users) and evaluate the possibility of transferring task-pair information across days for a given subject. Main results. Selecting the subject-dependent optimal task-pair among three different imagery movement tasks results in approximately 20% potential increase in the number of users that can be expected to control a binary BCI. The improvement is observed with respect to the best task-pair fixed across subjects. The best task-pair selected for each subject individually during a first day of recordings is generally a good task-pair in subsequent days. In general, task learning from the user side has a positive influence in the generalization of the optimal task-pair, but special attention should be given to inexperienced subjects. Significance. These results add significant evidence to existing literature that advocates task selection as a necessary step towards usable BCIs. This contribution motivates further research focused on deriving adaptive methods for task selection on larger sets of mental tasks in practical online scenarios.

  19. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  20. A CGE analysis for quantitative evaluation of electricity market changes

    International Nuclear Information System (INIS)

    Hwang, Won-Sik; Lee, Jeong-Dong

    2015-01-01

    Risk and uncertainty entailed by electricity industry privatization impose a heavy burden on the political determination. In this sense, ex ante analyses are important in order to investigate the economic effects of privatization or liberalization in the electricity industry. For the purpose of fulfilling these quantitative analyses, a novel approach is developed, incorporating a top-down and bottom-up model that takes into account economic effects and technological constraints simultaneously. This study also examines various counterfactual scenarios after Korean electricity industry reform through the integrated framework. Simulation results imply that authorities should prepare an improved regulatory system and policy measures such as forward contracts for industry reform, in order to promote competition in the distribution sector as well as the generation sector. -- Highlights: •A novel approach is proposed for incorporating a top-down and bottom-up model. •This study examines various counterfactual scenarios after Korean electricity industry reform. •An improved regulatory system and policy measures are required before the reform

  1. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  2. Evaluation of Quantitative and Qualitative Traits of Some Apricot Cultivars Grown in Zanjan Region

    Directory of Open Access Journals (Sweden)

    sanaz molaie

    2017-02-01

    Full Text Available Introduction Apricot (Prunus armeniaca L. has special position in Iran fruit culture industry. In terms of cultivation and producing, Iran is one of the major countries in the world but in terms of export Iran is ranked 23. For this reason, research on necessities of fruit culture industries and access to new cultivars by breeding project is required. Obviously, paying attention to quality and quantity of products and using of local germplasm that known completely, played a significant role in this project. Apricot with genomic feature 2n=16, has extensive diversity due to sexual propagation and cultivation in different areas. Central Asia and Caucasian groups of apricot that involving Iranian and Turkish cultivars, have greatest phenotypic variations, while European group including cultivars that cultivated in North America, Australia and South Africa have the lowest diversity. Climate adaptation, increasing of fruits quality, self-compatibility and resistance to diseases are the most important goals of apricot breeding. Of course, the quality of fruits depends on sugar and acid balance and special aroma. One of the important targets of apricot breeding is introduce and develop of cultivars that can be cultivated in extensive areas. Target of recent study is primary evaluating of morphological and pomological traits of some cultivar and genotypes of apricot grown in Zanjan province. In order to introduce the cultivars that produces fruit with high quality and complete scientific researches to selection of ideal cultivars in this region for future. Method and material: This research carried out on four cultivars (Badami, Daneshkadeh, Shekarpareh, Shahroodi and two genotypes (C and D and was conducted in a completely randomize design with three replications. Evaluating of tree, branch, leaf, flower and some fruit traits performed based on existing descriptor. For determining some important pomological traits, fruits harvested in commercial time

  3. [Quantitative spectrum analysis of characteristic gases of spontaneous combustion coal].

    Science.gov (United States)

    Liang, Yun-Tao; Tang, Xiao-Jun; Luo, Hai-Zhu; Sun, Yong

    2011-09-01

    Aimed at the characteristics of spontaneous combustion gas such as a variety of gases, lou limit of detection, and critical requirement of safety, Fourier transform infrared (FTIR) spectral analysis is presented to analyze characteristic gases of spontaneous combustion In this paper, analysis method is introduced at first by combing characteristics of absorption spectra of analyte and analysis requirement. Parameter setting method, sample preparation, feature variable abstract and analysis model building are taken into consideration. The methods of sample preparation, feature abstraction and analysis model are introduced in detail. And then, eleven kinds of gases were tested with Tensor 27 spectrometer. CH4, C2H6, C3H8, iC4H10, nC4H10, C2 H4, C3 H6, C3 H2, SF6, CO and CO2 were included. The optical path length was 10 cm while the spectra resolution was set as 1 cm(-1). The testing results show that the detection limit of all analytes is less than 2 x 10(-6). All the detection limits fit the measurement requirement of spontaneous combustion gas, which means that FTIR may be an ideal instrument and the analysis method used in this paper is competent for spontaneous combustion gas measurement on line.

  4. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  5. Calibration Phantom for Quantitative Tomography Analysis of Biodistribution of Magnetic Nanoparticles

    Science.gov (United States)

    Rahn, Helen; Kettering, Melanie; Richter, Heike; Hilger, Ingrid; Trahms, Lutz; Odenbach, Stefan

    2010-12-01

    Ferrofluids are being investigated for cancer treatments such as magnetic drug targeting (MDT) and magnetic heating treatments with the aim of treating the cancer locally, since magnetic nanoparticles with attached drugs are concentrated within the target region. Thus, the side effects are considerably reduced. One of the crucial factors for the success of these therapies is the magnetic nanoparticle distribution. Microcomputed X-ray tomography (XμCT) has been introduced as adequate technique for non-destructive three-dimensional analysis of biological samples enriched with magnetic nanoparticles. The biological tissue specimens, in this case tumor bearing mice after intra-tumoral magnetic nanoparticle injection, have been analyzed by means of XμCT. Complementary measurements have been performed by magnetorelaxometry (MRX). This technique enables a sensitive quantification of magnetic nanoparticles down to few nanograms. For multi-phase samples, such as biological tissue enriched with magnetic nanoparticles the polychromasy and beam hardening artifacts occurring in XμCT with conventional X-ray tubes cause severe problems for quantitative density determination. This problem requires an appropriate calibration of the polychromatic tomography equipment enabling a semi-quantitative analysis of the data. For this purpose a phantom system has been implemented. These phantoms consist of a tissue substitute containing different amounts of magnetic nanoparticles. Since the attenuation of the beam also depends on the thickness i.e. the path length of the beam transmitting the object, the reference sample has been defined to a cone shape. Thus, with one phantom the information about the magnetic nanoparticle concentration as well as the attenuation in dependence of the path length can be determined. Two phantom systems will be presented, one based on agarose-gel and one based on soap.

  6. Segmental Quantitative MR Imaging analysis of diurnal variation of water content in the lumbar intervertebral discs

    International Nuclear Information System (INIS)

    Zhu, Ting Ting; Ai, Tao; Zhang, Wei; Li, Tao; Li, Xiao Ming

    2015-01-01

    To investigate the changes in water content in the lumbar intervertebral discs by quantitative T2 MR imaging in the morning after bed rest and evening after a diurnal load. Twenty healthy volunteers were separately examined in the morning after bed rest and in the evening after finishing daily work. T2-mapping images were obtained and analyzed. An equally-sized rectangular region of interest (ROI) was manually placed in both, the anterior and the posterior annulus fibrosus (AF), in the outermost 20% of the disc. Three ROIs were placed in the space defined as the nucleus pulposus (NP). Repeated-measures analysis of variance and paired 2-tailed t tests were used for statistical analysis, with p < 0.05 as significantly different. T2 values significantly decreased from morning to evening, in the NP (anterior NP = -13.9 ms; central NP = -17.0 ms; posterior NP = -13.3 ms; all p < 0.001). Meanwhile T2 values significantly increased in the anterior AF (+2.9 ms; p = 0.025) and the posterior AF (+5.9 ms; p < 0.001). T2 values in the posterior AF showed the largest degree of variation among the 5 ROIs, but there was no statistical significance (p = 0.414). Discs with initially low T2 values in the center NP showed a smaller degree of variation in the anterior NP and in the central NP, than in discs with initially high T2 values in the center NP (10.0% vs. 16.1%, p = 0.037; 6.4% vs. 16.1%, p = 0.006, respectively). Segmental quantitative T2 MRI provides valuable insights into physiological aspects of normal discs.

  7. Quantitative description of the regional mechanics of the left atria by electroanatomical mapping

    International Nuclear Information System (INIS)

    Kuklik, Pawel; Molaee, Payman; Brooks, Anthony G; John, Bobby; Worthley, Stephen G; Sanders, Prashanthan

    2010-01-01

    The left atrium is a complex chamber, which plays an integral role in the maintenance of physiologic hemodynamic and electrical stability of the heart and is involved in many disease states, most commonly atrial fibrillation. Preserving regions of the left atrium that contribute the greatest to atrial mechanical function during curative strategies for atrial fibrillation are important. We present here a new application of the CARTO electroanatomical mapping system in the assessment of the left atria mechanical function. Electroanatomical data were collected in course of the electrophysiological procedure in 11 control patients and 12 patients with paroxysmal atrial fibrillation. The three-dimensional geometry of the left atria was reconstructed in 10 ms intervals and segmented into distinct regions. For each segment, a regional ejection fraction was calculated. We found that anterior, septal and lateral segments have significantly greater regional ejection fraction than atria roof, inferior and posterior segments. Therefore, we hypothesize that in order to minimize the impact on atrial mechanical function, an important determinant of thromboembolic risk, damage should be minimized to these atrial regions

  8. Quantitative analysis of bowel gas by plain abdominal radiograph combined with computer image processing

    International Nuclear Information System (INIS)

    Gao Yan; Peng Kewen; Zhang Houde; Shen Bixian; Xiao Hanxin; Cai Juan

    2003-01-01

    Objective: To establish a method for quantitative analysis of bowel gas by plain abdominal radiograph and computer graphics. Methods: Plain abdominal radiographs in supine position from 25 patients with irritable bowel syndrome (IBS) and 20 health controls were studied. A gastroenterologist and a radiologist independently conducted the following procedure on each radiograph. After the outline of bowel gas was traced by axe pen, the radiograph was digitized by a digital camera and transmitted to the computer with Histogram software. The total gas area was determined as the pixel value on images. The ratio of the bowel gas quantity to the pixel value in the region surrounded by a horizontal line tangential to the superior pubic symphysis margin, a horizontal line tangential to the tenth dorsal vertebra inferior margin, and the lateral line tangential to the right and left anteriosuperior iliac crest, was defined as the gas volume score (GVS). To examine the sequential reproducibility, a second plain abdominal radiograph was performed in 5 normal controls 1 week later, and the GVS were compared. Results: Bowel gas was easily identified on the plain abdominal radiograph. Both large and small intestine located in the selected region. Both observers could finish one radiographic measurement in less than 10 mins. The correlation coefficient between the two observers was 0.986. There was no statistical difference on GVS between the two sequential radiographs in 5 health controls. Conclusion: Quantification of bowel gas based on plain abdominal radiograph and computer is simple, rapid, and reliable

  9. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  10. Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.

    Science.gov (United States)

    Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan

    2017-01-01

    Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.

  11. Quantitative analysis of some volatile components in Mimusops elengi L.

    Directory of Open Access Journals (Sweden)

    Chantana Aromdee

    2009-08-01

    Full Text Available Dried pikul flower (Mimusops elengi L., Sapotaceae is used in many recipes of Thai traditional medicine i.e. cardiotonic and stomachic. In this study, fresh and dried pikul flowers were investigated. The odour of pikul flower, even when it was dried, is very strong and characteristic. The constituents of volatile oils in fresh and dried pikul flowers extracted by ether were analysed by gas chromatography-mass spectrometry. 2-Phenylethanol, 4-hydroxybenzenemethanol and cinnamyl alcohol were mainly found in fresh flower, 10.49, 8.69 and 6.17%, respectively. Whereas those mainly found in dried flowers were long chain carboxylic acid ester and (Z-9-octadecenoic acid, 5.37 and 4.71% of ether extract, respectively.An analytical method simultaneously determining benzyl alcohol, 2-phenylethanol and methyl paraben was developed by using the GC-FID method. The percent recoveries were 91.66, 104.59 and 105.28%, respectively. The intraday variations(% RSD were 7.22, 6.67 and 1.86%; and the interday variation were 3.12, 2.52 and 3.55%, respectively. Detection limits were 0.005, 0.014 and 0.001 ppm, and quantitation limits were 0.015, 0.048 and 0.003 ppm, respectively. Benzyl alcohol, 2-phenylethanol and methyl paraben content of dried flowers (9 samples from various drug stores in Thailand and one sample from China were 6.40-13.46, 17.57-196.57 and 27.35-355.53 ppm, respectively.

  12. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  13. Quantitative spectroscopy for the analysis of GOME data

    Science.gov (United States)

    Chance, K.

    1997-01-01

    Accurate analysis of the global ozone monitoring experiment (GOME) data to obtain atmospheric constituents requires reliable, traceable spectroscopic parameters for atmospheric absorption and scattering. Results are summarized for research that includes: the re-determination of Rayleigh scattering cross sections and phase functions for the 200 nm to 1000 nm range; the analysis of solar spectra to obtain a high-resolution reference spectrum with excellent absolute vacuum wavelength calibration; Ring effect cross sections and phase functions determined directly from accurate molecular parameters of N2 and O2; O2 A band line intensities and pressure broadening coefficients; and the analysis of absolute accuracies for ultraviolet and visible absorption cross sections of O3 and other trace species measurable by GOME.

  14. Quantitative risk assessment using the capacity-demand analysis

    International Nuclear Information System (INIS)

    Morgenroth, M.; Donnelly, C.R.; Westermann, G.D.; Huang, J.H.S.; Lam, T.M.

    1999-01-01

    The hydroelectric industry's recognition of the importance of avoiding unexpected failure, or forced outages, led to the development of probabilistic, or risk-based, methods in order to attempt to quantify exposures. Traditionally, such analysis has been carried out by qualitative assessments, relying on experience and sound engineering judgment to determine the optimum time to maintain, repair or replace a part or system. Depending on the nature of the problem, however, and the level of experience of those included in the decision making process, it is difficult to find a balance between acting proactively and accepting some amount of risk. The development of a practical means for establishing the probability of failure of any part or system, based on the determination of the statistical distribution of engineering properties such as acting stresses, is discussed. The capacity-demand analysis methodology, coupled with probablistic, risk-based analysis, permits all the factors associated with a decision to rehabilitate or replace a part, including the risks associated with the timing of the decision, to be assessed in a transparent and defendable manner. The methodology does not eliminate judgment altogether, but does move it from the level of estimating the risk of failure to the lower level of estimating variability in material properties, uncertainty in loading, and the uncertainties inherent in any engineering analysis. The method was successfully used in 1998 to carry out a comprehensive, economic risk analysis for the entire water conveyance system of a 90 year old hydropower station. The analysis included a number of diverse parts ranging from rock slopes and aging steel and concrete conduits, and the method allowed a rational assessment of the risks associated with reach of these varied parts to be determined, permitting the essential remedial works to be prioritized. 14 refs., 4 figs

  15. Quantitative Safety and Security Analysis from a Communication Perspective

    DEFF Research Database (Denmark)

    Malinowsky, Boris; Schwefel, Hans-Peter; Jung, Oliver

    2014-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real...... at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective...

  16. Quantitative remote sensing for monitoring forest canopy structural variables in the Three Gorges region of China

    NARCIS (Netherlands)

    Zeng, Y.

    2008-01-01

    Bridging various scales ranging from local to regional and global, remote sensing has facilitated extraordinary advances in modeling and mapping ecosystems and their functioning. Since forests are one of the most important natural resources on the terrestrial Earth surface, accurate and up-to-date

  17. Use of ultrafast computed tomography to quantitate regional myocardial perfusion: a preliminary report

    International Nuclear Information System (INIS)

    Rumberger, J.A.; Feiring, A.J.; Lipton, M.J.; Higgins, C.B.; Ell, S.R.; Marcus, M.L.

    1987-01-01

    The purpose of this study was to assess the potential for rapid acquisition computed axial tomography (Imatron C-100) to quantify regional myocardial perfusion. Myocardial and left ventricular cavity contrast clearance curves were constructed after injecting nonionic contrast (1 ml/kg over 2 to 3 seconds) into the inferior vena cava of six anesthetized, closed chest dogs (n = 14). Independent myocardial perfusion measurements were obtained by coincident injection of radiolabeled microspheres into the left atrium during control, intermediate and maximal myocardial vasodilation with adenosine (0.5 to 1.0 mg/kg per min, intravenously, respectively). At each flow state, 40 serial short-axis scans of the left ventricle were taken near end-diastole at the midpapillary muscle level. Contrast clearance curves were generated and analyzed from the left ventricular cavity and posterior papillary muscle regions after excluding contrast recirculation and minimizing partial volume effects. The area under the curve (gamma variate function) was determined for a region of interest placed within the left ventricular cavity. Characteristics of contrast clearance data from the posterior papillary muscle region that were evaluated included the peak myocardial opacification, area under the contrast clearance curve and a contrast clearance time defined by the full width/half maximal extent of the clearance curve. Myocardial perfusion (microspheres) ranged from 35 to 450 ml/100 g per min (mean 167 +/- 125)

  18. MOVES2010a regional level sensitivity analysis

    Science.gov (United States)

    2012-12-10

    This document discusses the sensitivity of various input parameter effects on emission rates using the US Environmental Protection Agencys (EPAs) MOVES2010a model at the regional level. Pollutants included in the study are carbon monoxide (CO),...

  19. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  20. Quantitative analysis of natural resource management options at different scales

    NARCIS (Netherlands)

    Keulen, van H.

    2007-01-01

    Natural capital (land, water, air) consists of many resources, each with its own quality, dynamics and renewability, but with strong interactions. The increasing competition for the natural resources, especially land and water, calls for a basic redirection in the analysis of land use. In this

  1. A quantitative method for Failure Mode and Effects Analysis

    NARCIS (Netherlands)

    Braaksma, Anne Johannes Jan; Meesters, A.J.; Klingenberg, W.; Hicks, C.

    2012-01-01

    Failure Mode and Effects Analysis (FMEA) is commonly used for designing maintenance routines by analysing potential failures, predicting their effect and facilitating preventive action. It is used to make decisions on operational and capital expenditure. The literature has reported that despite its

  2. [Evaluation of dental plaque by quantitative digital image analysis system].

    Science.gov (United States)

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (Pchart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  3. Quantitative analysis of prediction models for hot cracking in ...

    Indian Academy of Sciences (India)

    A RodrМguez-Prieto

    2017-11-16

    Nov 16, 2017 ... In this paper, a systematic analysis of different methods of d-ferrite estimation is carried out based ... Keywords. d-Ferrite; hot cracking; prediction methods; stringency levels; decision tool. 1. .... prediction of d-ferrite content using a system of multi- ..... support the selection of some materials according to their.

  4. Quantitative electron microscope autoradiography: application of multiple linear regression analysis

    International Nuclear Information System (INIS)

    Markov, D.V.

    1986-01-01

    A new method for the analysis of high resolution EM autoradiographs is described. It identifies labelled cell organelle profiles in sections on a strictly statistical basis and provides accurate estimates for their radioactivity without the need to make any assumptions about their size, shape and spatial arrangement. (author)

  5. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, Lieuwe Jan; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of

  6. From POOSL to UPPAAL : transformation and quantitative analysis

    NARCIS (Netherlands)

    Xing, J.; Theelen, B.D.; Langerak, R.; Pol, van de J.C.; Tretmans, J.; Voeten, J.P.M.

    2010-01-01

    POOSL (Parallel Object-Oriented Specification Language) is a powerful general purpose system-level modeling language. In research on design space exploration of motion control systems, POOSL has been used to construct models for performance analysis. The considered motion control algorithms are

  7. Mass spectrometry for real-time quantitative breath analysis

    Czech Academy of Sciences Publication Activity Database

    Smith, D.; Španěl, Patrik; Herbig, J.; Beauchamp, J.

    2014-01-01

    Roč. 8, č. 2 (2014), 027101 ISSN 1752-7155 Institutional support: RVO:61388955 Keywords : breath analysis * proton transfer reaction mass spectrometry * selected ion flow tube mass spectrometry Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.631, year: 2014

  8. Resilience to climate change in a cross-scale tourism governance context: a combined quantitative-qualitative network analysis

    Directory of Open Access Journals (Sweden)

    Tobias Luthe

    2016-03-01

    Full Text Available Social systems in mountain regions are exposed to a number of disturbances, such as climate change. Calls for conceptual and practical approaches on how to address climate change have been taken up in the literature. The resilience concept as a comprehensive theory-driven approach to address climate change has only recently increased in importance. Limited research has been undertaken concerning tourism and resilience from a network governance point of view. We analyze tourism supply chain networks with regard to resilience to climate change at the municipal governance scale of three Alpine villages. We compare these with a planned destination management organization (DMO as a governance entity of the same three municipalities on the regional scale. Network measures are analyzed via a quantitative social network analysis (SNA focusing on resilience from a tourism governance point of view. Results indicate higher resilience of the regional DMO because of a more flexible and diverse governance structure, more centralized steering of fast collective action, and improved innovative capacity, because of higher modularity and better core-periphery integration. Interpretations of quantitative results have been qualitatively validated by interviews and a workshop. We conclude that adaptation of tourism-dependent municipalities to gradual climate change should be dealt with at a regional governance scale and adaptation to sudden changes at a municipal scale. Overall, DMO building at a regional scale may enhance the resilience of tourism destinations, if the municipalities are well integrated.

  9. Quantitative analysis of intraoperative communication in open and laparoscopic surgery.

    Science.gov (United States)

    Sevdalis, Nick; Wong, Helen W L; Arora, Sonal; Nagpal, Kamal; Healey, Andrew; Hanna, George B; Vincent, Charles A

    2012-10-01

    Communication is important for patient safety in the operating room (OR). Several studies have assessed OR communications qualitatively or have focused on communication in crisis situations. This study used prospective, quantitative observation based on well-established communication theory to assess similarities and differences in communication patterns between open and laparoscopic surgery. Based on communication theory, a standardized proforma was developed for assessment in the OR via real-time observation of communication types, their purpose, their content, and their initiators/recipients. Data were collected prospectively in real time in the OR for 20 open and 20 laparoscopic inguinal hernia repairs. Assessors were trained and calibrated, and their reliability was established statistically. During 1,884 min of operative time, 4,227 communications were observed and analyzed (2,043 laparoscopic vs 2,184 open communications). The mean operative duration (laparoscopic, 48 min vs open, 47 min), mean communication frequency (laparoscopic, 102 communications/procedure vs open, 109 communications/procedure), and mean communication rate (laparoscopic, 2.13 communications/min vs open, 2.23 communications/min) did not differ significantly across laparoscopic and open procedures. Communications were most likely to be initiated by surgeons (80-81 %), to be received by either other surgeons (46-50%) or OR nurses (38-40 %), to be associated with equipment/procedural issues (39-47 %), and to provide direction for the OR team (38-46%) in open and laparoscopic cases. Moreover, communications in laparoscopic cases were significantly more equipment related (laparoscopic, 47 % vs open, 39 %) and aimed significantly more at providing direction (laparoscopic, 46 % vs open, 38 %) and at consulting (laparoscopic, 17 % vs open, 12 %) than at sharing information (laparoscopic, 17 % vs open, 31 %) (P communications were found in both laparoscopic and open cases during a relatively low

  10. Comprehensive comparison of preselected regions for a high level radioactive waste repository: a subjective quantitative evaluation method

    International Nuclear Information System (INIS)

    Wang Ju; Zong Zihua; Jin Yuanxin; Zhu Pengfei; Su Rui; Chen Weiming

    2012-01-01

    Based on the comprehensive features of the 6 preselected regions (Northwest China, Southwest China East China, South China, Inner Mongolia, Xinjiang regions) for China's high level radioactive waste repository, this paper uses the subjective quantitative method to evaluate the weight of each site selection criterion and provides the scores of each region. The results shows that the future natural changes and the hydrogeological conditions are considered as the most important natural siting criteria, while the social impact and human activities are the most important social siting criteria. According to the scores, the priority order of the regions are Northwest China, Xinjiang, Inner Mongolia, South China, East China, Southwest China. On the whole, the scores of' the regions in western China (Northwest China, Xinjiang and Inner Mongolia) are higher than those in eastern China (South China, East China Southwest China), which obviously shows that the participated experts considers that the disposal of high level waste in west China is more favorable than in east China. (authors)

  11. Quantitative data analysis with SPSS release 8 for Windows a guide for social scientists

    CERN Document Server

    Bryman, Alan

    2002-01-01

    The latest edition of this best-selling introduction to Quantitative Data Analysis through the use of a computer package has been completely updated to accommodate the needs of users of SPSS Release 8 for Windows. Like its predecessor, it provides a non-technical approach to quantitative data analysis and a user-friendly introduction to the widely used SPSS for Windows. It assumes no previous familiarity with either statistics or computing but takes the reader step-by-step through the techniques, reinforced by exercises for further practice. Techniques explained in Quantitative Data Analysis with SPSS Release 8 for Windows include: * correlation * simple and multiple regression * multivariate analysis of variance and covariance * factor analysis The book also covers issues such as sampling, statistical significance, conceptualization and measurement and the selection of appropriate tests. For further information or to download the book's datasets, please visit the webstite: http://www.routledge.com/textbooks/...

  12. Integrated quantitative fractal polarimetric analysis of monolayer lung cancer cells

    Science.gov (United States)

    Shrestha, Suman; Zhang, Lin; Quang, Tri; Farrahi, Tannaz; Narayan, Chaya; Deshpande, Aditi; Na, Ying; Blinzler, Adam; Ma, Junyu; Liu, Bo; Giakos, George C.

    2014-05-01

    Digital diagnostic pathology has become one of the most valuable and convenient advancements in technology over the past years. It allows us to acquire, store and analyze pathological information from the images of histological and immunohistochemical glass slides which are scanned to create digital slides. In this study, efficient fractal, wavelet-based polarimetric techniques for histological analysis of monolayer lung cancer cells will be introduced and different monolayer cancer lines will be studied. The outcome of this study indicates that application of fractal, wavelet polarimetric principles towards the analysis of squamous carcinoma and adenocarcinoma cancer cell lines may be proved extremely useful in discriminating among healthy and lung cancer cells as well as differentiating among different lung cancer cells.

  13. Quantitative Safety and Security Analysis from a Communication Perspective

    Directory of Open Access Journals (Sweden)

    Boris Malinowsky

    2015-12-01

    Full Text Available This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective on the communication protocols. The results are obtained using the network simulator ns-3.

  14. African Primary Care Research: Quantitative analysis and presentation of results

    Science.gov (United States)

    Ogunbanjo, Gboyega A.

    2014-01-01

    Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435

  15. Quantitative analysis of the renal aging in rats. Stereological study

    OpenAIRE

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins Filho, Eduardo Lopes; Fraga, Rogério de

    2016-01-01

    ABSTRACT PURPOSE: To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. METHODS: Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glo...

  16. A Quantitative Accident Sequence Analysis for a VHTR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jintae; Lee, Joeun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    In Korea, the basic design features of VHTR are currently discussed in the various design concepts. Probabilistic risk assessment (PRA) offers a logical and structured method to assess risks of a large and complex engineered system, such as a nuclear power plant. It will be introduced at an early stage in the design, and will be upgraded at various design and licensing stages as the design matures and the design details are defined. Risk insights to be developed from the PRA are viewed as essential to developing a design that is optimized in meeting safety objectives and in interpreting the applicability of the existing demands to the safety design approach of the VHTR. In this study, initiating events which may occur in VHTRs were selected through MLD method. The initiating events were then grouped into four categories for the accident sequence analysis. Initiating events frequency and safety systems failure rate were calculated by using reliability data obtained from the available sources and fault tree analysis. After quantification, uncertainty analysis was conducted. The SR and LR frequency are calculated respectively 7.52E- 10/RY and 7.91E-16/RY, which are relatively less than the core damage frequency of LWRs.

  17. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    International Nuclear Information System (INIS)

    Sandusky, Peter; Appiah-Amponsah, Emmanuel; Raftery, Daniel

    2011-01-01

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  18. Quantitative Trait Locus Analysis of Mating Behavior and Male Sex Pheromones in Nasonia Wasps

    Directory of Open Access Journals (Sweden)

    Wenwen Diao

    2016-06-01

    Full Text Available A major focus in speciation genetics is to identify the chromosomal regions and genes that reduce hybridization and gene flow. We investigated the genetic architecture of mating behavior in the parasitoid wasp species pair Nasonia giraulti and Nasonia oneida that exhibit strong prezygotic isolation. Behavioral analysis showed that N. oneida females had consistently higher latency times, and broke off the mating sequence more often in the mounting stage when confronted with N. giraulti males compared with males of their own species. N. oneida males produce a lower quantity of the long-range male sex pheromone (4R,5S-5-hydroxy-4-decanolide (RS-HDL. Crosses between the two species yielded hybrid males with various pheromone quantities, and these males were used in mating trials with females of either species to measure female mate discrimination rates. A quantitative trait locus (QTL analysis involving 475 recombinant hybrid males (F2, 2148 reciprocally backcrossed females (F3, and a linkage map of 52 equally spaced neutral single nucleotide polymorphism (SNP markers plus SNPs in 40 candidate mating behavior genes revealed four QTL for male pheromone amount, depending on partner species. Our results demonstrate that the RS-HDL pheromone plays a role in the mating system of N. giraulti and N. oneida, but also that additional communication cues are involved in mate choice. No QTL were found for female mate discrimination, which points at a polygenic architecture of female choice with strong environmental influences.

  19. Use of optimized 1D TOCSY NMR for improved quantitation and metabolomic analysis of biofluids

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, Peter [Eckerd College, Department of Chemistry (United States); Appiah-Amponsah, Emmanuel; Raftery, Daniel, E-mail: raftery@purdue.edu [Purdue University, Department of Chemistry (United States)

    2011-04-15

    One dimensional selective TOCSY experiments have been shown to be advantageous in providing improved data inputs for principle component analysis (PCA) (Sandusky and Raftery 2005a, b). Better subpopulation cluster resolution in the observed scores plots results from the ability to isolate metabolite signals of interest via the TOCSY based filtering approach. This report reexamines the quantitative aspects of this approach, first by optimizing the 1D TOCSY experiment as it relates to the measurement of biofluid constituent concentrations, and second by comparing the integration of 1D TOCSY read peaks to the bucket integration of 1D proton NMR spectra in terms of precision and accuracy. This comparison indicates that, because of the extensive peak overlap that occurs in the 1D proton NMR spectra of biofluid samples, bucket integrals are often far less accurate as measures of individual constituent concentrations than 1D TOCSY read peaks. Even spectral fitting approaches have proven difficult in the analysis of significantly overlapped spectral regions. Measurements of endogenous taurine made over a sample population of human urine demonstrates that, due to background signals from other constituents, bucket integrals of 1D proton spectra routinely overestimate the taurine concentrations and distort its variation over the sample population. As a result, PCA calculations performed using data matrices incorporating 1D TOCSY determined taurine concentrations produce better scores plot subpopulation cluster resolution.

  20. Regional heterogeneity and gene flow maintain variance in a quantitative trait within populations of lodgepole pine

    Science.gov (United States)

    Yeaman, Sam; Jarvis, Andy

    2006-01-01

    Genetic variation is of fundamental importance to biological evolution, yet we still know very little about how it is maintained in nature. Because many species inhabit heterogeneous environments and have pronounced local adaptations, gene flow between differently adapted populations may be a persistent source of genetic variation within populations. If this migration–selection balance is biologically important then there should be strong correlations between genetic variance within populations and the amount of heterogeneity in the environment surrounding them. Here, we use data from a long-term study of 142 populations of lodgepole pine (Pinus contorta) to compare levels of genetic variation in growth response with measures of climatic heterogeneity in the surrounding region. We find that regional heterogeneity explains at least 20% of the variation in genetic variance, suggesting that gene flow and heterogeneous selection may play an important role in maintaining the high levels of genetic variation found within natural populations. PMID:16769628

  1. Quantitative status of resources for radiation therapy in Asia and Pacific region

    International Nuclear Information System (INIS)

    Tatsuzaki, Hideo; Levin, Cecil Victor

    2001-01-01

    Purpose: Resources for radiation therapy in Asian and Pacific countries were analyzed to obtain a better understanding of the status of radiation oncological practice in the region. Methods and Materials: The data were obtained mainly through surveys on the availability of major equipment and personnel which were conducted through an International Atomic Energy Agency regional project. The study included 17 countries in South Asia, South East Asia, East Asia and Australasia. Data were related to national populations and economic and a general health care indices. Results: Large differences in equipment and personnel among countries were demonstrated. The availability of both teletherapy and brachytherapy was related to the economic status of the countries. The shortage of teletherapy machines was evident in more countries than that of brachytherapy. Many departments were found to treat patients without simulators or treatment planning systems. The number of radiation oncologists standardized by cancer incidence of a country did not correlate well with economic status. Conclusions: There were significant deficiencies in the availability of all components of radiation therapy in the analyzed countries. The deficiencies were linked predominantly to the economic status of the country. Cognisance should be taken of the specific shortfalls in each country to ensure that expansion or any assistance offered appropriately match its needs and can be fully utilized. The information on the resources currently available for radiation oncological practice in the region presented in this paper provides a valuable basis for planning of development aid programs on radiation therapy

  2. Quantitative terahertz time-domain spectroscopy and analysis in chemistry and biology

    DEFF Research Database (Denmark)

    Jepsen, Peter Uhd

    2005-01-01

    I will describe how Terahertz Time-Domain Spectroscopy (THz-TDS) can be used for quantitative, broadband spectroscopy in the far-infrared spectral region. Thz-TDS is sensitive to long-range, non-covalent interactions in the condensed phase, for instance intermolecular hydrogen bonding in molecula...

  3. Quantitative flow analysis of swimming dynamics with coherent Lagrangian vortices.

    Science.gov (United States)

    Huhn, F; van Rees, W M; Gazzola, M; Rossinelli, D; Haller, G; Koumoutsakos, P

    2015-08-01

    Undulatory swimmers flex their bodies to displace water, and in turn, the flow feeds back into the dynamics of the swimmer. At moderate Reynolds number, the resulting flow structures are characterized by unsteady separation and alternating vortices in the wake. We use the flow field from simulations of a two-dimensional, incompressible viscous flow of an undulatory, self-propelled swimmer and detect the coherent Lagrangian vortices in the wake to dissect the driving momentum transfer mechanisms. The detected material vortex boundary encloses a Lagrangian control volume that serves to track back the vortex fluid and record its circulation and momentum history. We consider two swimming modes: the C-start escape and steady anguilliform swimming. The backward advection of the coherent Lagrangian vortices elucidates the geometry of the vorticity field and allows for monitoring the gain and decay of circulation and momentum transfer in the flow field. For steady swimming, momentum oscillations of the fish can largely be attributed to the momentum exchange with the vortex fluid. For the C-start, an additionally defined jet fluid region turns out to balance the high momentum change of the fish during the rapid start.

  4. Quantitative analysis of light elements in thick samples by PIGE

    International Nuclear Information System (INIS)

    Mateus, R.; Jesus, A.P.; Ribeiro, J.P.

    2004-01-01

    PIGE analysis of thick and intermediate samples is usually performed with the help of standards, but this method gives only good results when the standard is very similar to the sample to be analysed. In this work, we present an alternative method for PIGE analysis of light elements in thick samples. This method is based on a code that integrates the nuclear reaction excitation function along the depth of the sample. For the integration procedure the sample is divided in sublayers, defined by the energy steps that were used to measure accurately the excitation function. This function is used as input. Within each sublayer the stopping power cross-sections may be assumed as constant. With these two conditions the calculus of the contribution of each sublayer for the total yield becomes an easy task. This work presents results for the analysis of lithium, boron, fluorine and sodium in thick samples. For this purpose, excitation functions of the reactions 7 Li(p,p ' γ) 7 Li, 19 F(p,p ' γ) 19 F, 10 B(p,αγ) 7 Be and 23 Na(p,p ' γ) 23 Na were employed. Calculated γ-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds of the referred elements. The agreement is better than 7.5%. Taking into consideration the experimental uncertainty of the measured yields and the errors related to the stopping power values used, this agreement shows that effects as the beam energy straggling, ignored in the calculation, seem to play a minor role

  5. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  6. Interleukin-2 signaling pathway analysis by quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Osinalde, Nerea; Moss, Helle; Arrizabalaga, Onetsine

    2011-01-01

    among which 79 were found with increased abundance in the tyrosine-phosphorylated complexes, including several previously not reported IL-2 downstream effectors. Combinatorial site-specific phosphoproteomic analysis resulted in identification of 99 phosphorylated sites mapping to the identified proteins...... with increased abundance in the tyrosine-phosphorylated complexes, of which 34 were not previously described. In addition, chemical inhibition of the identified IL-2-mediated JAK, PI3K and MAPK signaling pathways, resulted in distinct alteration on the IL-2 dependent proliferation....

  7. Quantitation of Surface Coating on Nanoparticles Using Thermogravimetric Analysis.

    Science.gov (United States)

    Dongargaonkar, Alpana A; Clogston, Jeffrey D

    2018-01-01

    Nanoparticles are critical components in nanomedicine and nanotherapeutic applications. Some nanoparticles, such as metallic nanoparticles, consist of a surface coating or surface modification to aid in its dispersion and stability. This surface coating may affect the behavior of nanoparticles in a biological environment, thus it is important to measure. Thermogravimetric analysis (TGA) can be used to determine the amount of coating on the surface of the nanoparticle. TGA experiments run under inert atmosphere can also be used to determine residual metal content present in the sample. In this chapter, the TGA technique and experimental method are described.

  8. Quantitative analysis of overlapping XPS peaks by spectrum reconstruction

    DEFF Research Database (Denmark)

    Graat, Peter C.J.; Somers, Marcel A. J.

    1998-01-01

    parameters. The values obtained for the oxide film thickness were compared with thickness values determined from the intensity of the corresponding O 1s spectra and with thickness values resulting from ellipsometric analysis. The sensitivity of the reconstruction procedure with regard to film thickness...... contributions in the spectra owing to inelastic scattering of signal electrons were calculated from the depth distributions of these constituents and their reference spectra. In the reconstruction procedure the film thickness and the concentrations of Fe/sup 2+/ and Fe/sup 3+/ in the oxide film were used as fit...

  9. Experimental design and quantitative analysis of microbial community multiomics.

    Science.gov (United States)

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  10. Quantitative risk analysis in two pipelines operated by TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Bittencourt, Euclides [Centro Universitario FIB, Salvador , BA (Brazil)

    2009-07-01

    Transportation risk analysis techniques were used to study two pipelines operated by TRANSPETRO. The Pipeline A is for the simultaneous transportation of diesel, gasoline and LPG and comprises three parts, all of them crossing rural areas. The Pipeline B is for oil transportation and one of its ends is located in an area of a high density population. Both pipelines had their risk studied using the PHAST RISK{sup R} software and the individual risk measures, the only considered measures for license purposes for this type of studies, presented level far below the maximum tolerable levels considered. (author)

  11. Quantitative analysis of the renal aging in rats. Stereological study.

    Science.gov (United States)

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins, Eduardo Lopes; Fraga, Rogério de

    2016-05-01

    To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glom]) of the renal glomeruli and average glomerular volume (Vol[glom])) and also it was evaluated the renal function for the dosage of serum creatinine and urea. There was significant decrease of the renal function in the oldest rats. The renal volume presented gradual increase during the development of the rats with the biggest values registered in the group of animals at 12 months of age and significant progressive decrease in older animals. Vv[glom] presented statistically significant gradual reduction between the groups and the Nv[glom] also decreased significantly. The renal function proved to be inferior in senile rats when compared to the young rats. The morphometric and stereological analysis evidenced renal atrophy, gradual reduction of the volume density and numerical density of the renal glomeruli associated to the aging process.

  12. Balancing the Quantitative and Qualitative Aspects of Social Network Analysis to Study Complex Social Systems

    OpenAIRE

    Schipper, Danny; Spekkink, Wouter

    2015-01-01

    Social Network Analysis (SNA) can be used to investigate complex social systems. SNA is typically applied as a quantitative method, which has important limitations. First, quantitative methods are capable of capturing the form of relationships (e.g. strength and frequency), but they are less suitable for capturing the content of relationships (e.g. interests and motivations). Second, while complex social systems are highly dynamic, the representations that SNA creates of such systems are ofte...

  13. Quantitative measurement of regional cerebral blood flow using {sup 99m}Tc-HM-PAO SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Hisashi; Nakamura, Yusaku; Yagi, Yuji; Miura, Kosuke; Takahashi, Mitsuo [Kinki Univ., Osaka-Sayama, Osaka (Japan)

    1994-10-01

    This study examined a simple method for measuring the regional cerebral blood flow (rCBF) using {sup 99m}Tc-HM-PAO SPECT. The mean CBF (mCBF) was determined by the Patlack plot method and rCBF was calculated with Lassen`s correction algorithm, as reported by Matsuda et al. The cerebral hemisphere was employed as the reference region for Lassen`s correction. The reference RI count rate was calculated from the left cerebral hemisphere at the basal ganglia level and the correction factor {alpha} was fixed at 2.0. As a result, rCBF could be measured more easily than by Matsuda`s method. The contribution of age, laterality and gender to the CBF of normal subjects were studied. The mCBF value of 26 normal subjects was 53.8{+-}6.4 ml/100 g/min and showed a significant correlation with advancing age (R=0.644, p=0.0004, n=26). The mean values for rCBF of the cerebellum, frontal area, temporal area, occipital area and parietal area were 77.3{+-}6.6 ml/100 g/min, 70.2{+-}9.1 ml/100 g/min, 72.3{+-}7.5 ml/100 g/min, 71.8{+-}6.2 ml/100 g/min and 73.8{+-}8.6 ml/100 g/min, respectively. There were no gender or laterality differences in the mCBF or respective rCBF values. Each of the above listed regions, except for the occipital area, demonstrated a significant correlation with advancing age. The most remarkable decrease in rCBF with age was noted in the frontal area (R=0.757, p=0.001, n=26). (author).

  14. Quantitative measurement of regional cerebral blood flow using 99mTc-HM-PAO SPECT

    International Nuclear Information System (INIS)

    Tanaka, Hisashi; Nakamura, Yusaku; Yagi, Yuji; Miura, Kosuke; Takahashi, Mitsuo

    1994-01-01

    This study examined a simple method for measuring the regional cerebral blood flow (rCBF) using 99m Tc-HM-PAO SPECT. The mean CBF (mCBF) was determined by the Patlack plot method and rCBF was calculated with Lassen's correction algorithm, as reported by Matsuda et al. The cerebral hemisphere was employed as the reference region for Lassen's correction. The reference RI count rate was calculated from the left cerebral hemisphere at the basal ganglia level and the correction factor α was fixed at 2.0. As a result, rCBF could be measured more easily than by Matsuda's method. The contribution of age, laterality and gender to the CBF of normal subjects were studied. The mCBF value of 26 normal subjects was 53.8±6.4 ml/100 g/min and showed a significant correlation with advancing age (R=0.644, p=0.0004, n=26). The mean values for rCBF of the cerebellum, frontal area, temporal area, occipital area and parietal area were 77.3±6.6 ml/100 g/min, 70.2±9.1 ml/100 g/min, 72.3±7.5 ml/100 g/min, 71.8±6.2 ml/100 g/min and 73.8±8.6 ml/100 g/min, respectively. There were no gender or laterality differences in the mCBF or respective rCBF values. Each of the above listed regions, except for the occipital area, demonstrated a significant correlation with advancing age. The most remarkable decrease in rCBF with age was noted in the frontal area (R=0.757, p=0.001, n=26). (author)

  15. Region and cell-type resolved quantitative proteomic map of the human heart

    DEFF Research Database (Denmark)

    Doll, Sophia; Dreßen, Martina; Geyer, Philipp E

    2017-01-01

    The heart is a central human organ and its diseases are the leading cause of death worldwide, but an in-depth knowledge of the identity and quantity of its constituent proteins is still lacking. Here, we determine the healthy human heart proteome by measuring 16 anatomical regions and three major...... cardiac cell types by high-resolution mass spectrometry-based proteomics. From low microgram sample amounts, we quantify over 10,700 proteins in this high dynamic range tissue. We combine copy numbers per cell with protein organellar assignments to build a model of the heart proteome at the subcellular...

  16. Quantitative Myocardial Perfusion Imaging Versus Visual Analysis in Diagnosing Myocardial Ischemia: A CE-MARC Substudy.

    Science.gov (United States)

    Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P

    2018-05-01

    This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial

  17. Regional trade market analysis: resort marketing approaches

    Science.gov (United States)

    David C. Bojanic; Rodney B. Warnick

    1995-01-01

    This paper examines the value of geographic segmentation for a regional ski resort in New England. Customers from different user groups were surveyed along with a list of inquiries and a purchased list, and grouped according to their area of origin. An ANOVA was performed to determine if there were differences in attitudes and trip behaviors between the segments. It...

  18. Southeast Regional Clean Energy Policy Analysis (Revised)

    Energy Technology Data Exchange (ETDEWEB)

    McLaren, J.

    2011-04-01

    More than half of the electricity produced in the southeastern states is fuelled by coal. Although the region produces some coal, most of the states depend heavily on coal imports. Many of the region's aging coal power facilities are planned for retirement within the next 20 years. However, estimates indicate that a 20% increase in capacity is needed over that time to meet the rapidly growing demand. The most common incentives for energy efficiency in the Southeast are loans and rebates; however, total public spending on energy efficiency is limited. The most common state-level policies to support renewable energy development are personal and corporate tax incentives and loans. The region produced 1.8% of the electricity from renewable resources other than conventional hydroelectricity in 2009, half of the national average. There is significant potential for development of a biomass market in the region, as well as use of local wind, solar, methane-to-energy, small hydro, and combined heat and power resources. Options are offered for expanding and strengthening state-level policies such as decoupling, integrated resource planning, building codes, net metering, and interconnection standards to support further clean energy development. Benefits would include energy security, job creation, insurance against price fluctuations, increased value of marginal lands, and local and global environmental paybacks.

  19. Southeast Regional Clean Energy Policy Analysis

    Energy Technology Data Exchange (ETDEWEB)

    McLaren, Joyce [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2011-04-01

    More than half of the electricity produced in the southeastern states is fuelled by coal. Although the region produces some coal, most of the states depend heavily on coal imports. Many of the region's aging coal power facilities are planned for retirement within the next 20 years. However, estimates indicate that a 20% increase in capacity is needed over that time to meet the rapidly growing demand. The most common incentives for energy efficiency in the Southeast are loans and rebates; however, total public spending on energy efficiency is limited. The most common state-level policies to support renewable energy development are personal and corporate tax incentives and loans. The region produced 1.8% of the electricity from renewable resources other than conventional hydroelectricity in 2009, half of the national average. There is significant potential for development of a biomass market in the region, as well as use of local wind, solar, methane-to-energy, small hydro, and combined heat and power resources. Options are offered for expanding and strengthening state-level policies such as decoupling, integrated resource planning, building codes, net metering, and interconnection standards to support further clean energy development. Benefits would include energy security, job creation, insurance against price fluctuations, increased value of marginal lands, and local and global environmental paybacks.

  20. Quantitative radiographic analysis of fiber reinforced polymer composites.

    Science.gov (United States)

    Baidya, K P; Ramakrishna, S; Rahman, M; Ritchie, A

    2001-01-01

    X-ray radiographic examination of the bone fracture healing process is a widely used method in the treatment and management of patients. Medical devices made of metallic alloys reportedly produce considerable artifacts that make the interpretation of radiographs difficult. Fiber reinforced polymer composite materials have been proposed to replace metallic alloys in certain medical devices because of their radiolucency, light weight, and tailorable mechanical properties. The primary objective of this paper is to provide a comparable radiographic analysis of different fiber reinforced polymer composites that are considered suitable for biomedical applications. Composite materials investigated consist of glass, aramid (Kevlar-29), and carbon reinforcement fibers, and epoxy and polyether-ether-ketone (PEEK) matrices. The total mass attenuation coefficient of each material was measured using clinical X-rays (50 kev). The carbon fiber reinforced composites were found to be more radiolucent than the glass and kevlar fiber reinforced composites.