WorldWideScience

Sample records for tissue probability map

  1. The average baboon brain: MRI templates and tissue probability maps from 89 individuals.

    Science.gov (United States)

    Love, Scott A; Marie, Damien; Roth, Muriel; Lacoste, Romain; Nazarian, Bruno; Bertello, Alice; Coulon, Olivier; Anton, Jean-Luc; Meguerditchian, Adrien

    2016-05-15

    The baboon (Papio) brain is a remarkable model for investigating the brain. The current work aimed at creating a population-average baboon (Papio anubis) brain template and its left/right hemisphere symmetric version from a large sample of T1-weighted magnetic resonance images collected from 89 individuals. Averaging the prior probability maps output during the segmentation of each individual also produced the first baboon brain tissue probability maps for gray matter, white matter and cerebrospinal fluid. The templates and the tissue probability maps were created using state-of-the-art, freely available software tools and are being made freely and publicly available: http://www.nitrc.org/projects/haiko89/ or http://lpc.univ-amu.fr/spip.php?article589. It is hoped that these images will aid neuroimaging research of the baboon by, for example, providing a modern, high quality normalization target and accompanying standardized coordinate system as well as probabilistic priors that can be used during tissue segmentation. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    Science.gov (United States)

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  3. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  4. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  5. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Science.gov (United States)

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Linear-fitting-based similarity coefficient map for tissue dissimilarity analysis in -w magnetic resonance imaging

    International Nuclear Information System (INIS)

    Yu Shao-De; Wu Shi-Bin; Xie Yao-Qin; Wang Hao-Yu; Wei Xin-Hua; Chen Xin; Pan Wan-Long; Hu Jiani

    2015-01-01

    Similarity coefficient mapping (SCM) aims to improve the morphological evaluation of weighted magnetic resonance imaging However, how to interpret the generated SCM map is still pending. Moreover, is it probable to extract tissue dissimilarity messages based on the theory behind SCM? The primary purpose of this paper is to address these two questions. First, the theory of SCM was interpreted from the perspective of linear fitting. Then, a term was embedded for tissue dissimilarity information. Finally, our method was validated with sixteen human brain image series from multi-echo . Generated maps were investigated from signal-to-noise ratio (SNR) and perceived visual quality, and then interpreted from intra- and inter-tissue intensity. Experimental results show that both perceptibility of anatomical structures and tissue contrast are improved. More importantly, tissue similarity or dissimilarity can be quantified and cross-validated from pixel intensity analysis. This method benefits image enhancement, tissue classification, malformation detection and morphological evaluation. (paper)

  7. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Linking retinotopic fMRI mapping and anatomical probability maps of human occipital areas V1 and V2.

    Science.gov (United States)

    Wohlschläger, A M; Specht, K; Lie, C; Mohlberg, H; Wohlschläger, A; Bente, K; Pietrzyk, U; Stöcker, T; Zilles, K; Amunts, K; Fink, G R

    2005-05-15

    Using functional MRI, we characterized field sign maps of the occipital cortex and created three-dimensional maps of these areas. By averaging the individual maps into group maps, probability maps of functionally defined V1 or V2 were determined and compared to anatomical probability maps of Brodmann areas BA17 and BA18 derived from cytoarchitectonic analysis (Amunts, K., Malikovic, A., Mohlberg, H., Schormann, T., Zilles, K., 2000. Brodmann's areas 17 and 18 brought into stereotaxic space-where and how variable? NeuroImage 11, 66-84). Comparison of areas BA17/V1 and BA18/V2 revealed good agreement of the anatomical and functional probability maps. Taking into account that our functional stimulation (due to constraints of the visual angle of stimulation achievable in the MR scanner) only identified parts of V1 and V2, for statistical evaluation of the spatial correlation of V1 and BA17, or V2 and BA18, respectively, the a priori measure kappa was calculated testing the hypothesis that a region can only be part of functionally defined V1 or V2 if it is also in anatomically defined BA17 or BA18, respectively. kappa = 1 means the hypothesis is fully true, kappa = 0 means functionally and anatomically defined visual areas are independent. When applying this measure to the probability maps, kappa was equal to 0.84 for both V1/BA17 and V2/BA18. The data thus show a good correspondence of functionally and anatomically derived segregations of early visual processing areas and serve as a basis for employing anatomical probability maps of V1 and V2 in group analyses to characterize functional activations of early visual processing areas.

  9. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  10. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  11. Probability Maps for the Visualization of Assimilation Ensemble Flow Data

    KAUST Repository

    Hollt, Thomas

    2015-05-25

    Ocean forecasts nowadays are created by running ensemble simulations in combination with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. This means that in a time series, after resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. In this work we present an approach using probability-weighted piecewise particle trajectories to allow such a mapping interactively, instead of tracing quadrillions of individual particles. We achieve interactive rates by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next time step. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates.

  12. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  13. Tissue-based map of the human proteome

    DEFF Research Database (Denmark)

    Uhlén, Mathias; Fagerberg, Linn; Hallström, Björn M.

    2015-01-01

    Resolving the molecular details of proteome variation in the different tissues and organs of the human body will greatly increase our knowledge of human biology and disease. Here, we present a map of the human tissue proteome based on an integrated omics approach that involves quantitative transc...

  14. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  15. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  16. Cancerous tissue mapping from random lasing emission spectra

    International Nuclear Information System (INIS)

    Polson, R C; Vardeny, Z V

    2010-01-01

    Random lasing emission spectra have been collected from both healthy and cancerous tissues. The two types of tissue with optical gain have different light scattering properties as obtained from an average power Fourier transform of their random lasing emission spectra. The difference in the power Fourier transform leads to a contrast between cancerous and benign tissues, which is utilized for tissue mapping of healthy and cancerous regions of patients

  17. Normal tissue complication probability for salivary glands

    International Nuclear Information System (INIS)

    Rana, B.S.

    2008-01-01

    The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years

  18. Creating and validating cis-regulatory maps of tissue-specific gene expression regulation

    Science.gov (United States)

    O'Connor, Timothy R.; Bailey, Timothy L.

    2014-01-01

    Predicting which genomic regions control the transcription of a given gene is a challenge. We present a novel computational approach for creating and validating maps that associate genomic regions (cis-regulatory modules–CRMs) with genes. The method infers regulatory relationships that explain gene expression observed in a test tissue using widely available genomic data for ‘other’ tissues. To predict the regulatory targets of a CRM, we use cross-tissue correlation between histone modifications present at the CRM and expression at genes within 1 Mbp of it. To validate cis-regulatory maps, we show that they yield more accurate models of gene expression than carefully constructed control maps. These gene expression models predict observed gene expression from transcription factor binding in the CRMs linked to that gene. We show that our maps are able to identify long-range regulatory interactions and improve substantially over maps linking genes and CRMs based on either the control maps or a ‘nearest neighbor’ heuristic. Our results also show that it is essential to include CRMs predicted in multiple tissues during map-building, that H3K27ac is the most informative histone modification, and that CAGE is the most informative measure of gene expression for creating cis-regulatory maps. PMID:25200088

  19. Calculation of normal tissue complication probability and dose-volume histogram reduction schemes for tissues with a critical element architecture

    International Nuclear Information System (INIS)

    Niemierko, Andrzej; Goitein, Michael

    1991-01-01

    The authors investigate a model of normal tissue complication probability for tissues that may be represented by a critical element architecture. They derive formulas for complication probability that apply to both a partial volume irradiation and to an arbitrary inhomogeneous dose distribution. The dose-volume isoeffect relationship which is a consequence of a critical element architecture is discussed and compared to the empirical power law relationship. A dose-volume histogram reduction scheme for a 'pure' critical element model is derived. In addition, a point-based algorithm which does not require precomputation of a dose-volume histogram is derived. The existing published dose-volume histogram reduction algorithms are analyzed. The authors show that the existing algorithms, developed empirically without an explicit biophysical model, have a close relationship to the critical element model at low levels of complication probability. However, it is also showed that they have aspects which are not compatible with a critical element model and the authors propose a modification to one of them to circumvent its restriction to low complication probabilities. (author). 26 refs.; 7 figs

  20. Probability of cavitation for single ultrasound pulses applied to tissues and tissue-mimicking materials.

    Science.gov (United States)

    Maxwell, Adam D; Cain, Charles A; Hall, Timothy L; Fowlkes, J Brian; Xu, Zhen

    2013-03-01

    In this study, the negative pressure values at which inertial cavitation consistently occurs in response to a single, two-cycle, focused ultrasound pulse were measured in several media relevant to cavitation-based ultrasound therapy. The pulse was focused into a chamber containing one of the media, which included liquids, tissue-mimicking materials, and ex vivo canine tissue. Focal waveforms were measured by two separate techniques using a fiber-optic hydrophone. Inertial cavitation was identified by high-speed photography in optically transparent media and an acoustic passive cavitation detector. The probability of cavitation (P(cav)) for a single pulse as a function of peak negative pressure (p(-)) followed a sigmoid curve, with the probability approaching one when the pressure amplitude was sufficient. The statistical threshold (defined as P(cav) = 0.5) was between p(-) = 26 and 30 MPa in all samples with high water content but varied between p(-) = 13.7 and >36 MPa in other media. A model for radial cavitation bubble dynamics was employed to evaluate the behavior of cavitation nuclei at these pressure levels. A single bubble nucleus with an inertial cavitation threshold of p(-) = 28.2 megapascals was estimated to have a 2.5 nm radius in distilled water. These data may be valuable for cavitation-based ultrasound therapy to predict the likelihood of cavitation at various pressure levels and dimensions of cavitation-induced lesions in tissue. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  1. Expression cartography of human tissues using self organizing maps

    Directory of Open Access Journals (Sweden)

    Löffler Markus

    2011-07-01

    Full Text Available Abstract Background Parallel high-throughput microarray and sequencing experiments produce vast quantities of multidimensional data which must be arranged and analyzed in a concerted way. One approach to addressing this challenge is the machine learning technique known as self organizing maps (SOMs. SOMs enable a parallel sample- and gene-centered view of genomic data combined with strong visualization and second-level analysis capabilities. The paper aims at bridging the gap between the potency of SOM-machine learning to reduce dimension of high-dimensional data on one hand and practical applications with special emphasis on gene expression analysis on the other hand. Results The method was applied to generate a SOM characterizing the whole genome expression profiles of 67 healthy human tissues selected from ten tissue categories (adipose, endocrine, homeostasis, digestion, exocrine, epithelium, sexual reproduction, muscle, immune system and nervous tissues. SOM mapping reduces the dimension of expression data from ten of thousands of genes to a few thousand metagenes, each representing a minicluster of co-regulated single genes. Tissue-specific and common properties shared between groups of tissues emerge as a handful of localized spots in the tissue maps collecting groups of co-regulated and co-expressed metagenes. The functional context of the spots was discovered using overrepresentation analysis with respect to pre-defined gene sets of known functional impact. We found that tissue related spots typically contain enriched populations of genes related to specific molecular processes in the respective tissue. Analysis techniques normally used at the gene-level such as two-way hierarchical clustering are better represented and provide better signal-to-noise ratios if applied to the metagenes. Metagene-based clustering analyses aggregate the tissues broadly into three clusters containing nervous, immune system and the remaining tissues

  2. A simple method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation

    International Nuclear Information System (INIS)

    Begnozzi, L.; Gentile, F.P.; Di Nallo, A.M.; Chiatti, L.; Zicari, C.; Consorti, R.; Benassi, M.

    1994-01-01

    Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.) [de

  3. MR-based automatic delineation of volumes of interest in human brain PET images using probability maps

    DEFF Research Database (Denmark)

    Svarer, Claus; Madsen, Karina; Hasselbalch, Steen G.

    2005-01-01

    The purpose of this study was to develop and validate an observer-independent approach for automatic generation of volume-of-interest (VOI) brain templates to be used in emission tomography studies of the brain. The method utilizes a VOI probability map created on the basis of a database of several...... delineation of the VOI set. The approach was also shown to work equally well in individuals with pronounced cerebral atrophy. Probability-map-based automatic delineation of VOIs is a fast, objective, reproducible, and safe way to assess regional brain values from PET or SPECT scans. In addition, the method...

  4. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  6. Mapping fire probability and severity in a Mediterranean area using different weather and fuel moisture scenarios

    Science.gov (United States)

    Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.

    2009-04-01

    Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in

  7. Identification of land degradation evidences in an organic farm using probability maps (Croatia)

    Science.gov (United States)

    Pereira, Paulo; Bogunovic, Igor; Estebaranz, Ferran

    2017-04-01

    Land degradation is a biophysical process with important impacts on society, economy and policy. Areas affected by land degradation do not provide services in quality and with capacity to full-field the communities that depends on them (Amaya-Romero et al., 2015; Beyene, 2015; Lanckriet et al., 2015). Agricultural activities are one of the main causes of land degradation (Kraaijvanger and Veldkamp, 2015), especially when they decrease soil organic matter (SOM), a crucial element for soil fertility. In temperate areas, the critical level of SOM concentration in agricultural soils is 3.4%. Below this level there is a potential decrease of soil quality (Loveland and Weeb, 2003). However, no previous work was carried out in other environments, such as the Mediterranean. The spatial distribution of potential degraded land is important to be identified and mapped, in order to identify the areas that need restoration (Brevik et al., 2016; Pereira et al., 2017). The aim of this work is to assess the spatial distribution of areas with evidences of land degradation (SOM bellow 3.4%) using probability maps in an organic farm located in Croatia. In order to find the best method, we compared several probability methods, such as Ordinary Kriging (OK), Simple Kriging (SK), Universal Kriging (UK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area is located on the Istria peninsula (45°3' N; 14°2' E), with a total area of 182 ha. One hundred eighty-two soil samples (0-30 cm) were collected during July of 2015 and SOM was assessed using wet combustion procedure. The assessment of the best probability method was carried out using leave one out cross validation method. The probability method with the lowest Root Mean Squared Error (RMSE) was the most accurate. The results showed that the best method to predict the probability of potential land degradation was SK with an RMSE of 0.635, followed by DK (RMSE=0.636), UK (RMSE=0.660), OK (RMSE

  8. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  9. Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition

    Science.gov (United States)

    Li, Wei; Wu, Bing; Liu, Chunlei

    2011-01-01

    Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002

  10. Cytoarchitecture, probability maps and functions of the human frontal pole.

    Science.gov (United States)

    Bludau, S; Eickhoff, S B; Mohlberg, H; Caspers, S; Laird, A R; Fox, P T; Schleicher, A; Zilles, K; Amunts, K

    2014-06-01

    The frontal pole has more expanded than any other part in the human brain as compared to our ancestors. It plays an important role for specifically human behavior and cognitive abilities, e.g. action selection (Kovach et al., 2012). Evidence about divergent functions of its medial and lateral part has been provided, both in the healthy brain and in psychiatric disorders. The anatomical correlates of such functional segregation, however, are still unknown due to a lack of stereotaxic, microstructural maps obtained in a representative sample of brains. Here we show that the human frontopolar cortex consists of two cytoarchitectonically and functionally distinct areas: lateral frontopolar area 1 (Fp1) and medial frontopolar area 2 (Fp2). Based on observer-independent mapping in serial, cell-body stained sections of 10 brains, three-dimensional, probabilistic maps of areas Fp1 and Fp2 were created. They show, for each position of the reference space, the probability with which each area was found in a particular voxel. Applying these maps as seed regions for a meta-analysis revealed that Fp1 and Fp2 differentially contribute to functional networks: Fp1 was involved in cognition, working memory and perception, whereas Fp2 was part of brain networks underlying affective processing and social cognition. The present study thus disclosed cortical correlates of a functional segregation of the human frontopolar cortex. The probabilistic maps provide a sound anatomical basis for interpreting neuroimaging data in the living human brain, and open new perspectives for analyzing structure-function relationships in the prefrontal cortex. The new data will also serve as a starting point for further comparative studies between human and non-human primate brains. This allows finding similarities and differences in the organizational principles of the frontal lobe during evolution as neurobiological basis for our behavior and cognitive abilities. Copyright © 2013 Elsevier Inc. All

  11. Mapping absolute tissue endogenous fluorophore concentrations with chemometric wide-field fluorescence microscopy

    Science.gov (United States)

    Xu, Zhang; Reilley, Michael; Li, Run; Xu, Min

    2017-06-01

    We report chemometric wide-field fluorescence microscopy for imaging the spatial distribution and concentration of endogenous fluorophores in thin tissue sections. Nonnegative factorization aided by spatial diversity is used to learn both the spectral signature and the spatial distribution of endogenous fluorophores from microscopic fluorescence color images obtained under broadband excitation and detection. The absolute concentration map of individual fluorophores is derived by comparing the fluorescence from "pure" fluorophores under the identical imaging condition following the identification of the fluorescence species by its spectral signature. This method is then demonstrated by characterizing the concentration map of endogenous fluorophores (including tryptophan, elastin, nicotinamide adenine dinucleotide, and flavin adenine dinucleotide) for lung tissue specimens. The absolute concentrations of these fluorophores are all found to decrease significantly from normal, perilesional, to cancerous (squamous cell carcinoma) tissue. Discriminating tissue types using the absolute fluorophore concentration is found to be significantly more accurate than that achievable with the relative fluorescence strength. Quantification of fluorophores in terms of the absolute concentration map is also advantageous in eliminating the uncertainties due to system responses or measurement details, yielding more biologically relevant data, and simplifying the assessment of competing imaging approaches.

  12. Cardiac tissue slices: preparation, handling, and successful optical mapping.

    Science.gov (United States)

    Wang, Ken; Lee, Peter; Mirams, Gary R; Sarathchandra, Padmini; Borg, Thomas K; Gavaghan, David J; Kohl, Peter; Bollensdorff, Christian

    2015-05-01

    Cardiac tissue slices are becoming increasingly popular as a model system for cardiac electrophysiology and pharmacology research and development. Here, we describe in detail the preparation, handling, and optical mapping of transmembrane potential and intracellular free calcium concentration transients (CaT) in ventricular tissue slices from guinea pigs and rabbits. Slices cut in the epicardium-tangential plane contained well-aligned in-slice myocardial cell strands ("fibers") in subepicardial and midmyocardial sections. Cut with a high-precision slow-advancing microtome at a thickness of 350 to 400 μm, tissue slices preserved essential action potential (AP) properties of the precutting Langendorff-perfused heart. We identified the need for a postcutting recovery period of 36 min (guinea pig) and 63 min (rabbit) to reach 97.5% of final steady-state values for AP duration (APD) (identified by exponential fitting). There was no significant difference between the postcutting recovery dynamics in slices obtained using 2,3-butanedione 2-monoxime or blebistatin as electromechanical uncouplers during the cutting process. A rapid increase in APD, seen after cutting, was caused by exposure to ice-cold solution during the slicing procedure, not by tissue injury, differences in uncouplers, or pH-buffers (bicarbonate; HEPES). To characterize intrinsic patterns of CaT, AP, and conduction, a combination of multipoint and field stimulation should be used to avoid misinterpretation based on source-sink effects. In summary, we describe in detail the preparation, mapping, and data analysis approaches for reproducible cardiac tissue slice-based investigations into AP and CaT dynamics. Copyright © 2015 the American Physiological Society.

  13. Decomposing the Hounsfield unit: probabilistic segmentation of brain tissue in computed tomography.

    Science.gov (United States)

    Kemmling, A; Wersching, H; Berger, K; Knecht, S; Groden, C; Nölte, I

    2012-03-01

    The aim of this study was to present and evaluate a standardized technique for brain segmentation of cranial computed tomography (CT) using probabilistic partial volume tissue maps based on a database of high resolution T1 magnetic resonance images (MRI). Probabilistic tissue maps of white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) were derived from 600 normal brain MRIs (3.0 Tesla, T1-3D-turbo-field-echo) of 2 large community-based population studies (BiDirect and SEARCH Health studies). After partial tissue segmentation (FAST 4.0), MR images were linearly registered to MNI-152 standard space (FLIRT 5.5) with non-linear refinement (FNIRT 1.0) to obtain non-binary probabilistic volume images for each tissue class which were subsequently used for CT segmentation. From 150 normal cerebral CT scans a customized reference image in standard space was constructed with iterative non-linear registration to MNI-152 space. The inverse warp of tissue-specific probability maps to CT space (MNI-152 to individual CT) was used to decompose a CT image into tissue specific components (GM, WM, CSF). Potential benefits and utility of this novel approach with regard to unsupervised quantification of CT images and possible visual enhancement are addressed. Illustrative examples of tissue segmentation in different pathological cases including perfusion CT are presented. Automated tissue segmentation of cranial CT images using highly refined tissue probability maps derived from high resolution MR images is feasible. Potential applications include automated quantification of WM in leukoaraiosis, CSF in hydrocephalic patients, GM in neurodegeneration and ischemia and perfusion maps with separate assessment of GM and WM.

  14. Raman spectroscopic biochemical mapping of tissues

    Science.gov (United States)

    Stone, Nicholas; Hart Prieto, Maria C.; Kendall, Catherine A.; Shetty, Geeta; Barr, Hugh

    2006-02-01

    Advances in technologies have brought us closer to routine spectroscopic diagnosis of early malignant disease. However, there is still a poor understanding of the carcinogenesis process. For example it is not known whether many cancers follow a logical sequence from dysplasia, to carcinoma in situ, to invasion. Biochemical tissue changes, triggered by genetic mutations, precede morphological and structural changes. These can be probed using Raman or FTIR microspectroscopy and the spectra analysed for biochemical constituents. Local microscopic distribution of various constituents can then be visualised. Raman mapping has been performed on a number of tissues including oesophagus, breast, bladder and prostate. The biochemical constituents have been calculated at each point using basis spectra and least squares analysis. The residual of the least squares fit indicates any unfit spectral components. The biochemical distribution will be compared with the defined histopathological boundaries. The distribution of nucleic acids, glycogen, actin, collagen I, III, IV, lipids and others appear to follow expected patterns.

  15. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren

    2013-01-01

    To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....

  16. Collapse susceptibility mapping in karstified gypsum terrain (Sivas basin - Turkey) by conditional probability, logistic regression, artificial neural network models

    Science.gov (United States)

    Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin

    2010-05-01

    This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.

  17. Mapping probabilities of extreme continental water storage changes from space gravimetry

    Science.gov (United States)

    Kusche, J.; Eicker, A.; Forootan, E.; Springer, A.; Longuevergne, L.

    2016-12-01

    Using data from the Gravity Recovery and Climate Experiment (GRACE) mission, we derive statistically robust 'hotspot' regions of high probability of peak anomalous - i.e. with respect to the seasonal cycle - water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/mon). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hotspot regions to GRACE results, and that most exceptions are located in the Tropics. However, a simulation experiment reveals that differences observed by GRACE are statistically significant, and further error analysis suggests that by around the year 2020 it will be possible to detect temporal changes in the frequency of extreme total fluxes (i.e. combined effects of mainly precipitation and floods) for at least 10-20% of the continental area, assuming that we have a continuation of GRACE by its follow-up GRACE-FO. J. Kusche et al. (2016): Mapping probabilities of extreme continental water storage changes from space gravimetry, Geophysical Research Letters, accepted online, doi:10.1002/2016GL069538

  18. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  19. Radiation risk of tissue late effects, a net consequence of probabilities of various cellular responses

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    1991-01-01

    Late effects from the exposure to low doses of ionizing radiation are hardly or not at all observed in man mainly due to the low values of risk coefficients that preclude statistical analyses of data from populations that are exposed to doses less than 0.2 Gy. In order to arrive at an assessment of potential risk from radiation exposure in the low dose range, the microdosimetry approach is essential. In the low dose range, ionizing radiation generates particle tracks, mainly electrons, which are distributed rather heterogeneously within the exposed tissue. Taking the individual cell as the elemental unit of life, observations and calculations of cellular responses to being hit by energy depositions events from low LET type are analysed. It emerges that besides the probability of a hit cell to sustain a detrimental effect with the consequense of malignant transformation there are probabilities of various adaptive responses that equipp the hit cell with a benefit. On the one hand, an improvement of cellular radical detoxification was observed in mouse bone marrow cells; another adaptive response pertaining to improved DNA repair, was reported for human lymphocytes. The improved radical detoxification in mouse bone marrow cells lasts for a period of 5-10 hours and improved DNA repair in human lymphocytes was seen for some 60 hours following acute irradiation. It is speculated that improved radical detoxification and improved DNA repair may reduce the probability of spontaneous carcinogenesis. Thus it is proposed to weigh the probability of detriment for a hit cell within a multicellular system against the probability of benefit through adaptive responses in other hit cells in the same system per radiation exposure. In doing this, the net effect of low doses of low LET radiation in tissue with individual cells being hit by energy deposition events could be zero or even beneficial. (orig./MG)

  20. MAP3K8 (TPL2/COT affects obesity-induced adipose tissue inflammation without systemic effects in humans and in mice.

    Directory of Open Access Journals (Sweden)

    Dov B Ballak

    Full Text Available Chronic low-grade inflammation in adipose tissue often accompanies obesity, leading to insulin resistance and increasing the risk for metabolic diseases. MAP3K8 (TPL2/COT is an important signal transductor and activator of pro-inflammatory pathways that has been linked to obesity-induced adipose tissue inflammation. We used human adipose tissue biopsies to study the relationship of MAP3K8 expression with markers of obesity and expression of pro-inflammatory cytokines (IL-1β, IL-6 and IL-8. Moreover, we evaluated obesity-induced adipose tissue inflammation and insulin resistance in mice lacking MAP3K8 and WT mice on a high-fat diet (HFD for 16 weeks. Individuals with a BMI >30 displayed a higher mRNA expression of MAP3K8 in adipose tissue compared to individuals with a normal BMI. Additionally, high mRNA expression levels of IL-1β, IL-6 and IL-8, but not TNF -α, in human adipose tissue were associated with higher expression of MAP3K8. Moreover, high plasma SAA and CRP did not associate with increased MAP3K8 expression in adipose tissue. Similarly, no association was found for MAP3K8 expression with plasma insulin or glucose levels. Mice lacking MAP3K8 had similar bodyweight gain as WT mice, yet displayed lower mRNA expression levels of IL-1β, IL-6 and CXCL1 in adipose tissue in response to the HFD as compared to WT animals. However, MAP3K8 deficient mice were not protected against HFD-induced adipose tissue macrophage infiltration or the development of insulin resistance. Together, the data in both human and mouse show that MAP3K8 is involved in local adipose tissue inflammation, specifically for IL-1β and its responsive cytokines IL-6 and IL-8, but does not seem to have systemic effects on insulin resistance.

  1. METABOLIC MAPPING BY ENZYME HISTOCHEMISTRY IN LIVING ANIMALS, TISSUES AND CELLS

    NARCIS (Netherlands)

    van Noorden, C. J. F.

    2009-01-01

    Imaging of reporter molecules such as fluorescent proteins in intact animals, tissue and cells has become an indispensable tool in cell biology Imaging activity of enzymes, which is called metabolic mapping, provides information on subcellular localisation in combination with function of the enzymes

  2. The use of normal tissue complication probability to predict radiation hepatitis

    International Nuclear Information System (INIS)

    Keum, Ki Chang; Seong, Jin Sil; Suh, Chang Ok; Lee, Sang Wook; Chung, Eun Ji; Shin, Hyun Soo; Kim, Gwi Eon

    2000-01-01

    Although it has been known that the tolerance of the liver to external beam irradiation depends on the irradiated volume and dose, few data exist which quantify this dependence. However, recently, with the development of three dimensional (3-D) treatment planning, have the tools to quantify the relationships between dose, volume, and normal tissue complications become available. The objective of this study is to investigate the relationships between normal tissue complication probability (NTCP) and the risk of radiation hepatitis for patients who received variant dose partial liver irradiation. From March 1992 to December 1994, 10 patients with hepatoma and 10 patients with bile duct cancer were included in this study. Eighteen patients had normal hepatic function, but 2 patients (prothrombin time 73%, 68%) had mild liver cirrhosis before irradiation. Radiation therapy was delivered with 10MV linear accelerator, 180-200 cGy fraction per day. The total dose ranged from 3,960 cGy to 6,000 cGy (median dose 5,040 cGy). The normal tissue complication probability was calculated by using Lyman's model. Radiation hepatitis was defined as the development of anicteric elevation of alkaline phosphatase of at least two fold and non-malignant ascites in the absence of documented progressive. The calculated NTCP ranged from 0.001 to 0.840 (median 0.05). Three of the 20 patients developed radiation hepatitis. The NTCP of the patients with radiation hepatitis were 0.390, 0.528, 0.844 (median: O.58±0.23), but that of the patients without radiation hepatitis ranged from 0.001 to 0.308 (median: 0.09±0.09). When the NTCP was calculated by using the volume factor of 0.32, a radiation hepatitis was observed only in patients with the NTCP value more than 0.39. By contrast, clinical results of evolving radiation hepatitis were not well correlated with NTCP value calculated when the volume factor of 0.69 was applied. On the basis of these observations, volume factor of 0.32 was more

  3. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  4. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  5. Evaluation of carrier collection probability in bifacial interdigitated-back-contact crystalline silicon solar cells by the internal quantum efficiency mapping method

    Science.gov (United States)

    Tachibana, Tomihisa; Tanahashi, Katsuto; Mochizuki, Toshimitsu; Shirasawa, Katsuhiko; Takato, Hidetaka

    2018-04-01

    Bifacial interdigitated-back-contact (IBC) silicon solar cells with a high bifaciality of 0.91 were fabricated. Screen printing and firing technology were used to reduce the production cost. For the first time, the relationship between the rear side structure and carrier collection probability was evaluated using internal quantum efficiency (IQE) mapping. The measurement results showed that the screen-printed electrode and back surface field (BSF) area led to low IQE. The low carrier collection probability by BSF area can be explained by electrical shading effects. Thus, it is clear that the IQE mapping system is useful to evaluate the IBC cell.

  6. Options and pitfalls of normal tissues complication probability models

    International Nuclear Information System (INIS)

    Dorr, Wolfgang

    2011-01-01

    Full text: Technological improvements in the physical administration of radiotherapy have led to increasing conformation of the treatment volume (TV) with the planning target volume (PTV) and of the irradiated volume (IV) with the TV. In this process of improvement of the physical quality of radiotherapy, the total volumes of organs at risk exposed to significant doses have significantly decreased, resulting in increased inhomogeneities in the dose distributions within these organs. This has resulted in a need to identify and quantify volume effects in different normal tissues. Today, irradiated volume today must be considered a 6t h 'R' of radiotherapy, in addition to the 5 'Rs' defined by Withers and Steel in the mid/end 1980 s. The current status of knowledge of these volume effects has recently been summarized for many organs and tissues by the QUANTEC (Quantitative Analysis of Normal Tissue Effects in the Clinic) initiative [Int. J. Radiat. Oncol. BioI. Phys. 76 (3) Suppl., 2010]. However, the concept of using dose-volume histogram parameters as a basis for dose constraints, even without applying any models for normal tissue complication probabilities (NTCP), is based on (some) assumptions that are not met in clinical routine treatment planning. First, and most important, dose-volume histogram (DVH) parameters are usually derived from a single, 'snap-shot' CT-scan, without considering physiological (urinary bladder, intestine) or radiation induced (edema, patient weight loss) changes during radiotherapy. Also, individual variations, or different institutional strategies of delineating organs at risk are rarely considered. Moreover, the reduction of the 3-dimentional dose distribution into a '2dimensl' DVH parameter implies that the localization of the dose within an organ is irrelevant-there are ample examples that this assumption is not justified. Routinely used dose constraints also do not take into account that the residual function of an organ may be

  7. Towards optical spectroscopic anatomical mapping (OSAM) for lesion validation in cardiac tissue (Conference Presentation)

    Science.gov (United States)

    Singh-Moon, Rajinder P.; Zaryab, Mohammad; Hendon, Christine P.

    2017-02-01

    Electroanatomical mapping (EAM) is an invaluable tool for guiding cardiac radiofrequency ablation (RFA) therapy. The principle roles of EAM is the identification of candidate ablation sites by detecting regions of abnormal electrogram activity and lesion validation subsequent to RF energy delivery. However, incomplete lesions may present interim electrical inactivity similar to effective treatment in the acute setting, despite efforts to reveal them with pacing or drugs, such as adenosine. Studies report that the misidentification and recovery of such lesions is a leading cause of arrhythmia recurrence and repeat procedures. In previous work, we demonstrated spectroscopic characterization of cardiac tissues using a fiber optic-integrated RF ablation catheter. In this work, we introduce OSAM (optical spectroscopic anatomical mapping), the application of this spectroscopic technique to obtain 2-dimensional biodistribution maps. We demonstrate its diagnostic potential as an auxiliary method for lesion validation in treated swine preparations. Endocardial lesion sets were created on fresh swine cardiac samples using a commercial RFA system. An optically-integrated catheter console fabricated in-house was used for measurement of tissue optical spectra between 600-1000nm. Three dimensional, Spatio-spectral datasets were generated by raster scanning of the optical catheter across the treated sample surface in the presence of whole blood. Tissue optical parameters were recovered at each spatial position using an inverse Monte Carlo method. OSAM biodistribution maps showed stark correspondence with gross examination of tetrazolium chloride stained tissue specimens. Specifically, we demonstrate the ability of OSAM to readily distinguish between shallow and deeper lesions, a limitation faced by current EAM techniques. These results showcase the OSAMs potential for lesion validation strategies for the treatment of cardiac arrhythmias.

  8. Spatial cluster analysis of nanoscopically mapped serotonin receptors for classification of fixed brain tissue

    Science.gov (United States)

    Sams, Michael; Silye, Rene; Göhring, Janett; Muresan, Leila; Schilcher, Kurt; Jacak, Jaroslaw

    2014-01-01

    We present a cluster spatial analysis method using nanoscopic dSTORM images to determine changes in protein cluster distributions within brain tissue. Such methods are suitable to investigate human brain tissue and will help to achieve a deeper understanding of brain disease along with aiding drug development. Human brain tissue samples are usually treated postmortem via standard fixation protocols, which are established in clinical laboratories. Therefore, our localization microscopy-based method was adapted to characterize protein density and protein cluster localization in samples fixed using different protocols followed by common fluorescent immunohistochemistry techniques. The localization microscopy allows nanoscopic mapping of serotonin 5-HT1A receptor groups within a two-dimensional image of a brain tissue slice. These nanoscopically mapped proteins can be confined to clusters by applying the proposed statistical spatial analysis. Selected features of such clusters were subsequently used to characterize and classify the tissue. Samples were obtained from different types of patients, fixed with different preparation methods, and finally stored in a human tissue bank. To verify the proposed method, samples of a cryopreserved healthy brain have been compared with epitope-retrieved and paraffin-fixed tissues. Furthermore, samples of healthy brain tissues were compared with data obtained from patients suffering from mental illnesses (e.g., major depressive disorder). Our work demonstrates the applicability of localization microscopy and image analysis methods for comparison and classification of human brain tissues at a nanoscopic level. Furthermore, the presented workflow marks a unique technological advance in the characterization of protein distributions in brain tissue sections.

  9. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  10. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  11. High-spatial-resolution mapping of the oxygen concentration in cortical tissue (Conference Presentation)

    Science.gov (United States)

    Jaswal, Rajeshwer S.; Yaseen, Mohammad A.; Fu, Buyin; Boas, David A.; Sakadžic, Sava

    2016-03-01

    Due to a lack of imaging tools for high-resolution imaging of cortical tissue oxygenation, the detailed maps of the oxygen partial pressure (PO2) around arterioles, venules, and capillaries remain largely unknown. Therefore, we have limited knowledge about the mechanisms that secure sufficient oxygen delivery in microvascular domains during brain activation, and provide some metabolic reserve capacity in diseases that affect either microvascular networks or the regulation of cerebral blood flow (CBF). To address this challenge, we applied a Two-Photon PO2 Microscopy to map PO2 at different depths in mice cortices. Measurements were performed through the cranial window in the anesthetized healthy mice as well as in the mouse models of microvascular dysfunctions. In addition, microvascular morphology was recorded by the two-photon microscopy at the end of each experiment and subsequently segmented. Co-registration of the PO2 measurements and exact microvascular morphology enabled quantification of the tissue PO2 dependence on distance from the arterioles, capillaries, and venules at various depths. Our measurements reveal significant spatial heterogeneity of the cortical tissue PO2 distribution that is dominated by the high oxygenation in periarteriolar spaces. In cases of impaired oxygen delivery due to microvascular dysfunction, significant reduction in tissue oxygenation away from the arterioles was observed. These tissue domains may be the initial sites of cortical injury that can further exacerbate the progression of the disease.

  12. Improving normal tissue complication probability models: the need to adopt a "data-pooling" culture.

    Science.gov (United States)

    Deasy, Joseph O; Bentzen, Søren M; Jackson, Andrew; Ten Haken, Randall K; Yorke, Ellen D; Constine, Louis S; Sharma, Ashish; Marks, Lawrence B

    2010-03-01

    Clinical studies of the dependence of normal tissue response on dose-volume factors are often confusingly inconsistent, as the QUANTEC reviews demonstrate. A key opportunity to accelerate progress is to begin storing high-quality datasets in repositories. Using available technology, multiple repositories could be conveniently queried, without divulging protected health information, to identify relevant sources of data for further analysis. After obtaining institutional approvals, data could then be pooled, greatly enhancing the capability to construct predictive models that are more widely applicable and better powered to accurately identify key predictive factors (whether dosimetric, image-based, clinical, socioeconomic, or biological). Data pooling has already been carried out effectively in a few normal tissue complication probability studies and should become a common strategy. Copyright 2010 Elsevier Inc. All rights reserved.

  13. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  14. Tissue Cancellation in Dual Energy Mammography Using a Calibration Phantom Customized for Direct Mapping.

    Science.gov (United States)

    Han, Seokmin; Kang, Dong-Goo

    2014-01-01

    An easily implementable tissue cancellation method for dual energy mammography is proposed to reduce anatomical noise and enhance lesion visibility. For dual energy calibration, the images of an imaging object are directly mapped onto the images of a customized calibration phantom. Each pixel pair of the low and high energy images of the imaging object was compared to pixel pairs of the low and high energy images of the calibration phantom. The correspondence was measured by absolute difference between the pixel values of imaged object and those of the calibration phantom. Then the closest pixel pair of the calibration phantom images is marked and selected. After the calibration using direct mapping, the regions with lesion yielded different thickness from the background tissues. Taking advantage of the different thickness, the visibility of cancerous lesions was enhanced with increased contrast-to-noise ratio, depending on the size of lesion and breast thickness. However, some tissues near the edge of imaged object still remained after tissue cancellation. These remaining residuals seem to occur due to the heel effect, scattering, nonparallel X-ray beam geometry and Poisson distribution of photons. To improve its performance further, scattering and the heel effect should be compensated.

  15. A probable risk factor of female breast cancer: study on benign and malignant breast tissue samples.

    Science.gov (United States)

    Rehman, Sohaila; Husnain, Syed M

    2014-01-01

    The study reports enhanced Fe, Cu, and Zn contents in breast tissues, a probable risk factor of breast cancer in females. Forty-one formalin-fixed breast tissues were analyzed using atomic absorption spectrophotometry. Twenty malignant, six adjacent to malignant and 15 benign tissues samples were investigated. The malignant tissues samples were of grade 11 and type invasive ductal carcinoma. The quantitative comparison between the elemental levels measured in the two types of specimen (benign and malignant) tissues (removed after surgery) suggests significant elevation of these metals (Fe, Cu, and Zn) in the malignant tissue. The specimens were collected just after mastectomy of women aged 19 to 59 years from the hospitals of Islamabad and Rawalpindi, Pakistan. Most of the patients belong to urban areas of Pakistan. Findings of study depict that these elements have a promising role in the initiation and development of carcinoma as consistent pattern of elevation for Fe, Cu, and Zn was observed. The results showed the excessive accumulation of Fe (229 ± 121 mg/L) in malignant breast tissue samples of patients (p factor of breast cancer. In order to validate our method of analysis, certified reference material muscle tissue lyophilized (IAEA) MA-M-2/TM was analyzed for metal studied. Determined concentrations were quite in good agreement with certified levels. Asymmetric concentration distribution for Fe, Cu, and Zn was observed in both malignant and benign tissue samples.

  16. Deep-tissue temperature mapping by multi-illumination photoacoustic tomography aided by a diffusion optical model: a numerical study

    Science.gov (United States)

    Zhou, Yuan; Tang, Eric; Luo, Jianwen; Yao, Junjie

    2018-01-01

    Temperature mapping during thermotherapy can help precisely control the heating process, both temporally and spatially, to efficiently kill the tumor cells and prevent the healthy tissues from heating damage. Photoacoustic tomography (PAT) has been used for noninvasive temperature mapping with high sensitivity, based on the linear correlation between the tissue's Grüneisen parameter and temperature. However, limited by the tissue's unknown optical properties and thus the optical fluence at depths beyond the optical diffusion limit, the reported PAT thermometry usually takes a ratiometric measurement at different temperatures and thus cannot provide absolute measurements. Moreover, ratiometric measurement over time at different temperatures has to assume that the tissue's optical properties do not change with temperatures, which is usually not valid due to the temperature-induced hemodynamic changes. We propose an optical-diffusion-model-enhanced PAT temperature mapping that can obtain the absolute temperature distribution in deep tissue, without the need of multiple measurements at different temperatures. Based on the initial acoustic pressure reconstructed from multi-illumination photoacoustic signals, both the local optical fluence and the optical parameters including absorption and scattering coefficients are first estimated by the optical-diffusion model, then the temperature distribution is obtained from the reconstructed Grüneisen parameters. We have developed a mathematic model for the multi-illumination PAT of absolute temperatures, and our two-dimensional numerical simulations have shown the feasibility of this new method. The proposed absolute temperature mapping method may set the technical foundation for better temperature control in deep tissue in thermotherapy.

  17. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    Science.gov (United States)

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  18. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Kevin T. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts Institute of Technology, Division of Health Sciences and Technology, Cambridge, MA (United States); Izquierdo-Garcia, David; Catana, Ciprian [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Poynton, Clare B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts General Hospital, Department of Psychiatry, Boston, MA (United States); University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Chonde, Daniel B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Harvard University, Program in Biophysics, Cambridge, MA (United States)

    2017-03-15

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps (''μ-maps'') were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map (''PAC-map'') generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach. (orig.)

  19. Distribution and probable physiological role of esterases in reproductive, digestive, and fat-body tissues of the adult cotton boll weevil, Anthonomus grandis Boh.

    Science.gov (United States)

    Jones, B R; Bancroft, H R

    1986-06-01

    Polyacrylamide gel electrophoresis was used to examine gut, Malpighian tube, fat-body, testes, and ovarioles tissues of the adult cotton boll weevil, Anthonomus grandis Boh. Esterases for which the inheritance has been reported previously by Terranova using whole-body homogenates were detected in dissected tissues and the probable physiological function of each allozyme is suggested. EST-1 occurs most frequently in ovarioles and female fat bodies. EST-2 is most often found in fat bodies and may be important in lipid turnover. No sex difference was observed. EST-3S is found in fat bodies and reproductive tissue, while EST-3F is always located in gut tissues, indicating that EST-3 is not controlled by a single autosomal locus with two codominant alleles as previously reported. EST-4, the most abundant esterase, can be detected in gut tissue at any age and is probably involved in digestion. EST-5 contains four allozymes which appear most frequently in testes and may be important during reproduction.

  20. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  1. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Farace, Paolo; Antolini, Renzo [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy); Pontalti, Rolando; Cristoforetti, Luca [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Scarpa, Marina [Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy)

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  2. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    International Nuclear Information System (INIS)

    Farace, Paolo; Antolini, Renzo; Pontalti, Rolando; Cristoforetti, Luca; Scarpa, Marina

    1997-01-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  3. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning.

    Science.gov (United States)

    Farace, P; Pontalti, R; Cristoforetti, L; Antolini, R; Scarpa, M

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning.

  4. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Farace, Paolo; Antolini, Renzo [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy); Pontalti, Rolando; Cristoforetti, Luca [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Scarpa, Marina [Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy)

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  5. An imaging colorimeter for noncontact tissue color mapping.

    Science.gov (United States)

    Balas, C

    1997-06-01

    There has been a considerable effort in several medical fields, for objective color analysis and characterization of biological tissues. Conventional colorimeters have proved inadequate for this purpose, since they do not provide spatial color information and because the measuring procedure randomly affects the color of the tissue. In this paper an imaging colorimeter is presented, where the nonimaging optical photodetector of colorimeters is replaced with the charge-coupled device (CCD) sensor of a color video camera, enabling the independent capturing of the color information for any spatial point within its field-of-view. Combining imaging and colorimetry methods, the acquired image is calibrated and corrected, under several ambient light conditions, providing noncontact reproducible color measurements and mapping, free of the errors and the limitations present in conventional colorimeters. This system was used for monitoring of blood supply changes of psoriatic plaques, that have undergone Psoralens and ultraviolet-A radiation (PUVA) therapy, where reproducible and reliable measurements were demonstrated. These features highlight the potential of the imaging colorimeters as clinical and research tools for the standardization of clinical diagnosis and for the objective evaluation of treatment effectiveness.

  6. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  7. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  8. Magnetic resonance tissue phase mapping demonstrates altered left ventricular diastolic function in children with chronic kidney disease

    International Nuclear Information System (INIS)

    Gimpel, Charlotte; Pohl, Martin; Jung, Bernd A.; Jung, Sabine; Brado, Johannes; Odening, Katja E.; Schwendinger, Daniel; Burkhardt, Barbara; Geiger, Julia; Arnold, Raoul

    2017-01-01

    Echocardiographic examinations have revealed functional cardiac abnormalities in children with chronic kidney disease. To assess the feasibility of MRI tissue phase mapping in children and to assess regional left ventricular wall movements in children with chronic kidney disease. Twenty pediatric patients with chronic kidney disease (before or after renal transplantation) and 12 healthy controls underwent tissue phase mapping (TPM) to quantify regional left ventricular function through myocardial long (Vz) and short-axis (Vr) velocities at all 3 levels of the left ventricle. Patients and controls (age: 8 years - 20 years) were matched for age, height, weight, gender and heart rate. Patients had higher systolic blood pressure. No patient had left ventricular hypertrophy on MRI or diastolic dysfunction on echocardiography. Fifteen patients underwent tissue Doppler echocardiography, with normal z-scores for mitral early diastolic (V E ), late diastolic (V A ) and peak systolic (V S ) velocities. Throughout all left ventricular levels, peak diastolic Vz and Vr (cm/s) were reduced in patients: Vz base -10.6 ± 1.9 vs. -13.4 ± 2.0 (P < 0.0003), Vz mid -7.8 ± 1.6 vs. -11 ± 1.5 (P < 0.0001), Vz apex -3.8 ± 1.6 vs. -5.3 ± 1.6 (P = 0.01), Vr base -4.2 ± 0.8 vs. -4.9 ± 0.7 (P = 0.01), Vr mid -4.7 ± 0.7 vs. -5.4 ± 0.7 (P = 0.01), Vr apex -4.7 ± 1.4 vs. -5.6 ± 1.1 (P = 0.05). Tissue phase mapping is feasible in children and adolescents. Children with chronic kidney disease show significantly reduced peak diastolic long- and short-axis left ventricular wall velocities, reflecting impaired early diastolic filling. Thus, tissue phase mapping detects chronic kidney disease-related functional myocardial changes before overt left ventricular hypertrophy or echocardiographic diastolic dysfunction occurs. (orig.)

  9. Using widely spaced observations of land use, forest attributes, and intrusions to map resource potential and human impact probability

    Science.gov (United States)

    Victor A. Rudis

    2000-01-01

    Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven State survey region (Alabama, Arkansas, Louisiana, Mississippi,...

  10. Using widely spaced observations of land use, forest attributes, and intrusions to map resource potential and human impact probability

    Science.gov (United States)

    Victor A. Rudis

    2000-01-01

    Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven-state survey region (Alabama, Arkansas, Louisiana, Mississippi,...

  11. Dual-energy digital mammography: Calibration and inverse-mapping techniques to estimate calcification thickness and glandular-tissue ratio

    International Nuclear Information System (INIS)

    Kappadath, S. Cheenu; Shaw, Chris C.

    2003-01-01

    Breast cancer may manifest as microcalcifications in x-ray mammography. Small microcalcifications, essential to the early detection of breast cancer, are often obscured by overlapping tissue structures. Dual-energy imaging, where separate low- and high-energy images are acquired and synthesized to cancel the tissue structures, may improve the ability to detect and visualize microcalcifications. Transmission measurements at two different kVp values were made on breast-tissue-equivalent materials under narrow-beam geometry using an indirect flat-panel mammographic imager. The imaging scenario consisted of variable aluminum thickness (to simulate calcifications) and variable glandular ratio (defined as the ratio of the glandular-tissue thickness to the total tissue thickness) for a fixed total tissue thickness--the clinical situation of microcalcification imaging with varying tissue composition under breast compression. The coefficients of the inverse-mapping functions used to determine material composition from dual-energy measurements were calculated by a least-squares analysis. The linear function poorly modeled both the aluminum thickness and the glandular ratio. The inverse-mapping functions were found to vary as analytic functions of second (conic) or third (cubic) order. By comparing the model predictions with the calibration values, the root-mean-square residuals for both the cubic and the conic functions were ∼50 μm for the aluminum thickness and ∼0.05 for the glandular ratio

  12. Digital integration of geological and aeroradiometric data for probability mapping of uranium occurrences in parts of south-eastern Rajasthan, India

    International Nuclear Information System (INIS)

    Chawla, A.S.; Katti, V.J.; Kak, S.N.; Das, S.K.

    1993-01-01

    Integration and evaluation of geological, radio geochemical, and magnetic information of Umra-Udaisagar and Sarara inlier area, Udaipur district, Rajasthan was attempted. Seventeen lithostructural variables from colour infrared (CIR) photo geological analogue maps were interpreted, radio geochemical and magnetic variables thematically evaluated from airborne gamma-ray spectrometric (AGRS) and aero magnetic (AM) digital data were co-registered using a sequential grid matrix of 500 m x 500 m. The variables were quantified using theme-specific equations and digitized in simple Boolean representation format, depending on the presence or absence of a variable, its positive or negative interest, and/or greater than or less than a theme-specific value in each cell. The database so generated was subjected to a software programme weighted modelling wherein, weights for each variable are computed based on conditional probability method, and favorability index maps are generated by discriminant objective analysis. Areas with high probability for uranium mineralisation were delineated by computing composite coincidence of cells with high favorability index value considering each variable as a control variable. Critical analysis of weights computed for each variable in different sets predicts the importance of control of the variable to uranium mineralisation. This attempt has resulted in delineating several new high-probability zones of uranium enrichment and indicated a regional structural control in the area. (author). 11 refs., 4 figs

  13. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  14. NIH Scientists Map Genetic Changes That Drive Tumors in a Common Pediatric Soft-Tissue Cancer

    Science.gov (United States)

    ... Press Release NIH scientists map genetic changes that drive tumors in a common pediatric soft-tissue cancer ... of Health FOLLOW US Facebook Twitter Instagram YouTube Google+ LinkedIn GovDelivery RSS CONTACT INFORMATION Contact Us LiveHelp ...

  15. Magnetic resonance tissue phase mapping demonstrates altered left ventricular diastolic function in children with chronic kidney disease

    Energy Technology Data Exchange (ETDEWEB)

    Gimpel, Charlotte; Pohl, Martin [Medical Center - University of Freiburg, Department of General Pediatrics, Adolescent Medicine and Neonatology, Center for Pediatrics, Freiburg (Germany); Jung, Bernd A. [Inselspital Bern, Institute of Diagnostic, Interventional and Pediatric Radiology, Bern (Switzerland); Jung, Sabine [Medical Center - University of Freiburg, Department of Nuclear Medicine, Freiburg (Germany); Brado, Johannes; Odening, Katja E. [University Heart Center Freiburg, Department of Cardiology and Angiology I, Freiburg (Germany); Schwendinger, Daniel [University Children' s Hospital Zurich, Zurich (Switzerland); Burkhardt, Barbara [University Children' s Hospital Zurich, Pediatric Heart Center, Zurich (Switzerland); Geiger, Julia [University Children' s Hospital Zurich, Department of Radiology, Zurich (Switzerland); Northwestern University, Department of Radiology, Chicago, IL (United States); Arnold, Raoul [University Hospital Heidelberg, Department of Pediatric and Congenital Cardiology, Heidelberg (Germany)

    2017-02-15

    Echocardiographic examinations have revealed functional cardiac abnormalities in children with chronic kidney disease. To assess the feasibility of MRI tissue phase mapping in children and to assess regional left ventricular wall movements in children with chronic kidney disease. Twenty pediatric patients with chronic kidney disease (before or after renal transplantation) and 12 healthy controls underwent tissue phase mapping (TPM) to quantify regional left ventricular function through myocardial long (Vz) and short-axis (Vr) velocities at all 3 levels of the left ventricle. Patients and controls (age: 8 years - 20 years) were matched for age, height, weight, gender and heart rate. Patients had higher systolic blood pressure. No patient had left ventricular hypertrophy on MRI or diastolic dysfunction on echocardiography. Fifteen patients underwent tissue Doppler echocardiography, with normal z-scores for mitral early diastolic (V{sub E}), late diastolic (V{sub A}) and peak systolic (V{sub S}) velocities. Throughout all left ventricular levels, peak diastolic Vz and Vr (cm/s) were reduced in patients: Vz{sub base} -10.6 ± 1.9 vs. -13.4 ± 2.0 (P < 0.0003), Vz{sub mid} -7.8 ± 1.6 vs. -11 ± 1.5 (P < 0.0001), Vz{sub apex} -3.8 ± 1.6 vs. -5.3 ± 1.6 (P = 0.01), Vr{sub base} -4.2 ± 0.8 vs. -4.9 ± 0.7 (P = 0.01), Vr{sub mid} -4.7 ± 0.7 vs. -5.4 ± 0.7 (P = 0.01), Vr{sub apex} -4.7 ± 1.4 vs. -5.6 ± 1.1 (P = 0.05). Tissue phase mapping is feasible in children and adolescents. Children with chronic kidney disease show significantly reduced peak diastolic long- and short-axis left ventricular wall velocities, reflecting impaired early diastolic filling. Thus, tissue phase mapping detects chronic kidney disease-related functional myocardial changes before overt left ventricular hypertrophy or echocardiographic diastolic dysfunction occurs. (orig.)

  16. A new plan-scoring method using normal tissue complication probability for personalized treatment plan decisions in prostate cancer

    Science.gov (United States)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan

    2018-01-01

    The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.

  17. Mapping and characterization of iron compounds in Alzheimer's tissue

    International Nuclear Information System (INIS)

    Collingwood, Joanna; Dobson, Jon

    2006-01-01

    Understanding the management of iron in the brain is of great importance in the study of neurodegeneration, where regional iron overload is frequently evident. A variety of approaches have been employed, from quantifying iron in various anatomical structures, to identifying genetic risk factors related to iron metabolism, and exploring chelation approaches to tackle iron overload in neurodegenerative disease. However, the ease with which iron can change valence state ensures that it is present in vivo in a wide variety of forms, both soluble and insoluble. Here, we review recent developments in approaches to locate and identify iron compounds in neurodegenerative tissue. In addition to complementary techniques that allow us to quantify and identify iron compounds using magnetometry, extraction, and electron microscopy, we are utilizing a powerful combined mapping/characterization approach with synchrotron X-rays. This has enabled the location and characterization of iron accumulations containing magnetite and ferritin in human Alzheimer's disease (AD) brain tissue sections in situ at micron-resolution. It is hoped that such approaches will contribute to our understanding of the role of unusual iron accumulations in disease pathogenesis, and optimise the potential to use brain iron as a clinical biomarker for early detection and diagnosis.

  18. Probability distribution of dose rates in the body tissue as a function of the rhytm of Sr90 administration and the age of animals

    International Nuclear Information System (INIS)

    Rasin, I.M.; Sarapul'tsev, I.A.

    1975-01-01

    The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law

  19. Seismic Ground Motion Hazards with 10 Percent Probability

    Data.gov (United States)

    Department of Homeland Security — This map layer shows seismic hazard in the United States. The data represent a model showing the probability that ground motion will reach a certain level. This map...

  20. Seismic Ground Motion Hazards with 2 Percent Probability

    Data.gov (United States)

    Department of Homeland Security — This map layer shows seismic hazard in the United States. The data represent a model showing the probability that ground motion will reach a certain level. This map...

  1. In-situ Characterization and Mapping of Iron Compounds in Alzheimer's Tissue

    International Nuclear Information System (INIS)

    Collingwood, J.F.; Mikhaylova, A.; Davidson, M.; Batich, C.; Streit, W.J.; Terry, J.; Dobson, J.

    2005-01-01

    There is a well-established link between iron overload in the brain and pathology associated with neurodegeneration in a variety of disorders such as Alzheimer's (AD), Parkinson's (PD) and Huntington's (HD) diseases. This association was first discovered in AD by Goodman in 1953, where, in addition to abnormally high concentrations of iron in autopsy brain tissue, iron has also been shown to accumulate at sites of brain pathology such as senile plaques. However, since this discovery, progress in understanding the origin, role and nature of iron compounds associated with neurodegeneration has been slow. Here we report, for the first time, the location and characterization of iron compounds in human AD brain tissue sections. Iron fluorescence was mapped over a frontal-lobe tissue section from an Alzheimer's patient, and anomalous iron concentrations were identified using synchrotron X-ray absorption techniques at 5 (micro)m spatial resolution. Concentrations of ferritin and magnetite, a magnetic iron oxide potentially indicating disrupted brain-iron metabolism, were evident. These results demonstrate a practical means of correlating iron compounds and disease pathology in-situ and have clear implications for disease pathogenesis and potential therapies.

  2. Effects of tissue susceptibility on brain temperature mapping.

    Science.gov (United States)

    Maudsley, Andrew A; Goryawala, Mohammed Z; Sheriff, Sulaiman

    2017-02-01

    A method for mapping of temperature over a large volume of the brain using volumetric proton MR spectroscopic imaging has been implemented and applied to 150 normal subjects. Magnetic susceptibility-induced frequency shifts in gray- and white-matter regions were measured and included as a correction in the temperature mapping calculation. Additional sources of magnetic susceptibility variations of the individual metabolite resonance frequencies were also observed that reflect the cellular-level organization of the brain metabolites, with the most notable differences being attributed to changes of the N-Acetylaspartate resonance frequency that reflect the intra-axonal distribution and orientation of the white-matter tracts with respect to the applied magnetic field. These metabolite-specific susceptibility effects are also shown to change with age. Results indicate no change of apparent brain temperature with age from 18 to 84 years old, with a trend for increased brain temperature throughout the cerebrum in females relative for males on the order of 0.1°C; slightly increased temperatures in the left hemisphere relative to the right; and a lower temperature of 0.3°C in the cerebellum relative to that of cerebral white-matter. This study presents a novel acquisition method for noninvasive measurement of brain temperature that is of potential value for diagnostic purposes and treatment monitoring, while also demonstrating limitations of the measurement due to the confounding effects of tissue susceptibility variations. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Cardiovascular magnetic resonance frontiers: Tissue characterisation with mapping

    Directory of Open Access Journals (Sweden)

    Rebecca Schofield

    2016-11-01

    Full Text Available The clinical use of cardiovascular magnetic resonance (CMR imaging has expanded rapidly over the last decade. Its role in cardiac morphological and functional assessment is established, with perfusion and late gadolinium enhancement (LGE imaging for scar increasingly used in day-to-day clinical decision making. LGE allows a virtual histological assessment of the myocardium, with the pattern of scar suggesting disease aetiology, and the extent of predicting risk. However, even combined, the full range of pathological processes occurring in the myocardium are not interrogated. Mapping is a new frontier where the intrinsic magnetic properties of heart muscle are measured to probe further. T1, T2 and T2* mapping measures the three fundamental tissue relaxation rate constants before contrast, and the extracellular volume (ECV after contrast. These are displayed in colour, often providing an immediate appreciation of pathology. These parameters are differently sensitive to pathologies. Iron (cardiac siderosis, intramyocardial haemorrhage makes T1, T2 and T2* fall. T2 also falls with fat infiltration (Fabry disease. T2 increases with oedema (acute infarction, takotsubo cardiomyopathy, myocarditis, rheumatological disease. Native T1 increases with fibrosis, oedema and amyloid. Some of these changes are large (e.g. iron, oedema, amyloid, others more modest (diffuse fibrosis. They can be used to detect early disease, distinguish aetiology and, in some circumstances, guide therapy. In this review, we discuss these processes, illustrating clinical application and future advances.

  4. Normal tissue complication probability (NTCP), the clinician,s perspective

    International Nuclear Information System (INIS)

    Yeoh, E.K.

    2011-01-01

    Full text: 3D radiation treatment planning has enabled dose distributions to be related to the volume of normal tissues irradiated. The dose volume histograms thus derived have been utilized to set NTCP dose constraints to facilitate optimization of treatment planning. However, it is not widely appreciated that a number of important variables other than DYH's which determine NTCP in the individual patient. These variables will be discussed under the headings of patient and treatment related as well as tumour related factors. Patient related factors include age, co-morbidities such as connective tissue disease and diabetes mellitus, previous tissue/organ damage, tissue architectural organization (parallel or serial), regional tissue/organ and individual tissue/organ radiosensitivities as well as the development of severe acute toxicity. Treatment related variables which need to be considered include dose per fraction (if not the conventional 1.8012.00 Gy/fraction, particularly for IMRT), number of fractions and total dose, dose rate (particularly if combined with brachytherapy) and concurrent chemotherapy or other biological dose modifiers. Tumour related factors which impact on NTCP include infiltration of normal tissue/organ usually at presentation leading to compromised function but also with recurrent disease after radiation therapy as well as variable tumour radiosensitivities between and within tumour types. Whilst evaluation of DYH data is a useful guide in the choice of treatment plan, the current state of knowledge requires the clinician to make an educated judgement based on a consideration of the other factors.

  5. Mapping of Mechanical Strains and Stresses around Quiescent Engineered Three-Dimensional Epithelial Tissues

    Science.gov (United States)

    Gjorevski, Nikolce; Nelson, Celeste M.

    2012-01-01

    Understanding how physical signals guide biological processes requires qualitative and quantitative knowledge of the mechanical forces generated and sensed by cells in a physiologically realistic three-dimensional (3D) context. Here, we used computational modeling and engineered epithelial tissues of precise geometry to define the experimental parameters that are required to measure directly the mechanical stress profile of 3D tissues embedded within native type I collagen. We found that to calculate the stresses accurately in these settings, we had to account for mechanical heterogeneities within the matrix, which we visualized and quantified using confocal reflectance and atomic force microscopy. Using this technique, we were able to obtain traction forces at the epithelium-matrix interface, and to resolve and quantify patterns of mechanical stress throughout the surrounding matrix. We discovered that whereas single cells generate tension by contracting and pulling on the matrix, the contraction of multicellular tissues can also push against the matrix, causing emergent compression. Furthermore, tissue geometry defines the spatial distribution of mechanical stress across the epithelium, which communicates mechanically over distances spanning hundreds of micrometers. Spatially resolved mechanical maps can provide insight into the types and magnitudes of physical parameters that are sensed and interpreted by multicellular tissues during normal and pathological processes. PMID:22828342

  6. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD).

    Science.gov (United States)

    Luxton, Gary; Keall, Paul J; King, Christopher R

    2008-01-07

    To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within approximately 0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD(50), and conversely m and TD(50) are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d(ref), n, v(eff) and the Niemierko equivalent uniform dose (EUD), where d(ref) and v(eff) are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data.

  7. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD)

    International Nuclear Information System (INIS)

    Luxton, Gary; Keall, Paul J; King, Christopher R

    2008-01-01

    To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within ∼0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD 50 , and conversely m and TD 50 are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d ref , n, v eff and the Niemierko equivalent uniform dose (EUD), where d ref and v eff are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data

  8. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  9. Taking potential probability function maps to the local scale and matching them with land use maps

    Science.gov (United States)

    Garg, Saryu; Sinha, Vinayak; Sinha, Baerbel

    2013-04-01

    Source-Receptor models have been developed using different methods. Residence-time weighted concentration back trajectory analysis and Potential Source Contribution Function (PSCF) are the two most popular techniques for identification of potential sources of a substance in a defined geographical area. Both techniques use back trajectories calculated using global models and assign values of probability/concentration to various locations in an area. These values represent the probability of threshold exceedances / the average concentration measured at the receptor in air masses with a certain residence time over a source area. Both techniques, however, have only been applied to regional and long-range transport phenomena due to inherent limitation with respect to both spatial accuracy and temporal resolution of the of back trajectory calculations. Employing the above mentioned concepts of residence time weighted concentration back-trajectory analysis and PSCF, we developed a source-receptor model capable of identifying local and regional sources of air pollutants like Particulate Matter (PM), NOx, SO2 and VOCs. We use 1 to 30 minute averages of concentration values and wind direction and speed from a single receptor site or from multiple receptor sites to trace the air mass back in time. The model code assumes all the atmospheric transport to be Lagrangian and linearly extrapolates air masses reaching the receptor location, backwards in time for a fixed number of steps. We restrict the model run to the lifetime of the chemical species under consideration. For long lived species the model run is limited to 180 trees/gridsquare); moderate concentrations for agricultural lands with low tree density (1.5-2.5 ppbv for 250 μg/m3 for traffic hotspots in Chandigarh City are observed. Based on the validation against the land use maps, the model appears to do an excellent job in source apportionment and identifying emission hotspots. Acknowledgement: We thank the IISER

  10. Normal Tissue Complication Probability Modeling of Acute Hematologic Toxicity in Cervical Cancer Patients Treated With Chemoradiotherapy

    International Nuclear Information System (INIS)

    Rose, Brent S.; Aydogan, Bulent; Liang, Yun; Yeginer, Mete; Hasselle, Michael D.; Dandekar, Virag; Bafana, Rounak; Yashar, Catheryn M.; Mundt, Arno J.; Roeske, John C.; Mell, Loren K.

    2011-01-01

    Purpose: To test the hypothesis that increased pelvic bone marrow (BM) irradiation is associated with increased hematologic toxicity (HT) in cervical cancer patients undergoing chemoradiotherapy and to develop a normal tissue complication probability (NTCP) model for HT. Methods and Materials: We tested associations between hematologic nadirs during chemoradiotherapy and the volume of BM receiving ≥10 and 20 Gy (V 10 and V 20 ) using a previously developed linear regression model. The validation cohort consisted of 44 cervical cancer patients treated with concurrent cisplatin and pelvic radiotherapy. Subsequently, these data were pooled with data from 37 identically treated patients from a previous study, forming a cohort of 81 patients for normal tissue complication probability analysis. Generalized linear modeling was used to test associations between hematologic nadirs and dosimetric parameters, adjusting for body mass index. Receiver operating characteristic curves were used to derive optimal dosimetric planning constraints. Results: In the validation cohort, significant negative correlations were observed between white blood cell count nadir and V 10 (regression coefficient (β) = -0.060, p = 0.009) and V 20 (β = -0.044, p = 0.010). In the combined cohort, the (adjusted) β estimates for log (white blood cell) vs. V 10 and V 20 were as follows: -0.022 (p = 0.025) and -0.021 (p = 0.002), respectively. Patients with V 10 ≥ 95% were more likely to experience Grade ≥3 leukopenia (68.8% vs. 24.6%, p 20 > 76% (57.7% vs. 21.8%, p = 0.001). Conclusions: These findings support the hypothesis that HT increases with increasing pelvic BM volume irradiated. Efforts to maintain V 10 20 < 76% may reduce HT.

  11. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Surface density mapping of natural tissue by a scanning haptic microscope (SHM).

    Science.gov (United States)

    Moriwaki, Takeshi; Oie, Tomonori; Takamizawa, Keiichi; Murayama, Yoshinobu; Fukuda, Toru; Omata, Sadao; Nakayama, Yasuhide

    2013-02-01

    To expand the performance capacity of the scanning haptic microscope (SHM) beyond surface mapping microscopy of elastic modulus or topography, surface density mapping of a natural tissue was performed by applying a measurement theory of SHM, in which a frequency change occurs upon contact of the sample surface with the SHM sensor - a microtactile sensor (MTS) that vibrates at a pre-determined constant oscillation frequency. This change was mainly stiffness-dependent at a low oscillation frequency and density-dependent at a high oscillation frequency. Two paragon examples with extremely different densities but similar macroscopic elastic moduli in the range of natural soft tissues were selected: one was agar hydrogels and the other silicon organogels with extremely low (less than 25 mg/cm(3)) and high densities (ca. 1300 mg/cm(3)), respectively. Measurements were performed in saline solution near the second-order resonance frequency, which led to the elastic modulus, and near the third-order resonance frequency. There was little difference in the frequency changes between the two resonance frequencies in agar gels. In contrast, in silicone gels, a large frequency change by MTS contact was observed near the third-order resonance frequency, indicating that the frequency change near the third-order resonance frequency reflected changes in both density and elastic modulus. Therefore, a density image of the canine aortic wall was subsequently obtained by subtracting the image observed near the second-order resonance frequency from that near the third-order resonance frequency. The elastin-rich region had a higher density than the collagen-rich region.

  13. Normal tissue complication probability modeling of radiation-induced hypothyroidism after head-and-neck radiation therapy.

    Science.gov (United States)

    Bakhshandeh, Mohsen; Hashemi, Bijan; Mahdavi, Seied Rabi Mehdi; Nikoofar, Alireza; Vasheghani, Maryam; Kazemnejad, Anoshirvan

    2013-02-01

    To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with α/β = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D(50) estimated from the models was approximately 44 Gy. The implemented normal tissue complication probability models showed a parallel architecture for the

  14. Normal Tissue Complication Probability Modeling of Radiation-Induced Hypothyroidism After Head-and-Neck Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bakhshandeh, Mohsen [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Hashemi, Bijan, E-mail: bhashemi@modares.ac.ir [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Mahdavi, Seied Rabi Mehdi [Department of Medical Physics, Faculty of Medical Sciences, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Nikoofar, Alireza; Vasheghani, Maryam [Department of Radiation Oncology, Hafte-Tir Hospital, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kazemnejad, Anoshirvan [Department of Biostatistics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of)

    2013-02-01

    Purpose: To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Methods and Materials: Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with {alpha}/{beta} = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Results: Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D{sub 50} estimated from the models was approximately 44 Gy. Conclusions: The implemented

  15. In-Field, In Situ, and In Vivo 3-Dimensional Elemental Mapping for Plant Tissue and Soil Analysis Using Laser-Induced Breakdown Spectroscopy

    Directory of Open Access Journals (Sweden)

    Chunjiang Zhao

    2016-10-01

    Full Text Available Sensing and mapping element distributions in plant tissues and its growth environment has great significance for understanding the uptake, transport, and accumulation of nutrients and harmful elements in plants, as well as for understanding interactions between plants and the environment. In this study, we developed a 3-dimensional elemental mapping system based on laser-induced breakdown spectroscopy that can be deployed in- field to directly measure the distribution of multiple elements in living plants as well as in the soil. Mapping is performed by a fast scanning laser, which ablates a micro volume of a sample to form a plasma. The presence and concentration of specific elements are calculated using the atomic, ionic, and molecular spectral characteristics of the plasma emission spectra. Furthermore, we mapped the pesticide residues in maize leaves after spraying to demonstrate the capacity of this method for trace elemental mapping. We also used the system to quantitatively detect the element concentrations in soil, which can be used to further understand the element transport between plants and soil. We demonstrate that this method has great potential for elemental mapping in plant tissues and soil with the advantages of 3-dimensional and multi-elemental mapping, in situ and in vivo measurement, flexible use, and low cost.

  16. Osteosarcoma: correlation of T1 map and histology map

    International Nuclear Information System (INIS)

    Suh, Jin Suck; Yun, Mi Jin; Jeong, Eun Kee; Shin, Kyoo Ho; Yang, Woo Ick

    1999-01-01

    To determine whether T1 mapping shows regional differences between viable and necrotic regions of osteosarcomas after anticancer chemotherapy and to assess whether this mapping is able to express the characteristics of various intramural tissue components. Eleven of 20 osteosarcomas were included in this study, while the remaining nine were excluded because the tumor site was inappropriate for comparison of T1 map and tumor macrosection. All patients underwent MR imaging for the purpose of T1 mapping, followed by pre-operative chemotherapy and subsequent limb-salvage surgery. Spin echo pulse sequencing was used with varying TR (100, 200, 400, 800, 1600, and 2400 msec) and a constant TE of 20 msec. Using a C-language software program, T1 relaxation time was calculated on a pixel-by-pixel basis and then a T1 map was generated by using a post-processing program, NIH Image. We attempted correlation of the T1 map and histologic findings, particularly in regions of interest(ROI) if certain areas were different from other regions on either the T1 or histologic map. Value was expressed as an average of the ratio of T1 of ROI and T1 of fat tissue, and this was used as an internal reference for normalization of the measurement. Tumor necrosis was 100 %(Grade IV) in six specimens, and over 90 % (Grade III) in five. Viable tumor cells were found mostly in regions with chondroid matrix and seldom in regions with osteoid matrix. Regardless of cell viability, values ranged from 0.9 to 9.87(mean, 4.02) in tumor necrotic area with osteoid matrices, and from 3.04 to 3.9(mean, 3.55) in areas with chondroid matrices. Other regions with fibrous tissue proliferation, hemorrhage, and fatty necrosis showed values of 2.92-9.83(mean, 7.20), 2.65-5.96(mean,3.59), and 1.43-3.11(mean, 2.68) respectively. The values of various tissues overlapped. No statistically significant difference was found between regions in which tumors were viable and those with tumor necrosis. Although we hypothesized

  17. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  18. Parametric methods for characterizing myocardial tissue by magnetic resonance imaging (part 2): T2 mapping.

    Science.gov (United States)

    Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T

    2015-01-01

    Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  19. Does the fluence map editing in electronic tissue compensator improve dose homogeneity in bilateral field plan of head and neck patients?

    Directory of Open Access Journals (Sweden)

    Kinhikar Rajesh

    2008-01-01

    Full Text Available The purpose of this study was to evaluate the effect of fluence map editing in electronic tissue compensator (ETC on the dose homogeneity for head and neck cancer patients. Treatment planning using 6-MV X-rays and bilateral field arrangement employing ETC was carried out on the computed tomography (CT datasets of 20 patients with head and neck cancer. All the patients were planned in Varian Eclipse three-dimensional treatment planning system (3DTPS with dynamic multileaf collimator (DMLC. The treatment plans, with and without fluence editing, was compared and the effect of pre-editing and post-editing the fluence maps in the treatment field was evaluated. The skin dose was measured with thermoluminescent dosimeters (TLDs and was compared with the skin dose estimated by TPS. The mean percentage volume of the tissue receiving at least 107% of the prescription dose was 5.4 (range 1.5-10; SD 2.4. Post-editing fluence map showed that the mean percentage volume of the tissue receiving at least 107% of the prescription dose was 0.47 (range 0.1-0.9; SD 0.3. The mean skin dose measured with TLD was found to be 74% (range 71-80% of the prescribed dose while the TPS showed the mean skin dose as 85% (range 80-90%. The TPS overestimated the skin dose by 11%. Fluence map editing thus proved to be a potential tool for improving dose homogeneity in head and neck cancer patients planned with ETC, thus reducing the hot spots in the treatment region as well. The treatment with ETC is feasible with DMLC and does not take any additional time for setup or delivery. The method used to edit the fluence maps is simple and time efficient. Manual control over a plan is essential to create the best treatment plan possible.

  20. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  1. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  2. MR-based automatic delineation of volumes of interest in human brain PET images using probability maps

    DEFF Research Database (Denmark)

    Svarer, Claus; Madsen, Karina; Hasselbalch, Steen G.

    2005-01-01

    subjects' MR-images, where VOI sets have been defined manually. High-resolution structural MR-images and 5-HT(2A) receptor binding PET-images (in terms of (18)F-altanserin binding) from 10 healthy volunteers and 10 patients with mild cognitive impairment were included for the analysis. A template including...... 35 VOIs was manually delineated on the subjects' MR images. Through a warping algorithm template VOI sets defined from each individual were transferred to the other subjects MR-images and the voxel overlap was compared to the VOI set specifically drawn for that particular individual. Comparisons were...... delineation of the VOI set. The approach was also shown to work equally well in individuals with pronounced cerebral atrophy. Probability-map-based automatic delineation of VOIs is a fast, objective, reproducible, and safe way to assess regional brain values from PET or SPECT scans. In addition, the method...

  3. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  4. Three-dimensional multislice spiral computed tomographic angiography: a potentially useful tool for safer free tissue transfer to complicated regions

    DEFF Research Database (Denmark)

    Demirtas, Yener; Cifci, Mehmet; Kelahmetoglu, Osman

    2009-01-01

    Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer to compli......Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer...... be kept in mind, especially inthe patients with peripheral vascular disease. 3D-MSCTA has the potential to replace digital subtraction angiography for planning of microvascular reconstructions and newer devices with higher resolutions will probably increase the reliability of this technique. (c) 2009...

  5. Vaccination with map specific peptides reduces map burden in tissues of infected goats

    DEFF Research Database (Denmark)

    Melvang, Heidi Mikkelsen; Hassan, Sufia Butt; Thakur, Aneesh

    As an alternative to protein-based vaccines, we investigated the effect of post-exposure vaccination with Map specific peptides in a goat model aiming at developing a Map vaccine that will neither interfere with diagnosis of paratuberculosis nor bovine tuberculosis. Peptides were initially select...... in the unvaccinated control group seroconverted in ID Screen® ELISA at last sampling prior to euthanasia. These results indicate that a subunit vaccine against Map can induce a protective immune response against paratuberculosis in goats....

  6. Local Directional Probability Optimization for Quantification of Blurred Gray/White Matter Junction in Magnetic Resonance Image

    Directory of Open Access Journals (Sweden)

    Xiaoxia Qu

    2017-09-01

    Full Text Available The blurred gray/white matter junction is an important feature of focal cortical dysplasia (FCD lesions. FCD is the main cause of epilepsy and can be detected through magnetic resonance (MR imaging. Several earlier studies have focused on computing the gradient magnitude of the MR image and used the resulting map to model the blurred gray/white matter junction. However, gradient magnitude cannot quantify the blurred gray/white matter junction. Therefore, we proposed a novel algorithm called local directional probability optimization (LDPO for detecting and quantifying the width of the gray/white matter boundary (GWB within the lesional areas. The proposed LDPO method mainly consists of the following three stages: (1 introduction of a hidden Markov random field-expectation-maximization algorithm to compute the probability images of brain tissues in order to obtain the GWB region; (2 generation of local directions from gray matter (GM to white matter (WM passing through the GWB, considering the GWB to be an electric potential field; (3 determination of the optimal local directions for any given voxel of GWB, based on iterative searching of the neighborhood. This was then used to measure the width of the GWB. The proposed LDPO method was tested on real MR images of patients with FCD lesions. The results indicated that the LDPO method could quantify the GWB width. On the GWB width map, the width of the blurred GWB in the lesional region was observed to be greater than that in the non-lesional regions. The proposed GWB width map produced higher F-scores in terms of detecting the blurred GWB within the FCD lesional region as compared to that of FCD feature maps, indicating better trade-off between precision and recall.

  7. Joint probability discrimination between stationary tissue and blood velocity signals

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2001-01-01

    before and after echo-canceling, and (b) the amplitude variations between samples in consecutive RF-signals before and after echo-canceling. The statistical discriminator was obtained by computing the probability density functions (PDFs) for each feature through histogram analysis of data....... This study presents a new statistical discriminator. Investigation of the RF-signals reveals that features can be derived that distinguish the segments of the signal, which do an do not carry information on the blood flow. In this study 4 features, have been determined: (a) the energy content in the segments....... The discrimination is performed by determining the joint probability of the features for the segment under investigation and choosing the segment type that is most likely. The method was tested on simulated data resembling RF-signals from the carotid artery....

  8. Suitable reference tissues for quantitative susceptibility mapping of the brain.

    Science.gov (United States)

    Straub, Sina; Schneider, Till M; Emmerich, Julian; Freitag, Martin T; Ziener, Christian H; Schlemmer, Heinz-Peter; Ladd, Mark E; Laun, Frederik B

    2017-07-01

    Since quantitative susceptibility mapping (QSM) quantifies magnetic susceptibility relative to a reference value, a suitable reference tissue has to be available to compare different subjects and stages of disease. To find such a suitable reference tissue for QSM of the brain, melanoma patients with and without brain metastases were measured. Twelve reference regions were chosen and assessed for stability of susceptibility values with respect to multiple intra-individual and inter-individual measurements, age, and stage of disease. Cerebrospinal fluid (CSF), the internal capsule and one region in the splenium of the corpus callosum are the regions with the smallest standard deviations of the mean susceptibility value. The mean susceptibility is 0.010 ± 0.014 ppm for CSF in the atrium of the lateral ventricles (csf post ), -0.060 ± 0.019 ppm for the posterior limb of the internal capsule (ci2), and -0.008 ± 0.019 ppm for the splenium of the corpus callosum. csf post and ci2 show nearly no dependence on age or stage of disease, whereas some other regions, e.g., the red nucleus, show moderate dependence on age or disease. The internal capsule and CSF appear to be the most suitable reference regions for QSM of the brain in the melanoma patients studied. Both showed virtually no dependence on age or disease and small variations among patients. Magn Reson Med 78:204-214, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  9. New exponential, logarithm and q-probability in the non-extensive statistical physics

    OpenAIRE

    Chung, Won Sang

    2013-01-01

    In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.

  10. A comparison between probability and information measures of uncertainty in a simulated soil map and the economic value of imperfect soil information.

    Science.gov (United States)

    Lark, R. Murray

    2014-05-01

    Conventionally the uncertainty of a conventional soil map has been expressed in terms of the mean purity of its map units: the probability that the soil profile class examined at a site would be found to correspond to the eponymous class of the simple map unit that is delineated there (Burrough et al, 1971). This measure of uncertainty has an intuitive meaning and is used for quality control in soil survey contracts (Western, 1978). However, it may be of limited value to the manager or policy maker who wants to decide whether the map provides a basis for decision making, and whether the cost of producing a better map would be justified. In this study I extend a published analysis of the economic implications of uncertainty in a soil map (Giasson et al., 2000). A decision analysis was developed to assess the economic value of imperfect soil map information for agricultural land use planning. Random error matrices for the soil map units were then generated, subject to constraints which ensure consistency with fixed frequencies of the different soil classes. For each error matrix the mean map unit purity was computed, and the value of the implied imperfect soil information was computed by the decision analysis. An alternative measure of the uncertainty in a soil map was considered. This is the mean soil map information which is the difference between the information content of a soil observation, at a random location in the region, and the information content of a soil observation given that the map unit is known. I examined the relationship between the value of imperfect soil information and the purity and information measures of map uncertainty. In both cases there was considerable variation in the economic value of possible maps with fixed values of the uncertainty measure. However, the correlation was somewhat stronger with the information measure, and there was a clear upper bound on the value of an imperfect soil map when the mean information takes some

  11. Post-exposure vaccination with multi-stage vaccine significantly reduce map level in tissues without interference in diagnostics

    DEFF Research Database (Denmark)

    Thakur, Aneesh; Aagaard, Claus; Melvang, Heidi Mikkelsen

    A new (Fet11) vaccine against paratuberculosis based on recombinant antigens from acute and latent stages of Map infection was developed to be used without interference with diagnostic tests for bovine TB and Johne’s disease. Calves were orally inoculated with 2x10E10 live Map in their third week...... of life and randomly assigned to four groups of seven calves each. One group was left unvaccinated, while other calves were post-exposure vaccinated with either a whole-cell vaccine at 16 weeks, or Fet11 vaccine at 3 and 7, or 16 and 20 weeks of age, respectively. Antibody responses were measured by ID...... Screen® ELISA and individual vaccine protein ELISAs along with FACS and IFN-γ responses to PPDj and to individual vaccine proteins. At termination 8 or 12 months of age, Map burden in a number of gut tissues was determined by quantitative IS900 PCR and histopathology. Fet11 vaccination of calves at 16...

  12. Parametric techniques for characterizing myocardial tissue by magnetic resonance imaging (part 1): T1 mapping.

    Science.gov (United States)

    Perea Palazón, R J; Ortiz Pérez, J T; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Solé Arqués, M

    2016-01-01

    The development of myocardial fibrosis is a common process in the appearance of ventricular dysfunction in many heart diseases. Magnetic resonance imaging makes it possible to accurately evaluate the structure and function of the heart, and its role in the macroscopic characterization of myocardial fibrosis by late enhancement techniques has been widely validated clinically. Recent studies have demonstrated that T1-mapping techniques can quantify diffuse myocardial fibrosis and the expansion of the myocardial extracellular space in absolute terms. However, further studies are necessary to validate the usefulness of this technique in the early detection of tissue remodeling at a time when implementing early treatment would improve a patient's prognosis. This article reviews the state of the art for T1 mapping of the myocardium, its clinical applications, and its limitations. Copyright © 2016 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  13. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  14. Raman molecular imaging of brain frozen tissue sections.

    Science.gov (United States)

    Kast, Rachel E; Auner, Gregory W; Rosenblum, Mark L; Mikkelsen, Tom; Yurgelevic, Sally M; Raghunathan, Aditya; Poisson, Laila M; Kalkanis, Steven N

    2014-10-01

    Raman spectroscopy provides a molecular signature of the region being studied. It is ideal for neurosurgical applications because it is non-destructive, label-free, not impacted by water concentration, and can map an entire region of tissue. The objective of this paper is to demonstrate the meaningful spatial molecular information provided by Raman spectroscopy for identification of regions of normal brain, necrosis, diffusely infiltrating glioma and solid glioblastoma (GBM). Five frozen section tissues (1 normal, 1 necrotic, 1 GBM, and 2 infiltrating glioma) were mapped in their entirety using a 300-µm-square step size. Smaller regions of interest were also mapped using a 25-µm step size. The relative concentrations of relevant biomolecules were mapped across all tissues and compared with adjacent hematoxylin and eosin-stained sections, allowing identification of normal, GBM, and necrotic regions. Raman peaks and peak ratios mapped included 1003, 1313, 1431, 1585, and 1659 cm(-1). Tissue maps identified boundaries of grey and white matter, necrosis, GBM, and infiltrating tumor. Complementary information, including relative concentration of lipids, protein, nucleic acid, and hemoglobin, was presented in a manner which can be easily adapted for in vivo tissue mapping. Raman spectroscopy can successfully provide label-free imaging of tissue characteristics with high accuracy. It can be translated to a surgical or laboratory tool for rapid, non-destructive imaging of tumor margins.

  15. Investigation of normal tissue complication probabilities in prostate and partial breast irradiation radiotherapy techniques

    International Nuclear Information System (INIS)

    Bezak, E.; Takam, R.; Bensaleh, S.; Yeoh, E.; Marcu, L.

    2011-01-01

    Full text: Normal- Tissue-Complication Probabilities of rectum, bladder and urethra following various radiation techniques for prostate cancer were evaluated using the relative-seriality and Lyman models. NTCPs of lungs, heart and skin, their dependence on sourceposition, balloon-deformation were also investigated for HDR mammosite brachytherapy. The prostate treatment techniques included external three dimentional conformal-radiotherapy, Low-Dose-Rate brachytherapy (1-125), High-Dose-Rate brachytherapy (Ir-I92). Dose- Volume-Histograms of critical structures for prostate and breast radiotherapy, retrieved from corresponding treatment planning systems, were converted to Biological Effective Dose (BEffD)-based and Equivalent Dose(Deq)-based DVHs to account for differences in radiation delivery and fractionation schedule. Literature-based model parameters were used to calculate NTCPs. Hypofractionated 3D-CRT (2.75 Gy/fraction, total dose 55 Gy) NTCPs of rectum, bladder and urethra were less than those for standard fractionated 4-field 3D-CRT (2-Gy/fraction, 64 Gy) and dose-escalated 4- and 5-field 3D-CRT (74 Gy). Rectal and bladder NTCPs (5.2% and 6.6%) following the dose-escalated 4-field 3D-CRT (74 Gy) were the highest among analyzed techniques. The average NTCP for rectum and urethra were 0.6% and 24.7% for LDRBT and 0.5% and 11.2% for HDR-BT. For Mammosite, NTCP was estimated to be 0.1 %, 0.1 %, 1.2% and 3.5% for skin desquamation, erythema, telangiectasia and fibrosis respectively (the source positioned at the balloon centre). A 4 mm Mammosite-balloon deformation leads to overdosing of PTV regions by ∼40%, resulting in excessive skin dose and increased NTCP. Conclusions Prostate brachytherapy resulted in NTCPs lower compared to external beam techniques. Mammosite-brachytherapy resulted in no heart/lung complications regardless of balloon deformation. However, 4 mm deformation caused 0.6% increase in tissue fibrosis NTCP.

  16. Occurrence and Probability Maps of Lutzomyia longipalpis and Lutzomyia cruzi (Diptera: Psychodidae: Phlebotominae) in Brazil.

    Science.gov (United States)

    Andrade-Filho, J D; Scholte, R G C; Amaral, A L G; Shimabukuro, P H F; Carvalho, O S; Caldeira, R L

    2017-09-01

    Leishmaniases are serious diseases caused by trypanosomatid protozoans of the genus Leishmania transmitted by the bite of phlebotomine sand flies. We analyzed records pertaining to Lutzomyia longipalpis (Lutz and Neiva, 1912) and Lutzomyia cruzi (Mangabeira, 1938) in Brazil from the following sources: the collection of phlebotomine sand flies of the Centro de Pesquisas René Rachou/Fiocruz (FIOCRUZ-COLFLEB), the "SpeciesLink" (CRIA) database, from systematic surveys of scientific articles and gray literature (dissertations, theses, and communications), and disease data obtained from the Information System for Notifiable Diseases/Ministry of Health (SINAN/MS). Environmental data and ecological niche modeling (ESMS) using the approach of MaxEnt algorithm produced maps of occurrence probability for both Lu. longipalpis and Lu. cruzi. Lutzomyia longipalpis was found in 229 Brazilian municipalities and Lu. cruzi in 27. The species were sympatric in 16 municipalities of the Central-West region of Brazil. Our results show that Lu. longipalpis is widely distributed and associated with the high number of cases of visceral leishmaniasis reported in Brazil. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Oxygen Mapping within Healthy and Acutely Infarcted Brain Tissue in Humans Using the NMR Relaxation of Lipids: A Proof-Of-Concept Translational Study.

    Science.gov (United States)

    Colliez, Florence; Safronova, Marta M; Magat, Julie; Joudiou, Nicolas; Peeters, André P; Jordan, Bénédicte F; Gallez, Bernard; Duprez, Thierry

    2015-01-01

    The clinical applicability of brain oxygenation mapping using the MOBILE (Mapping of Oxygen By Imaging Lipids relaxation Enhancement) magnetic resonance (MR) technique was assessed in the clinical setting of normal brain and of acute cerebral ischemia as a founding proof-of-concept translational study. Changes in the oxygenation level within healthy brain tissue can be detected by analyzing the spin-lattice proton relaxation ('Global T1' combining water and lipid protons) because of the paramagnetic properties of molecular oxygen. It was hypothesized that selective measurement of the relaxation of the lipid protons ('Lipids T1') would result in enhanced sensitivity of pO2 mapping because of higher solubility of oxygen in lipids than in water, and this was demonstrated in pre-clinical models using the MOBILE technique. In the present study, 12 healthy volunteers and eight patients with acute (48-72 hours) brain infarction were examined with the same clinical 3T MR system. Both Lipids R1 (R1 = 1/T1) and Global R1 were significantly different in the infarcted area and the contralateral unaffected brain tissue, with a higher statistical significance for Lipids R1 (median difference: 0.408 s-1; pbrain tissue of stroke patients were not significantly different from the R1 values calculated in the brain tissue of healthy volunteers. The main limitations of the present prototypic version of the MOBILE sequence are the long acquisition time (4 min), hampering robustness of data in uncooperative patients, and a 2 mm slice thickness precluding accurate measurements in small infarcts because of partial volume averaging effects.

  18. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  20. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t

    2012-01-01

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  1. The maximum entropy method of moments and Bayesian probability theory

    Science.gov (United States)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  2. Normal tissue complication probabilities: dependence on choice of biological model and dose-volume histogram reduction scheme

    International Nuclear Information System (INIS)

    Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake

    2000-01-01

    Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use

  3. Ergodicity of polygonal slap maps

    International Nuclear Information System (INIS)

    Del Magno, Gianluigi; Pedro Gaivão, José; Lopes Dias, João; Duarte, Pedro

    2014-01-01

    Polygonal slap maps are piecewise affine expanding maps of the interval obtained by projecting the sides of a polygon along their normals onto the perimeter of the polygon. These maps arise in the study of polygonal billiards with non-specular reflection laws. We study the absolutely continuous invariant probabilities (acips) of the slap maps for several polygons, including regular polygons and triangles. We also present a general method for constructing polygons with slap maps with more than one ergodic acip. (paper)

  4. Towards a global land subsidence map

    NARCIS (Netherlands)

    Erkens, G.; Sutanudjaja, E. H.

    2015-01-01

    Land subsidence is a global problem, but a global land subsidence map is not available yet. Such map is crucial to raise global awareness of land subsidence, as land subsidence causes extensive damage (probably in the order of billions of dollars annually). With the global land subsidence map

  5. MO-F-CAMPUS-J-04: Tissue Segmentation-Based MR Electron Density Mapping Method for MR-Only Radiation Treatment Planning of Brain

    Energy Technology Data Exchange (ETDEWEB)

    Yu, H [Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Lee, Y [Sunnybrook Odette Cancer Centre, Toronto, Ontario (Canada); Ruschin, M [Odette Cancer Centre, Toronto, ON (Canada); Karam, I [Sunnybrook Odette Cancer Center, Toronto, Ontario (Canada); Sahgal, A [University of Toronto, Toronto, ON (Canada)

    2015-06-15

    Purpose: Automatically derive electron density of tissues using MR images and generate a pseudo-CT for MR-only treatment planning of brain tumours. Methods: 20 stereotactic radiosurgery (SRS) patients’ T1-weighted MR images and CT images were retrospectively acquired. First, a semi-automated tissue segmentation algorithm was developed to differentiate tissues with similar MR intensities and large differences in electron densities. The method started with approximately 12 slices of manually contoured spatial regions containing sinuses and airways, then air, bone, brain, cerebrospinal fluid (CSF) and eyes were automatically segmented using edge detection and anatomical information including location, shape, tissue uniformity and relative intensity distribution. Next, soft tissues - muscle and fat were segmented based on their relative intensity histogram. Finally, intensities of voxels in each segmented tissue were mapped into their electron density range to generate pseudo-CT by linearly fitting their relative intensity histograms. Co-registered CT was used as a ground truth. The bone segmentations of pseudo-CT were compared with those of co-registered CT obtained by using a 300HU threshold. The average distances between voxels on external edges of the skull of pseudo-CT and CT in three axial, coronal and sagittal slices with the largest width of skull were calculated. The mean absolute electron density (in Hounsfield unit) difference of voxels in each segmented tissues was calculated. Results: The average of distances between voxels on external skull from pseudo-CT and CT were 0.6±1.1mm (mean±1SD). The mean absolute electron density differences for bone, brain, CSF, muscle and fat are 78±114 HU, and 21±8 HU, 14±29 HU, 57±37 HU, and 31±63 HU, respectively. Conclusion: The semi-automated MR electron density mapping technique was developed using T1-weighted MR images. The generated pseudo-CT is comparable to that of CT in terms of anatomical position of

  6. Normal tissue complication probabilities correlated with late effects in the rectum after prostate conformal radiotherapy

    International Nuclear Information System (INIS)

    Dale, Einar; Olsen, Dag R.; Fossa, Sophie D.

    1999-01-01

    Purpose: Radiation therapy of deep-sited tumours will always result in normal tissue doses to some extent. The aim of this study was to calculate different risk estimates of late effects in the rectum for a group of cancer prostate patients treated with conformal radiation therapy (CRT) and correlate these estimates with the occurrences of late effects. Since the rectum is a hollow organ, several ways of generating dose-volume distributions over the organ are possible, and we wanted to investigate two of them. Methods and Materials: A mathematical model, known as the Lyman-Kutcher model, conventionally used to estimate normal tissue complication probabilities (NTCP) associated with radiation therapy, was applied to a material of 52 cancer prostate patients. The patients were treated with a four field box technique, with the rectum as organ at risk. Dose-volume histograms (DVH) were generated for the whole rectum (including the cavity) and of the rectum wall. One to two years after the treatment, the patients completed a questionnaire concerning bowel (rectum) related morbidity quantifying the extent of late effects. Results: A correlation analysis using Spearman's rank correlation coefficient, for NTCP values calculated from the DVHs and the patients' scores, gave correlation coefficients which were not statistically significant at the p max , of the whole rectum, correlated better to observed late toxicity than D max derived from histograms of the rectum wall. Correlation coefficients from 'high-dose' measures were larger than those calculated from the NTCP values. Accordingly, as the volume parameter of the Lyman-Kutcher model was reduced, raising the impact of small high-dose volumes on the NTCP values, the correlation between observed effects and NTCP values became significant at p < 0.01 level. Conclusions: 1) High-dose levels corresponding to small volume fractions of the cumulative dose-volume histograms were best correlated with the occurrences of late

  7. Analysis of a human brain transcriptome map

    Directory of Open Access Journals (Sweden)

    Greene Jonathan R

    2002-04-01

    Full Text Available Abstract Background Genome wide transcriptome maps can provide tools to identify candidate genes that are over-expressed or silenced in certain disease tissue and increase our understanding of the structure and organization of the genome. Expressed Sequence Tags (ESTs from the public dbEST and proprietary Incyte LifeSeq databases were used to derive a transcript map in conjunction with the working draft assembly of the human genome sequence. Results Examination of ESTs derived from brain tissues (excluding brain tumor tissues suggests that these genes are distributed on chromosomes in a non-random fashion. Some regions on the genome are dense with brain-enriched genes while some regions lack brain-enriched genes, suggesting a significant correlation between distribution of genes along the chromosome and tissue type. ESTs from brain tumor tissues have also been mapped to the human genome working draft. We reveal that some regions enriched in brain genes show a significant decrease in gene expression in brain tumors, and, conversely that some regions lacking in brain genes show an increased level of gene expression in brain tumors. Conclusions This report demonstrates a novel approach for tissue specific transcriptome mapping using EST-based quantitative assessment.

  8. Vaccination with peptides of Mycobacterium avium subsp. paratuberculosis (MAP) reduces MAP burden of infected goats

    DEFF Research Database (Denmark)

    Melvang, Heidi Mikkelsen; Hassan, Sufia Butt; Thakur, Aneesh

    Mycobacterium avium subsp. paratuberculosis (Map) is the cause of paratuberculosis, a chronic enteritis of ruminants that is widespread worldwide. We investigated the effect of post-exposure vaccination with Map specific peptides in a goat model aiming at developing a Map vaccine that will neither...... unique to Map from selected proteins (n =68). For vaccination, 23 MAP peptides (20 µg each) were selected and formulated with Montanide ISA 61 VG adjuvant. At age three weeks 10 goats were orally inoculated with 4x10E9 live Map and assigned to two groups of 5 goats each: 5 vaccinated (V) at 14 and 18...... weeks post inoculation (PI) and 5 unvaccinated (C). At termination 32 weeks PI, Map burdens in 15 intestinal tissues and lymph nodes were determined by IS900 qPCR. Of the 75 tissue samples from the 5 C goats only 5 samples were IS900 qPCR negative. In contrast, only 9 samples in total from 5 V goats...

  9. MAP3K8 (TPL2/COT) Affects Obesity-Induced Adipose Tissue Inflammation without Systemic Effects in Humans and in Mice

    NARCIS (Netherlands)

    Ballak, D.B.; Essen, P. van; Diepen, J.A. van; Jansen, H.J.; Hijmans, A.G.; Matsuguchi, T.; Sparrer, H.; Tack, C.J.J.; Netea, M.G.; Joosten, L.A.B.; Stienstra, R.

    2014-01-01

    Chronic low-grade inflammation in adipose tissue often accompanies obesity, leading to insulin resistance and increasing the risk for metabolic diseases. MAP3K8 (TPL2/COT) is an important signal transductor and activator of pro-inflammatory pathways that has been linked to obesity-induced adipose

  10. In situ biological dose mapping estimates the radiation burden delivered to 'spared' tissue between synchrotron X-ray microbeam radiotherapy tracks.

    Directory of Open Access Journals (Sweden)

    Kai Rothkamm

    Full Text Available Microbeam radiation therapy (MRT using high doses of synchrotron X-rays can destroy tumours in animal models whilst causing little damage to normal tissues. Determining the spatial distribution of radiation doses delivered during MRT at a microscopic scale is a major challenge. Film and semiconductor dosimetry as well as Monte Carlo methods struggle to provide accurate estimates of dose profiles and peak-to-valley dose ratios at the position of the targeted and traversed tissues whose biological responses determine treatment outcome. The purpose of this study was to utilise γ-H2AX immunostaining as a biodosimetric tool that enables in situ biological dose mapping within an irradiated tissue to provide direct biological evidence for the scale of the radiation burden to 'spared' tissue regions between MRT tracks. Γ-H2AX analysis allowed microbeams to be traced and DNA damage foci to be quantified in valleys between beams following MRT treatment of fibroblast cultures and murine skin where foci yields per unit dose were approximately five-fold lower than in fibroblast cultures. Foci levels in cells located in valleys were compared with calibration curves using known broadbeam synchrotron X-ray doses to generate spatial dose profiles and calculate peak-to-valley dose ratios of 30-40 for cell cultures and approximately 60 for murine skin, consistent with the range obtained with conventional dosimetry methods. This biological dose mapping approach could find several applications both in optimising MRT or other radiotherapeutic treatments and in estimating localised doses following accidental radiation exposure using skin punch biopsies.

  11. Quantitative Susceptibility Mapping: Contrast Mechanisms and Clinical Applications

    Science.gov (United States)

    Liu, Chunlei; Wei, Hongjiang; Gong, Nan-Jie; Cronin, Matthew; Dibb, Russel; Decker, Kyle

    2016-01-01

    Quantitative susceptibility mapping (QSM) is a recently developed MRI technique for quantifying the spatial distribution of magnetic susceptibility within biological tissues. It first uses the frequency shift in the MRI signal to map the magnetic field profile within the tissue. The resulting field map is then used to determine the spatial distribution of the underlying magnetic susceptibility by solving an inverse problem. The solution is achieved by deconvolving the field map with a dipole field, under the assumption that the magnetic field is a result of the superposition of the dipole fields generated by all voxels and that each voxel has its unique magnetic susceptibility. QSM provides improved contrast to noise ratio for certain tissues and structures compared to its magnitude counterpart. More importantly, magnetic susceptibility is a direct reflection of the molecular composition and cellular architecture of the tissue. Consequently, by quantifying magnetic susceptibility, QSM is becoming a quantitative imaging approach for characterizing normal and pathological tissue properties. This article reviews the mechanism generating susceptibility contrast within tissues and some associated applications. PMID:26844301

  12. MAPS of Cancer

    Science.gov (United States)

    Gray, Lincoln

    1998-01-01

    Our goal was to produce an interactive visualization from a mathematical model that successfully predicts metastases from head and neck cancer. We met this goal early in the project. The visualization is available for the public to view. Our work appears to fill a need for more information about this deadly disease. The idea of this project was to make an easily interpretable visualization based on what we call "functional maps" of disease. A functional map is a graphic summary of medical data, where distances between parts of the body are determined by the probability of disease, not by anatomical distances. Functional maps often beat little resemblance to anatomical maps, but they can be used to predict the spread of disease. The idea of modeling the spread of disease in an abstract multidimensional space is difficult for many people. Our goal was to make the important predictions easy to see. NASA must face this problem frequently: how to help laypersons and professionals see important trends in abstract, complex data. We took advantage of concepts perfected in NASA's graphics libraries. As an analogy, consider a functional map of early America. Suppose we choose travel times, rather than miles, as our measures of inter-city distances. For Abraham Lincoln, travel times would have been the more meaningful measure of separation between cities. In such a map New Orleans would be close to Memphis because of the Mississippi River. St. Louis would be close to Portland because of the Oregon Trail. Oklahoma City would be far from Little Rock because of the Cheyenne. Such a map would look puzzling to those of us who have always seen physical maps, but the functional map would be more useful in predicting the probabilities of inter-site transit. Continuing the analogy, we could predict the spread of social diseases such as gambling along the rivers and cattle rustling along the trails. We could simply print the functional map of America, but it would be more interesting

  13. Prediction of radiation-induced liver disease by Lyman normal-tissue complication probability model in three-dimensional conformal radiation therapy for primary liver carcinoma

    International Nuclear Information System (INIS)

    Xu ZhiYong; Liang Shixiong; Zhu Ji; Zhu Xiaodong; Zhao Jiandong; Lu Haijie; Yang Yunli; Chen Long; Wang Anyu; Fu Xiaolong; Jiang Guoliang

    2006-01-01

    Purpose: To describe the probability of RILD by application of the Lyman-Kutcher-Burman normal-tissue complication (NTCP) model for primary liver carcinoma (PLC) treated with hypofractionated three-dimensional conformal radiotherapy (3D-CRT). Methods and Materials: A total of 109 PLC patients treated by 3D-CRT were followed for RILD. Of these patients, 93 were in liver cirrhosis of Child-Pugh Grade A, and 16 were in Child-Pugh Grade B. The Michigan NTCP model was used to predict the probability of RILD, and then the modified Lyman NTCP model was generated for Child-Pugh A and Child-Pugh B patients by maximum-likelihood analysis. Results: Of all patients, 17 developed RILD in which 8 were of Child-Pugh Grade A, and 9 were of Child-Pugh Grade B. The prediction of RILD by the Michigan model was underestimated for PLC patients. The modified n, m, TD 5 (1) were 1.1, 0.28, and 40.5 Gy and 0.7, 0.43, and 23 Gy for patients with Child-Pugh A and B, respectively, which yielded better estimations of RILD probability. The hepatic tolerable doses (TD 5 ) would be MDTNL of 21 Gy and 6 Gy, respectively, for Child-Pugh A and B patients. Conclusions: The Michigan model was probably not fit to predict RILD in PLC patients. A modified Lyman NTCP model for RILD was recommended

  14. Quantification of the volumetric benefit of image-guided radiotherapy (I.G.R.T.) in prostate cancer: Margins and presence probability map

    International Nuclear Information System (INIS)

    Cazoulat, G.; Crevoisier, R. de; Simon, A.; Louvel, G.; Manens, J.P.; Haigron, P.; Crevoisier, R. de; Louvel, G.; Manens, J.P.; Lafond, C.

    2009-01-01

    Purpose: To quantify the prostate and seminal vesicles (S.V.) anatomic variations in order to choose appropriate margins including intrapelvic anatomic variations. To quantify volumetric benefit of image-guided radiotherapy (I.G.R.T.). Patients and methods: Twenty patients, receiving a total dose of 70 Gy in the prostate, had a planning CT scan and eight weekly CT scans during treatment. Prostate and S.V. were manually contoured. Each weekly CT scan was registered to the planning CT scan according to three modalities: radiopaque skin marks, pelvis bone or prostate. For each patient, prostate and S.V. displacements were quantified. 3-dimensional maps of prostate and S.V. presence probability were established. Volumes including minimal presence probabilities were compared between the three modalities of registration. Result: For the prostate intrapelvic displacements, systematic and random variations and maximal displacements for the entire population were: 5 mm, 2.7 mm and 16.5 mm in anteroposterior axis; 2.7 mm, 2.4 mm and 11.4 mm in supero-inferior axis and 0.5 mm, 0.8 mm and 3.3 mm laterally. Margins according to van Herk recipe (to cover the prostate for 90% of the patients with the 95% isodose) were: 8 mm, 8.3 mm and 1.9 mm, respectively. The 100% prostate presence probability volumes correspond to 37%, 50% and 61% according to the registration modality. For the S.V., these volumes correspond to 8%, 14% and 18% of the S.V. volume. Conclusions: Without I.G.R.T., 5 mm prostate posterior margins are insufficient and should be at least 8 mm, to account for intrapelvic anatomic variations. Prostate registration almost doubles the 100% presence probability volume compared to skin registration. Deformation of S.V. will require either to increase dramatically margins (simple) or new planning (not realistic). (authors)

  15. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    Science.gov (United States)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin

  16. Risk-targeted maps for Romania

    Science.gov (United States)

    Vacareanu, Radu; Pavel, Florin; Craciun, Ionut; Coliba, Veronica; Arion, Cristian; Aldea, Alexandru; Neagu, Cristian

    2018-03-01

    Romania has one of the highest seismic hazard levels in Europe. The seismic hazard is due to a combination of local crustal seismic sources, situated mainly in the western part of the country and the Vrancea intermediate-depth seismic source, which can be found at the bend of the Carpathian Mountains. Recent seismic hazard studies have shown that there are consistent differences between the slopes of the seismic hazard curves for sites situated in the fore-arc and back-arc of the Carpathian Mountains. Consequently, in this study we extend this finding to the evaluation of the probability of collapse of buildings and finally to the development of uniform risk-targeted maps. The main advantage of uniform risk approach is that the target probability of collapse will be uniform throughout the country. Finally, the results obtained are discussed in the light of a recent study with the same focus performed at European level using the hazard data from SHARE project. The analyses performed in this study have pointed out to a dominant influence of the quantile of peak ground acceleration used for anchoring the fragility function. This parameter basically alters the shape of the risk-targeted maps shifting the areas which have higher collapse probabilities from eastern Romania to western Romania, as its exceedance probability increases. Consequently, a uniform procedure for deriving risk-targeted maps appears as more than necessary.

  17. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  18. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  19. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  20. Molecular cloning, genomic organization, chromosome mapping, tissues expression pattern and identification of a novel splicing variant of porcine CIDEb gene

    International Nuclear Information System (INIS)

    Li, YanHua; Li, AiHua; Yang, Z.Q.

    2016-01-01

    Cell death-inducing DNA fragmentation factor-α-like effector b (CIDEb) is a member of the CIDE family of apoptosis-inducing factors, CIDEa and CIDEc have been reported to be Lipid droplets (LDs)-associated proteins that promote atypical LD fusion in adipocytes, and responsible for liver steatosis under fasting and obese conditions, whereas CIDEb promotes lipid storage under normal diet conditions [1], and promotes the formation of triacylglyceride-enriched VLDL particles in hepatocytes [2]. Here, we report the gene cloning, chromosome mapping, tissue distribution, genetic expression analysis, and identification of a novel splicing variant of the porcine CIDEb gene. Sequence analysis shows that the open reading frame of the normal porcine CIDEb isoform covers 660bp and encodes a 219-amino acid polypeptide, whereas its alternative splicing variant encodes a 142-amino acid polypeptide truncated at the fourth exon and comprised of the CIDE-N domain and part of the CIDE-C domain. The deduced amino acid sequence of normal porcine CIDEb shows an 85.8% similarity to the human protein and 80.0% to the mouse protein. The CIDEb genomic sequence spans approximately 6KB comprised of five exons and four introns. Radiation hybrid mapping demonstrated that porcine CIDEb is located at chromosome 7q21 and at a distance of 57cR from the most significantly linked marker, S0334, regions that are syntenic with the corresponding region in the human genome. Tissue expression analysis indicated that normal CIDEb mRNA is ubiquitously expressed in many porcine tissues. It was highly expressed in white adipose tissue and was observed at relatively high levels in the liver, lung, small intestine, lymphatic tissue and brain. The normal version of CIDEb was the predominant form in all tested tissues, whereas the splicing variant was expressed at low levels in all examined tissues except the lymphatic tissue. Furthermore, genetic expression analysis indicated that CIDEb mRNA levels were

  1. Molecular cloning, genomic organization, chromosome mapping, tissues expression pattern and identification of a novel splicing variant of porcine CIDEb gene

    Energy Technology Data Exchange (ETDEWEB)

    Li, YanHua, E-mail: liyanhua.1982@aliyun.com [Ministry of Education Key Laboratory of Child Development and Disorders, Chongqing Key Laboratory of Translational Medical Research in Cognitive Development and Learning and Memory Disorders, China International Science and Technology Cooperation base of Child development and Critical Disorders, Children’s Hospital of Chongqing Medical University, Chongqing 400014 (China); Li, AiHua [Chongqing Cancer Institute & Hospital & Cancer Center, Chongqing 404100 (China); Yang, Z.Q. [Key Laboratory of Agricultural Animal Genetics, Breeding and Reproduction of Ministry of Education, College of Life Science and Technology, Huazhong Agricultural University, Wuhan 430070 (China)

    2016-09-09

    Cell death-inducing DNA fragmentation factor-α-like effector b (CIDEb) is a member of the CIDE family of apoptosis-inducing factors, CIDEa and CIDEc have been reported to be Lipid droplets (LDs)-associated proteins that promote atypical LD fusion in adipocytes, and responsible for liver steatosis under fasting and obese conditions, whereas CIDEb promotes lipid storage under normal diet conditions [1], and promotes the formation of triacylglyceride-enriched VLDL particles in hepatocytes [2]. Here, we report the gene cloning, chromosome mapping, tissue distribution, genetic expression analysis, and identification of a novel splicing variant of the porcine CIDEb gene. Sequence analysis shows that the open reading frame of the normal porcine CIDEb isoform covers 660bp and encodes a 219-amino acid polypeptide, whereas its alternative splicing variant encodes a 142-amino acid polypeptide truncated at the fourth exon and comprised of the CIDE-N domain and part of the CIDE-C domain. The deduced amino acid sequence of normal porcine CIDEb shows an 85.8% similarity to the human protein and 80.0% to the mouse protein. The CIDEb genomic sequence spans approximately 6KB comprised of five exons and four introns. Radiation hybrid mapping demonstrated that porcine CIDEb is located at chromosome 7q21 and at a distance of 57cR from the most significantly linked marker, S0334, regions that are syntenic with the corresponding region in the human genome. Tissue expression analysis indicated that normal CIDEb mRNA is ubiquitously expressed in many porcine tissues. It was highly expressed in white adipose tissue and was observed at relatively high levels in the liver, lung, small intestine, lymphatic tissue and brain. The normal version of CIDEb was the predominant form in all tested tissues, whereas the splicing variant was expressed at low levels in all examined tissues except the lymphatic tissue. Furthermore, genetic expression analysis indicated that CIDEb mRNA levels were

  2. Understanding map projections: Chapter 15

    Science.gov (United States)

    Usery, E. Lynn; Kent, Alexander J.; Vujakovic, Peter

    2018-01-01

    It has probably never been more important in the history of cartography than now that people understand how maps work. With increasing globalization, for example, world maps provide a key format for the transmission of information, but are often poorly used. Examples of poor understanding and use of projections and the resultant maps are many; for instance, the use of rectangular world maps in the United Kingdom press to show Chinese and Korean missile ranges as circles, something which can only be achieved on equidistant projections and then only from one launch point (Vujakovic, 2014).

  3. The role of house surveys in geological radon potential mapping

    International Nuclear Information System (INIS)

    Ball, K.

    1997-01-01

    Because radon levels vary widely between apparently identical buildings on the same geological unit, no map can predict the radon level in an individual building. Maps can, however, give the probability that a building in a particular locality is above a threshold of radon concentration such as a reference or action level. The probability may be calculated for a particular building type or for a mixture of building types. In the latter case the probability is in effect an estimate of the proportion of buildings above the threshold level. Alternatively maps can provide estimates of the mean radon levels in buildings by area. Maps showing the geographical variation in probability that new or existing building will exceed a radon reference level are used to prevent excessive exposures to radon. The information may be used in various ways, such as to target information campaigns encouraging measurement of radon levels in homes or to modify regulations for new buildings. The data which are used to provide the estimates of the proportion of buildings above a threshold may be radon measurements results from a sample of buildings, or may be indirect indicators such as ground radium concentrations, emanation coefficients and permeability measurements. Consistency in radon measurement protocols and detailed positional information are prerequisites for mapping radon prone areas based upon house data. Grouping building radon measurements by geological formation and superficial cover can produce radon potential maps which are more spatially accurate than grid square maps and more accurate in estimating numbers of homes affected than mapping based only on measuring geological and pedagogical properties

  4. Cosmic string induced CMB maps

    International Nuclear Information System (INIS)

    Landriau, M.; Shellard, E. P. S.

    2011-01-01

    We compute maps of CMB temperature fluctuations seeded by cosmic strings using high resolution simulations of cosmic strings in a Friedmann-Robertson-Walker universe. We create full-sky, 18 deg. and 3 deg. CMB maps, including the relevant string contribution at each resolution from before recombination to today. We extract the angular power spectrum from these maps, demonstrating the importance of recombination effects. We briefly discuss the probability density function of the pixel temperatures, their skewness, and kurtosis.

  5. The effect of 6 and 15 MV on intensity-modulated radiation therapy prostate cancer treatment: plan evaluation, tumour control probability and normal tissue complication probability analysis, and the theoretical risk of secondary induced malignancies

    Science.gov (United States)

    Hussein, M; Aldridge, S; Guerrero Urbano, T; Nisbet, A

    2012-01-01

    Objective The aim of this study was to investigate the effect of 6 and 15-MV photon energies on intensity-modulated radiation therapy (IMRT) prostate cancer treatment plan outcome and to compare the theoretical risks of secondary induced malignancies. Methods Separate prostate cancer IMRT plans were prepared for 6 and 15-MV beams. Organ-equivalent doses were obtained through thermoluminescent dosemeter measurements in an anthropomorphic Aldersen radiation therapy human phantom. The neutron dose contribution at 15 MV was measured using polyallyl-diglycol-carbonate neutron track etch detectors. Risk coefficients from the International Commission on Radiological Protection Report 103 were used to compare the risk of fatal secondary induced malignancies in out-of-field organs and tissues for 6 and 15 MV. For the bladder and the rectum, a comparative evaluation of the risk using three separate models was carried out. Dose–volume parameters for the rectum, bladder and prostate planning target volume were evaluated, as well as normal tissue complication probability (NTCP) and tumour control probability calculations. Results There is a small increased theoretical risk of developing a fatal cancer from 6 MV compared with 15 MV, taking into account all the organs. Dose–volume parameters for the rectum and bladder show that 15 MV results in better volume sparing in the regions below 70 Gy, but the volume exposed increases slightly beyond this in comparison with 6 MV, resulting in a higher NTCP for the rectum of 3.6% vs 3.0% (p=0.166). Conclusion The choice to treat using IMRT at 15 MV should not be excluded, but should be based on risk vs benefit while considering the age and life expectancy of the patient together with the relative risk of radiation-induced cancer and NTCPs. PMID:22010028

  6. Earthquake Probability Assessment for the Active Faults in Central Taiwan: A Case Study

    Directory of Open Access Journals (Sweden)

    Yi-Rui Lee

    2016-06-01

    Full Text Available Frequent high seismic activities occur in Taiwan due to fast plate motions. According to the historical records the most destructive earthquakes in Taiwan were caused mainly by inland active faults. The Central Geological Survey (CGS of Taiwan has published active fault maps in Taiwan since 1998. There are 33 active faults noted in the 2012 active fault map. After the Chi-Chi earthquake, CGS launched a series of projects to investigate the details to better understand each active fault in Taiwan. This article collected this data to develop active fault parameters and referred to certain experiences from Japan and the United States to establish a methodology for earthquake probability assessment via active faults. We consider the active faults in Central Taiwan as a good example to present the earthquake probability assessment process and results. The appropriate “probability model” was used to estimate the conditional probability where M ≥ 6.5 and M ≥ 7.0 earthquakes. Our result shows that the highest earthquake probability for M ≥ 6.5 earthquake occurring in 30, 50, and 100 years in Central Taiwan is the Tachia-Changhua fault system. Conversely, the lowest earthquake probability is the Chelungpu fault. The goal of our research is to calculate the earthquake probability of the 33 active faults in Taiwan. The active fault parameters are important information that can be applied in the following seismic hazard analysis and seismic simulation.

  7. Retrospective validation of a lava-flow hazard map for Mount Etna volcano

    Directory of Open Access Journals (Sweden)

    Ciro Del Negro

    2011-12-01

    Full Text Available This report presents a retrospective methodology to validate a long-term hazard map related to lava-flow invasion at Mount Etna, the most active volcano in Europe. A lava-flow hazard map provides the probability that a specific point will be affected by potential destructive volcanic processes over the time period considered. We constructed this lava-flow hazard map for Mount Etna volcano through the identification of the emission regions with the highest probabilities of eruptive vents and through characterization of the event types for the numerical simulations and the computation of the eruptive probabilities. Numerical simulations of lava-flow paths were carried out using the MAGFLOW cellular automata model. To validate the methodology developed, a hazard map was built by considering only the eruptions that occurred at Mount Etna before 1981. On the basis of the probability of coverage by lava flows, the map was divided into ten classes, and two fitting scores were calculated to measure the overlap between the hazard classes and the actual shapes of the lava flows that occurred after 1981.

  8. Mapping of the spatial distribution of silver nanoparticles in root tissues of Vicia faba by laser-induced breakdown spectroscopy (LIBS).

    Science.gov (United States)

    Krajcarová, L; Novotný, K; Kummerová, M; Dubová, J; Gloser, V; Kaiser, J

    2017-10-01

    The manuscript presents a procedure for optimal sample preparation and the mapping of the spatial distribution of metal ions and nanoparticles in plant roots using laser-induced breakdown spectroscopy (LIBS) in a double-pulse configuration (DP LIBS) in orthogonal reheating mode. Two Nd:YAG lasers were used; the first one was an ablation laser (UP-266 MACRO, New Wave, USA) with a wavelength of 266nm, and the second one (Brilliant, Quantel, France), with a fundamental wavelength of 1064nm, was used to reheat the microplasma. Seedlings of Vicia faba were cultivated for 7 days in CuSO 4 or AgNO 3 solutions with a concentration of 10µmoll -1 or in a solution of silver nanoparticles (AgNPs) with a concentration of 10µmoll -1 of total Ag, and in distilled water as a control. The total contents of the examined metals in the roots after sample mineralization as well as changes in the concentrations of the metals in the cultivation solutions were monitored by ICP-OES. Root samples embedded in the TissueTek medium and cut into 40µm thick cross sections using the Cryo-Cut Microtome proved to be best suited for an accurate LIBS analysis with a 50µm spatial resolution. 2D raster maps of elemental distribution were created for the emission lines of Cu(I) at 324.754nm and Ag(I) at 328.068nm. The limits of detection of DP LIBS for the root cross sections were estimated to be 4pg for Cu, 18pg for Ag, and 3pg for AgNPs. The results of Ag spatial distribution mapping indicated that unlike Ag + ions, AgNPs do not penetrate into the inner tissues of Vicia faba roots but stay in their outermost layers. The content of Ag in roots cultivated in the AgNP solution was one order of magnitude lower compared to roots cultivated in the metal ion solutions. The significantly smaller concentration of Ag in root tissues cultivated in the AgNP solution also supports the conclusion that the absorption and uptake of AgNPs by roots of Vicia faba is very slow. LIBS mapping of root sections

  9. Validity of T2 mapping in characterization of the regeneration tissue by bone marrow derived cell transplantation in osteochondral lesions of the ankle

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, M., E-mail: milva.battaglia@ior.it [Service of Ecography and Radiology, Rizzoli Orthopaedic Institute, via Pupilli n. 1, 40136 Bologna (Italy); Rimondi, E. [Service of Ecography and Radiology, Rizzoli Orthopaedic Institute, via Pupilli n. 1, 40136 Bologna (Italy); Monti, C. [Service of CT and MRI, Casa di Cura Madre Fortunata Toniolo, Bologna (Italy); Guaraldi, F. [Department of Pathology, The Johns Hopkins University, School of Medicine, Baltimore, MD (United States); Sant' Andrea, A. [Service of CT and MRI, Casa di Cura Madre Fortunata Toniolo, Bologna (Italy); Buda, R.; Cavallo, M.; Giannini, S.; Vannini, F. [Clinical Orthopaedic and Traumatology Unit II, Rizzoli Orthopaedic Institute, Bologna (Italy)

    2011-11-15

    Objective: Bone marrow derived cell transplantation (BMDCT) has been recently suggested as a possible surgical technique to repair osteochondral lesions. To date, no qualitative MRI studies have evaluated its efficacy. The aim of our study is to investigate the validity of MRI T2-mapping sequence in characterizing the reparative tissue obtained and its ability to correlate with clinical results. Methods and materials: 20 patients with an osteochondral lesion of the talus underwent BMDCT and were evaluated at 2 years follow up using MRI T2-mapping sequence. 20 healthy volunteers were recruited as controls. MRI images were acquired using a protocol suggested by the International Cartilage Repair Society, MOCART scoring system and T2 mapping. Results were then correlated with AOFAS clinical score. Results: AOFAS score increased from 66.8 {+-} 14.5 pre-operatively to 91.2 {+-} 8.3 (p < 0.0005) at 2 years follow-up. T2-relaxation time value of 35-45 ms was derived from healthy ankles evaluation and assumed as normal hyaline cartilage value and used as a control. Regenerated tissue with a T2-relaxation time value comparable to hyaline cartilage was found in all the cases treated, covering a mean of 78% of the repaired lesion area. A high clinical score was related directly to isointense signal in DPFSE fat sat (p = 0.05), and percentage of regenerated hyaline cartilage (p = 0.05), inversely to the percentage of regenerated fibrocartilage. Lesion's depth negatively related to the integrity of the repaired tissue's surface (tau = -0.523, p = 0.007), and to the percentage of regenerated hyaline cartilage (rho = -0.546, p = 0.013). Conclusions: Because of its ability to detect cartilage's quality and to correlate to the clinical score, MRI T2-mapping sequence integrated with Mocart score represent a valid, non-invasive technique for qualitative cartilage assessment after regenerative surgical procedures.

  10. Tissue reduction of map numbers after post-exposure vaccination with single latency antigen is improved by combination with acute-stage antigens in goats

    DEFF Research Database (Denmark)

    Thakur, Aneesh; Aagaard, C.; Melvang, Heidi Mikkelsen

    compared to unvaccinated control goats. FET11 and FET13 vaccination, however, provided significantly protection with absent or very low Map numbers in tissues. No goats seroconverted in ID Screen® ELISA, except for a single goat in the unvaccinated control group at last sampling prior to euthanasia. PPDj...

  11. Southern pine beetle infestation probability mapping using weights of evidence analysis

    Science.gov (United States)

    Jason B. Grogan; David L. Kulhavy; James C. Kroll

    2010-01-01

    Weights of Evidence (WofE) spatial analysis was used to predict probability of southern pine beetle (Dendroctonus frontalis) (SPB) infestation in Angelina, Nacogdoches, San Augustine and Shelby Co., TX. Thematic data derived from Landsat imagery (1974–2002 Landsat 1–7) were used. Data layers included: forest covertype, forest age, forest patch size...

  12. Application of a weighted spatial probability model in GIS to analyse landslides in Penang Island, Malaysia

    Directory of Open Access Journals (Sweden)

    Samy Ismail Elmahdy

    2016-01-01

    Full Text Available In the current study, Penang Island, which is one of the several mountainous areas in Malaysia that is often subjected to landslide hazard, was chosen for further investigation. A multi-criteria Evaluation and the spatial probability weighted approach and model builder was applied to map and analyse landslides in Penang Island. A set of automated algorithms was used to construct new essential geological and morphometric thematic maps from remote sensing data. The maps were ranked using the weighted probability spatial model based on their contribution to the landslide hazard. Results obtained showed that sites at an elevation of 100–300 m, with steep slopes of 10°–37° and slope direction (aspect in the E and SE directions were areas of very high and high probability for the landslide occurrence; the total areas were 21.393 km2 (11.84% and 58.690 km2 (32.48%, respectively. The obtained map was verified by comparing variogram models of the mapped and the occurred landslide locations and showed a strong correlation with the locations of occurred landslides, indicating that the proposed method can successfully predict the unpredictable landslide hazard. The method is time and cost effective and can be used as a reference for geological and geotechnical engineers.

  13. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  14. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  15. Quantitative susceptibility mapping (QSM): Decoding MRI data for a tissue magnetic biomarker

    Science.gov (United States)

    Wang, Yi; Liu, Tian

    2015-01-01

    In MRI, the main magnetic field polarizes the electron cloud of a molecule, generating a chemical shift for observer protons within the molecule and a magnetic susceptibility inhomogeneity field for observer protons outside the molecule. The number of water protons surrounding a molecule for detecting its magnetic susceptibility is vastly greater than the number of protons within the molecule for detecting its chemical shift. However, the study of tissue magnetic susceptibility has been hindered by poor molecular specificities of hitherto used methods based on MRI signal phase and T2* contrast, which depend convolutedly on surrounding susceptibility sources. Deconvolution of the MRI signal phase can determine tissue susceptibility but is challenged by the lack of MRI signal in the background and by the zeroes in the dipole kernel. Recently, physically meaningful regularizations, including the Bayesian approach, have been developed to enable accurate quantitative susceptibility mapping (QSM) for studying iron distribution, metabolic oxygen consumption, blood degradation, calcification, demyelination, and other pathophysiological susceptibility changes, as well as contrast agent biodistribution in MRI. This paper attempts to summarize the basic physical concepts and essential algorithmic steps in QSM, to describe clinical and technical issues under active development, and to provide references, codes, and testing data for readers interested in QSM. Magn Reson Med 73:82–101, 2015. © 2014 The Authors. Magnetic Resonance in Medicine Published by Wiley Periodicals, Inc. on behalf of International Society of Medicine in Resonance. This is an open access article under the terms of the Creative commons Attribution License, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited. PMID:25044035

  16. Method for Automatic Selection of Parameters in Normal Tissue Complication Probability Modeling.

    Science.gov (United States)

    Christophides, Damianos; Appelt, Ane L; Gusnanto, Arief; Lilley, John; Sebag-Montefiore, David

    2018-07-01

    To present a fully automatic method to generate multiparameter normal tissue complication probability (NTCP) models and compare its results with those of a published model, using the same patient cohort. Data were analyzed from 345 rectal cancer patients treated with external radiation therapy to predict the risk of patients developing grade 1 or ≥2 cystitis. In total, 23 clinical factors were included in the analysis as candidate predictors of cystitis. Principal component analysis was used to decompose the bladder dose-volume histogram into 8 principal components, explaining more than 95% of the variance. The data set of clinical factors and principal components was divided into training (70%) and test (30%) data sets, with the training data set used by the algorithm to compute an NTCP model. The first step of the algorithm was to obtain a bootstrap sample, followed by multicollinearity reduction using the variance inflation factor and genetic algorithm optimization to determine an ordinal logistic regression model that minimizes the Bayesian information criterion. The process was repeated 100 times, and the model with the minimum Bayesian information criterion was recorded on each iteration. The most frequent model was selected as the final "automatically generated model" (AGM). The published model and AGM were fitted on the training data sets, and the risk of cystitis was calculated. The 2 models had no significant differences in predictive performance, both for the training and test data sets (P value > .05) and found similar clinical and dosimetric factors as predictors. Both models exhibited good explanatory performance on the training data set (P values > .44), which was reduced on the test data sets (P values < .05). The predictive value of the AGM is equivalent to that of the expert-derived published model. It demonstrates potential in saving time, tackling problems with a large number of parameters, and standardizing variable selection in NTCP

  17. DNA methylation map in circulating leukocytes mirrors subcutaneous adipose tissue methylation pattern: a genome-wide analysis from non-obese and obese patients

    Science.gov (United States)

    Crujeiras, A. B.; Diaz-Lagares, A.; Sandoval, J.; Milagro, F. I.; Navas-Carretero, S.; Carreira, M. C.; Gomez, A.; Hervas, D.; Monteiro, M. P.; Casanueva, F. F.; Esteller, M.; Martinez, J. A.

    2017-01-01

    The characterization of the epigenetic changes within the obesity-related adipose tissue will provide new insights to understand this metabolic disorder, but adipose tissue is not easy to sample in population-based studies. We aimed to evaluate the capacity of circulating leukocytes to reflect the adipose tissue-specific DNA methylation status of obesity susceptibility. DNA samples isolated from subcutaneous adipose tissue and circulating leukocytes were hybridized in the Infinium HumanMethylation 450 BeadChip. Data were compared between samples from obese (n = 45) and non-obese (n = 8–10) patients by Wilcoxon-rank test, unadjusted for cell type distributions. A global hypomethylation of the differentially methylated CpG sites (DMCpGs) was observed in the obese subcutaneous adipose tissue and leukocytes. The overlap analysis yielded a number of genes mapped by the common DMCpGs that were identified to reflect the obesity state in the leukocytes. Specifically, the methylation levels of FGFRL1, NCAPH2, PNKD and SMAD3 exhibited excellent and statistically significant efficiencies in the discrimination of obesity from non-obesity status (AUC > 0.80; p obesity-related adipose tissue pathogenesis through peripheral blood analysis, an easily accessible and minimally invasive biological material instead of adipose tissue. PMID:28211912

  18. Integrating spatial, temporal, and size probabilities for the annual landslide hazard maps in the Shihmen watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    C. Y. Wu

    2013-09-01

    Full Text Available Landslide spatial, temporal, and size probabilities were used to perform a landslide hazard assessment in this study. Eleven intrinsic geomorphological, and two extrinsic rainfall factors were evaluated as landslide susceptibility related factors as they related to the success rate curves, landslide ratio plots, frequency distributions of landslide and non-landslide groups, as well as probability–probability plots. Data on landslides caused by Typhoon Aere in the Shihmen watershed were selected to train the susceptibility model. The landslide area probability, based on the power law relationship between the landslide area and a noncumulative number, was analyzed using the Pearson type 5 probability density function. The exceedance probabilities of rainfall with various recurrence intervals, including 2, 5, 10, 20, 50, 100 and 200 yr, were used to determine the temporal probabilities of the events. The study was conducted in the Shihmen watershed, which has an area of 760 km2 and is one of the main water sources for northern Taiwan. The validation result of Typhoon Krosa demonstrated that this landslide hazard model could be used to predict the landslide probabilities. The results suggested that integration of spatial, area, and exceedance probabilities to estimate the annual probability of each slope unit is feasible. The advantage of this annual landslide probability model lies in its ability to estimate the annual landslide risk, instead of a scenario-based risk.

  19. Experimental investigations into sample preparation of Alzheimer tissue specimens for nuclear microprobe analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, T [CEC-JRC, Central Bureau for Nuclear Measurements, Geel (Belgium); Tapper, U A.S. [Dept. of Nuclear Physics, Lund Inst. of Science and Tech. (Sweden); Sturesson, K; Brun, A [Div. of Neuropathology, Dept. of Pathology, Lund University Hospital (Sweden)

    1991-03-01

    Nuclear microprobe analysis was applied to the study of elemental distribution in brains sections of patients with a diagnosis of Alzheimer's disease. Stained and nonstained cryosections were studied. The work carried out shows that serious elemental losses follow the sample staining procedure. Major losses occurred in a simple rinse of the tissue section, probably reducing most of the in-vivo gradients, which show that generally very little information can be gained from stained sections. However, in many cases stained sections are compulsory because of the requirement to recognize the area which is to be studied. All the elemental maps obtained for the neurofibrillary deposits indicate a localized concentration for Si and probably also Al, associated with the senile plaque core. Neither of these elements were found in the staining solutions used. The validity of the results is discussed as well as the possible link of Al and/or Si in the development of Alzheimer's desease. (orig.).

  20. Accounting for access costs in validation of soil maps

    NARCIS (Netherlands)

    Yang, Lin; Brus, Dick J.; Zhu, A.X.; Li, Xinming; Shi, Jingjing

    2018-01-01

    The quality of soil maps can best be estimated by collecting additional data at locations selected by probability sampling. These data can be used in design-based estimation of map quality measures such as the population mean of the squared prediction errors (MSE) for continuous soil maps and

  1. Regularizing mappings of Lévy measures

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Thorbjørnsen, Steen

    2006-01-01

    the class of selfdecomposable laws onto the so called Thorin class . Further, partly motivated by our previous studies of infinite divisibility in free probability, we introduce a one-parameter family of one-to-one mappings , which interpolates smoothly between ( α=0 ) and the identity mapping on ( α=1...... ). We prove that each of the mappings shares many of the properties of . In particular, they are representable in terms of stochastic integrals with respect to associated Levy processes....

  2. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    Energy Technology Data Exchange (ETDEWEB)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C. [University Hospital of Cologne, Department of Radiology, Cologne (Germany); Schaarschmidt, Frank [Leibniz Universitaet Hannover, Institute of Biostatistics, Faculty of Natural Sciences, Hannover (Germany); Stehning, Christian [Philips Research, Hamburg (Germany); Schnackenburg, Bernhard [Philips, Healthcare Germany, Hamburg (Germany); Michels, Guido [University Hospital of Cologne, Department III of Internal Medicine, Heart Centre, Cologne (Germany)

    2017-12-15

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  3. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    International Nuclear Information System (INIS)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C.; Schaarschmidt, Frank; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido

    2017-01-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  4. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    infinity, while the Hill sphere method results in a severely underestimated probability. We provide a discussion of the reasons for these differences, and we finally present the results of the MOID method in the form of probability maps for the Earth and Mars on their current orbits. These maps show a relatively flat probability distribution, except for the occurrence of two ridges found at small inclinations and for coinciding projectile/target perihelion distances. Conclusions: Our results verify the standard formulae in the general case, away from the singularities. In fact, severe shortcomings are limited to the immediate vicinity of those extreme orbits. On the other hand, the new Monte Carlo methods can be used without excessive consumption of computer time, and the MOID method avoids the problems associated with the other methods. Appendices are available in electronic form at http://www.aanda.org

  5. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  6. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  7. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  8. Determination of glutamate dehydrogenase activity and its kinetics in mouse tissues using metabolic mapping (quantitative enzyme histochemistry).

    Science.gov (United States)

    Botman, Dennis; Tigchelaar, Wikky; Van Noorden, Cornelis J F

    2014-11-01

    Glutamate dehydrogenase (GDH) catalyses the reversible conversion of glutamate into α-ketoglutarate with the concomitant reduction of NAD(P)(+) to NAD(P)H or vice versa. GDH activity is subject to complex allosteric regulation including substrate inhibition. To determine GDH kinetics in situ, we assessed the effects of various glutamate concentrations in combination with either the coenzyme NAD(+) or NADP(+) on GDH activity in mouse liver cryostat sections using metabolic mapping. NAD(+)-dependent GDH V(max) was 2.5-fold higher than NADP(+)-dependent V(max), whereas the K(m) was similar, 1.92 mM versus 1.66 mM, when NAD(+) or NADP(+) was used, respectively. With either coenzyme, V(max) was determined at 10 mM glutamate and substrate inhibition was observed at higher glutamate concentrations with a K(i) of 12.2 and 3.95 for NAD(+) and NADP(+) used as coenzyme, respectively. NAD(+)- and NADP(+)-dependent GDH activities were examined in various mouse tissues. GDH activity was highest in liver and much lower in other tissues. In all tissues, the highest activity was found when NAD(+) was used as a coenzyme. In conclusion, GDH activity in mice is highest in the liver with NAD(+) as a coenzyme and highest GDH activity was determined at a glutamate concentration of 10 mM. © The Author(s) 2014.

  9. On the Inclusion of Short-distance Bystander Effects into a Logistic Tumor Control Probability Model.

    Science.gov (United States)

    Tempel, David G; Brodin, N Patrik; Tomé, Wolfgang A

    2018-01-01

    Currently, interactions between voxels are neglected in the tumor control probability (TCP) models used in biologically-driven intensity-modulated radiotherapy treatment planning. However, experimental data suggests that this may not always be justified when bystander effects are important. We propose a model inspired by the Ising model, a short-range interaction model, to investigate if and when it is important to include voxel to voxel interactions in biologically-driven treatment planning. This Ising-like model for TCP is derived by first showing that the logistic model of tumor control is mathematically equivalent to a non-interacting Ising model. Using this correspondence, the parameters of the logistic model are mapped to the parameters of an Ising-like model and bystander interactions are introduced as a short-range interaction as is the case for the Ising model. As an example, we apply the model to study the effect of bystander interactions in the case of radiation therapy for prostate cancer. The model shows that it is adequate to neglect bystander interactions for dose distributions that completely cover the treatment target and yield TCP estimates that lie in the shoulder of the dose response curve. However, for dose distributions that yield TCP estimates that lie on the steep part of the dose response curve or for inhomogeneous dose distributions having significant hot and/or cold regions, bystander effects may be important. Furthermore, the proposed model highlights a previously unexplored and potentially fruitful connection between the fields of statistical mechanics and tumor control probability/normal tissue complication probability modeling.

  10. Association of intraoperative tissue oxygenation with suspected risk factors for tissue hypoxia.

    Science.gov (United States)

    Spruit, R J; Schwarte, L A; Hakenberg, O W; Scheeren, T W L

    2013-10-01

    Tissue hypoxia may cause organ dysfunction, but not much is known about tissue oxygenation in the intraoperative setting. We studied microcirculatory tissue oxygen saturation (StO₂) to determine representative values for anesthetized patients undergoing urological surgery and to test the hypothesis that StO₂ is associated with known perioperative risk factors for morbidity and mortality, conventionally monitored variables, and hypotension requiring norepinephrine. Using near-infrared spectroscopy, we measured StO₂ on the thenar eminence in 160 patients undergoing open urological surgery under general anesthesia (FiO2 0.35-0.4), and calculated its correlations with age, risk level for general perioperative complications and mortality (high if age ≥70 and procedure is radical cystectomy), mean arterial pressure (MAP), hemoglobin concentration (Hb), central venous oxygen saturation (ScvO₂), and norepinephrine use. The time averaged StO₂ was 86 ± 6 % (mean ± SD). In the multivariate analysis, Hb [standardized coefficient (SC) 0.21, p = 0.003], ScvO₂ (SC 0.53, p SStO₂ was partly dependent on MAP only when this was below 65 mmHg (lowest MAP SC 0.20, p = 0.006, MAP area under the curve <65 mmHg SC 0.03, p = 0.02). Finally, StO₂ was slightly lower in patients requiring norepinephrine (85 ± 6 vs. 89 ± 6 %, p = 0.001). Intraoperative StO₂ in urological patients was comparable to that of healthy volunteers breathing room air as reported in the literature and correlated with known perioperative risk factors. Further research should investigate its association with outcome and the effect of interventions aimed at optimizing StO₂.

  11. Statistical properties of the gyro-averaged standard map

    Science.gov (United States)

    da Fonseca, Julio D.; Sokolov, Igor M.; Del-Castillo-Negrete, Diego; Caldas, Ibere L.

    2015-11-01

    A statistical study of the gyro-averaged standard map (GSM) is presented. The GSM is an area preserving map model proposed in as a simplified description of finite Larmor radius (FLR) effects on ExB chaotic transport in magnetized plasmas with zonal flows perturbed by drift waves. The GSM's effective perturbation parameter, gamma, is proportional to the zero-order Bessel function of the particle's Larmor radius. In the limit of zero Larmor radius, the GSM reduces to the standard, Chirikov-Taylor map. We consider plasmas in thermal equilibrium and assume a Larmor radius' probability density function (pdf) resulting from a Maxwell-Boltzmann distribution. Since the particles have in general different Larmor radii, each orbit is computed using a different perturbation parameter, gamma. We present analytical and numerical computations of the pdf of gamma for a Maxwellian distribution. We also compute the pdf of global chaos, which gives the probability that a particle with a given Larmor radius exhibits global chaos, i.e. the probability that Kolmogorov-Arnold-Moser (KAM) transport barriers do not exist.

  12. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners.

    Science.gov (United States)

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.

  13. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  14. Linear and nonlinear optical signals in probability and phase-space representations

    International Nuclear Information System (INIS)

    Man'ko, Margarita A

    2006-01-01

    Review of different representations of signals including the phase-space representations and tomographic representations is presented. The signals under consideration are either linear or nonlinear ones. The linear signals satisfy linear quantumlike Schroedinger and von Neumann equations. Nonlinear signals satisfy nonlinear Schroedinger equations as well as Gross-Pitaevskii equation describing solitons in Bose-Einstein condensate. The Ville-Wigner distributions for solitons are considered in comparison with tomographic-probability densities describing solitons completely. different kinds of tomographies - symplectic tomography, optical tomography and Fresnel tomography are reviewed. New kind of map of the signals onto probability distributions of discrete photon number-like variable is discussed. Mutual relations between different transformations of signal functions are established in explicit form. Such characteristics of the signal-probability distribution as entropy is discussed

  15. Identification of titanium in human tissues: probable role in pathologic processes

    International Nuclear Information System (INIS)

    Moran, C.A.; Mullick, F.G.; Ishak, K.G.; Johnson, F.B.; Hummer, W.B.

    1991-01-01

    Six cases of titanium dioxide exposure involving lung, skin, and synovium are described, with a review of the literature. The patients, four men and two women, were between the ages of 22 and 65 years. The pulmonary changes were characterized by fibrosis and numerous macrophages with abundant deposition of a black pigment. Adjacent areas of bronchopneumonia were also observed. In the skin a severe necrotizing lesion involving the subcutaneous tissue with extension to the muscle was observed in one case and a nonspecific inflammatory response was observed in another; both cases showed abundant black pigment deposition. Electron microscopy and energy dispersive x-ray analysis demonstrated the presence of large quantities of titanium in the pigment granules. There may be a combination of black pigment deposition and fibrosis, necrosis, or a xanthomatous or granulomatous reaction, that, together with negative results on special staining and culture studies for organisms, should raise the suspicion of titanium-associated injury and prompt the study of the affected tissues by x-ray analysis for positive identification

  16. Association between increased epicardial adipose tissue volume and coronary plaque composition.

    Science.gov (United States)

    Yamashita, Kennosuke; Yamamoto, Myong Hwa; Ebara, Seitarou; Okabe, Toshitaka; Saito, Shigeo; Hoshimoto, Koichi; Yakushiji, Tadayuki; Isomura, Naoei; Araki, Hiroshi; Obara, Chiaki; Ochiai, Masahiko

    2014-09-01

    To assess the relationship between epicardial adipose tissue volume (EATV) and plaque vulnerability in significant coronary stenosis using a 40-MHz intravascular ultrasound (IVUS) imaging system (iMap-IVUS), we analyzed 130 consecutive patients with coronary stenosis who underwent dual-source computed tomography (CT) and cardiac catheterization. Culprit lesions were imaged by iMap-IVUS before stenting. The iMAP-IVUS system classified coronary plaque components as fibrous, lipid, necrotic, or calcified tissue, based on the radiofrequency spectrum. Epicardial adipose tissue was measured as the tissue ranging from -190 to -30 Hounsfield units. EATV, calculated as the sum of the fat areas on short-axis images, was 85.0 ± 34.0 cm(3). There was a positive correlation between EATV and the percentage of necrotic plaque tissue (R (2) = 0.34, P EATV and the percentage of fibrous tissue (R (2) = 0.24, P EATV (β = 0.14, P = 0.02) were independently associated with the percentage of necrotic plaque tissue. An increase in EATV was associated with the development of coronary atherosclerosis and, potentially, with the most dangerous type of plaque.

  17. Three-dimensional micro-scale strain mapping in living biological soft tissues.

    Science.gov (United States)

    Moo, Eng Kuan; Sibole, Scott C; Han, Sang Kuy; Herzog, Walter

    2018-04-01

    Non-invasive characterization of the mechanical micro-environment surrounding cells in biological tissues at multiple length scales is important for the understanding of the role of mechanics in regulating the biosynthesis and phenotype of cells. However, there is a lack of imaging methods that allow for characterization of the cell micro-environment in three-dimensional (3D) space. The aims of this study were (i) to develop a multi-photon laser microscopy protocol capable of imprinting 3D grid lines onto living tissue at a high spatial resolution, and (ii) to develop image processing software capable of analyzing the resulting microscopic images and performing high resolution 3D strain analyses. Using articular cartilage as the biological tissue of interest, we present a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning length scales from the tissue to the cell level. Using custom image processing software, we provide accurate and robust 3D micro-strain analysis that allows for detailed qualitative and quantitative assessment of the 3D tissue kinematics. This novel technique preserves tissue structural integrity post-scanning, therefore allowing for multiple strain measurements at different time points in the same specimen. The proposed technique is versatile and opens doors for experimental and theoretical investigations on the relationship between tissue deformation and cell biosynthesis. Studies of this nature may enhance our understanding of the mechanisms underlying cell mechano-transduction, and thus, adaptation and degeneration of soft connective tissues. We presented a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning from tissue length scale to cellular length scale. Using a custom image processing software (lsmgridtrack), we provide accurate and robust micro

  18. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  19. Mapping probabilities of extreme continental water storage changes from space gravimetry

    OpenAIRE

    Kusche , Jürgen; Eicker , Annette; Forootan , Ehsan; Springer , Anne; Longuevergne , Laurent

    2016-01-01

    International audience; Using data from the Gravity Recovery And Climate Experiment (GRACE) mission, we derive statistically robust " hot spot " regions of high probability of peak anomalous—i.e., with respect to the seasonal cycle—water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/month). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hot spot regions to GRACE results and that most e...

  20. Real-time optoacoustic monitoring of temperature in tissues

    International Nuclear Information System (INIS)

    Larina, Irina V; Larin, Kirill V; Esenaliev, Rinat O

    2005-01-01

    To improve the safety and efficacy of thermal therapy, it is necessary to map tissue temperature in real time with submillimetre spatial resolution. Accurate temperature maps may provide the necessary control of the boundaries of the heated regions and minimize thermal damage to surrounding normal tissues. Current imaging modalities fail to monitor tissue temperature in real time with high resolution and accuracy. We investigated a non-invasive optoacoustic method for accurate, real-time monitoring of tissue temperature during thermotherapy. In this study, we induced temperature gradients in tissue and tissue-like samples and monitored the temperature distribution using the optoacoustic technique. The fundamental harmonic of a Q-switched Nd : YAG laser (λ = 1064 nm) was used for optoacoustic wave generation and probing of tissue temperature. The tissue temperature was also monitored with a multi-sensor temperature probe inserted in the samples. Good agreement between optoacoustically measured and actual tissue temperatures was obtained. The accuracy of temperature monitoring was better than 1 0 C, while the spatial resolution was about 1 mm. These data suggest that the optoacoustic technique has the potential to be used for non-invasive, real-time temperature monitoring during thermotherapy

  1. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  2. Rational Variety Mapping for Contrast-Enhanced Nonlinear Unsupervised Segmentation of Multispectral Images of Unstained Specimen

    Science.gov (United States)

    Kopriva, Ivica; Hadžija, Mirko; Popović Hadžija, Marijana; Korolija, Marina; Cichocki, Andrzej

    2011-01-01

    A methodology is proposed for nonlinear contrast-enhanced unsupervised segmentation of multispectral (color) microscopy images of principally unstained specimens. The methodology exploits spectral diversity and spatial sparseness to find anatomical differences between materials (cells, nuclei, and background) present in the image. It consists of rth-order rational variety mapping (RVM) followed by matrix/tensor factorization. Sparseness constraint implies duality between nonlinear unsupervised segmentation and multiclass pattern assignment problems. Classes not linearly separable in the original input space become separable with high probability in the higher-dimensional mapped space. Hence, RVM mapping has two advantages: it takes implicitly into account nonlinearities present in the image (ie, they are not required to be known) and it increases spectral diversity (ie, contrast) between materials, due to increased dimensionality of the mapped space. This is expected to improve performance of systems for automated classification and analysis of microscopic histopathological images. The methodology was validated using RVM of the second and third orders of the experimental multispectral microscopy images of unstained sciatic nerve fibers (nervus ischiadicus) and of unstained white pulp in the spleen tissue, compared with a manually defined ground truth labeled by two trained pathophysiologists. The methodology can also be useful for additional contrast enhancement of images of stained specimens. PMID:21708116

  3. Mechanisms for the inversion of chirality: Global reaction route mapping of stereochemical pathways in a probable chiral extraterrestrial molecule, 2-aminopropionitrile

    International Nuclear Information System (INIS)

    Kaur, Ramanpreet; Vikas

    2015-01-01

    2-Aminopropionitrile (APN), a probable candidate as a chiral astrophysical molecule, is a precursor to amino-acid alanine. Stereochemical pathways in 2-APN are explored using Global Reaction Route Mapping (GRRM) method employing high-level quantum-mechanical computations. Besides predicting the conventional mechanism for chiral inversion that proceeds through an achiral intermediate, a counterintuitive flipping mechanism is revealed for 2-APN through chiral intermediates explored using the GRRM. The feasibility of the proposed stereochemical pathways, in terms of the Gibbs free-energy change, is analyzed at the temperature conditions akin to the interstellar medium. Notably, the stereoinversion in 2-APN is observed to be more feasible than the dissociation of 2-APN and intermediates involved along the stereochemical pathways, and the flipping barrier is observed to be as low as 3.68 kJ/mol along one of the pathways. The pathways proposed for the inversion of chirality in 2-APN may provide significant insight into the extraterrestrial origin of life

  4. Maps on statistical manifolds exactly reduced from the Perron-Frobenius equations for solvable chaotic maps

    Science.gov (United States)

    Goto, Shin-itiro; Umeno, Ken

    2018-03-01

    Maps on a parameter space for expressing distribution functions are exactly derived from the Perron-Frobenius equations for a generalized Boole transform family. Here the generalized Boole transform family is a one-parameter family of maps, where it is defined on a subset of the real line and its probability distribution function is the Cauchy distribution with some parameters. With this reduction, some relations between the statistical picture and the orbital one are shown. From the viewpoint of information geometry, the parameter space can be identified with a statistical manifold, and then it is shown that the derived maps can be characterized. Also, with an induced symplectic structure from a statistical structure, symplectic and information geometric aspects of the derived maps are discussed.

  5. Multivariate Normal Tissue Complication Probability Modeling of Heart Valve Dysfunction in Hodgkin Lymphoma Survivors

    International Nuclear Information System (INIS)

    Cella, Laura; Liuzzi, Raffaele; Conson, Manuel; D’Avino, Vittoria; Salvatore, Marco; Pacelli, Roberto

    2013-01-01

    Purpose: To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced asymptomatic heart valvular defects (RVD). Methods and Materials: Fifty-six patients treated with sequential chemoradiation therapy for Hodgkin lymphoma (HL) were retrospectively reviewed for RVD events. Clinical information along with whole heart, cardiac chambers, and lung dose distribution parameters was collected, and the correlations to RVD were analyzed by means of Spearman's rank correlation coefficient (Rs). For the selection of the model order and parameters for NTCP modeling, a multivariate logistic regression method using resampling techniques (bootstrapping) was applied. Model performance was evaluated using the area under the receiver operating characteristic curve (AUC). Results: When we analyzed the whole heart, a 3-variable NTCP model including the maximum dose, whole heart volume, and lung volume was shown to be the optimal predictive model for RVD (Rs = 0.573, P<.001, AUC = 0.83). When we analyzed the cardiac chambers individually, for the left atrium and for the left ventricle, an NTCP model based on 3 variables including the percentage volume exceeding 30 Gy (V30), cardiac chamber volume, and lung volume was selected as the most predictive model (Rs = 0.539, P<.001, AUC = 0.83; and Rs = 0.557, P<.001, AUC = 0.82, respectively). The NTCP values increase as heart maximum dose or cardiac chambers V30 increase. They also increase with larger volumes of the heart or cardiac chambers and decrease when lung volume is larger. Conclusions: We propose logistic NTCP models for RVD considering not only heart irradiation dose but also the combined effects of lung and heart volumes. Our study establishes the statistical evidence of the indirect effect of lung size on radio-induced heart toxicity

  6. Hash function based on piecewise nonlinear chaotic map

    International Nuclear Information System (INIS)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2009-01-01

    Chaos-based cryptography appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an algorithm for one-way hash function construction based on piecewise nonlinear chaotic map with a variant probability parameter is proposed. Also the proposed algorithm is an attempt to present a new chaotic hash function based on multithreaded programming. In this chaotic scheme, the message is connected to the chaotic map using probability parameter and other parameters of chaotic map such as control parameter and initial condition, so that the generated hash value is highly sensitive to the message. Simulation results indicate that the proposed algorithm presented several interesting features, such as high flexibility, good statistical properties, high key sensitivity and message sensitivity. These properties make the scheme a suitable choice for practical applications.

  7. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1993-01-01

    This paper will describe a methodology which has been developed to allow accident probabilities associated with one severity category scheme to be transferred to another severity category scheme, permitting some comparisons of different studies at the category level. In this methodology, the severity category schemes to be compared are mapped onto a common set of axes. The axes represent critical accident environments (e.g., impact, thermal, crush, puncture) and indicate the range of accident parameters from zero (no accident) to the most sever credible forces. The choice of critical accident environments for the axes depends on the package being transported and the mode of transportation. The accident probabilities associated with one scheme are then transferred to the other scheme. This transfer of category probabilities is based on the relationships of the critical accident parameters to probability of occurrence. The methodology can be employed to transfer any quantity between category schemes if the appropriate supporting information is available. (J.P.N.)

  8. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  9. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  10. Point-of-care instrument for monitoring tissue health during skin graft repair

    Science.gov (United States)

    Gurjar, R. S.; Seetamraju, M.; Zhang, J.; Feinberg, S. E.; Wolf, D. E.

    2011-06-01

    We have developed the necessary theoretical framework and the basic instrumental design parameters to enable mapping of subsurface blood dynamics and tissue oxygenation for patients undergoing skin graft procedures. This analysis forms the basis for developing a simple patch geometry, which can be used to map by diffuse optical techniques blood flow velocity and tissue oxygenation as a function of depth in subsurface tissue.skin graft, diffuse correlation analysis, oxygen saturation.

  11. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  12. Spectral analysis of noisy nonlinear maps

    International Nuclear Information System (INIS)

    Hirshman, S.P.; Whitson, J.C.

    1982-01-01

    A path integral equation formalism is developed to obtain the frequency spectrum of nonlinear mappings exhibiting chaotic behavior. The one-dimensional map, x/sub n+1/ = f(x/sub n/), where f is nonlinear and n is a discrete time variable, is analyzed in detail. This map is introduced as a paradigm of systems whose exact behavior is exceedingly complex, and therefore irretrievable, but which nevertheless possess smooth, well-behaved solutions in the presence of small sources of external noise. A Boltzmann integral equation is derived for the probability distribution function p(x,n). This equation is linear and is therefore amenable to spectral analysis. The nonlinear dynamics in f(x) appear as transition probability matrix elements, and the presence of noise appears simply as an overall multiplicative scattering amplitude. This formalism is used to investigate the band structure of the logistic equation and to analyze the effects of external noise on both the invariant measure and the frequency spectrum of x/sub n/ for several values of lambda epsilon [0,1

  13. Impact of Chemotherapy on Normal Tissue Complication Probability Models of Acute Hematologic Toxicity in Patients Receiving Pelvic Intensity Modulated Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bazan, Jose G.; Luxton, Gary; Kozak, Margaret M.; Anderson, Eric M.; Hancock, Steven L.; Kapp, Daniel S.; Kidd, Elizabeth A.; Koong, Albert C.; Chang, Daniel T., E-mail: dtchang@stanford.edu

    2013-12-01

    Purpose: To determine how chemotherapy agents affect radiation dose parameters that correlate with acute hematologic toxicity (HT) in patients treated with pelvic intensity modulated radiation therapy (P-IMRT) and concurrent chemotherapy. Methods and Materials: We assessed HT in 141 patients who received P-IMRT for anal, gynecologic, rectal, or prostate cancers, 95 of whom received concurrent chemotherapy. Patients were separated into 4 groups: mitomycin (MMC) + 5-fluorouracil (5FU, 37 of 141), platinum ± 5FU (Cis, 32 of 141), 5FU (26 of 141), and P-IMRT alone (46 of 141). The pelvic bone was contoured as a surrogate for pelvic bone marrow (PBM) and divided into subsites: ilium, lower pelvis, and lumbosacral spine (LSS). The volumes of each region receiving 5-40 Gy were calculated. The endpoint for HT was grade ≥3 (HT3+) leukopenia, neutropenia or thrombocytopenia. Normal tissue complication probability was calculated using the Lyman-Kutcher-Burman model. Logistic regression was used to analyze association between HT3+ and dosimetric parameters. Results: Twenty-six patients experienced HT3+: 10 of 37 (27%) MMC, 14 of 32 (44%) Cis, 2 of 26 (8%) 5FU, and 0 of 46 P-IMRT. PBM dosimetric parameters were correlated with HT3+ in the MMC group but not in the Cis group. LSS dosimetric parameters were well correlated with HT3+ in both the MMC and Cis groups. Constrained optimization (0tissue complication probability curve compared with treatment with Cis. Dose tolerance of PBM and the LSS subsite may be lower for

  14. Axon diameter mapping in crossing fibers with diffusion MRI

    DEFF Research Database (Denmark)

    Zhang, Hui; Dyrby, Tim B; Alexander, Daniel C

    2011-01-01

    This paper proposes a technique for a previously unaddressed problem, namely, mapping axon diameter in crossing fiber regions, using diffusion MRI. Direct measurement of tissue microstructure of this kind using diffusion MRI offers a new class of biomarkers that give more specific information about...... tissue than measures derived from diffusion tensor imaging. Most existing techniques for axon diameter mapping assume a single axon orientation in the tissue model, which limits their application to only the most coherently oriented brain white matter, such as the corpus callosum, where the single...... model to enable axon diameter mapping in voxels with crossing fibers. We show in simulation that the technique can provide robust axon diameter estimates in a two-fiber crossing with the crossing angle as small as 45 degrees. Using ex vivo imaging data, we further demonstrate the feasibility...

  15. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    Science.gov (United States)

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  16. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  17. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  18. Probabilistic mapping of flood-induced backscatter changes in SAR time series

    Science.gov (United States)

    Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick

    2017-04-01

    The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.

  19. Probability of Elevated Nitrate Concentrations in Groundwater in the Eagle River Watershed Valley-Fill Aquifer, Eagle County, North-Central Colorado, 2006-2007

    Science.gov (United States)

    Rupert, Michael G.; Plummer, Niel

    2009-01-01

    This raster data set delineates the predicted probability of elevated nitrate concentrations in groundwater in the Eagle River watershed valley-fill aquifer, Eagle County, North-Central Colorado, 2006-2007. This data set was developed by a cooperative project between the U.S. Geological Survey, Eagle County, the Eagle River Water and Sanitation District, the Town of Eagle, the Town of Gypsum, and the Upper Eagle Regional Water Authority. This project was designed to evaluate potential land-development effects on groundwater and surface-water resources so that informed land-use and water management decisions can be made. This groundwater probability map and its associated probability maps was developed as follows: (1) A point data set of wells with groundwater quality and groundwater age data was overlaid with thematic layers of anthropogenic (related to human activities) and hydrogeologic data by using a geographic information system to assign each well values for depth to groundwater, distance to major streams and canals, distance to gypsum beds, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Statistical models predicting the probability of elevated nitrate concentrations, the probability of unmixed young water (using chlorofluorocarbon-11 concentrations and tritium activities), and the probability of elevated volatile organic compound concentrations were developed using logistic regression techniques. (3) The statistical models were entered into a GIS and the probability map was constructed.

  20. Real-Time Vision-Based Stiffness Mapping †.

    Science.gov (United States)

    Faragasso, Angela; Bimbo, João; Stilli, Agostino; Wurdemann, Helge Arne; Althoefer, Kaspar; Asama, Hajime

    2018-04-26

    This paper presents new findings concerning a hand-held stiffness probe for the medical diagnosis of abnormalities during palpation of soft-tissue. Palpation is recognized by the medical community as an essential and low-cost method to detect and diagnose disease in soft-tissue. However, differences are often subtle and clinicians need to train for many years before they can conduct a reliable diagnosis. The probe presented here fills this gap providing a means to easily obtain stiffness values of soft tissue during a palpation procedure. Our stiffness sensor is equipped with a multi degree of freedom (DoF) Aurora magnetic tracker, allowing us to track and record the 3D position of the probe whilst examining a tissue area, and generate a 3D stiffness map in real-time. The stiffness probe was integrated in a robotic arm and tested in an artificial environment representing a good model of soft tissue organs; the results show that the sensor can accurately measure and map the stiffness of a silicon phantom embedded with areas of varying stiffness.

  1. Real-Time Vision-Based Stiffness Mapping

    Directory of Open Access Journals (Sweden)

    Angela Faragasso

    2018-04-01

    Full Text Available This paper presents new findings concerning a hand-held stiffness probe for the medical diagnosis of abnormalities during palpation of soft-tissue. Palpation is recognized by the medical community as an essential and low-cost method to detect and diagnose disease in soft-tissue. However, differences are often subtle and clinicians need to train for many years before they can conduct a reliable diagnosis. The probe presented here fills this gap providing a means to easily obtain stiffness values of soft tissue during a palpation procedure. Our stiffness sensor is equipped with a multi degree of freedom (DoF Aurora magnetic tracker, allowing us to track and record the 3D position of the probe whilst examining a tissue area, and generate a 3D stiffness map in real-time. The stiffness probe was integrated in a robotic arm and tested in an artificial environment representing a good model of soft tissue organs; the results show that the sensor can accurately measure and map the stiffness of a silicon phantom embedded with areas of varying stiffness.

  2. The role of micro-NRA and micro-PIXE in carbon mapping of organic tissues

    International Nuclear Information System (INIS)

    Niekraszewicz, L.A.B.; Souza, C.T. de; Stori, E.M.; Jobim, P.F.C.; Amaral, L.; Dias, J.F.

    2015-01-01

    This study reports the work developed in the Ion Implantation Laboratory (Porto Alegre, RS, Brazil) in order to implement the micro-NRA technique for the study of light elements in organic tissues. In particular, the work was focused on nuclear reactions employing protons and alphas with carbon. The (p,p) resonances at 0.475 and 1.734 were investigated. The (α,α) resonance at 4.265 MeV was studied as well. The results indicate that the yields for the 0.475 and 1.734 MeV resonances are similar. Elemental maps of different structures obtained with the micro-NRA technique using the 1.734 MeV resonance were compared with those obtained with micro-PIXE employing a SDD detector equipped with an ultra-thin window. The results show that the use of micro-NRA for carbon at 1.734 MeV resonance provides good results in some cases at the expense of longer beam times. On the other hand, micro-PIXE provides enhanced yields but is limited to surface analysis since soft X-rays are greatly attenuated by matter

  3. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  4. Stabilizing unstable fixed points of chaotic maps via minimum entropy control

    Energy Technology Data Exchange (ETDEWEB)

    Salarieh, Hassan [Center of Excellence in Design, Robotics and Automation, Department of Mechanical Engineering, Sharif University of Technology, P.O. Box 11365-9567, Tehran (Iran, Islamic Republic of)], E-mail: salarieh@mech.sharif.edu; Alasty, Aria [Center of Excellence in Design, Robotics and Automation, Department of Mechanical Engineering, Sharif University of Technology, P.O. Box 11365-9567, Tehran (Iran, Islamic Republic of)

    2008-08-15

    In this paper the problem of chaos control in nonlinear maps using minimization of entropy function is investigated. Invariant probability measure of a chaotic dynamics can be used to produce an entropy function in the sense of Shannon. In this paper it is shown that how the entropy control technique is utilized for chaos elimination. Using only the measured states of a chaotic map the probability measure of the system is numerically estimated and this estimated measure is used to obtain an estimation for the entropy of the chaotic map. The control variable of the chaotic system is determined in such a way that the entropy function descends until the chaotic trajectory of the map is replaced with a regular one. The proposed idea is applied for stabilizing the fixed points of the logistic and the Henon maps as some cases of study. Simulation results show the effectiveness of the method in chaos rejection when only the statistical information is available from the under-study systems.

  5. Source-receptor probability of atmospheric long-distance dispersal of viruses to Israel from the eastern Mediterranean area.

    Science.gov (United States)

    Klausner, Z; Klement, E; Fattal, E

    2018-02-01

    Viruses that affect the health of humans and farm animals can spread over long distances via atmospheric mechanisms. The phenomenon of atmospheric long-distance dispersal (LDD) is associated with severe consequences because it may introduce pathogens into new areas. The introduction of new pathogens to Israel was attributed to LDD events numerous times. This provided the motivation for this study which is aimed to identify all the locations in the eastern Mediterranean that may serve as sources for pathogen incursion into Israel via LDD. This aim was achieved by calculating source-receptor relationship probability maps. These maps describe the probability that an infected vector or viral aerosol, once airborne, will have an atmospheric route that can transport it to a distant location. The resultant probability maps demonstrate a seasonal tendency in the probability of specific areas to serve as sources for pathogen LDD into Israel. Specifically, Cyprus' season is the summer; southern Turkey and the Greek islands of Crete, Karpathos and Rhodes are associated with spring and summer; lower Egypt and Jordan may serve as sources all year round, except the summer months. The method used in this study can easily be implemented to any other geographic region. The importance of this study is the ability to provide a climatologically valid and accurate risk assessment tool to support long-term decisions regarding preparatory actions for future outbreaks long before a specific outbreak occurs. © 2017 Blackwell Verlag GmbH.

  6. Mapping photothermally induced gene expression in living cells and tissues by nanorod-locked nucleic acid complexes.

    Science.gov (United States)

    Riahi, Reza; Wang, Shue; Long, Min; Li, Na; Chiou, Pei-Yu; Zhang, Donna D; Wong, Pak Kin

    2014-04-22

    The photothermal effect of plasmonic nanostructures has numerous applications, such as cancer therapy, photonic gene circuit, large cargo delivery, and nanostructure-enhanced laser tweezers. The photothermal operation can also induce unwanted physical and biochemical effects, which potentially alter the cell behaviors. However, there is a lack of techniques for characterizing the dynamic cell responses near the site of photothermal operation with high spatiotemporal resolution. In this work, we show that the incorporation of locked nucleic acid probes with gold nanorods allows photothermal manipulation and real-time monitoring of gene expression near the area of irradiation in living cells and animal tissues. The multimodal gold nanorod serves as an endocytic delivery reagent to transport the probes into the cells, a fluorescence quencher and a binding competitor to detect intracellular mRNA, and a plasmonic photothermal transducer to induce cell ablation. We demonstrate the ability of the gold nanorod-locked nucleic acid complex for detecting the spatiotemporal gene expression in viable cells and tissues and inducing photothermal ablation of single cells. Using the gold nanorod-locked nucleic acid complex, we systematically characterize the dynamic cellular heat shock responses near the site of photothermal operation. The gold nanorod-locked nucleic acid complex enables mapping of intracellular gene expressions and analyzes the photothermal effects of nanostructures toward various biomedical applications.

  7. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping.

    Science.gov (United States)

    Baeßler, Bettina; Schaarschmidt, Frank; Treutlein, Melanie; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C

    2017-12-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. • A novel quantitative approach to myocardial oedema imaging in myocarditis was re-evaluated. • The T2-mapping-derived parameters maxT2 and madSD were compared to traditional Lake-Louise criteria. • Using maxT2 and madSD with dedicated cut-offs performs similarly to Lake-Louise criteria. • Adding maxT2 and madSD to LGE results in further increased diagnostic performance. • This novel approach has the potential to overcome the limitations of T2-mapping.

  8. Mapping by monoclonal antibody detection of glycosaminoglycans in connective tissues

    DEFF Research Database (Denmark)

    Couchman, J R; Caterson, B; Christner, J E

    1984-01-01

    Chondroitin sulphate proteoglycans are widespread connective tissue components and chemical analysis of cartilage and other proteoglycans has demonstrated molecular speciation involving the degree and position of sulphation of the carbohydrate chains. This may, in turn, affect the properties...... of the glycosaminoglycan (GAG), particularly with respect to self-association and interactions with other extracellular matrix components. Interactions with specific molecules from different connective tissue types, such as the collagens and their associated glycoproteins, could be favoured by particular charge...... and dermatan sulphate. These provide novel opportunities to study the in vivo distribution of chondroitin sulphate proteoglycans. We demonstrate that chondroitin sulphates exhibit remarkable connective tissue specificity and furthermore provide evidence that some proteoglycans may predominantly carry only one...

  9. Nanomechanical mapping of bone tissue regenerated by magnetic scaffolds.

    Science.gov (United States)

    Bianchi, Michele; Boi, Marco; Sartori, Maria; Giavaresi, Gianluca; Lopomo, Nicola; Fini, Milena; Dediu, Alek; Tampieri, Anna; Marcacci, Maurilio; Russo, Alessandro

    2015-01-01

    Nanoindentation can provide new insights on the maturity stage of regenerating bone. The aim of the present study was the evaluation of the nanomechanical properties of newly-formed bone tissue at 4 weeks from the implantation of permanent magnets and magnetic scaffolds in the trabecular bone of rabbit femoral condyles. Three different groups have been investigated: MAG-A (NdFeB magnet + apatite/collagen scaffold with magnetic nanoparticles directly nucleated on the collagen fibers during scaffold synthesis); MAG-B (NdFeB magnet + apatite/collagen scaffold later infiltrated with magnetic nanoparticles) and MAG (NdFeB magnet). The mechanical properties of different-maturity bone tissues, i.e. newly-formed immature, newly-formed mature and native trabecular bone have been evaluated for the three groups. Contingent correlations between elastic modulus and hardness of immature, mature and native bone have been examined and discussed, as well as the efficacy of the adopted regeneration method in terms of "mechanical gap" between newly-formed and native bone tissue. The results showed that MAG-B group provided regenerated bone tissue with mechanical properties closer to that of native bone compared to MAG-A or MAG groups after 4 weeks from implantation. Further, whereas the mechanical properties of newly-formed immature and mature bone were found to be fairly good correlated, no correlation was detected between immature or mature bone and native bone. The reported results evidence the efficacy of nanoindentation tests for the investigation of the maturity of newly-formed bone not accessible through conventional analyses.

  10. Improved histopathological evaluation of gliomas using tissue fragments obtained by ultrasonic aspiration

    DEFF Research Database (Denmark)

    Neckelmann, K; Kristensen, B W; Schrøder, H D

    2004-01-01

    included in the biopsy removed for peroperative frozen section investigation. When the slides with Sonocut tissue fragments were analyzed, the probability of making the most malignant diagnosis increased from 81.3% - 99.1%, when slides from 1 - 5 paraffin blocks were analyzed, respectively. When subgroups...... of small, medium and big tumors were analyzed, it was found that only 2 paraffin blocks from small tumors need to be prepared to reach 98.3% probability of making the most malignant diagnosis, whereas 5 paraffin blocks from big tumors need to be prepared to reach a 96.8% probability. In conclusion......, the study shows that a limited amount of Sonocut ultrasonic tissue fragments improve the diagnostic evaluation of gliomas. These tissue fragments therefore must not be discarded. Only few paraffin blocks need to be prepared to reach close to 100% probability of making the most malignant diagnosis, reducing...

  11. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  12. Evolution of an array of elements with logistic transition probability

    International Nuclear Information System (INIS)

    Majernik, Vladimir; Surda, Anton

    1996-01-01

    The paper addresses the problem how the state of an array of elements changes if the transition probabilities of its elements is chosen in the form of a logistic map. This problem leads to a special type of a discrete-time Markov which we simulated numerically for the different transition probabilities and the number of elements in the array. We show that the time evolution of the array exhibits a wide scale of behavior depending on the value of the total number of its elements and on the logistic constant a. We point out that this problem can be applied for description of a spin system with a certain type of mean field and of the multispecies ecosystems with an internal noise. (authors)

  13. Topological probability and connection strength induced activity in complex neural networks

    International Nuclear Information System (INIS)

    Du-Qu, Wei; Bo, Zhang; Dong-Yuan, Qiu; Xiao-Shu, Luo

    2010-01-01

    Recent experimental evidence suggests that some brain activities can be assigned to small-world networks. In this work, we investigate how the topological probability p and connection strength C affect the activities of discrete neural networks with small-world (SW) connections. Network elements are described by two-dimensional map neurons (2DMNs) with the values of parameters at which no activity occurs. It is found that when the value of p is smaller or larger, there are no active neurons in the network, no matter what the value of connection strength is; for a given appropriate connection strength, there is an intermediate range of topological probability where the activity of 2DMN network is induced and enhanced. On the other hand, for a given intermediate topological probability level, there exists an optimal value of connection strength such that the frequency of activity reaches its maximum. The possible mechanism behind the action of topological probability and connection strength is addressed based on the bifurcation method. Furthermore, the effects of noise and transmission delay on the activity of neural network are also studied. (general)

  14. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  15. On the Transfer of a Number of Concepts of Statistical Radiophysics to the Theory of One-dimensional Point Mappings

    Directory of Open Access Journals (Sweden)

    Agalar M. Agalarov

    2018-01-01

    Full Text Available In the article, the possibility of using a bispectrum under the investigation of regular and chaotic behaviour of one-dimensional point mappings is discussed. The effectiveness of the transfer of this concept to nonlinear dynamics was demonstrated by an example of the Feigenbaum mapping. Also in the work, the application of the Kullback-Leibler entropy in the theory of point mappings is considered. It has been shown that this information-like value is able to describe the behaviour of statistical ensembles of one-dimensional mappings. In the framework of this theory some general properties of its behaviour were found out. Constructivity of the Kullback-Leibler entropy in the theory of point mappings was shown by means of its direct calculation for the ”saw tooth” mapping with linear initial probability density. Moreover, for this mapping the denumerable set of initial probability densities hitting into its stationary probability density after a finite number of steps was pointed out. 

  16. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand.  Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.Keywords: Perceived Understanding, Probability Concepts, Rasch Measurement Model DOI: dx.doi.org/10.22342/jme.61.1

  17. Differential deposition of H2A.Z in rice seedling tissue during the day-night cycle.

    Science.gov (United States)

    Zhang, Kang; Xu, Wenying; Wang, Chunchao; Yi, Xin; Su, Zhen

    2017-03-04

    Chromatin structure has an important role in modulating gene expression. The incorporation of histone variants into the nucleosome leads to important changes in the chromatin structure. The histone variant H2A.Z is highly conserved between different species of fungi, animals, and plants. However, dynamic changes to H2A.Z in rice have not been reported during the day-night cycle. In this study, we generated genome wide maps of H2A.Z for day and night time in harvested seedling tissues by combining chromatin immunoprecipitation and high-throughput sequencing. The analysis results for the H2A.Z data sets detected 7099 genes with higher depositions of H2A.Z in seedling tissues harvested at night compared with seedling tissues harvested during the day, whereas 4597 genes had higher H2A.Z depositions in seedlings harvested during the day. The gene expression profiles data suggested that H2A.Z probably negatively regulated gene expression during the day-night cycle and was involved in many important biologic processes. In general, our results indicated that H2A.Z may play an important role in plant responses to the diurnal oscillation process.

  18. Modelling the electrical properties of tissue as a porous medium

    International Nuclear Information System (INIS)

    Smye, S W; Evans, C J; Robinson, M P; Sleeman, B D

    2007-01-01

    Models of the electrical properties of biological tissue have been the subject of many studies. These models have sought to explain aspects of the dielectric dispersion of tissue. This paper develops a mathematical model of the complex permittivity of tissue as a function of frequency f, in the range 10 4 7 Hz, which is derived from a formulation used to describe the complex permittivity of porous media. The model introduces two parameters, porosity and percolation probability, to the description of the electrical properties of any tissue which comprises a random arrangement of cells. The complex permittivity for a plausible porosity and percolation probability distribution is calculated and compared with the published measured electrical properties of liver tissue. Broad agreement with the experimental data is noted. It is suggested that future detailed experimental measurements should be undertaken to validate the model. The model may be a more convenient method of parameterizing the electrical properties of biological tissue and subsequent measurement of these parameters in a range of tissues may yield information of biological and clinical significance

  19. Estimating the sky map in gamma-ray astronomy with a Compton telescope

    International Nuclear Information System (INIS)

    Herbert, T.J.

    1991-01-01

    Compton telescopes represent an effective design for γ-ray astronomy in the 1-30 MeV range. However, the complexity of the system response to incident γ-rays has restricted the formulation of optimal methods for processing the data. Since data is only acquired at considerable expense and difficulty a significant investment in both algorithm development and computer processing time are warranted. Current methods for processing low level data form the sky map as either the sum or product of the probabilities that each recorded γ-ray originated from within an area of the sky map. Instead, we model the unknown sky map itself as the means of a Poisson process generating the γ-ray recorded by the telescope. In this paper the authors formulate the probability density function of the data conditioned upon the sky map and derive an iterative algorithm for estimating the sky map by the method of maximum likelihood

  20. Predictive analysis and mapping of indoor radon concentrations in a complex environment using kernel estimation: An application to Switzerland

    Energy Technology Data Exchange (ETDEWEB)

    Kropat, Georg, E-mail: georg.kropat@chuv.ch [Institute of Radiation Physics, Lausanne University Hospital, Rue du Grand-Pré 1, 1007 Lausanne (Switzerland); Bochud, Francois [Institute of Radiation Physics, Lausanne University Hospital, Rue du Grand-Pré 1, 1007 Lausanne (Switzerland); Jaboyedoff, Michel [Faculty of Geosciences and Environment, University of Lausanne, GEOPOLIS — 3793, 1015 Lausanne (Switzerland); Laedermann, Jean-Pascal [Institute of Radiation Physics, Lausanne University Hospital, Rue du Grand-Pré 1, 1007 Lausanne (Switzerland); Murith, Christophe; Palacios, Martha [Swiss Federal Office of Public Health, Schwarzenburgstrasse 165, 3003 Berne (Switzerland); Baechler, Sébastien [Institute of Radiation Physics, Lausanne University Hospital, Rue du Grand-Pré 1, 1007 Lausanne (Switzerland); Swiss Federal Office of Public Health, Schwarzenburgstrasse 165, 3003 Berne (Switzerland)

    2015-02-01

    Purpose: The aim of this study was to develop models based on kernel regression and probability estimation in order to predict and map IRC in Switzerland by taking into account all of the following: architectural factors, spatial relationships between the measurements, as well as geological information. Methods: We looked at about 240 000 IRC measurements carried out in about 150 000 houses. As predictor variables we included: building type, foundation type, year of construction, detector type, geographical coordinates, altitude, temperature and lithology into the kernel estimation models. We developed predictive maps as well as a map of the local probability to exceed 300 Bq/m{sup 3}. Additionally, we developed a map of a confidence index in order to estimate the reliability of the probability map. Results: Our models were able to explain 28% of the variations of IRC data. All variables added information to the model. The model estimation revealed a bandwidth for each variable, making it possible to characterize the influence of each variable on the IRC estimation. Furthermore, we assessed the mapping characteristics of kernel estimation overall as well as by municipality. Overall, our model reproduces spatial IRC patterns which were already obtained earlier. On the municipal level, we could show that our model accounts well for IRC trends within municipal boundaries. Finally, we found that different building characteristics result in different IRC maps. Maps corresponding to detached houses with concrete foundations indicate systematically smaller IRC than maps corresponding to farms with earth foundation. Conclusions: IRC mapping based on kernel estimation is a powerful tool to predict and analyze IRC on a large-scale as well as on a local level. This approach enables to develop tailor-made maps for different architectural elements and measurement conditions and to account at the same time for geological information and spatial relations between IRC measurements

  1. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  2. Quantitative evaluation of fMRI retinotopic maps, from V1 to V4, for cognitive experiments

    Directory of Open Access Journals (Sweden)

    Cecile eBordier

    2015-05-01

    Full Text Available FMRI retinotopic mapping is a non-invasive technique for the delineation of low-level visual areas in individual subjects. It generally relies upon the analysis of functional responses to periodic visual stimuli that encode eccentricity or polar angle in the visual field. This technique is used in vision research when the precise assignation of brain activation to retinotopic areas is an issue. It involves processing steps computed with different algorithms and embedded in various software suites. Manual intervention may be needed for some steps. Although the diversity of the available processing suites and manual interventions may potentially introduce some differences in the final delineation of visual areas, no documented comparison between maps obtained with different procedures has been reported in the literature. To explore the effect of the processing steps on the quality of the maps obtained, we used two tools, BALC, which relies on a fully automated procedure, and BrainVoyager, where areas are delineated by hand on the brain surface. To focus on the mapping procedures specifically, we used the same SPM pipeline for pretreatment and the same tissue segmentation tool. We document the consistency and differences of the fMRI retinotopic maps obtained from routine retinotopy experiments on ten subjects. The maps obtained by skilled users are never fully identical. However, the agreement between the maps, around 80% for low-level areas, is probably sufficient for most applications. Our results also indicate that assigning cognitive activations, following a specific experiment (here, color perception, to individual retinotopic maps is not free of errors. We provide measurements of this error, that may help for the cautious interpretation of cognitive activation projection onto fMRI retinotopic maps. On average, the magnitude of the error is about 20%, with much larger differences in a few subjects.

  3. Increase in tumor control and normal tissue complication probabilities in advanced head-and-neck cancer for dose-escalated intensity-modulated photon and proton therapy

    Directory of Open Access Journals (Sweden)

    Annika eJakobi

    2015-11-01

    Full Text Available Introduction:Presently used radio-chemotherapy regimens result in moderate local control rates for patients with advanced head and neck squamous cell carcinoma (HNSCC. Dose escalation (DE may be an option to improve patient outcome, but may also increase the risk of toxicities in healthy tissue. The presented treatment planning study evaluated the feasibility of two DE levels for advanced HNSCC patients, planned with either intensity-modulated photon therapy (IMXT or proton therapy (IMPT.Materials and Methods:For 45 HNSCC patients, IMXT and IMPT treatment plans were created including DE via a simultaneous integrated boost (SIB in the high-risk volume, while maintaining standard fractionation with 2 Gy per fraction in the remaining target volume. Two DE levels for the SIB were compared: 2.3 Gy and 2.6 Gy. Treatment plan evaluation included assessment of tumor control probabilities (TCP and normal tissue complication probabilities (NTCP.Results:An increase of approximately 10% in TCP was estimated between the DE levels. A pronounced high-dose rim surrounding the SIB volume was identified in IMXT treatment. Compared to IMPT, this extra dose slightly increased the TCP values and to a larger extent the NTCP values. For both modalities, the higher DE level led only to a small increase in NTCP values (mean differences < 2% in all models, except for the risk of aspiration, which increased on average by 8% and 6% with IMXT and IMPT, respectively, but showed a considerable patient dependence. Conclusions:Both DE levels appear applicable to patients with IMXT and IMPT since all calculated NTCP values, except for one, increased only little for the higher DE level. The estimated TCP increase is of relevant magnitude. The higher DE schedule needs to be investigated carefully in the setting of a prospective clinical trial, especially regarding toxicities caused by high local doses that lack a sound dose response description, e.g., ulcers.

  4. Wavelet analysis of polarization azimuths maps for laser images of myocardial tissue for the purpose of diagnosing acute coronary insufficiency

    Science.gov (United States)

    Wanchuliak, O. Ya.; Peresunko, A. P.; Bakko, Bouzan Adel; Kushnerick, L. Ya.

    2011-09-01

    This paper presents the foundations of a large scale - localized wavelet - polarization analysis - inhomogeneous laser images of histological sections of myocardial tissue. Opportunities were identified defining relations between the structures of wavelet coefficients and causes of death. The optical model of polycrystalline networks of myocardium protein fibrils is presented. The technique of determining the coordinate distribution of polarization azimuth of the points of laser images of myocardium histological sections is suggested. The results of investigating the interrelation between the values of statistical (statistical moments of the 1st-4th order) parameters are presented which characterize distributions of wavelet - coefficients polarization maps of myocardium layers and death reasons.

  5. Data-driven probability concentration and sampling on manifold

    Energy Technology Data Exchange (ETDEWEB)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi-Echelle, MSME UMR 8208 CNRS, 5 bd Descartes, 77454 Marne-La-Vallée Cedex 2 (France); Ghanem, R., E-mail: ghanem@usc.edu [University of Southern California, 210 KAP Hall, Los Angeles, CA 90089 (United States)

    2016-09-15

    A new methodology is proposed for generating realizations of a random vector with values in a finite-dimensional Euclidean space that are statistically consistent with a dataset of observations of this vector. The probability distribution of this random vector, while a priori not known, is presumed to be concentrated on an unknown subset of the Euclidean space. A random matrix is introduced whose columns are independent copies of the random vector and for which the number of columns is the number of data points in the dataset. The approach is based on the use of (i) the multidimensional kernel-density estimation method for estimating the probability distribution of the random matrix, (ii) a MCMC method for generating realizations for the random matrix, (iii) the diffusion-maps approach for discovering and characterizing the geometry and the structure of the dataset, and (iv) a reduced-order representation of the random matrix, which is constructed using the diffusion-maps vectors associated with the first eigenvalues of the transition matrix relative to the given dataset. The convergence aspects of the proposed methodology are analyzed and a numerical validation is explored through three applications of increasing complexity. The proposed method is found to be robust to noise levels and data complexity as well as to the intrinsic dimension of data and the size of experimental datasets. Both the methodology and the underlying mathematical framework presented in this paper contribute new capabilities and perspectives at the interface of uncertainty quantification, statistical data analysis, stochastic modeling and associated statistical inverse problems.

  6. A Bayesian-probability-based method for assigning protein backbone dihedral angles based on chemical shifts and local sequences

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jun; Liu Haiyan [University of Science and Technology of China, Hefei National Laboratory for Physical Sciences at the Microscale, and Key Laboratory of Structural Biology, School of Life Sciences (China)], E-mail: hyliu@ustc.edu.cn

    2007-01-15

    Chemical shifts contain substantial information about protein local conformations. We present a method to assign individual protein backbone dihedral angles into specific regions on the Ramachandran map based on the amino acid sequences and the chemical shifts of backbone atoms of tripeptide segments. The method uses a scoring function derived from the Bayesian probability for the central residue of a query tripeptide segment to have a particular conformation. The Ramachandran map is partitioned into representative regions at two levels of resolution. The lower resolution partitioning is equivalent to the conventional definitions of different secondary structure regions on the map. At the higher resolution level, the {alpha} and {beta} regions are further divided into subregions. Predictions are attempted at both levels of resolution. We compared our method with TALOS using the original TALOS database, and obtained comparable results. Although TALOS may produce the best results with currently available databases which are much enlarged, the Bayesian-probability-based approach can provide a quantitative measure for the reliability of predictions.

  7. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  8. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

    Directory of Open Access Journals (Sweden)

    Robert G. Staudte

    2018-04-01

    Full Text Available We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs, obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples.

  9. The influence of phonotactic probability and neighborhood density on children's production of newly learned words.

    Science.gov (United States)

    Heisler, Lori; Goffman, Lisa

    A word learning paradigm was used to teach children novel words that varied in phonotactic probability and neighborhood density. The effects of frequency and density on speech production were examined when phonetic forms were non-referential (i.e., when no referent was attached) and when phonetic forms were referential (i.e., when a referent was attached through fast mapping). Two methods of analysis were included: (1) kinematic variability of speech movement patterning; and (2) measures of segmental accuracy. Results showed that phonotactic frequency influenced the stability of movement patterning whereas neighborhood density influenced phoneme accuracy. Motor learning was observed in both non-referential and referential novel words. Forms with low phonotactic probability and low neighborhood density showed a word learning effect when a referent was assigned during fast mapping. These results elaborate on and specify the nature of interactivity observed across lexical, phonological, and articulatory domains.

  10. Seismic hazard maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  11. A Hybrid Hierarchical Approach for Brain Tissue Segmentation by Combining Brain Atlas and Least Square Support Vector Machine

    Science.gov (United States)

    Kasiri, Keyvan; Kazemi, Kamran; Dehghani, Mohammad Javad; Helfroush, Mohammad Sadegh

    2013-01-01

    In this paper, we present a new semi-automatic brain tissue segmentation method based on a hybrid hierarchical approach that combines a brain atlas as a priori information and a least-square support vector machine (LS-SVM). The method consists of three steps. In the first two steps, the skull is removed and the cerebrospinal fluid (CSF) is extracted. These two steps are performed using the toolbox FMRIB's automated segmentation tool integrated in the FSL software (FSL-FAST) developed in Oxford Centre for functional MRI of the brain (FMRIB). Then, in the third step, the LS-SVM is used to segment grey matter (GM) and white matter (WM). The training samples for LS-SVM are selected from the registered brain atlas. The voxel intensities and spatial positions are selected as the two feature groups for training and test. SVM as a powerful discriminator is able to handle nonlinear classification problems; however, it cannot provide posterior probability. Thus, we use a sigmoid function to map the SVM output into probabilities. The proposed method is used to segment CSF, GM and WM from the simulated magnetic resonance imaging (MRI) using Brainweb MRI simulator and real data provided by Internet Brain Segmentation Repository. The semi-automatically segmented brain tissues were evaluated by comparing to the corresponding ground truth. The Dice and Jaccard similarity coefficients, sensitivity and specificity were calculated for the quantitative validation of the results. The quantitative results show that the proposed method segments brain tissues accurately with respect to corresponding ground truth. PMID:24696800

  12. Incidence Probability of Delayed Health Consequences of the Chernobyl Accident

    International Nuclear Information System (INIS)

    Abdel-Ghani, A.H.; El-Naggar, A.M.; El-Kadi, A.A.

    2000-01-01

    During the first international Conference on the long -term consequences of the Chernobyl disaster in 1995 at Kiev, and also during the 1996 International Conference at Vienna, Summing up the consequences of the Chernobyl accident, the data regarding the delayed health consequences were mainly related to thyroid cancer, hereditary disorders, general morbidity, mortality and psychological disturbances. Contrary to expectations, the incidences of Leukemia and Soft Tissue tumors were similar to the spontaneous incident. The expected delayed effects, however, among the accident survivors, the liquidators and populations resident in contaminated areas would show higher incidence probability to Leukemia. These population groups have been continuously exposed to low level radiation both externally and internally. Application of the new ICRP concept of radiation-induced Detriment, and the Nominal Probability Coefficient for Cancer and hereditary effects for both workers and populations are used as the rationale to calculate the incidence probability of occurrence of delayed health effects of the Chernobyl accidents

  13. Probability of Elevated Volatile Organic Compound (VOC) Concentrations in Groundwater in the Eagle River Watershed Valley-Fill Aquifer, Eagle County, North-Central Colorado, 2006-2007

    Science.gov (United States)

    Rupert, Michael G.; Plummer, Niel

    2009-01-01

    This raster data set delineates the predicted probability of elevated volatile organic compound (VOC) concentrations in groundwater in the Eagle River watershed valley-fill aquifer, Eagle County, North-Central Colorado, 2006-2007. This data set was developed by a cooperative project between the U.S. Geological Survey, Eagle County, the Eagle River Water and Sanitation District, the Town of Eagle, the Town of Gypsum, and the Upper Eagle Regional Water Authority. This project was designed to evaluate potential land-development effects on groundwater and surface-water resources so that informed land-use and water management decisions can be made. This groundwater probability map and its associated probability maps was developed as follows: (1) A point data set of wells with groundwater quality and groundwater age data was overlaid with thematic layers of anthropogenic (related to human activities) and hydrogeologic data by using a geographic information system to assign each well values for depth to groundwater, distance to major streams and canals, distance to gypsum beds, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Statistical models predicting the probability of elevated nitrate concentrations, the probability of unmixed young water (using chlorofluorocarbon-11 concentrations and tritium activities), and the probability of elevated volatile organic compound concentrations were developed using logistic regression techniques. (3) The statistical models were entered into a GIS and the probability map was constructed.

  14. USING RASCH ANALYSIS TO EXPLORE WHAT STUDENTS LEARN ABOUT PROBABILITY CONCEPTS

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand

  15. Using Rasch Analysis To Explore What Students Learn About Probability Concepts

    Directory of Open Access Journals (Sweden)

    Zamalia Mahmud

    2015-01-01

    Full Text Available Students’ understanding of probability concepts have been investigated from various different perspectives. This study was set out to investigate perceived understanding of probability concepts of forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW. Rasch measurement which is based on a probabilistic model was used to identify concepts that students find easy, moderate and difficult to understand. Data were captured from the e-learning Moodle platform where students provided their responses through an on-line quiz. As illustrated in the Rasch map, 96% of the students could understand about sample space, simple events, mutually exclusive events and tree diagram while 67% of the students found concepts of conditional and independent events rather easy to understand.

  16. Systematic variations in multi-spectral lidar representations of canopy height profiles and gap probability

    Science.gov (United States)

    Chasmer, L.; Hopkinson, C.; Gynan, C.; Mahoney, C.; Sitar, M.

    2015-12-01

    Airborne and terrestrial lidar are increasingly used in forest attribute modeling for carbon, ecosystem and resource monitoring. The near infra-red wavelength at 1064nm has been utilised most in airborne applications due to, for example, diode manufacture costs, surface reflectance and eye safety. Foliage reflects well at 1064nm and most of the literature on airborne lidar forest structure is based on data from this wavelength. However, lidar systems also operate at wavelengths further from the visible spectrum (e.g. 1550nm) for eye safety reasons. This corresponds to a water absorption band and can be sensitive to attenuation if surfaces contain moisture. Alternatively, some systems operate in the visible range (e.g. 532nm) for specialised applications requiring simultaneous mapping of terrestrial and bathymetric surfaces. All these wavelengths provide analogous 3D canopy structure reconstructions and thus offer the potential to be combined for spatial comparisons or temporal monitoring. However, a systematic comparison of wavelength-dependent foliage profile and gap probability (index of transmittance) is needed. Here we report on two multispectral lidar missions carried out in 2013 and 2015 over conifer, deciduous and mixed stands in Ontario, Canada. The first used separate lidar sensors acquiring comparable data at three wavelengths, while the second used a single sensor with 3 integrated laser systems. In both cases, wavelenegths sampled were 532nm, 1064nm and 1550nm. The experiment revealed significant differences in proportions of returns at ground level, the vertical foliage distribution and gap probability across wavelengths. Canopy attenuation was greatest at 532nm due to photosynthetic plant tissue absorption. Relative to 1064nm, foliage was systematically undersampled at the 10% to 60% height percentiles at both 1550nm and 532nm (this was confirmed with coincident terrestrial lidar data). When using all returns to calculate gap probability, all

  17. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  18. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  19. Evaluation and comparison of cartilage repair tissue of the patella and medial femoral condyle by using morphological MRI and biochemical zonal T2 mapping

    International Nuclear Information System (INIS)

    Welsch, Goetz H.; Mamisch, Tallal C.; Quirbach, Sebastian; Trattnig, Siegfried; Zak, Lukas; Marlovits, Stefan

    2009-01-01

    The objective of this study was to use advanced MR techniques to evaluate and compare cartilage repair tissue after matrix-associated autologous chondrocyte transplantation (MACT) in the patella and medial femoral condyle (MFC). Thirty-four patients treated with MACT underwent 3-T MRI of the knee. Patients were treated on either patella (n = 17) or MFC (n = 17) cartilage and were matched by age and postoperative interval. For morphological evaluation, the MR observation of cartilage repair tissue (MOCART) score was used, with a 3D-True-FISP sequence. For biochemical assessment, T2 mapping was prepared by using a multiecho spin-echo approach with particular attention to the cartilage zonal structure. Statistical evaluation was done by analyses of variance. The MOCART score showed no significant differences between the patella and MFC (p ≥ 0.05). With regard to biochemical T2 relaxation, higher T2 values were found throughout the MFC (p < 0.05). The zonal increase in T2 values from deep to superficial was significant for control cartilage (p < 0.001) and cartilage repair tissue (p < 0.05), with an earlier onset in the repair tissue of the patella. The assessment of cartilage repair tissue of the patella and MFC afforded comparable morphological results, whereas biochemical T2 values showed differences, possibly due to dissimilar biomechanical loading conditions. (orig.)

  20. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    International Nuclear Information System (INIS)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren M.; Hegedüs, Laszlo; Overgaard, Jens; Johansen, Jørgen

    2013-01-01

    Background and purpose: To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors. Patients and methods: Patients with HNSCC receiving definitive radiotherapy with 66–68 Gy without surgery were followed up with serial post-treatment thyrotropin (TSH) assessment. HT was defined as TSH >4.0 mU/l. Data were analyzed with both a logistic and a mixture model (correcting for latency) to determine risk factors for HT and develop an NTCP model based on mean thyroid dose (MTD) and thyroid volume. Results: 203 patients were included. Median follow-up: 25.1 months. Five-year estimated risk of HT was 25.6%. In the mixture model, the only independent risk factors for HT were thyroid volume (cm 3 ) (OR = 0.75 [95% CI: 0.64–0.85], p 3 , respectively. Conclusions: Comparing the logistic and mixture models demonstrates the importance of latent-time correction in NTCP-modeling. Thyroid dose constraints in treatment planning should be individualized based on thyroid volume

  1. Color on emergency mapping

    Science.gov (United States)

    Jiang, Lili; Qi, Qingwen; Zhang, An

    2007-06-01

    There are so many emergency issues in our daily life. Such as typhoons, tsunamis, earthquake, fires, floods, epidemics, etc. These emergencies made people lose their lives and their belongings. Every day, every hour, even every minute people probably face the emergency, so how to handle it and how to decrease its hurt are the matters people care most. If we can map it exactly before or after the emergencies; it will be helpful to the emergency researchers and people who live in the emergency place. So , through the emergency map, before emergency is occurring we can predict the situation, such as when and where the emergency will be happen; where people can refuge, etc. After disaster, we can also easily assess the lost, discuss the cause and make the lost less. The primary effect of mapping is offering information to the people who care about the emergency and the researcher who want to study it. Mapping allows the viewers to get a spatial sense of hazard. It can also provide the clues to study the relationship of the phenomenon in emergency. Color, as the basic element of the map, it can simplify and clarify the phenomenon. Color can also affects the general perceptibility of the map, and elicits subjective reactions to the map. It is to say, structure, readability, and the reader's psychological reactions can be affected by the use of color.

  2. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  3. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    Science.gov (United States)

    Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.

    2018-04-01

    Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards

  4. Brain water mapping with MR imaging

    International Nuclear Information System (INIS)

    Laine, F.J.; Fatouros, P.P.; Kraft, K.A.

    1990-01-01

    This paper reports on a recently developed MR imaging technique to determine the spatial distribution of brain water to healthy volunteers. A noninvasive MR imaging technique to obtain absolute measurements of brain water has been developed and validated with phantom and animal studies. Patient confirmation was obtained from independent gravimetric measurements of brain tissue samples harvested by biopsy. This approach entails the production of accurate T1 maps from multiple inversion recovery images of a selected anatomic section and their subsequent conversion into an absolute water image by means of a previously determined calibration curve. Twenty healthy volunteers were studied and their water distribution was determined in a standard section. The following brain water values means and SD grams of water per gram of tissue) were obtained for selected brain regions; white matter, 68.9% ± 1.0; corpus callosum, 67.4% ± 1.1; thalamus, 75.3% ± 1.4; and caudate nucleus, 80.3% ± 1.4. MR imaging water mapping is a valid means of determining water content in a variety of brain tissues

  5. Culture-Independent Identification of Mycobacterium avium Subspecies paratuberculosis in Ovine Tissues: Comparison with Bacterial Culture and Histopathological Lesions

    Directory of Open Access Journals (Sweden)

    Kamal R. Acharya

    2017-12-01

    Full Text Available Johne’s disease is a chronic debilitating enteropathy of ruminants caused by Mycobacterium avium subspecies paratuberculosis (MAP. Current abattoir surveillance programs detect disease via examination of gross lesions and confirmation by histopathological and/or tissue culture, which is time-consuming and has relatively low sensitivity. This study aimed to investigate whether a high-throughput quantitative PCR (qPCR test is a viable alternative for tissue testing. Intestine and mesenteric lymph nodes were sourced from sheep experimentally infected with MAP and the DNA extracted using a protocol developed for tissues, comprised enzymatic digestion of the tissue homogenate, chemical and mechanical lysis, and magnetic bead-based DNA purification. The extracted DNA was tested by adapting a previously validated qPCR for fecal samples, and the results were compared with culture and histopathology results of the corresponding tissues. The MAP tissue qPCR confirmed infection in the majority of sheep with gross lesions on postmortem (37/38. Likewise, almost all tissue culture (61/64 or histopathology (52/58 positives were detected with good to moderate agreement (Cohen’s kappa statistic and no significant difference to the reference tests (McNemar’s Chi-square test. Higher MAP DNA quantities corresponded to animals with more severe histopathology (odds ratio: 1.82; 95% confidence interval: 1.60, 2.07. Culture-independent strain typing on tissue DNA was successfully performed. This MAP tissue qPCR method had a sensitivity equivalent to the reference tests and is thus a viable replacement for gross- and histopathological examination of tissue samples in abattoirs. In addition, the test could be validated for testing tissue samples intended for human consumption.

  6. Distinguishability notion based on Wootters statistical distance: Application to discrete maps

    Science.gov (United States)

    Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.

    2017-08-01

    We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.

  7. Absolute continuity for operator valued completely positive maps on C∗-algebras

    Science.gov (United States)

    Gheondea, Aurelian; Kavruk, Ali Şamil

    2009-02-01

    Motivated by applicability to quantum operations, quantum information, and quantum probability, we investigate the notion of absolute continuity for operator valued completely positive maps on C∗-algebras, previously introduced by Parthasarathy [in Athens Conference on Applied Probability and Time Series Analysis I (Springer-Verlag, Berlin, 1996), pp. 34-54]. We obtain an intrinsic definition of absolute continuity, we show that the Lebesgue decomposition defined by Parthasarathy is the maximal one among all other Lebesgue-type decompositions and that this maximal Lebesgue decomposition does not depend on the jointly dominating completely positive map, we obtain more flexible formulas for calculating the maximal Lebesgue decomposition, and we point out the nonuniqueness of the Lebesgue decomposition as well as a sufficient condition for uniqueness. In addition, we consider Radon-Nikodym derivatives for absolutely continuous completely positive maps that, in general, are unbounded positive self-adjoint operators affiliated to a certain von Neumann algebra, and we obtain a spectral approximation by bounded Radon-Nikodym derivatives. An application to the existence of the infimum of two completely positive maps is indicated, and formulas in terms of Choi's matrices for the Lebesgue decomposition of completely positive maps in matrix algebras are obtained.

  8. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  9. Geometry of q-Exponential Family of Probability Distributions

    Directory of Open Access Journals (Sweden)

    Shun-ichi Amari

    2011-06-01

    Full Text Available The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability estimator.

  10. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  11. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  12. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  13. Targets of DNA-binding proteins in bacterial promoter regions present enhanced probabilities for spontaneous thermal openings

    International Nuclear Information System (INIS)

    Apostolaki, Angeliki; Kalosakas, George

    2011-01-01

    We mapped promoter regions of double-stranded DNA with respect to the probabilities of appearance of relatively large bubble openings exclusively due to thermal fluctuations at physiological temperatures. We analyzed five well-studied promoter regions of procaryotic type and found a spatial correlation between the binding sites of transcription factors and the position of peaks in the probability pattern of large thermal openings. Other distinct peaks of the calculated patterns correlate with potential binding sites of DNA-binding proteins. These results suggest that a DNA molecule would more frequently expose the bases that participate in contacts with proteins, which would probably enhance the probability of the latter to reach their targets. It also stands for using this method as a means to analyze DNA sequences based on their intrinsic thermal properties

  14. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  15. Landslide susceptibility map: from research to application

    Science.gov (United States)

    Fiorucci, Federica; Reichenbach, Paola; Ardizzone, Francesca; Rossi, Mauro; Felicioni, Giulia; Antonini, Guendalina

    2014-05-01

    Susceptibility map is an important and essential tool in environmental planning, to evaluate landslide hazard and risk and for a correct and responsible management of the territory. Landslide susceptibility is the likelihood of a landslide occurring in an area on the basis of local terrain conditions. Can be expressed as the probability that any given region will be affected by landslides, i.e. an estimate of "where" landslides are likely to occur. In this work we present two examples of landslide susceptibility map prepared for the Umbria Region and for the Perugia Municipality. These two maps were realized following official request from the Regional and Municipal government to the Research Institute for the Hydrogeological Protection (CNR-IRPI). The susceptibility map prepared for the Umbria Region represents the development of previous agreements focused to prepare: i) a landslide inventory map that was included in the Urban Territorial Planning (PUT) and ii) a series of maps for the Regional Plan for Multi-risk Prevention. The activities carried out for the Umbria Region were focused to define and apply methods and techniques for landslide susceptibility zonation. Susceptibility maps were prepared exploiting a multivariate statistical model (linear discriminant analysis) for the five Civil Protection Alert Zones defined in the regional territory. The five resulting maps were tested and validated using the spatial distribution of recent landslide events that occurred in the region. The susceptibility map for the Perugia Municipality was prepared to be integrated as one of the cartographic product in the Municipal development plan (PRG - Piano Regolatore Generale) as required by the existing legislation. At strategic level, one of the main objectives of the PRG, is to establish a framework of knowledge and legal aspects for the management of geo-hydrological risk. At national level most of the susceptibility maps prepared for the PRG, were and still are obtained

  16. Tissue engineering and regenerative medicine: manufacturing challenges.

    Science.gov (United States)

    Williams, D J; Sebastine, I M

    2005-12-01

    Tissue engineering and regenerative medicine are interdisciplinary fields that apply principles of engineering and life sciences to develop biological substitutes, typically composed of biological and synthetic components, that restore, maintain or improve tissue function. Many tissue engineering technologies are still at a laboratory or pre-commercial scale. The short review paper describes the most significant manufacturing and bio-process challenges inherent in the commercialisation and exploitation of the exciting results emerging from the biological and clinical laboratories exploring tissue engineering and regenerative medicine. A three-generation road map of the industry has been used to structure a view of these challenges and to define where the manufacturing community can contribute to the commercial success of the products from these emerging fields. The first-generation industry is characterised by its demonstrated clinical applications and products in the marketplace, the second is characterised by emerging clinical applications, and the third generation is characterised by aspirational clinical applications. The paper focuses on the cost reduction requirement of the first generation of the industry to allow more market penetration and consequent patient impact. It indicates the technological requirements, for instance the creation of three-dimensional tissue structures, and value chain issues in the second generation of the industry. The third-generation industry challenges lie in fundamental biological and clinical science. The paper sets out a road map of these generations to identify areas for research.

  17. A new approach to the statistical treatment of 2D-maps in proteomics using fuzzy logic.

    Science.gov (United States)

    Marengo, Emilio; Robotti, Elisa; Gianotti, Valentina; Righetti, Pier Giorgio

    2003-01-01

    A new approach to the statistical treatment of 2D-maps has been developed. This method is based on the use of fuzzy logic and allows to take into consideration the typical low reproducibility of 2D-maps. In this approach the signal corresponding to the presence of proteins on the 2D-maps is substituted with probability functions, centred on the signal itself. The standard deviation of the bidimensional gaussian probability function employed to blur the signal allows to assign different uncertainties to the two electrophoretic dimensions. The effect of changing the standard deviation and the digitalisation resolution are investigated.

  18. CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation.

    Science.gov (United States)

    Wilke, Marko; Altaye, Mekibib; Holland, Scott K

    2017-01-01

    Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating "unusual" populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php.

  19. Tumor spatial heterogeneity in myxoid-containing soft tissue using texture analysis of diffusion-weighted MRI.

    Directory of Open Access Journals (Sweden)

    Hyun Su Kim

    Full Text Available The objective of this study was to examine the tumor spatial heterogeneity in myxoid-containing soft-tissue tumors (STTs using texture analysis of diffusion-weighted imaging (DWI. A total of 40 patients with myxoid-containing STTs (23 benign and 17 malignant were included in this study. The region of interest (ROI was manually drawn on the apparent diffusion coefficient (ADC map. For texture analysis, the global (mean, standard deviation, skewness, and kurtosis, regional (intensity variability and size-zone variability, and local features (energy, entropy, correlation, contrast, homogeneity, variance, and maximum probability were extracted from the ADC map. Student's t-test was used to test the difference between group means. Analysis of covariance (ANCOVA was performed with adjustments for age, sex, and tumor volume. The receiver operating characteristic (ROC analysis was performed to compare diagnostic performances. Malignant myxoid-containing STTs had significantly higher kurtosis (P = 0.040, energy (P = 0.034, correlation (P<0.001, and homogeneity (P = 0.003, but significantly lower contrast (P<0.001 and variance (P = 0.001 compared with benign myxoid-containing STTs. Contrast showed the highest area under the curve (AUC = 0.923, P<0.001, sensitivity (94.12%, and specificity (86.96%. Our results reveal the potential utility of texture analysis of ADC maps for differentiating benign and malignant myxoid-containing STTs.

  20. Mapping closure for probability distribution function in low frequency magnetized plasma turbulence

    International Nuclear Information System (INIS)

    Das, A.; Kaw, P.

    1995-01-01

    Recent numerical studies on the Hasegawa--Mima equation and its variants describing low frequency magnetized plasma turbulence indicate that the potential fluctuations have a Gaussian character whereas the vorticity exhibits non-Gaussian features. A theoretical interpretation for this observation using the recently developed mapping closure technique [Chen, Chen, and Kraichnan, Phys. Rev. Lett. 63, 2657 (1989)] has been provided here. It has been shown that non-Gaussian statistics for the vorticity arises because of a competition between nonlinear straining and diffusive damping whereas the Gaussianity of the statistics of φ arises because the only significant nonlinearity is associated with divergence free convection, which produces no strain terms. copyright 1995 American Institute of Physics

  1. The role of ergodicity and mixing in the central limit theorem for Casati-Prosen triangle map variables

    Energy Technology Data Exchange (ETDEWEB)

    Queiros, S.M. Duarte [Unilever R and D Port Sunlight, Quarry Road East, CH63 3JW (United Kingdom)], E-mail: silvio.queiros@unilever.com

    2009-04-13

    In this Letter we analyse the behaviour of the probability density function of the sum of N deterministic variables generated from the triangle map of Casati-Prosen. For the case in which the map is both ergodic and mixing the resulting probability density function quickly concurs with the Normal distribution. However, when the map is weakly chaotic, and fuzzily not mixing, the resulting probability density functions are described by power-laws. Moreover, contrarily to what it would be expected, as the number of added variables N increases the distance to Gaussian distribution increases. This behaviour goes against standard central limit theorem. By extrapolation of our finite size results we preview that in the limit of N going to infinity the distribution has the same asymptotic decay as a Lorentzian (or a q=2-Gaussian)

  2. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  3. Normal tissue complication probability: Does simultaneous integrated boost intensity-modulated radiotherapy score over other techniques in treatment of prostate adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Jothy Basu K

    2009-01-01

    Full Text Available Aim: The main objective of this study was to analyze the radiobiological effect of different treatment strategies on high-risk prostate adenocarcinoma. Materials and Methods: Ten cases of high-risk prostate adenocarcinoma were selected for this dosimetric study. Four different treatment strategies used for treating prostate cancer were compared. Conventional four-field box technique covering prostate and nodal volumes followed by three-field conformal boost (3D + 3DCRT, four-field box technique followed by intensity-modulated radiotherapy (IMRT boost (3D + IMRT, IMRT followed by IMRT boost (IMRT + IMRT, and simultaneous integrated boost IMRT (SIBIMRT were compared in terms of tumor control probability (TCP and normal tissue complication probability (NTCP. The dose prescription except for SIBIMRT was 45 Gy in 25 fractions for the prostate and nodal volumes in the initial phase and 27 Gy in 15 fractions for the prostate in the boost phase. For SIBIMRT, equivalent doses were calculated using biologically equivalent dose assuming the α/β ratio of 1.5 Gy with a dose prescription of 60.75 Gy for the gross tumor volume (GTV and 45 Gy for the clinical target volume in 25 fractions. IMRT plans were made with 15-MV equispaced seven coplanar fields. NTCP was calculated using the Lyman-Kutcher-Burman (LKB model. Results: An NTCP of 10.7 ± 0.99%, 8.36 ± 0.66%, 6.72 ± 0.85%, and 1.45 ± 0.11% for the bladder and 14.9 ± 0.99%, 14.04 ± 0.66%, 11.38 ± 0.85%, 5.12 ± 0.11% for the rectum was seen with 3D + 3DCRT, 3D + IMRT, IMRT + IMRT, and SIBIMRT respectively. Conclusions: SIBIMRT had the least NTCP over all other strategies with a reduced treatment time (3 weeks less. It should be the technique of choice for dose escalation in prostate carcinoma.

  4. Acoustic methods for cavitation mapping in biomedical applications

    Science.gov (United States)

    Wan, M.; Xu, S.; Ding, T.; Hu, H.; Liu, R.; Bai, C.; Lu, S.

    2015-12-01

    In recent years, cavitation is increasingly utilized in a wide range of applications in biomedical field. Monitoring the spatial-temporal evolution of cavitation bubbles is of great significance for efficiency and safety in biomedical applications. In this paper, several acoustic methods for cavitation mapping proposed or modified on the basis of existing work will be presented. The proposed novel ultrasound line-by-line/plane-by-plane method can depict cavitation bubbles distribution with high spatial and temporal resolution and may be developed as a potential standard 2D/3D cavitation field mapping method. The modified ultrafast active cavitation mapping based upon plane wave transmission and reception as well as bubble wavelet and pulse inversion technique can apparently enhance the cavitation to tissue ratio in tissue and further assist in monitoring the cavitation mediated therapy with good spatial and temporal resolution. The methods presented in this paper will be a foundation to promote the research and development of cavitation imaging in non-transparent medium.

  5. Laboratory Workflow Analysis of Culture of Periprosthetic Tissues in Blood Culture Bottles.

    Science.gov (United States)

    Peel, Trisha N; Sedarski, John A; Dylla, Brenda L; Shannon, Samantha K; Amirahmadi, Fazlollaah; Hughes, John G; Cheng, Allen C; Patel, Robin

    2017-09-01

    Culture of periprosthetic tissue specimens in blood culture bottles is more sensitive than conventional techniques, but the impact on laboratory workflow has yet to be addressed. Herein, we examined the impact of culture of periprosthetic tissues in blood culture bottles on laboratory workflow and cost. The workflow was process mapped, decision tree models were constructed using probabilities of positive and negative cultures drawn from our published study (T. N. Peel, B. L. Dylla, J. G. Hughes, D. T. Lynch, K. E. Greenwood-Quaintance, A. C. Cheng, J. N. Mandrekar, and R. Patel, mBio 7:e01776-15, 2016, https://doi.org/10.1128/mBio.01776-15), and the processing times and resource costs from the laboratory staff time viewpoint were used to compare periprosthetic tissues culture processes using conventional techniques with culture in blood culture bottles. Sensitivity analysis was performed using various rates of positive cultures. Annualized labor savings were estimated based on salary costs from the U.S. Labor Bureau for Laboratory staff. The model demonstrated a 60.1% reduction in mean total staff time with the adoption of tissue inoculation into blood culture bottles compared to conventional techniques (mean ± standard deviation, 30.7 ± 27.6 versus 77.0 ± 35.3 h per month, respectively; P < 0.001). The estimated annualized labor cost savings of culture using blood culture bottles was $10,876.83 (±$337.16). Sensitivity analysis was performed using various rates of culture positivity (5 to 50%). Culture in blood culture bottles was cost-effective, based on the estimated labor cost savings of $2,132.71 for each percent increase in test accuracy. In conclusion, culture of periprosthetic tissue in blood culture bottles is not only more accurate than but is also cost-saving compared to conventional culture methods. Copyright © 2017 American Society for Microbiology.

  6. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  7. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  8. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  9. Large area synchrotron X-ray fluorescence mapping of biological samples

    International Nuclear Information System (INIS)

    Kempson, I.; Thierry, B.; Smith, E.; Gao, M.; De Jonge, M.

    2014-01-01

    Large area mapping of inorganic material in biological samples has suffered severely from prohibitively long acquisition times. With the advent of new detector technology we can now generate statistically relevant information for studying cell populations, inter-variability and bioinorganic chemistry in large specimen. We have been implementing ultrafast synchrotron-based XRF mapping afforded by the MAIA detector for large area mapping of biological material. For example, a 2.5 million pixel map can be acquired in 3 hours, compared to a typical synchrotron XRF set-up needing over 1 month of uninterrupted beamtime. Of particular focus to us is the fate of metals and nanoparticles in cells, 3D tissue models and animal tissues. The large area scanning has for the first time provided statistically significant information on sufficiently large numbers of cells to provide data on intercellular variability in uptake of nanoparticles. Techniques such as flow cytometry generally require analysis of thousands of cells for statistically meaningful comparison, due to the large degree of variability. Large area XRF now gives comparable information in a quantifiable manner. Furthermore, we can now image localised deposition of nanoparticles in tissues that would be highly improbable to 'find' by typical XRF imaging. In addition, the ultra fast nature also makes it viable to conduct 3D XRF tomography over large dimensions. This technology avails new opportunities in biomonitoring and understanding metal and nanoparticle fate ex-vivo. Following from this is extension to molecular imaging through specific anti-body targeted nanoparticles to label specific tissues and monitor cellular process or biological consequence

  10. A stereotaxic, population-averaged T1w ovine brain atlas including cerebral morphology and tissue volumes

    Directory of Open Access Journals (Sweden)

    Björn eNitzsche

    2015-06-01

    Full Text Available Standard stereotaxic reference systems play a key role in human brain studies. Stereotaxic coordinate systems have also been developed for experimental animals including non-human primates, dogs and rodents. However, they are lacking for other species being relevant in experimental neuroscience including sheep. Here, we present a spatial, unbiased ovine brain template with tissue probability maps (TPM that offer a detailed stereotaxic reference frame for anatomical features and localization of brain areas, thereby enabling inter-individual and cross-study comparability. Three-dimensional data sets from healthy adult Merino sheep (Ovis orientalis aries, 12 ewes and 26 neutered rams were acquired on a 1.5T Philips MRI using a T1w sequence. Data were averaged by linear and non-linear registration algorithms. Moreover, animals were subjected to detailed brain volume analysis including examinations with respect to body weight, age and sex. The created T1w brain template provides an appropriate population-averaged ovine brain anatomy in a spatial standard coordinate system. Additionally, TPM for gray (GM and white (WM matter as well as cerebrospinal fluid (CSF classification enabled automatic prior-based tissue segmentation using statistical parametric mapping (SPM. Overall, a positive correlation of GM volume and body weight explained about 15% of the variance of GM while a positive correlation between WM and age was found. Absolute tissue volume differences were not detected, indeed ewes showed significantly more GM per bodyweight as compared to neutered rams. The created framework including spatial brain template and TPM represent a useful tool for unbiased automatic image preprocessing and morphological characterization in sheep. Therefore, the reported results may serve as a starting point for further experimental and/or translational research aiming at in vivo analysis in this species.

  11. Tissue and plasma enzyme activities in juvenile green iguanas.

    Science.gov (United States)

    Wagner, R A; Wetzel, R

    1999-02-01

    To determine activities of intracellular enzymes in 8 major organs in juvenile green iguanas and to compare tissue and plasma activities. 6 green iguanas iguanas, but high values may not always indicate overt muscle disease. The AMS activity may be specific for the pancreas, but the wide range of plasma activity would likely limit its diagnostic usefulness. Activities of AST and LDH may reflect tissue damage or inflammation, but probably do not reflect damage to specific tissues or organs.

  12. MCSLTT, Monte Carlo Simulation of Light Transport in Tissue

    International Nuclear Information System (INIS)

    2008-01-01

    Description of program or function: Understanding light-tissue interaction is fundamental in the field of Biomedical Optics. It has important implications for both therapeutic and diagnostic technologies. In this program, light transport in scattering tissue is modeled by absorption and scattering events as each photon travels through the tissue. The path of each photon is determined statistically by calculating probabilities of scattering and absorption. Other measured quantities are total reflected light, total transmitted light, and total heat absorbed

  13. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    Directory of Open Access Journals (Sweden)

    A. Luke

    2018-04-01

    Full Text Available Flood hazard mapping in the United States (US is deeply tied to the National Flood Insurance Program (NFIP. Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1 legends that frame flood intensity both qualitatively and quantitatively, and (2 flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1 standing water depths following the flood, (2 the erosive potential of flowing water, and (3 pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating

  14. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Chao, Pei-Ju; Wang, Hung-Yu; Hsu, Hsuan-Chih; Chang, PaoShu; Chen, Wen-Cheng

    2012-01-01

    With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3 + xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R 2 , the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD 50 ) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD 50 =43.6 Gy and m=0.18 with the SEF data, and TD 50 =44.1 Gy and m=0.11 with the QoL data. The rate of grade 3 + xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Our study shows the agreement between the NTCP parameter modeling based on SEF and

  15. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Science.gov (United States)

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  16. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Directory of Open Access Journals (Sweden)

    Lee Tsair-Fwu

    2012-12-01

    Full Text Available Abstract Background With advances in modern radiotherapy (RT, many patients with head and neck (HN cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB model to derive parameters for the normal tissue complication probability (NTCP for xerostomia based on scintigraphy assessments and quality of life (QoL questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50 and the slope of the dose–response curve (m were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement

  17. Body maps on the human genome.

    Science.gov (United States)

    Cherniak, Christopher; Rodriguez-Esteban, Raul

    2013-12-20

    Chromosomes have territories, or preferred locales, in the cell nucleus. When these sites are taken into account, some large-scale structure of the human genome emerges. The synoptic picture is that genes highly expressed in particular topologically compact tissues are not randomly distributed on the genome. Rather, such tissue-specific genes tend to map somatotopically onto the complete chromosome set. They seem to form a "genome homunculus": a multi-dimensional, genome-wide body representation extending across chromosome territories of the entire spermcell nucleus. The antero-posterior axis of the body significantly corresponds to the head-tail axis of the nucleus, and the dorso-ventral body axis to the central-peripheral nucleus axis. This large-scale genomic structure includes thousands of genes. One rationale for a homuncular genome structure would be to minimize connection costs in genetic networks. Somatotopic maps in cerebral cortex have been reported for over a century.

  18. Probability of Unmixed Young Groundwater (defined using chlorofluorocarbon-11 concentrations and tritium activities) in the Eagle River Watershed Valley-Fill Aquifer, Eagle County, North-Central Colorado, 2006-2007

    Science.gov (United States)

    Rupert, Michael G.; Plummer, Niel

    2009-01-01

    This raster data set delineates the predicted probability of unmixed young groundwater (defined using chlorofluorocarbon-11 concentrations and tritium activities) in groundwater in the Eagle River watershed valley-fill aquifer, Eagle County, North-Central Colorado, 2006-2007. This data set was developed by a cooperative project between the U.S. Geological Survey, Eagle County, the Eagle River Water and Sanitation District, the Town of Eagle, the Town of Gypsum, and the Upper Eagle Regional Water Authority. This project was designed to evaluate potential land-development effects on groundwater and surface-water resources so that informed land-use and water management decisions can be made. This groundwater probability map and its associated probability maps were developed as follows: (1) A point data set of wells with groundwater quality and groundwater age data was overlaid with thematic layers of anthropogenic (related to human activities) and hydrogeologic data by using a geographic information system to assign each well values for depth to groundwater, distance to major streams and canals, distance to gypsum beds, precipitation, soils, and well depth. These data then were downloaded to a statistical software package for analysis by logistic regression. (2) Statistical models predicting the probability of elevated nitrate concentrations, the probability of unmixed young water (using chlorofluorocarbon-11 concentrations and tritium activities), and the probability of elevated volatile organic compound concentrations were developed using logistic regression techniques. (3) The statistical models were entered into a GIS and the probability map was constructed.

  19. Linkage mapping of putative regulator genes of barley grain development characterized by expression profiling

    Directory of Open Access Journals (Sweden)

    Wobus Ulrich

    2009-01-01

    Full Text Available Abstract Background Barley (Hordeum vulgare L. seed development is a highly regulated process with fine-tuned interaction of various tissues controlling distinct physiological events during prestorage, storage and dessication phase. As potential regulators involved within this process we studied 172 transcription factors and 204 kinases for their expression behaviour and anchored a subset of them to the barley linkage map to promote marker-assisted studies on barley grains. Results By a hierachical clustering of the expression profiles of 376 potential regulatory genes expressed in 37 different tissues, we found 50 regulators preferentially expressed in one of the three grain tissue fractions pericarp, endosperm and embryo during seed development. In addition, 27 regulators found to be expressed during both seed development and germination and 32 additional regulators are characteristically expressed in multiple tissues undergoing cell differentiation events during barley plant ontogeny. Another 96 regulators were, beside in the developing seed, ubiquitously expressed among all tissues of germinating seedlings as well as in reproductive tissues. SNP-marker development for those regulators resulted in anchoring 61 markers on the genetic linkage map of barley and the chromosomal assignment of another 12 loci by using wheat-barley addition lines. The SNP frequency ranged from 0.5 to 1.0 SNP/kb in the parents of the various mapping populations and was 2.3 SNP/kb over all eight lines tested. Exploration of macrosynteny to rice revealed that the chromosomal orders of the mapped putative regulatory factors were predominantly conserved during evolution. Conclusion We identified expression patterns of major transcription factors and signaling related genes expressed during barley ontogeny and further assigned possible functions based on likely orthologs functionally well characterized in model plant species. The combined linkage map and reference

  20. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  1. Probable maximum flood analysis, Richton Dome, Mississippi-Phase I: Technical report

    International Nuclear Information System (INIS)

    1987-03-01

    This report presents results of a preliminary analysis of the extent of inundation that would result from a probable maximum flood (PMF) event in the overdome area of Richton Dome, Mississippi. Bogue Homo and Thompson Creek watersheds drain the overdome area. The US Army Corps of Engineers' HEC-1 Flood Hydrograph Package was used to calculate runoff hydrographs, route computed flood hydrographs, and determine maximum flood stages at cross sections along overdome tributaries. The area and configuration of stream cross sections were determined from US Geological Survey topographic maps. Using maximum flood stages calculated by the HEC-1 analysis, areas of inundation were delineated on 10-ft (3-m) contour interval topographic maps. Approximately 10% of the overdome area, or 0.9 mi 2 (2 km 2 ), would be inundated by a PMF event. 34 refs., 3 figs., 1 tab

  2. Differential radiosensitivity on a tissue level in Delphinium ajacis

    Energy Technology Data Exchange (ETDEWEB)

    Mandal, S K; Basu, R K [Bose Research Inst., Calcutta (India). Cryogenetics Lab.

    1980-09-01

    Root, leaf, pollen mother cell and endosperm of D.ajacis showed differential sensitivity as measured by X-ray-induced chromosomal aberrations at mitotic anaphase and telophase stages of the first and second division cycles after irradiation. These tissues differed significantly in Interphase Chromosome Volume (ICV) values. In all the tissues the percentage of aberrant cells increased linearly with increase in X-ray dose. Though endosperm had the largest ICV value it was the most radioresistant tissue tested. The relative radiosensitivity of the other 3 tissues was positively correlated with ICV value. The radioresistance of endosperm is probably due to factors unique to this tissue which remained obscure.

  3. Radiotherapy in patients with connective tissue diseases.

    Science.gov (United States)

    Giaj-Levra, Niccolò; Sciascia, Savino; Fiorentino, Alba; Fersino, Sergio; Mazzola, Rosario; Ricchetti, Francesco; Roccatello, Dario; Alongi, Filippo

    2016-03-01

    The decision to offer radiotherapy in patients with connective tissue diseases continues to be challenging. Radiotherapy might trigger the onset of connective tissue diseases by increasing the expression of self-antigens, diminishing regulatory T-cell activity, and activating effectors of innate immunity (dendritic cells) through Toll-like receptor-dependent mechanisms, all of which could potentially lead to breaks of immune tolerance. This potential risk has raised some debate among radiation oncologists about whether patients with connective tissue diseases can tolerate radiation as well as people without connective tissue diseases. Because the number of patients with cancer and connective tissue diseases needing radiotherapy will probably increase due to improvements in medical treatment and longer life expectancy, the issue of interactions between radiotherapy and connective tissue diseases needs to be clearer. In this Review, we discuss available data and evidence for patients with connective tissue diseases treated with radiotherapy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  5. Digital tissue and what it may reveal about the brain.

    Science.gov (United States)

    Morgan, Josh L; Lichtman, Jeff W

    2017-10-30

    Imaging as a means of scientific data storage has evolved rapidly over the past century from hand drawings, to photography, to digital images. Only recently can sufficiently large datasets be acquired, stored, and processed such that tissue digitization can actually reveal more than direct observation of tissue. One field where this transformation is occurring is connectomics: the mapping of neural connections in large volumes of digitized brain tissue.

  6. Lod scores for gene mapping in the presence of marker map uncertainty.

    Science.gov (United States)

    Stringham, H M; Boehnke, M

    2001-07-01

    Multipoint lod scores are typically calculated for a grid of locus positions, moving the putative disease locus across a fixed map of genetic markers. Changing the order of a set of markers and/or the distances between the markers can make a substantial difference in the resulting lod score curve and the location and height of its maximum. The typical approach of using the best maximum likelihood marker map is not easily justified if other marker orders are nearly as likely and give substantially different lod score curves. To deal with this problem, we propose three weighted multipoint lod score statistics that make use of information from all plausible marker orders. In each of these statistics, the information conditional on a particular marker order is included in a weighted sum, with weight equal to the posterior probability of that order. We evaluate the type 1 error rate and power of these three statistics on the basis of results from simulated data, and compare these results to those obtained using the best maximum likelihood map and the map with the true marker order. We find that the lod score based on a weighted sum of maximum likelihoods improves on using only the best maximum likelihood map, having a type 1 error rate and power closest to that of using the true marker order in the simulation scenarios we considered. Copyright 2001 Wiley-Liss, Inc.

  7. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  8. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  9. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  10. Transport maps and dimension reduction for Bayesian computation

    KAUST Repository

    Marzouk, Youssef

    2015-01-01

    We introduce a new framework for efficient sampling from complex probability distributions, using a combination of optimal transport maps and the Metropolis-Hastings rule. The core idea is to use continuous transportation to transform typical Metropolis proposal mechanisms (e.g., random walks, Langevin methods) into non-Gaussian proposal distributions that can more effectively explore the target density. Our approach adaptively constructs a lower triangular transport map—an approximation of the Knothe-Rosenblatt rearrangement—using information from previous MCMC states, via the solution of an optimization problem. This optimization problem is convex regardless of the form of the target distribution. It is solved efficiently using a Newton method that requires no gradient information from the target probability distribution; the target distribution is instead represented via samples. Sequential updates enable efficient and parallelizable adaptation of the map even for large numbers of samples. We show that this approach uses inexact or truncated maps to produce an adaptive MCMC algorithm that is ergodic for the exact target distribution. Numerical demonstrations on a range of parameter inference problems show order-of-magnitude speedups over standard MCMC techniques, measured by the number of effectively independent samples produced per target density evaluation and per unit of wallclock time. We will also discuss adaptive methods for the construction of transport maps in high dimensions, where use of a non-adapted basis (e.g., a total order polynomial expansion) can become computationally prohibitive. If only samples of the target distribution, rather than density evaluations, are available, then we can construct high-dimensional transformations by composing sparsely parameterized transport maps with rotations of the parameter space. If evaluations of the target density and its gradients are available, then one can exploit the structure of the variational

  11. Transport maps and dimension reduction for Bayesian computation

    KAUST Repository

    Marzouk, Youssef

    2015-01-07

    We introduce a new framework for efficient sampling from complex probability distributions, using a combination of optimal transport maps and the Metropolis-Hastings rule. The core idea is to use continuous transportation to transform typical Metropolis proposal mechanisms (e.g., random walks, Langevin methods) into non-Gaussian proposal distributions that can more effectively explore the target density. Our approach adaptively constructs a lower triangular transport map—an approximation of the Knothe-Rosenblatt rearrangement—using information from previous MCMC states, via the solution of an optimization problem. This optimization problem is convex regardless of the form of the target distribution. It is solved efficiently using a Newton method that requires no gradient information from the target probability distribution; the target distribution is instead represented via samples. Sequential updates enable efficient and parallelizable adaptation of the map even for large numbers of samples. We show that this approach uses inexact or truncated maps to produce an adaptive MCMC algorithm that is ergodic for the exact target distribution. Numerical demonstrations on a range of parameter inference problems show order-of-magnitude speedups over standard MCMC techniques, measured by the number of effectively independent samples produced per target density evaluation and per unit of wallclock time. We will also discuss adaptive methods for the construction of transport maps in high dimensions, where use of a non-adapted basis (e.g., a total order polynomial expansion) can become computationally prohibitive. If only samples of the target distribution, rather than density evaluations, are available, then we can construct high-dimensional transformations by composing sparsely parameterized transport maps with rotations of the parameter space. If evaluations of the target density and its gradients are available, then one can exploit the structure of the variational

  12. Comparing 511 keV Attenuation Maps Obtained from Different Energy Mapping Methods for CT Based Attenuation Correction of PET Data

    Directory of Open Access Journals (Sweden)

    Maryam Shirmohammad

    2008-06-01

    of  K 2 HPO 4 the  three  methods;   hybrid   scaling/segmentation, bilinear and dual energy produced the lowest relative difference of  10.91, 10.88 and 5%, respectively. For patients it was found that for soft tissues all the mentioned energy  mapping  methods  produce  acceptable  attenuation  map  at  511  keV.  The  relative  difference  of  scaling,  segmentation,  hybrid,  and  bilinear  methods  compared  to  TX  method  was  6.95,  4.51,  7,  and  6.45%  respectively.  For bony tissues, the quantitative analysis  showed that  scaling and segmentation  method  produce high relative difference of 26 and 23.2%, respectively and the relative difference of hybrid and  bilinear in comparison to TX method was 10.7 and 20%, respectively.   Discussion and Conclusion:  Based on the result obtained from these two studies it can be concluded  that for soft tissues all energy mapping methods yield acceptable results while for bony tissues all the  mentioned methods except the scaling and segmentation yield acceptable results.

  13. Automated prediction of tissue outcome after acute ischemic stroke in computed tomography perfusion images

    Science.gov (United States)

    Vos, Pieter C.; Bennink, Edwin; de Jong, Hugo; Velthuis, Birgitta K.; Viergever, Max A.; Dankbaar, Jan Willem

    2015-03-01

    Assessment of the extent of cerebral damage on admission in patients with acute ischemic stroke could play an important role in treatment decision making. Computed tomography perfusion (CTP) imaging can be used to determine the extent of damage. However, clinical application is hindered by differences among vendors and used methodology. As a result, threshold based methods and visual assessment of CTP images has not yet shown to be useful in treatment decision making and predicting clinical outcome. Preliminary results in MR studies have shown the benefit of using supervised classifiers for predicting tissue outcome, but this has not been demonstrated for CTP. We present a novel method for the automatic prediction of tissue outcome by combining multi-parametric CTP images into a tissue outcome probability map. A supervised classification scheme was developed to extract absolute and relative perfusion values from processed CTP images that are summarized by a trained classifier into a likelihood of infarction. Training was performed using follow-up CT scans of 20 acute stroke patients with complete recanalization of the vessel that was occluded on admission. Infarcted regions were annotated by expert neuroradiologists. Multiple classifiers were evaluated in a leave-one-patient-out strategy for their discriminating performance using receiver operating characteristic (ROC) statistics. Results showed that a RandomForest classifier performed optimally with an area under the ROC of 0.90 for discriminating infarct tissue. The obtained results are an improvement over existing thresholding methods and are in line with results found in literature where MR perfusion was used.

  14. Identification of QTLs Associated with Callogenesis and Embryogenesis in Oil Palm Using Genetic Linkage Maps Improved with SSR Markers

    Science.gov (United States)

    Ting, Ngoot-Chin; Jansen, Johannes; Nagappan, Jayanthi; Ishak, Zamzuri; Chin, Cheuk-Weng; Tan, Soon-Guan; Cheah, Suan-Choo; Singh, Rajinder

    2013-01-01

    Clonal reproduction of oil palm by means of tissue culture is a very inefficient process. Tissue culturability is known to be genotype dependent with some genotypes being more amenable to tissue culture than others. In this study, genetic linkage maps enriched with simple sequence repeat (SSR) markers were developed for dura (ENL48) and pisifera (ML161), the two fruit forms of oil palm, Elaeis guineensis. The SSR markers were mapped onto earlier reported parental maps based on amplified fragment length polymorphism (AFLP) and restriction fragment length polymorphism (RFLP) markers. The new linkage map of ENL48 contains 148 markers (33 AFLPs, 38 RFLPs and 77 SSRs) in 23 linkage groups (LGs), covering a total map length of 798.0 cM. The ML161 map contains 240 markers (50 AFLPs, 71 RFLPs and 119 SSRs) in 24 LGs covering a total of 1,328.1 cM. Using the improved maps, two quantitative trait loci (QTLs) associated with tissue culturability were identified each for callusing rate and embryogenesis rate. A QTL for callogenesis was identified in LGD4b of ENL48 and explained 17.5% of the phenotypic variation. For embryogenesis rate, a QTL was detected on LGP16b in ML161 and explained 20.1% of the variation. This study is the first attempt to identify QTL associated with tissue culture amenity in oil palm which is an important step towards understanding the molecular processes underlying clonal regeneration of oil palm. PMID:23382832

  15. Comprehensive comparison of large-scale tissue expression datasets

    DEFF Research Database (Denmark)

    Santos Delgado, Alberto; Tsafou, Kalliopi; Stolte, Christian

    2015-01-01

    a comprehensive evaluation of tissue expression data from a variety of experimental techniques and show that these agree surprisingly well with each other and with results from literature curation and text mining. We further found that most datasets support the assumed but not demonstrated distinction between......For tissues to carry out their functions, they rely on the right proteins to be present. Several high-throughput technologies have been used to map out which proteins are expressed in which tissues; however, the data have not previously been systematically compared and integrated. We present......://tissues.jensenlab.org), which makes all the scored and integrated data available through a single user-friendly web interface....

  16. INTERACTIVE VISUALIZATION OF PROBABILITY AND CUMULATIVE DENSITY FUNCTIONS

    KAUST Repository

    Potter, Kristin; Kirby, Robert Michael; Xiu, Dongbin; Johnson, Chris R.

    2012-01-01

    The probability density function (PDF), and its corresponding cumulative density function (CDF), provide direct statistical insight into the characterization of a random process or field. Typically displayed as a histogram, one can infer probabilities of the occurrence of particular events. When examining a field over some two-dimensional domain in which at each point a PDF of the function values is available, it is challenging to assess the global (stochastic) features present within the field. In this paper, we present a visualization system that allows the user to examine two-dimensional data sets in which PDF (or CDF) information is available at any position within the domain. The tool provides a contour display showing the normed difference between the PDFs and an ansatz PDF selected by the user and, furthermore, allows the user to interactively examine the PDF at any particular position. Canonical examples of the tool are provided to help guide the reader into the mapping of stochastic information to visual cues along with a description of the use of the tool for examining data generated from an uncertainty quantification exercise accomplished within the field of electrophysiology.

  17. Evaluation of normal tissue responses to high-LET radiations

    International Nuclear Information System (INIS)

    Halnan, K.E.

    1979-01-01

    Clinical results presented have been analysed to evaluate normal tissue responses to high-LET radiations. Damage to brain, spinal cord, gut, skin, connective tissue and bone has occurred. A high RBE is probable for brain and possible for spinal cord and gut but other reasons for damage are also discussed. A net gain seems likely. Random controlled trials are advocated. (author)

  18. Association between increased epicardial adipose tissue volume and coronary plaque composition

    OpenAIRE

    Yamashita, Kennosuke; Yamamoto, Myong Hwa; Ebara, Seitarou; Okabe, Toshitaka; Saito, Shigeo; Hoshimoto, Koichi; Yakushiji, Tadayuki; Isomura, Naoei; Araki, Hiroshi; Obara, Chiaki; Ochiai, Masahiko

    2013-01-01

    To assess the relationship between epicardial adipose tissue volume (EATV) and plaque vulnerability in significant coronary stenosis using a 40-MHz intravascular ultrasound (IVUS) imaging system (iMap-IVUS), we analyzed 130 consecutive patients with coronary stenosis who underwent dual-source computed tomography (CT) and cardiac catheterization. Culprit lesions were imaged by iMap-IVUS before stenting. The iMAP-IVUS system classified coronary plaque components as fibrous, lipid, necrotic, or ...

  19. Estimation of geometrically undistorted B0 inhomogeneity maps

    International Nuclear Information System (INIS)

    Matakos, A; Balter, J; Cao, Y

    2014-01-01

    Geometric accuracy of MRI is one of the main concerns for its use as a sole image modality in precision radiation therapy (RT) planning. In a state-of-the-art scanner, system level geometric distortions are within acceptable levels for precision RT. However, subject-induced B 0 inhomogeneity may vary substantially, especially in air-tissue interfaces. Recent studies have shown distortion levels of more than 2 mm near the sinus and ear canal are possible due to subject-induced field inhomogeneity. These distortions can be corrected with the use of accurate B 0 inhomogeneity field maps. Most existing methods estimate these field maps from dual gradient-echo (GRE) images acquired at two different echo-times under the assumption that the GRE images are practically undistorted. However distortion that may exist in the GRE images can result in estimated field maps that are distorted in both geometry and intensity, leading to inaccurate correction of clinical images. This work proposes a method for estimating undistorted field maps from GRE acquisitions using an iterative joint estimation technique. The proposed method yields geometrically corrected GRE images and undistorted field maps that can also be used for the correction of images acquired by other sequences. The proposed method is validated through simulation, phantom experiments and applied to patient data. Our simulation results show that our method reduces the root-mean-squared error of the estimated field map from the ground truth by ten-fold compared to the distorted field map. Both the geometric distortion and the intensity corruption (artifact) in the images caused by the B 0 field inhomogeneity are corrected almost completely. Our phantom experiment showed improvement in the geometric correction of approximately 1 mm at an air-water interface using the undistorted field map compared to using a distorted field map. The proposed method for undistorted field map estimation can lead to improved geometric

  20. Estimation of geometrically undistorted B0 inhomogeneity maps

    Science.gov (United States)

    Matakos, A.; Balter, J.; Cao, Y.

    2014-09-01

    Geometric accuracy of MRI is one of the main concerns for its use as a sole image modality in precision radiation therapy (RT) planning. In a state-of-the-art scanner, system level geometric distortions are within acceptable levels for precision RT. However, subject-induced B0 inhomogeneity may vary substantially, especially in air-tissue interfaces. Recent studies have shown distortion levels of more than 2 mm near the sinus and ear canal are possible due to subject-induced field inhomogeneity. These distortions can be corrected with the use of accurate B0 inhomogeneity field maps. Most existing methods estimate these field maps from dual gradient-echo (GRE) images acquired at two different echo-times under the assumption that the GRE images are practically undistorted. However distortion that may exist in the GRE images can result in estimated field maps that are distorted in both geometry and intensity, leading to inaccurate correction of clinical images. This work proposes a method for estimating undistorted field maps from GRE acquisitions using an iterative joint estimation technique. The proposed method yields geometrically corrected GRE images and undistorted field maps that can also be used for the correction of images acquired by other sequences. The proposed method is validated through simulation, phantom experiments and applied to patient data. Our simulation results show that our method reduces the root-mean-squared error of the estimated field map from the ground truth by ten-fold compared to the distorted field map. Both the geometric distortion and the intensity corruption (artifact) in the images caused by the B0 field inhomogeneity are corrected almost completely. Our phantom experiment showed improvement in the geometric correction of approximately 1 mm at an air-water interface using the undistorted field map compared to using a distorted field map. The proposed method for undistorted field map estimation can lead to improved geometric

  1. Automated tissue classification of pediatric brains from magnetic resonance images using age-specific atlases

    Science.gov (United States)

    Metzger, Andrew; Benavides, Amanda; Nopoulos, Peg; Magnotta, Vincent

    2016-03-01

    The goal of this project was to develop two age appropriate atlases (neonatal and one year old) that account for the rapid growth and maturational changes that occur during early development. Tissue maps from this age group were initially created by manually correcting the resulting tissue maps after applying an expectation maximization (EM) algorithm and an adult atlas to pediatric subjects. The EM algorithm classified each voxel into one of ten possible tissue types including several subcortical structures. This was followed by a novel level set segmentation designed to improve differentiation between distal cortical gray matter and white matter. To minimize the req uired manual corrections, the adult atlas was registered to the pediatric scans using high -dimensional, symmetric image normalization (SyN) registration. The subject images were then mapped to an age specific atlas space, again using SyN registration, and the resulting transformation applied to the manually corrected tissue maps. The individual maps were averaged in the age specific atlas space and blurred to generate the age appropriate anatomical priors. The resulting anatomical priors were then used by the EM algorithm to re-segment the initial training set as well as an independent testing set. The results from the adult and age-specific anatomical priors were compared to the manually corrected results. The age appropriate atlas provided superior results as compared to the adult atlas. The image analysis pipeline used in this work was built using the open source software package BRAINSTools.

  2. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Science.gov (United States)

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  3. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Directory of Open Access Journals (Sweden)

    Pramod K Avti

    Full Text Available In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM was investigated to detect, map, and quantify trace amounts [nanograms (ng to micrograms (µg] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds.Optical-resolution (OR and acoustic-resolution (AR--Photoacoustic microscopy (PAM was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR fluorescence microscopy.Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections.The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  4. Probing the statistics of transport in the Hénon Map

    Science.gov (United States)

    Alus, O.; Fishman, S.; Meiss, J. D.

    2016-09-01

    The phase space of an area-preserving map typically contains infinitely many elliptic islands embedded in a chaotic sea. Orbits near the boundary of a chaotic region have been observed to stick for long times, strongly influencing their transport properties. The boundary is composed of invariant "boundary circles." We briefly report recent results of the distribution of rotation numbers of boundary circles for the Hénon quadratic map and show that the probability of occurrence of small integer entries of their continued fraction expansions is larger than would be expected for a number chosen at random. However, large integer entries occur with probabilities distributed proportionally to the random case. The probability distributions of ratios of fluxes through island chains is reported as well. These island chains are neighbours in the sense of the Meiss-Ott Markov-tree model. Two distinct universality families are found. The distributions of the ratio between the flux and orbital period are also presented. All of these results have implications for models of transport in mixed phase space.

  5. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  6. Three-dimensional optical coherence micro-elastography of skeletal muscle tissue

    OpenAIRE

    Chin, Lixin; Kennedy, Brendan F.; Kennedy, Kelsey M.; Wijesinghe, Philip; Pinniger, Gavin J.; Terrill, Jessica R.; McLaughlin, Robert A.; Sampson, David D.

    2014-01-01

    In many muscle pathologies, impairment of skeletal muscle function is closely linked to changes in the mechanical properties of the muscle constituents. Optical coherence micro-elastography (OCME) uses optical coherence tomography (OCT) imaging of tissue under a quasi-static, compressive mechanical load to map variations in tissue mechanical properties on the micro-scale. We present the first study of OCME on skeletal muscle tissue. We show that this technique can resolve features of muscle t...

  7. Determination of composition and structure of spongy bone tissue in human head of femur by Raman spectral mapping.

    Science.gov (United States)

    Kozielski, M; Buchwald, T; Szybowicz, M; Błaszczak, Z; Piotrowski, A; Ciesielczyk, B

    2011-07-01

    Biomechanical properties of bone depend on the composition and organization of collagen fibers. In this study, Raman microspectroscopy was employed to determine the content of mineral and organic constituents and orientation of collagen fibers in spongy bone in the human head of femur at the microstructural level. Changes in composition and structure of trabecula were illustrated using Raman spectral mapping. The polarized Raman spectra permit separate analysis of local variations in orientation and composition. The ratios of ν₂PO₄³⁻/Amide III, ν₄PO₄³⁻/Amide III and ν₁CO₃²⁻/ν₂PO₄³⁻ are used to describe relative amounts of spongy bone components. The ν₁PO₄³⁻/Amide I ratio is quite susceptible to orientation effect and brings information on collagen fibers orientation. The results presented illustrate the versatility of the Raman method in the study of bone tissue. The study permits better understanding of bone physiology and evaluation of the biomechanical properties of bone.

  8. Isolation of Mycobacterium avium subsp paratuberculosis (Map) from feral cats on a dairy farm with Map-infected cattle.

    Science.gov (United States)

    Palmer, Mitchell V; Stoffregen, William C; Carpenter, Jeremy G; Stabel, Judith R

    2005-07-01

    Paratuberculosis is an economically important disease of dairy cattle caused by Mycobacterium avium subsp. paratuberculosis (Map). The role of nonruminant, nondomestic animals in the epidemiology of paratuberculosis in cattle is unclear. To examine nonruminant, nondomestic animals for the presence of Map, 25 feral cats, nine mice (species unknown), eight rabbits (Sylvilagus floridanus), six raccoons (Procyon lotor), and three opossums (Didelphis virginiana) were collected from a mid-western dairy with known Map-infected cattle. Mycobacterium avium subsp. paratuberculosis was isolated from the mesenteric lymph node from seven of 25 (28%) feral cats. Ileum was culture-positive for three of these seven cats, and an isolation of Map was also made from the ileum of one of nine (11%) mice. Tissue samples from other species were negative as determined by Map culture; microscopic lesions consistent with paratuberculosis were not seen in any animal. Restriction fragment polymorphism analysis of isolates from cats and dairy cattle suggest interspecies transmission. The means by which interspecies transmission occurred may be through ingestion of Map-contaminated feces or waste milk or through ingestion of Map-infected prey. Shedding of Map from infected cats was not evaluated. The epidemiologic role of Map-infected feral cats on dairy farms requires further investigation.

  9. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  10. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  11. Effects of ultrasound frequency and tissue stiffness on the histotripsy intrinsic threshold for cavitation.

    Science.gov (United States)

    Vlaisavljevich, Eli; Lin, Kuang-Wei; Maxwell, Adam; Warnez, Matthew T; Mancia, Lauren; Singh, Rahul; Putnam, Andrew J; Fowlkes, Brian; Johnsen, Eric; Cain, Charles; Xu, Zhen

    2015-06-01

    Histotripsy is an ultrasound ablation method that depends on the initiation of a cavitation bubble cloud to fractionate soft tissue. Previous work has indicated that a cavitation cloud can be formed by a single pulse with one high-amplitude negative cycle, when the negative pressure amplitude directly exceeds a pressure threshold intrinsic to the medium. We hypothesize that the intrinsic threshold in water-based tissues is determined by the properties of the water inside the tissue, and changes in tissue stiffness or ultrasound frequency will have a minimal impact on the histotripsy intrinsic threshold. To test this hypothesis, the histotripsy intrinsic threshold was investigated both experimentally and theoretically. The probability of cavitation was measured by subjecting tissue phantoms with adjustable mechanical properties and ex vivo tissues to a histotripsy pulse of 1-2 cycles produced by 345-kHz, 500-kHz, 1.5-MHz and 3-MHz histotripsy transducers. Cavitation was detected and characterized by passive cavitation detection and high-speed photography, from which the probability of cavitation was measured versus pressure amplitude. The results revealed that the intrinsic threshold (the negative pressure at which probability = 0.5) is independent of stiffness for Young's moduli (E) ultrasound frequency in the hundreds of kilohertz to megahertz range. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  12. Comparison of tissue viability imaging and colorimetry: skin blanching.

    Science.gov (United States)

    Zhai, Hongbo; Chan, Heidi P; Farahmand, Sara; Nilsson, Gert E; Maibach, Howard I

    2009-02-01

    Operator-independent assessment of skin blanching is important in the development and evaluation of topically applied steroids. Spectroscopic instruments based on hand-held probes, however, include elements of operator dependence such as difference in applied pressure and probe misalignment, while laser Doppler-based methods are better suited for demonstration of skin vasodilatation than for vasoconstriction. To demonstrate the potential of the emerging technology of Tissue Viability Imaging (TiVi) in the objective and operator-independent assessment of skin blanching. The WheelsBridge TiVi600 Tissue Viability Imager was used for quantification of human skin blanching with the Minolta chromameter CR 200 as an independent colorimeter reference method. Desoximetasone gel 0.05% was applied topically on the volar side of the forearm under occlusion for 6 h in four healthy adults. In a separate study, the induction of blanching in the occlusion phase was mapped using a transparent occlusion cover. The relative uncertainty in the blanching estimate produced by the Tissue Viability Imager was about 5% and similar to that of the chromameter operated by a single user and taking the a(*) parameter as a measure of blanching. Estimation of skin blanching could also be performed in the presence of a transient paradoxical erythema, using the integrated TiVi software. The successive induction of skin blanching during the occlusion phase could readily be mapped by the Tissue Viability Imager. TiVi seems to be suitable for operator-independent and remote mapping of human skin blanching, eliminating the main disadvantages of methods based on hand-held probes.

  13. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  14. Identification of QTLs associated with callogenesis and embryogenesis in oil palm using genetic linkage maps improved with SSR markers.

    Directory of Open Access Journals (Sweden)

    Ngoot-Chin Ting

    Full Text Available Clonal reproduction of oil palm by means of tissue culture is a very inefficient process. Tissue culturability is known to be genotype dependent with some genotypes being more amenable to tissue culture than others. In this study, genetic linkage maps enriched with simple sequence repeat (SSR markers were developed for dura (ENL48 and pisifera (ML161, the two fruit forms of oil palm, Elaeis guineensis. The SSR markers were mapped onto earlier reported parental maps based on amplified fragment length polymorphism (AFLP and restriction fragment length polymorphism (RFLP markers. The new linkage map of ENL48 contains 148 markers (33 AFLPs, 38 RFLPs and 77 SSRs in 23 linkage groups (LGs, covering a total map length of 798.0 cM. The ML161 map contains 240 markers (50 AFLPs, 71 RFLPs and 119 SSRs in 24 LGs covering a total of 1,328.1 cM. Using the improved maps, two quantitative trait loci (QTLs associated with tissue culturability were identified each for callusing rate and embryogenesis rate. A QTL for callogenesis was identified in LGD4b of ENL48 and explained 17.5% of the phenotypic variation. For embryogenesis rate, a QTL was detected on LGP16b in ML161 and explained 20.1% of the variation. This study is the first attempt to identify QTL associated with tissue culture amenity in oil palm which is an important step towards understanding the molecular processes underlying clonal regeneration of oil palm.

  15. Intensity Based Seismic Hazard Map of Republic of Macedonia

    Science.gov (United States)

    Dojcinovski, Dragi; Dimiskovska, Biserka; Stojmanovska, Marta

    2016-04-01

    The territory of the Republic of Macedonia and the border terrains are among the most seismically active parts of the Balkan Peninsula belonging to the Mediterranean-Trans-Asian seismic belt. The seismological data on the R. Macedonia from the past 16 centuries point to occurrence of very strong catastrophic earthquakes. The hypocenters of the occurred earthquakes are located above the Mohorovicic discontinuity, most frequently, at a depth of 10-20 km. Accurate short -term prognosis of earthquake occurrence, i.e., simultaneous prognosis of time, place and intensity of their occurrence is still not possible. The present methods of seismic zoning have advanced to such an extent that it is with a great probability that they enable efficient protection against earthquake effects. The seismic hazard maps of the Republic of Macedonia are the result of analysis and synthesis of data from seismological, seismotectonic and other corresponding investigations necessary for definition of the expected level of seismic hazard for certain time periods. These should be amended, from time to time, with new data and scientific knowledge. The elaboration of this map does not completely solve all issues related to earthquakes, but it provides basic empirical data necessary for updating the existing regulations for construction of engineering structures in seismically active areas regulated by legal regulations and technical norms whose constituent part is the seismic hazard map. The map has been elaborated based on complex seismological and geophysical investigations of the considered area and synthesis of the results from these investigations. There were two phases of elaboration of the map. In the first phase, the map of focal zones characterized by maximum magnitudes of possible earthquakes has been elaborated. In the second phase, the intensities of expected earthquakes have been computed according to the MCS scale. The map is prognostic, i.e., it provides assessment of the

  16. Direct-to-PCR tissue preservation for DNA profiling.

    Science.gov (United States)

    Sorensen, Amy; Berry, Clare; Bruce, David; Gahan, Michelle Elizabeth; Hughes-Stamm, Sheree; McNevin, Dennis

    2016-05-01

    Disaster victim identification (DVI) often occurs in remote locations with extremes of temperatures and humidities. Access to mortuary facilities and refrigeration are not always available. An effective and robust DNA sampling and preservation procedure would increase the probability of successful DNA profiling and allow faster repatriation of bodies and body parts. If the act of tissue preservation also released DNA into solution, ready for polymerase chain reaction (PCR), the DVI process could be further streamlined. In this study, we explored the possibility of obtaining DNA profiles without DNA extraction, by adding aliquots of preservative solutions surrounding fresh human muscle and decomposing human muscle and skin tissue samples directly to PCR. The preservatives consisted of two custom preparations and two proprietary solutions. The custom preparations were a salt-saturated solution of dimethyl sulfoxide (DMSO) with ethylenediaminetetraacetic (EDTA) and TENT buffer (Tris, EDTA, NaCl, Tween 20). The proprietary preservatives were DNAgard (Biomatrica(®)) and Tissue Stabilising Kit (DNA Genotek). We obtained full PowerPlex(®) 21 (Promega) and GlobalFiler(®) (Life Technologies) DNA profiles from fresh and decomposed tissue preserved at 35 °C for up to 28 days for all four preservatives. The preservative aliquots removed from the fresh muscle tissue samples had been stored at -80 °C for 4 years, indicating that long-term archival does not diminish the probability of successful DNA typing. Rather, storage at -80 °C seems to reduce PCR inhibition.

  17. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  18. Spiral waves characterization: Implications for an automated cardiodynamic tissue characterization.

    Science.gov (United States)

    Alagoz, Celal; Cohen, Andrew R; Frisch, Daniel R; Tunç, Birkan; Phatharodom, Saran; Guez, Allon

    2018-07-01

    Spiral waves are phenomena observed in cardiac tissue especially during fibrillatory activities. Spiral waves are revealed through in-vivo and in-vitro studies using high density mapping that requires special experimental setup. Also, in-silico spiral wave analysis and classification is performed using membrane potentials from entire tissue. In this study, we report a characterization approach that identifies spiral wave behaviors using intracardiac electrogram (EGM) readings obtained with commonly used multipolar diagnostic catheters that perform localized but high-resolution readings. Specifically, the algorithm is designed to distinguish between stationary, meandering, and break-up rotors. The clustering and classification algorithms are tested on simulated data produced using a phenomenological 2D model of cardiac propagation. For EGM measurements, unipolar-bipolar EGM readings from various locations on tissue using two catheter types are modeled. The distance measure between spiral behaviors are assessed using normalized compression distance (NCD), an information theoretical distance. NCD is a universal metric in the sense it is solely based on compressibility of dataset and not requiring feature extraction. We also introduce normalized FFT distance (NFFTD) where compressibility is replaced with a FFT parameter. Overall, outstanding clustering performance was achieved across varying EGM reading configurations. We found that effectiveness in distinguishing was superior in case of NCD than NFFTD. We demonstrated that distinct spiral activity identification on a behaviorally heterogeneous tissue is also possible. This report demonstrates a theoretical validation of clustering and classification approaches that provide an automated mapping from EGM signals to assessment of spiral wave behaviors and hence offers a potential mapping and analysis framework for cardiac tissue wavefront propagation patterns. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  20. Four-dimensional optoacoustic temperature mapping in laser-induced thermotherapy

    Science.gov (United States)

    Oyaga Landa, Francisco Javier; Deán-Ben, Xosé Luís.; Sroka, Ronald; Razansky, Daniel

    2018-02-01

    Photoablative laser therapy is in common use for selective destruction of malignant masses, vascular and brain abnormalities. Tissue ablation and coagulation are irreversible processes occurring shortly after crossing a certain thermal exposure threshold. As a result, accurate mapping of the temperature field is essential for optimizing the outcome of these clinical interventions. Here we demonstrate four-dimensional optoacoustic temperature mapping of the entire photoablated region. Accuracy of the method is investigated in tissue-mimicking phantom experiments. Deviations of the volumetric optoacoustic temperature readings provided at 40ms intervals remained below 10% for temperature elevations above 3°C, as validated by simultaneous thermocouple measurements. The excellent spatio-temporal resolution of the new temperature monitoring approach aims at improving safety and efficacy of laser-based photothermal procedures.

  1. Correlation of spatial climate/weather maps and the advantages of using the Mahalanobis metric in predictions

    OpenAIRE

    Stephenson, D. B.

    2011-01-01

    he skill in predicting spatially varying weather/climate maps depends on the definition of the measure of similarity between the maps. Under the justifiable approximation that the anomaly maps are distributed multinormally, it is shown analytically that the choice of weighting metric, used in defining the anomaly correlation between spatial maps, can change the resulting probability distribution of the correlation coefficient. The estimate of the numbers of degrees of freedom based on the var...

  2. Imaging the spectral reflectance properties of bipolar radiofrequency-fused bowel tissue

    Science.gov (United States)

    Clancy, Neil T.; Arya, Shobhit; Stoyanov, Danail; Du, Xiaofei; Hanna, George B.; Elson, Daniel S.

    2015-07-01

    Delivery of radiofrequency (RF) electrical energy is used during surgery to heat and seal tissue, such as vessels, allowing resection without blood loss. Recent work has suggested that this approach may be extended to allow surgical attachment of larger tissue segments for applications such as bowel anastomosis. In a large series of porcine surgical procedures bipolar RF energy was used to resect and re-seal the small bowel in vivo with a commercial tissue fusion device (Ligasure; Covidien PLC, USA). The tissue was then imaged with a multispectral imaging laparoscope to obtain a spectral datacube comprising both fused and healthy tissue. Maps of blood volume, oxygen saturation and scattering power were derived from the measured reflectance spectra using an optimised light-tissue interaction model. A 60% increase in reflectance of visible light (460-700 nm) was observed after fusion, with the tissue taking on a white appearance. Despite this the distinctive shape of the haemoglobin absorption spectrum was still noticeable in the 460-600 nm wavelength range. Scattering power increased in the fused region in comparison to normal serosa, while blood volume and oxygen saturation decreased. Observed fusion-induced changes in the reflectance spectrum are consistent with the biophysical changes induced through tissue denaturation and increased collagen cross-linking. The multispectral imager allows mapping of the spatial extent of these changes and classification of the zone of damaged tissue. Further analysis of the spectral data in parallel with histopathological examination of excised specimens will allow correlation of the optical property changes with microscopic alterations in tissue structure.

  3. Defects in MAP1S-mediated autophagy turnover of fibronectin cause renal fibrosis.

    Science.gov (United States)

    Xu, Guibin; Yue, Fei; Huang, Hai; He, Yongzhong; Li, Xun; Zhao, Haibo; Su, Zhengming; Jiang, Xianhan; Li, Wenjiao; Zou, Jing; Chen, Qi; Liu, Leyuan

    2016-05-01

    Excessive deposition of extracellular matrix proteins in renal tissues causes renal fibrosis and renal function failure. Mammalian cells primarily use the autophagy-lysosome system to degrade misfolded/aggregated proteins and dysfunctional organelles. MAP1S is an autophagy activator and promotes the biogenesis and degradation of autophagosomes. Previously, we reported that MAP1S suppresses hepatocellular carcinogenesis in a mouse model and predicts a better prognosis in patients suffering from clear cell renal cell carcinomas. Furthermore, we have characterized that MAP1S enhances the turnover of fibronectin, and mice overexpressing LC3 but with MAP1S deleted accumulate fibronectin and develop liver fibrosis because of the synergistic impact of LC3-induced over-synthesis of fibronectin and MAP1S depletion-caused impairment of fibronectin degradation. Here we show that a suppression of MAP1S in renal cells caused an impairment of autophagy clearance of fibronectin and an activation of pyroptosis. Depletion of MAP1S in mice leads to an accumulation of fibrosis-related proteins and the development of renal fibrosis in aged mice. The levels of MAP1S were dramatically reduced and levels of fibronectin were greatly elevated in renal fibrotic tissues from patients diagnosed as renal atrophy and renal failure. Therefore, MAP1S deficiency may cause the accumulation of fibronectin and the development of renal fibrosis.

  4. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  5. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  6. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  7. Quantitative maps of protein phosphorylation sites across 14 different rat organs and tissues

    DEFF Research Database (Denmark)

    Lundby, Alicia; Secher, Anna; Lage, Kasper

    2012-01-01

    Deregulated cellular signalling is a common hallmark of disease, and delineating tissue phosphoproteomes is key to unravelling the underlying mechanisms. Here we present the broadest tissue catalogue of phosphoproteins to date, covering 31,480 phosphorylation sites on 7,280 proteins quantified ac...

  8. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  9. Local stem cell depletion model for normal tissue damage

    International Nuclear Information System (INIS)

    Yaes, R.J.; Keland, A.

    1987-01-01

    The hypothesis that radiation causes normal tissue damage by completely depleting local regions of tissue of viable stem cells leads to a simple mathematical model for such damage. In organs like skin and spinal cord where destruction of a small volume of tissue leads to a clinically apparent complication, the complication probability is expressed as a function of dose, volume and stem cell number by a simple triple negative exponential function analogous to the double exponential function of Munro and Gilbert for tumor control. The steep dose response curves for radiation myelitis that are obtained with our model are compared with the experimental data for radiation myelitis in laboratory rats. The model can be generalized to include other types or organs, high LET radiation, fractionated courses of radiation, and cases where an organ with a heterogeneous stem cell population receives an inhomogeneous dose of radiation. In principle it would thus be possible to determine the probability of tumor control and of damage to any organ within the radiation field if the dose distribution in three dimensional space within a patient is known

  10. Use of probability methods in prospecting-exploration in looking for oil. Primeneniye veroyatnostnykh metodov v poiskovo-razvedochnykh rabotakh na neft'

    Energy Technology Data Exchange (ETDEWEB)

    Kharbukh, Dzh U; Davton, Dzh Kh; Devis, Dzh K

    1981-01-01

    The experience of using probability methods in different geological conditions on the US territory is generalized. The efficiency of using systems analysis, imitation modeling of prospecting-exploration process and conditions for arrangement of fields, machine processing of data in plotting different types of structural maps, probability forecasting of the presence of fields is shown. Especial attention is focused on nonstructural traps. A brief dictionary of terms is presented used in the mathematical apparatus and the computer in oil geology.

  11. Fibroblast Growth Factors: Biology, Function, and Application for Tissue Regeneration

    Directory of Open Access Journals (Sweden)

    Ye-Rang Yun

    2010-01-01

    Full Text Available Fibroblast growth factors (FGFs that signal through FGF receptors (FGFRs regulate a broad spectrum of biological functions, including cellular proliferation, survival, migration, and differentiation. The FGF signal pathways are the RAS/MAP kinase pathway, PI3 kinase/AKT pathway, and PLCγ pathway, among which the RAS/MAP kinase pathway is known to be predominant. Several studies have recently implicated the in vitro biological functions of FGFs for tissue regeneration. However, to obtain optimal outcomes in vivo, it is important to enhance the half-life of FGFs and their biological stability. Future applications of FGFs are expected when the biological functions of FGFs are potentiated through the appropriate use of delivery systems and scaffolds. This review will introduce the biology and cellular functions of FGFs and deal with the biomaterials based delivery systems and their current applications for the regeneration of tissues, including skin, blood vessel, muscle, adipose, tendon/ligament, cartilage, bone, tooth, and nerve tissues.

  12. A Bayesian framework for cosmic string searches in CMB maps

    Energy Technology Data Exchange (ETDEWEB)

    Ciuca, Razvan; Hernández, Oscar F., E-mail: razvan.ciuca@mail.mcgill.ca, E-mail: oscarh@physics.mcgill.ca [Department of Physics, McGill University, 3600 rue University, Montréal, QC, H3A 2T8 (Canada)

    2017-08-01

    There exists various proposals to detect cosmic strings from Cosmic Microwave Background (CMB) or 21 cm temperature maps. Current proposals do not aim to find the location of strings on sky maps, all of these approaches can be thought of as a statistic on a sky map. We propose a Bayesian interpretation of cosmic string detection and within that framework, we derive a connection between estimates of cosmic string locations and cosmic string tension G μ. We use this Bayesian framework to develop a machine learning framework for detecting strings from sky maps and outline how to implement this framework with neural networks. The neural network we trained was able to detect and locate cosmic strings on noiseless CMB temperature map down to a string tension of G μ=5 ×10{sup −9} and when analyzing a CMB temperature map that does not contain strings, the neural network gives a 0.95 probability that G μ≤2.3×10{sup −9}.

  13. Practical obstacles and their mitigation strategies in compressional optical coherence elastography of biological tissues

    Directory of Open Access Journals (Sweden)

    Vladimir Y. Zaitsev

    2017-11-01

    Full Text Available In this paper, we point out some practical obstacles arising in realization of compressional optical coherence elastography (OCE that have not attracted sufficient attention previously. Specifically, we discuss (i complications in quantification of the Young modulus of tissues related to partial adhesion between the OCE probe and soft intervening reference layer sensor, (ii distorting influence of tissue surface curvature/corrugation on the subsurface strain distribution mapping, (iii ways of signal-to-noise ratio (SNR enhancement in OCE strain mapping when periodic averaging is not realized, and (iv potentially significant influence of tissue elastic nonlinearity on quantification of its stiffness. Potential practical approaches to mitigate the effects of these complications are also described.

  14. Modeling, Designing, and Implementing an Avatar-based Interactive Map

    Directory of Open Access Journals (Sweden)

    Stefan Andrei

    2016-03-01

    Full Text Available Designing interactive maps has always been a challenge due to the geographical complexity of the earth’s landscape and the difficulty of resolving details to a high resolution. In the past decade or so, one of the most impressive map-based software application, the Global Positioning System (GPS, has probably the highest level of interaction with the user. This article describes an innovative technique for designing an avatar-based virtual interactive map for the Lamar University Campus, which will entail the buildings’ exterior as well as their interiors. Many universities provide 2D or 3D maps and even interactive maps. However, these maps do not provide a complete interaction with the user. To the best of our knowledge, this project is the first avatar-based interaction game that allows 100% interaction with the user. This work provides tremendous help to the freshman students and visitors of Lamar University. As an important marketing tool, the main objective is to get better visibility of the campus worldwide and to increase the number of students attending Lamar University.

  15. Application of wildfire spread and behavior models to assess fire probability and severity in the Mediterranean region

    Science.gov (United States)

    Salis, Michele; Arca, Bachisio; Bacciu, Valentina; Spano, Donatella; Duce, Pierpaolo; Santoni, Paul; Ager, Alan; Finney, Mark

    2010-05-01

    Characterizing the spatial pattern of large fire occurrence and severity is an important feature of the fire management planning in the Mediterranean region. The spatial characterization of fire probabilities, fire behavior distributions and value changes are key components for quantitative risk assessment and for prioritizing fire suppression resources, fuel treatments and law enforcement. Because of the growing wildfire severity and frequency in recent years (e.g.: Portugal, 2003 and 2005; Italy and Greece, 2007 and 2009), there is an increasing demand for models and tools that can aid in wildfire prediction and prevention. Newer wildfire simulation systems offer promise in this regard, and allow for fine scale modeling of wildfire severity and probability. Several new applications has resulted from the development of a minimum travel time (MTT) fire spread algorithm (Finney, 2002), that models the fire growth searching for the minimum time for fire to travel among nodes in a 2D network. The MTT approach makes computationally feasible to simulate thousands of fires and generate burn probability and fire severity maps over large areas. The MTT algorithm is imbedded in a number of research and fire modeling applications. High performance computers are typically used for MTT simulations, although the algorithm is also implemented in the FlamMap program (www.fire.org). In this work, we described the application of the MTT algorithm to estimate spatial patterns of burn probability and to analyze wildfire severity in three fire prone areas of the Mediterranean Basin, specifically Sardinia (Italy), Sicily (Italy) and Corsica (France) islands. We assembled fuels and topographic data for the simulations in 500 x 500 m grids for the study areas. The simulations were run using 100,000 ignitions under weather conditions that replicated severe and moderate weather conditions (97th and 70th percentile, July and August weather, 1995-2007). We used both random ignition locations

  16. Development of multigene expression signature maps at the protein level from digitized immunohistochemistry slides.

    Directory of Open Access Journals (Sweden)

    Gregory J Metzger

    Full Text Available Molecular classification of diseases based on multigene expression signatures is increasingly used for diagnosis, prognosis, and prediction of response to therapy. Immunohistochemistry (IHC is an optimal method for validating expression signatures obtained using high-throughput genomics techniques since IHC allows a pathologist to examine gene expression at the protein level within the context of histologically interpretable tissue sections. Additionally, validated IHC assays may be readily implemented as clinical tests since IHC is performed on routinely processed clinical tissue samples. However, methods have not been available for automated n-gene expression profiling at the protein level using IHC data. We have developed methods to compute expression level maps (signature maps of multiple genes from IHC data digitized on a commercial whole slide imaging system. Areas of cancer for these expression level maps are defined by a pathologist on adjacent, co-registered H&E slides, allowing assessment of IHC statistics and heterogeneity within the diseased tissue. This novel way of representing multiple IHC assays as signature maps will allow the development of n-gene expression profiling databases in three dimensions throughout virtual whole organ reconstructions.

  17. Performance of T2 Maps in the Detection of Prostate Cancer.

    Science.gov (United States)

    Chatterjee, Aritrick; Devaraj, Ajit; Mathew, Melvy; Szasz, Teodora; Antic, Tatjana; Karczmar, Gregory S; Oto, Aytekin

    2018-05-03

    This study compares the performance of T2 maps in the detection of prostate cancer (PCa) in comparison to T2-weighted (T2W) magnetic resonance images. The prospective study was institutional review board approved. Consenting patients (n = 45) with histologic confirmed PCa underwent preoperative 3-T magnetic resonance imaging with or without endorectal coil. Two radiologists, working independently, marked regions of interests (ROIs) on PCa lesions separately on T2W images and T2 maps. Each ROI was assigned a score of 1-5 based on the confidence in accurately detecting cancer, with 5 being the highest confidence. Subsequently, the histologically confirmed PCa lesions (n = 112) on whole-mount sections were matched with ROIs to calculate sensitivity, positive predictive value (PPV), and radiologist confidence score. Quantitative T2 values of PCa and benign tissue ROIs were measured. Sensitivity and confidence score for PCa detection were similar for T2W images (51%, 4.5 ± 0.8) and T2 maps (52%, 4.5 ± 0.6). However, PPV was significantly higher (P = .001) for T2 maps (88%) compared to T2W (72%) images. The use of endorectal coils nominally improved sensitivity (T2W: 55 vs 47%, T2 map: 54% vs 48%) compared to the use of no endorectal coils, but not the PPV and the confidence score. Quantitative T2 values for PCa (105 ± 28 milliseconds) were significantly (P = 9.3 × 10 -14 ) lower than benign peripheral zone tissue (211 ± 71 milliseconds), with moderate significant correlation with Gleason score (ρ = -0.284). Our study shows that review of T2 maps by radiologists has similar sensitivity but higher PPV compared to T2W images. Additional quantitative information obtained from T2 maps is helpful in differentiating cancer from normal prostate tissue and determining its aggressiveness. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  18. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  19. Identification and target prediction of miRNAs specifically expressed in rat neural tissue

    Directory of Open Access Journals (Sweden)

    Tu Kang

    2009-05-01

    Full Text Available Abstract Background MicroRNAs (miRNAs are a large group of RNAs that play important roles in regulating gene expression and protein translation. Several studies have indicated that some miRNAs are specifically expressed in human, mouse and zebrafish tissues. For example, miR-1 and miR-133 are specifically expressed in muscles. Tissue-specific miRNAs may have particular functions. Although previous studies have reported the presence of human, mouse and zebrafish tissue-specific miRNAs, there have been no detailed reports of rat tissue-specific miRNAs. In this study, Home-made rat miRNA microarrays which established in our previous study were used to investigate rat neural tissue-specific miRNAs, and mapped their target genes in rat tissues. This study will provide information for the functional analysis of these miRNAs. Results In order to obtain as complete a picture of specific miRNA expression in rat neural tissues as possible, customized miRNA microarrays with 152 selected miRNAs from miRBase were used to detect miRNA expression in 14 rat tissues. After a general clustering analysis, 14 rat tissues could be clearly classified into neural and non-neural tissues based on the obtained expression profiles with p values Conclusion Our work provides a global view of rat neural tissue-specific miRNA profiles and a target map of miRNAs, which is expected to contribute to future investigations of miRNA regulatory mechanisms in neural systems.

  20. Digital karyotyping reveals probable target genes at 7q21.3 locus in hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Wang Shengyue

    2011-07-01

    Full Text Available Abstract Background Hepatocellular carcinoma (HCC is a worldwide malignant liver tumor with high incidence in China. Subchromosomal amplifications and deletions accounted for major genomic alterations occurred in HCC. Digital karyotyping was an effective method for analyzing genome-wide chromosomal aberrations at high resolution. Methods A digital karyotyping library of HCC was constructed and 454 Genome Sequencer FLX System (Roche was applied in large scale sequencing of the library. Digital Karyotyping Data Viewer software was used to analyze genomic amplifications and deletions. Genomic amplifications of genes detected by digital karyotyping were examined by real-time quantitative PCR. The mRNA expression level of these genes in tumorous and paired nontumorous tissues was also detected by real-time quantitative RT-PCR. Results A total of 821,252 genomic tags were obtained from the digital karyotyping library of HCC, with 529,162 tags (64% mapped to unique loci of human genome. Multiple subchromosomal amplifications and deletions were detected through analyzing the digital karyotyping data, among which the amplification of 7q21.3 drew our special attention. Validation of genes harbored within amplicons at 7q21.3 locus revealed that genomic amplification of SGCE, PEG10, DYNC1I1 and SLC25A13 occurred in 11 (21%, 11 (21%, 11 (21% and 23 (44% of the 52 HCC samples respectively. Furthermore, the mRNA expression level of SGCE, PEG10 and DYNC1I1 were significantly up-regulated in tumorous liver tissues compared with corresponding nontumorous counterparts. Conclusions Our results indicated that subchromosomal region of 7q21.3 was amplified in HCC, and SGCE, PEG10 and DYNC1I1 were probable protooncogenes located within the 7q21.3 locus.

  1. How Do Users Map Points Between Dissimilar Shapes?

    KAUST Repository

    Hecher, Michael

    2017-07-25

    Finding similar points in globally or locally similar shapes has been studied extensively through the use of various point descriptors or shape-matching methods. However, little work exists on finding similar points in dissimilar shapes. In this paper, we present the results of a study where users were given two dissimilar two-dimensional shapes and asked to map a given point in the first shape to the point in the second shape they consider most similar. We find that user mappings in this study correlate strongly with simple geometric relationships between points and shapes. To predict the probability distribution of user mappings between any pair of simple two-dimensional shapes, two distinct statistical models are defined using these relationships. We perform a thorough validation of the accuracy of these predictions and compare our models qualitatively and quantitatively to well-known shape-matching methods. Using our predictive models, we propose an approach to map objects or procedural content between different shapes in different design scenarios.

  2. Sparse PDF maps for non-linear multi-resolution image operations

    KAUST Repository

    Hadwiger, Markus

    2012-11-01

    We introduce a new type of multi-resolution image pyramid for high-resolution images called sparse pdf maps (sPDF-maps). Each pyramid level consists of a sparse encoding of continuous probability density functions (pdfs) of pixel neighborhoods in the original image. The encoded pdfs enable the accurate computation of non-linear image operations directly in any pyramid level with proper pre-filtering for anti-aliasing, without accessing higher or lower resolutions. The sparsity of sPDF-maps makes them feasible for gigapixel images, while enabling direct evaluation of a variety of non-linear operators from the same representation. We illustrate this versatility for antialiased color mapping, O(n) local Laplacian filters, smoothed local histogram filters (e.g., median or mode filters), and bilateral filters. © 2012 ACM.

  3. Influence of contact force on voltage mapping: A combined magnetic resonance imaging and electroanatomic mapping study in patients with tetralogy of Fallot.

    Science.gov (United States)

    Teijeira-Fernandez, Elvis; Cochet, Hubert; Bourier, Felix; Takigawa, Masateru; Cheniti, Ghassen; Thompson, Nathaniel; Frontera, Antonio; Camaioni, Claudia; Massouille, Gregoire; Jalal, Zakaria; Derval, Nicolas; Iriart, Xavier; Denis, Arnaud; Hocini, Meleze; Haissaguerre, Michel; Jais, Pierre; Thambo, Jean-Benoit; Sacher, Frederic

    2018-03-20

    Voltage criteria for ventricular mapping have been obtained from small series of patients and prioritizing high specificity. The purpose of this study was to analyse the potential influence of contact force (CF) on voltage mapping and to define voltage cutoff values for right ventricular (RV) scar using the tetralogy of Fallot as a model of transmural RV scar and magnetic resonance imaging (MRI) as reference. Fourteen patients (age 32.6 ± 14.3 years; 5 female) with repaired tetralogy of Fallot underwent high-resolution cardiac MRI (1.25 × 1.25 × 2.5 mm). Scar, defined as pixels with intensity >50% maximum, was mapped over the RV geometry and merged within the CARTO system to RV endocardial voltage maps acquired using a 3.5-mm ablation catheter with CF technology (SmartTouch, Biosense Webster). In total, 2446 points were analyzed, 915 within scars and 1531 in healthy tissue according to MRI. CF correlated to unipolar (ρ = 0.186; P voltage in healthy tissue (ρ = 0.245; P voltage cutoffs of 5.19 mV for unipolar voltage and 1.76 mV for bipolar voltage, yielding sensitivity/specificity of 0.89/0.85 and 0.9/0.9, respectively. CF is an important factor to be taken into account for voltage mapping. If good CF is applied, unipolar and bipolar voltage cutoffs of 5.19 mV and 1.76 mV are optimal for identifying RV scar on endocardial mapping with the SmartTouch catheter. Data on the diagnostic accuracy of different voltage cutoff values are provided. Copyright © 2018 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  4. Brain Injury Lesion Imaging Using Preconditioned Quantitative Susceptibility Mapping without Skull Stripping.

    Science.gov (United States)

    Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y

    2018-04-01

    Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping

  5. Probabilistic storm surge inundation maps for Metro Manila based on Philippine public storm warning signals

    Science.gov (United States)

    Tablazon, J.; Caro, C. V.; Lagmay, A. M. F.; Briones, J. B. L.; Dasallas, L.; Lapidez, J. P.; Santiago, J.; Suarez, J. K.; Ladiero, C.; Gonzalo, L. A.; Mungcal, M. T. F.; Malano, V.

    2015-03-01

    A storm surge is the sudden rise of sea water over the astronomical tides, generated by an approaching storm. This event poses a major threat to the Philippine coastal areas, as manifested by Typhoon Haiyan on 8 November 2013. This hydro-meteorological hazard is one of the main reasons for the high number of casualties due to the typhoon, with 6300 deaths. It became evident that the need to develop a storm surge inundation map is of utmost importance. To develop these maps, the Nationwide Operational Assessment of Hazards under the Department of Science and Technology (DOST-Project NOAH) simulated historical tropical cyclones that entered the Philippine Area of Responsibility. The Japan Meteorological Agency storm surge model was used to simulate storm surge heights. The frequency distribution of the maximum storm surge heights was calculated using simulation results of tropical cyclones under a specific public storm warning signal (PSWS) that passed through a particular coastal area. This determines the storm surge height corresponding to a given probability of occurrence. The storm surge heights from the model were added to the maximum astronomical tide data from WXTide software. The team then created maps of inundation for a specific PSWS using the probability of exceedance derived from the frequency distribution. Buildings and other structures were assigned a probability of exceedance depending on their occupancy category, i.e., 1% probability of exceedance for critical facilities, 10% probability of exceedance for special occupancy structures, and 25% for standard occupancy and miscellaneous structures. The maps produced show the storm-surge-vulnerable areas in Metro Manila, illustrated by the flood depth of up to 4 m and extent of up to 6.5 km from the coastline. This information can help local government units in developing early warning systems, disaster preparedness and mitigation plans, vulnerability assessments, risk-sensitive land use plans, shoreline

  6. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  7. Evolution of probability measures by cellular automata on algebraic topological Markov chains

    Directory of Open Access Journals (Sweden)

    ALEJANDRO MAASS

    2003-01-01

    Full Text Available In this paper we review some recent results on the evolution of probability measures under cellular automata acting on a fullshift. In particular we discuss the crucial role of the attractiveness of maximal measures. We enlarge the context of the results of a previous study of topological Markov chains that are Abelian groups; the shift map is an automorphism of this group. This is carried out by studying the dynamics of Markov measures by a particular additive cellular automata. Many of these topics were within the focus of Francisco Varela's mathematical interests.

  8. An electrical impedance tomography (EIT) multi-electrode needle-probe device for local assessment of heterogeneous tissue impeditivity.

    Science.gov (United States)

    Meroni, Davide; Maglioli, Camilla Carpano; Bovio, Dario; Greco, Francesco G; Aliverti, Andrea

    2017-07-01

    Electrical Impedance Tomography (EIT) is an image reconstruction technique applied in medicine for the electrical imaging of living tissues. In literature there is the evidence that a large resistivity variation related to the differences of the human tissues exists. As a result of this interest for the electrical characterization of the biological samples, recently the attention is also focused on the identification and characterization of the human tissue, by studying the homogeneity of its structure. An 8 electrodes needle-probe device has been developed with the intent of identifying the structural inhomogeneities under the surface layers. Ex-vivo impeditivity measurements, by placing the needle-probe in 5 different patterns of fat and lean porcine tissue, were performed, and impeditivity maps were obtained by EIDORS open source software for image reconstruction in electrical impedance. The values composing the maps have been analyzed, pointing out a good tissue discrimination, and the conformity with the real images. We conclude that this device is able to perform impeditivity maps matching to reality for position and orientation. In all the five patterns presented is possible to identify and replicate correctly the heterogeneous tissue under test. This new procedure can be helpful to the medical staff to completely characterize the biological sample, in different unclear situations.

  9. Plan curvature and landslide probability in regions dominated by earth flows and earth slides

    Science.gov (United States)

    Ohlmacher, G.C.

    2007-01-01

    Damaging landslides in the Appalachian Plateau and scattered regions within the Midcontinent of North America highlight the need for landslide-hazard mapping and a better understanding of the geomorphic development of landslide terrains. The Plateau and Midcontinent have the necessary ingredients for landslides including sufficient relief, steep slope gradients, Pennsylvanian and Permian cyclothems that weather into fine-grained soils containing considerable clay, and adequate precipitation. One commonly used parameter in landslide-hazard analysis that is in need of further investigation is plan curvature. Plan curvature is the curvature of the hillside in a horizontal plane or the curvature of the contours on a topographic map. Hillsides can be subdivided into regions of concave outward plan curvature called hollows, convex outward plan curvature called noses, and straight contours called planar regions. Statistical analysis of plan-curvature and landslide datasets indicate that hillsides with planar plan curvature have the highest probability for landslides in regions dominated by earth flows and earth slides in clayey soils (CH and CL). The probability of landslides decreases as the hillsides become more concave or convex. Hollows have a slightly higher probability for landslides than noses. In hollows landslide material converges into the narrow region at the base of the slope. The convergence combined with the cohesive nature of fine-grained soils creates a buttressing effect that slows soil movement and increases the stability of the hillside within the hollow. Statistical approaches that attempt to determine landslide hazard need to account for the complex relationship between plan curvature, type of landslide, and landslide susceptibility. ?? 2007 Elsevier B.V. All rights reserved.

  10. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    International Nuclear Information System (INIS)

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-01-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear–quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18–30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8–30.9 Gy) and 22.0 Gy (range, 20.2–26.6 Gy), respectively. By use of conventional values for α/β, volume parameter n, 50% complication probability dose TD 50 , and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of α/β and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of α/β and n yielded better predictions (0.7 complications), with n = 0.023 and α/β = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high α/β value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models traditionally used to estimate spinal cord NTCP

  11. Direct tissue oxygen monitoring by in vivo photoacoustic lifetime imaging (PALI)

    Science.gov (United States)

    Shao, Qi; Morgounova, Ekaterina; Ashkenazi, Shai

    2014-03-01

    Tissue oxygen plays a critical role in maintaining tissue viability and in various diseases, including response to therapy. Images of oxygen distribution provide the history of tissue hypoxia and evidence of oxygen availability in the circulatory system. Currently available methods of direct measuring or imaging tissue oxygen all have significant limitations. Previously, we have reported a non-invasive in vivo imaging modality based on photoacoustic lifetime. The technique maps the excited triplet state of oxygen-sensitive dye, thus reflects the spatial and temporal distribution of tissue oxygen. We have applied PALI on tumor hypoxia in small animals, and the hypoxic region imaged by PALI is consistent with the site of the tumor imaged by ultrasound. Here, we present two studies of applying PALI to monitor changes of tissue oxygen by modulations. The first study involves an acute ischemia model using a thin thread tied around the hind limb of a normal mouse to reduce the blood flow. PALI images were acquired before, during, and after the restriction. The drop of muscle pO2 and recovery from hypoxia due to reperfusion were observed by PALI tracking the same region. The second study modulates tissue oxygen by controlling the percentage of oxygen the mouse inhales. We demonstrate that PALI is able to reflect the change of oxygen level with respect to both hyperbaric and hypobaric conditions. We expect this technique to be very attractive for a range of clinical applications in which tissue oxygen mapping would improve therapy decision making and treatment planning.

  12. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  13. Blind source separation of ex-vivo aorta tissue multispectral images.

    Science.gov (United States)

    Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson

    2015-05-01

    Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method's performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue.

  14. Connective tissue regeneration in skeletal muscle after eccentric contraction-induced injury

    DEFF Research Database (Denmark)

    Mackey, Abigail Louise; Kjaer, Michael

    2017-01-01

    Human skeletal muscle has the potential to regenerate completely after injury induced under controlled experimental conditions. The events inside the myofibres as they undergo necrosis, followed closely by satellite cell mediated myogenesis, have been mapped in detail. Much less is known about...... the adaptation throughout this process of both the connective tissue structures surrounding the myofibres, and the fibroblasts, the cells responsible for synthesising this connective tissue. However, the few studies investigating muscle connective tissue remodelling demonstrate a strong response that appears...

  15. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Synthetic aperture tissue and flow ultrasound imaging

    DEFF Research Database (Denmark)

    Nikolov, Svetoslav

    imaging applied to medical ultrasound. It is divided into two major parts: tissue and blood flow imaging. Tissue imaging using synthetic aperture algorithms has been investigated for about two decades, but has not been implemented in medical scanners yet. Among the other reasons, the conventional scanning...... and beamformation methods are adequate for the imaging modalities in clinical use - the B-mode imaging of tissue structures, and the color mapping of blood flow. The acquisition time, however, is too long, and these methods fail to perform real-time three-dimensional scans. The synthetic transmit aperture......, on the other hand, can create a Bmode image with as little as 2 emissions, thus significantly speeding-up the scan procedure. The first part of the dissertation describes the synthetic aperture tissue imaging. It starts with an overview of the efforts previously made by other research groups. A classification...

  2. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  3. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  4. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  5. Comparison of Cerebral Glucose Metabolism between Possible and Probable Multiple System Atrophy

    Directory of Open Access Journals (Sweden)

    Kyum-Yil Kwon

    2009-05-01

    Full Text Available Background: To investigate the relationship between presenting clinical manifestations and imaging features of multisystem neuronal dysfunction in MSA patients, using 18F-fluorodeoxyglucose positron emission tomography (18F-FDG PET. Methods: We studied 50 consecutive MSA patients with characteristic brain MRI findings of MSA, including 34 patients with early MSA-parkinsonian (MSA-P and 16 with early MSA-cerebellar (MSA-C. The cerebral glucose metabolism of all MSA patients was evaluated in comparison with 25 age-matched controls. 18F-FDG PET results were assessed by the Statistic Parametric Mapping (SPM analysis and the regions of interest (ROI method. Results: The mean time from disease onset to 18F-FDG PET was 25.9±13.0 months in 34 MSA-P patients and 20.1±11.1 months in 16 MSA-C patients. Glucose metabolism of the putamen showed a greater decrease in possible MSA-P than in probable MSA-P (p=0.031. Although the Unified Multiple System Atrophy Rating Scale (UMSARS score did not differ between possible MSA-P and probable MSA-P, the subscores of rigidity (p=0.04 and bradykinesia (p= 0.008 were significantly higher in possible MSA-P than in probable MSA-P. Possible MSA-C showed a greater decrease in glucose metabolism of the cerebellum than probable MSA-C (p=0.016. Conclusions: Our results may suggest that the early neuropathological pattern of possible MSA with a predilection for the striatonigral or olivopontocerebellar system differs from that of probable MSA, which has prominent involvement of the autonomic nervous system in addition to the striatonigral or olivopontocerebellar system.

  6. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Rosie D Salazar

    Full Text Available The common toad (Bufo bufo is of increasing conservation concern in the United Kingdom (UK due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat.We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013. We developed a population-level resource selection function (RSF to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads.The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  7. Effects of white matter microstructure on phase and susceptibility maps.

    Science.gov (United States)

    Wharton, Samuel; Bowtell, Richard

    2015-03-01

    To investigate the effects on quantitative susceptibility mapping (QSM) and susceptibility tensor imaging (STI) of the frequency variation produced by the microstructure of white matter (WM). The frequency offsets in a WM tissue sample that are not explained by the effect of bulk isotropic or anisotropic magnetic susceptibility, but rather result from the local microstructure, were characterized for the first time. QSM and STI were then applied to simulated frequency maps that were calculated using a digitized whole-brain, WM model formed from anatomical and diffusion tensor imaging data acquired from a volunteer. In this model, the magnitudes of the frequency contributions due to anisotropy and microstructure were derived from the results of the tissue experiments. The simulations suggest that the frequency contribution of microstructure is much larger than that due to bulk effects of anisotropic magnetic susceptibility. In QSM, the microstructure contribution introduced artificial WM heterogeneity. For the STI processing, the microstructure contribution caused the susceptibility anisotropy to be significantly overestimated. Microstructure-related phase offsets in WM yield artifacts in the calculated susceptibility maps. If susceptibility mapping is to become a robust MRI technique, further research should be carried out to reduce the confounding effects of microstructure-related frequency contributions. © 2014 Wiley Periodicals, Inc.

  8. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  9. Joint probability distributions and fluctuation theorems

    International Nuclear Information System (INIS)

    García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien

    2012-01-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators

  10. ChIP-seq Accurately Predicts Tissue-Specific Activity of Enhancers

    Energy Technology Data Exchange (ETDEWEB)

    Visel, Axel; Blow, Matthew J.; Li, Zirong; Zhang, Tao; Akiyama, Jennifer A.; Holt, Amy; Plajzer-Frick, Ingrid; Shoukry, Malak; Wright, Crystal; Chen, Feng; Afzal, Veena; Ren, Bing; Rubin, Edward M.; Pennacchio, Len A.

    2009-02-01

    A major yet unresolved quest in decoding the human genome is the identification of the regulatory sequences that control the spatial and temporal expression of genes. Distant-acting transcriptional enhancers are particularly challenging to uncover since they are scattered amongst the vast non-coding portion of the genome. Evolutionary sequence constraint can facilitate the discovery of enhancers, but fails to predict when and where they are active in vivo. Here, we performed chromatin immunoprecipitation with the enhancer-associated protein p300, followed by massively-parallel sequencing, to map several thousand in vivo binding sites of p300 in mouse embryonic forebrain, midbrain, and limb tissue. We tested 86 of these sequences in a transgenic mouse assay, which in nearly all cases revealed reproducible enhancer activity in those tissues predicted by p300 binding. Our results indicate that in vivo mapping of p300 binding is a highly accurate means for identifying enhancers and their associated activities and suggest that such datasets will be useful to study the role of tissue-specific enhancers in human biology and disease on a genome-wide scale.

  11. Investigation of elemental changes in brain tissues following excitotoxic injury

    International Nuclear Information System (INIS)

    Siegele, Rainer; Howell, Nicholas R.; Callaghan, Paul D.; Pastuovic, Zeljko

    2013-01-01

    Recently the ANSTO heavy ion microprobe has been used for elemental mapping of thin brain tissue sections. The fact that a very small portion of the proton energy is used for X-ray excitation combined with small variations of the major element concentrations makes μ-PIXE imaging and GeoPIXE analysis a challenging task. Excitotoxic brain injury underlies the pathology of stroke and various neurodegenerative disorders. Large fluxes in Ca +2 cytosolic concentrations are a key feature of the initiation of this pathophysiological process. In order to understand if these modifications are associated with changes in the elemental composition, several brain sections have been mapped with μ-PIXE. Increases in Ca +2 cytosolic concentrations were indicative of the pathophysiological process continuing 1 week after an initiating neural insult. We were able to measure significant variations in K and Ca concentration distribution across investigated brain tissue. These variations correlate very well with physiological changes visible in the brain tissue. Moreover, the obtained μ-PIXE results clearly demonstrate that the elemental composition changes significantly correlate with brain drauma

  12. Investigation of elemental changes in brain tissues following excitotoxic injury

    Energy Technology Data Exchange (ETDEWEB)

    Siegele, Rainer, E-mail: rns@ansto.gov.au [Institute for Environmental Research, ANSTO, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia); Howell, Nicholas R.; Callaghan, Paul D. [Life Sciences, ANSTO, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia); Pastuovic, Zeljko [Institute for Environmental Research, ANSTO, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia)

    2013-07-01

    Recently the ANSTO heavy ion microprobe has been used for elemental mapping of thin brain tissue sections. The fact that a very small portion of the proton energy is used for X-ray excitation combined with small variations of the major element concentrations makes μ-PIXE imaging and GeoPIXE analysis a challenging task. Excitotoxic brain injury underlies the pathology of stroke and various neurodegenerative disorders. Large fluxes in Ca{sup +2} cytosolic concentrations are a key feature of the initiation of this pathophysiological process. In order to understand if these modifications are associated with changes in the elemental composition, several brain sections have been mapped with μ-PIXE. Increases in Ca{sup +2} cytosolic concentrations were indicative of the pathophysiological process continuing 1 week after an initiating neural insult. We were able to measure significant variations in K and Ca concentration distribution across investigated brain tissue. These variations correlate very well with physiological changes visible in the brain tissue. Moreover, the obtained μ-PIXE results clearly demonstrate that the elemental composition changes significantly correlate with brain drauma.

  13. Polarized light microscopy for 3-dimensional mapping of collagen fiber architecture in ocular tissues.

    Science.gov (United States)

    Yang, Bin; Jan, Ning-Jiun; Brazile, Bryn; Voorhees, Andrew; Lathrop, Kira L; Sigal, Ian A

    2018-04-06

    Collagen fibers play a central role in normal eye mechanics and pathology. In ocular tissues, collagen fibers exhibit a complex 3-dimensional (3D) fiber orientation, with both in-plane (IP) and out-of-plane (OP) orientations. Imaging techniques traditionally applied to the study of ocular tissues only quantify IP fiber orientation, providing little information on OP fiber orientation. Accurate description of the complex 3D fiber microstructures of the eye requires quantifying full 3D fiber orientation. Herein, we present 3dPLM, a technique based on polarized light microscopy developed to quantify both IP and OP collagen fiber orientations of ocular tissues. The performance of 3dPLM was examined by simulation and experimental verification and validation. The experiments demonstrated an excellent agreement between extracted and true 3D fiber orientation. Both IP and OP fiber orientations can be extracted from the sclera and the cornea, providing previously unavailable quantitative 3D measures and insight into the tissue microarchitecture. Together, the results demonstrate that 3dPLM is a powerful imaging technique for the analysis of ocular tissues. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Dosimetric precision requirements and quantities for characterizing the response of tumors and normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Brahme, A [Karolinska Inst., Stockholm (Sweden). Dept. of Radiation Physics

    1996-08-01

    Based on simple radiobiological models the effect of the distribution of absorbed dose in therapy beams on the radiation response of tumor and normal tissue volumes are investigated. Under the assumption that the dose variation in the treated volume is small it is shown that the response of the tissue to radiation is determined mainly by the mean dose to the tumor or normal tissue volume in question. Quantitative expressions are also given for the increased probability of normal tissue complications and the decreased probability of tumor control as a function of increasing dose variations around the mean dose level to these tissues. When the dose variations are large the minimum tumor dose (to cm{sup 3} size volumes) will generally be better related to tumor control and the highest dose to significant portions of normal tissue correlates best to complications. In order not to lose more than one out of 20 curable patients (95% of highest possible treatment outcome) the required accuracy in the dose distribution delivered to the target volume should be 2.5% (1{sigma}) for a mean dose response gradient {gamma} in the range 2 - 3. For more steeply responding tumors and normal tissues even stricter requirements may be desirable. (author). 15 refs, 6 figs.

  15. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  16. Lenses and effective spatial resolution in macroscopic optical mapping

    International Nuclear Information System (INIS)

    Bien, Harold; Parikh, Puja; Entcheva, Emilia

    2007-01-01

    Optical mapping of excitation dynamically tracks electrical waves travelling through cardiac or brain tissue by the use of fluorescent dyes. There are several characteristics that set optical mapping apart from other imaging modalities: dynamically changing signals requiring short exposure times, dim fluorescence demanding sensitive sensors and wide fields of view (low magnification) resulting in poor optical performance. These conditions necessitate the use of optics with good light gathering ability, i.e. lenses having high numerical aperture. Previous optical mapping studies often used sensor resolution to estimate the minimum spatial feature resolvable, assuming perfect optics and infinite contrast. We examine here the influence of finite contrast and real optics on the effective spatial resolution in optical mapping under broad-field illumination for both lateral (in-plane) resolution and axial (depth) resolution of collected fluorescence signals

  17. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  18. Mapping local anisotropy axis for scattering media using backscattering Mueller matrix imaging

    Science.gov (United States)

    He, Honghui; Sun, Minghao; Zeng, Nan; Du, E.; Guo, Yihong; He, Yonghong; Ma, Hui

    2014-03-01

    Mueller matrix imaging techniques can be used to detect the micro-structure variations of superficial biological tissues, including the sizes and shapes of cells, the structures in cells, and the densities of the organelles. Many tissues contain anisotropic fibrous micro-structures, such as collagen fibers, elastin fibers, and muscle fibers. Changes of these fibrous structures are potentially good indicators for some pathological variations. In this paper, we propose a quantitative analysis technique based on Mueller matrix for mapping local anisotropy axis of scattering media. By conducting both experiments on silk sample and Monte Carlo simulation based on the sphere-cylinder scattering model (SCSM), we extract anisotropy axis parameters from different backscattering Mueller matrix elements. Moreover, we testify the possible applications of these parameters for biological tissues. The preliminary experimental results of human cancerous samples show that, these parameters are capable to map the local axis of fibers. Since many pathological changes including early stage cancers affect the well aligned structures for tissues, the experimental results indicate that these parameters can be used as potential tools in clinical applications for biomedical diagnosis purposes.

  19. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  20. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  1. Quantitative multi-parameter mapping of R1, PD*, MT and R2* at 3T: a multi-center validation

    Directory of Open Access Journals (Sweden)

    Nikolaus eWeiskopf

    2013-06-01

    Full Text Available Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1mm high-resolution maps of the longitudinal relaxation rate (R1=1/T1, effective proton density (PD*, magnetization transfer saturation (MT and effective transverse relaxation rate (R2*=1/T2*. MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV for typical morphometric measures (i.e., gray matter probability maps used in voxel-based morphometry and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1% and 8%, respectively, except for the inter-site CoV of R2* (< 20%. The gray matter probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived gray matter probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.

  2. Optical histology: a method to visualize microvasculature in thick tissue sections of mouse brain.

    Directory of Open Access Journals (Sweden)

    Austin J Moy

    Full Text Available The microvasculature is the network of blood vessels involved in delivering nutrients and gases necessary for tissue survival. Study of the microvasculature often involves immunohistological methods. While useful for visualizing microvasculature at the µm scale in specific regions of interest, immunohistology is not well suited to visualize the global microvascular architecture in an organ. Hence, use of immunohistology precludes visualization of the entire microvasculature of an organ, and thus impedes study of global changes in the microvasculature that occur in concert with changes in tissue due to various disease states. Therefore, there is a critical need for a simple, relatively rapid technique that will facilitate visualization of the microvascular network of an entire tissue.The systemic vasculature of a mouse is stained with the fluorescent lipophilic dye DiI using a method called "vessel painting". The brain, or other organ of interest, is harvested and fixed in 4% paraformaldehyde. The organ is then sliced into 1 mm sections and optically cleared, or made transparent, using FocusClear, a proprietary optical clearing agent. After optical clearing, the DiI-labeled tissue microvasculature is imaged using confocal fluorescence microscopy and adjacent image stacks tiled together to produce a depth-encoded map of the microvasculature in the tissue slice. We demonstrated that the use of optical clearing enhances both the tissue imaging depth and the estimate of the vascular density. Using our "optical histology" technique, we visualized microvasculature in the mouse brain to a depth of 850 µm.Presented here are maps of the microvasculature in 1 mm thick slices of mouse brain. Using combined optical clearing and optical imaging techniques, we devised a methodology to enhance the visualization of the microvasculature in thick tissues. We believe this technique could potentially be used to generate a three-dimensional map of the

  3. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  4. Mechanical modulation of nascent stem cell lineage commitment in tissue engineering scaffolds.

    Science.gov (United States)

    Song, Min Jae; Dean, David; Knothe Tate, Melissa L

    2013-07-01

    Taking inspiration from tissue morphogenesis in utero, this study tests the concept of using tissue engineering scaffolds as delivery devices to modulate emergent structure-function relationships at early stages of tissue genesis. We report on the use of a combined computational fluid dynamics (CFD) modeling, advanced manufacturing methods, and experimental fluid mechanics (micro-piv and strain mapping) for the prospective design of tissue engineering scaffold geometries that deliver spatially resolved mechanical cues to stem cells seeded within. When subjected to a constant magnitude global flow regime, the local scaffold geometry dictates the magnitudes of mechanical stresses and strains experienced by a given cell, and in a spatially resolved fashion, similar to patterning during morphogenesis. In addition, early markers of mesenchymal stem cell lineage commitment relate significantly to the local mechanical environment of the cell. Finally, by plotting the range of stress-strain states for all data corresponding to nascent cell lineage commitment (95% CI), we begin to "map the mechanome", defining stress-strain states most conducive to targeted cell fates. In sum, we provide a library of reference mechanical cues that can be delivered to cells seeded on tissue engineering scaffolds to guide target tissue phenotypes in a temporally and spatially resolved manner. Knowledge of these effects allows for prospective scaffold design optimization using virtual models prior to prototyping and clinical implementation. Finally, this approach enables the development of next generation scaffolds cum delivery devices for genesis of complex tissues with heterogenous properties, e.g., organs, joints or interface tissues such as growth plates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  6. Automatic Mapping Extraction from Multiecho T2-Star Weighted Magnetic Resonance Images for Improving Morphological Evaluations in Human Brain

    Directory of Open Access Journals (Sweden)

    Shaode Yu

    2013-01-01

    Full Text Available Mapping extraction is useful in medical image analysis. Similarity coefficient mapping (SCM replaced signal response to time course in tissue similarity mapping with signal response to TE changes in multiecho T2-star weighted magnetic resonance imaging without contrast agent. Since different tissues are with different sensitivities to reference signals, a new algorithm is proposed by adding a sensitivity index to SCM. It generates two mappings. One measures relative signal strength (SSM and the other depicts fluctuation magnitude (FMM. Meanwhile, the new method is adaptive to generate a proper reference signal by maximizing the sum of contrast index (CI from SSM and FMM without manual delineation. Based on four groups of images from multiecho T2-star weighted magnetic resonance imaging, the capacity of SSM and FMM in enhancing image contrast and morphological evaluation is validated. Average contrast improvement index (CII of SSM is 1.57, 1.38, 1.34, and 1.41. Average CII of FMM is 2.42, 2.30, 2.24, and 2.35. Visual analysis of regions of interest demonstrates that SSM and FMM show better morphological structures than original images, T2-star mapping and SCM. These extracted mappings can be further applied in information fusion, signal investigation, and tissue segmentation.

  7. Dynamic Programming for Re-Mapping Noisy Fixations in Translation Tasks

    DEFF Research Database (Denmark)

    Carl, Michael

    2013-01-01

    possible fixated symbols, including those on the line above and below the naïve fixation mapping. In a second step a dynamic programming algorithm applies a number of heuristics to find the best path through the lattice, based on the probable distance in characters, in words and in pixels between...

  8. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  9. Stochastic thermodynamics of quantum maps with and without equilibrium.

    Science.gov (United States)

    Barra, Felipe; Lledó, Cristóbal

    2017-11-01

    We study stochastic thermodynamics for a quantum system of interest whose dynamics is described by a completely positive trace-preserving (CPTP) map as a result of its interaction with a thermal bath. We define CPTP maps with equilibrium as CPTP maps with an invariant state such that the entropy production due to the action of the map on the invariant state vanishes. Thermal maps are a subgroup of CPTP maps with equilibrium. In general, for CPTP maps, the thermodynamic quantities, such as the entropy production or work performed on the system, depend on the combined state of the system plus its environment. We show that these quantities can be written in terms of system properties for maps with equilibrium. The relations that we obtain are valid for arbitrary coupling strengths between the system and the thermal bath. The fluctuations of thermodynamic quantities are considered in the framework of a two-point measurement scheme. We derive the entropy production fluctuation theorem for general maps and a fluctuation relation for the stochastic work on a system that starts in the Gibbs state. Some simplifications for the probability distributions in the case of maps with equilibrium are presented. We illustrate our results by considering spin 1/2 systems under thermal maps, nonthermal maps with equilibrium, maps with nonequilibrium steady states, and concatenations of them. Finally, and as an important application, we consider a particular limit in which the concatenation of maps generates a continuous time evolution in Lindblad form for the system of interest, and we show that the concept of maps with and without equilibrium translates into Lindblad equations with and without quantum detailed balance, respectively. The consequences for the thermodynamic quantities in this limit are discussed.

  10. Global mapping of transposon location.

    Directory of Open Access Journals (Sweden)

    Abram Gabriel

    2006-12-01

    Full Text Available Transposable genetic elements are ubiquitous, yet their presence or absence at any given position within a genome can vary between individual cells, tissues, or strains. Transposable elements have profound impacts on host genomes by altering gene expression, assisting in genomic rearrangements, causing insertional mutations, and serving as sources of phenotypic variation. Characterizing a genome's full complement of transposons requires whole genome sequencing, precluding simple studies of the impact of transposition on interindividual variation. Here, we describe a global mapping approach for identifying transposon locations in any genome, using a combination of transposon-specific DNA extraction and microarray-based comparative hybridization analysis. We use this approach to map the repertoire of endogenous transposons in different laboratory strains of Saccharomyces cerevisiae and demonstrate that transposons are a source of extensive genomic variation. We also apply this method to mapping bacterial transposon insertion sites in a yeast genomic library. This unique whole genome view of transposon location will facilitate our exploration of transposon dynamics, as well as defining bases for individual differences and adaptive potential.

  11. Mapping the Hydropathy of Amino Acids Based on Their Local Solvation Structure

    KAUST Repository

    Bonella, S.; Raimondo, D.; Milanetti, E.; Tramontano, A.; Ciccotti, G.

    2014-01-01

    densities can be used to map the hydrophilic and hydrophobic groups on the amino acids with greater detail than possible with other available methods. Three indicators are then defined based on the features of these probabilities to quantify the specific

  12. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    NARCIS (Netherlands)

    Avti, P.K.; Hu, S.; Favazza, C.; Mikos, A.G.; Jansen, J.A.; Shroyer, K.R.; Wang, L.V.; Sitharaman, B.

    2012-01-01

    AIMS: In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (microg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies

  13. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  14. Atomically resolved tissue integration.

    Science.gov (United States)

    Karlsson, Johan; Sundell, Gustav; Thuvander, Mattias; Andersson, Martin

    2014-08-13

    In the field of biomedical technology, a critical aspect is the ability to control and understand the integration of an implantable device in living tissue. Despite the technical advances in the development of biomaterials, the elaborate interplay encompassing materials science and biology on the atomic level is not very well understood. Within implantology, anchoring a biomaterial device into bone tissue is termed osseointegration. In the most accepted theory, osseointegration is defined as an interfacial bonding between implant and bone; however, there is lack of experimental evidence to confirm this. Here we show that atom probe tomography can be used to study the implant-tissue interaction, allowing for three-dimensional atomic mapping of the interface region. Interestingly, our analyses demonstrated that direct contact between Ca atoms and the implanted titanium oxide surface is formed without the presence of a protein interlayer, which means that a pure inorganic interface is created, hence giving experimental support to the current theory of osseointegration. We foresee that this result will be of importance in the development of future biomaterials as well as in the design of in vitro evaluation techniques.

  15. Mass spectrometry imaging enriches biomarker discovery approaches with candidate mapping.

    Science.gov (United States)

    Scott, Alison J; Jones, Jace W; Orschell, Christie M; MacVittie, Thomas J; Kane, Maureen A; Ernst, Robert K

    2014-01-01

    Integral to the characterization of radiation-induced tissue damage is the identification of unique biomarkers. Biomarker discovery is a challenging and complex endeavor requiring both sophisticated experimental design and accessible technology. The resources within the National Institute of Allergy and Infectious Diseases (NIAID)-sponsored Consortium, Medical Countermeasures Against Radiological Threats (MCART), allow for leveraging robust animal models with novel molecular imaging techniques. One such imaging technique, MALDI (matrix-assisted laser desorption ionization) mass spectrometry imaging (MSI), allows for the direct spatial visualization of lipids, proteins, small molecules, and drugs/drug metabolites-or biomarkers-in an unbiased manner. MALDI-MSI acquires mass spectra directly from an intact tissue slice in discrete locations across an x, y grid that are then rendered into a spatial distribution map composed of ion mass and intensity. The unique mass signals can be plotted to generate a spatial map of biomarkers that reflects pathology and molecular events. The crucial unanswered questions that can be addressed with MALDI-MSI include identification of biomarkers for radiation damage that reflect the response to radiation dose over time and the efficacy of therapeutic interventions. Techniques in MALDI-MSI also enable integration of biomarker identification among diverse animal models. Analysis of early, sublethally irradiated tissue injury samples from diverse mouse tissues (lung and ileum) shows membrane phospholipid signatures correlated with histological features of these unique tissues. This paper will discuss the application of MALDI-MSI for use in a larger biomarker discovery pipeline.

  16. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  17. Multivariate normal tissue complication probability modeling of gastrointestinal toxicity after external beam radiotherapy for localized prostate cancer

    International Nuclear Information System (INIS)

    Cella, Laura; D’Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Doria, Francesca; Faiella, Adriana; Loffredo, Filomena; Salvatore, Marco; Pacelli, Roberto

    2013-01-01

    The risk of radio-induced gastrointestinal (GI) complications is affected by several factors other than the dose to the rectum such as patient characteristics, hormonal or antihypertensive therapy, and acute rectal toxicity. Purpose of this work is to study clinical and dosimetric parameters impacting on late GI toxicity after prostate external beam radiotherapy (RT) and to establish multivariate normal tissue complication probability (NTCP) model for radiation-induced GI complications. A total of 57 men who had undergone definitive RT for prostate cancer were evaluated for GI events classified using the RTOG/EORTC scoring system. Their median age was 73 years (range 53–85). The patients were assessed for GI toxicity before, during, and periodically after RT completion. Several clinical variables along with rectum dose-volume parameters (Vx) were collected and their correlation to GI toxicity was analyzed by Spearman’s rank correlation coefficient (Rs). Multivariate logistic regression method using resampling techniques was applied to select model order and parameters for NTCP modeling. Model performance was evaluated through the area under the receiver operating characteristic curve (AUC). At a median follow-up of 30 months, 37% (21/57) patients developed G1-2 acute GI events while 33% (19/57) were diagnosed with G1-2 late GI events. An NTCP model for late mild/moderate GI toxicity based on three variables including V65 (OR = 1.03), antihypertensive and/or anticoagulant (AH/AC) drugs (OR = 0.24), and acute GI toxicity (OR = 4.3) was selected as the most predictive model (Rs = 0.47, p < 0.001; AUC = 0.79). This three-variable model outperforms the logistic model based on V65 only (Rs = 0.28, p < 0.001; AUC = 0.69). We propose a logistic NTCP model for late GI toxicity considering not only rectal irradiation dose but also clinical patient-specific factors. Accordingly, the risk of G1-2 late GI increases as V65 increases, it is higher for patients experiencing

  18. Direct dose mapping versus energy/mass transfer mapping for 4D dose accumulation: fundamental differences and dosimetric consequences.

    Science.gov (United States)

    Li, Haisen S; Zhong, Hualiang; Kim, Jinkoo; Glide-Hurst, Carri; Gulam, Misbah; Nurushev, Teamour S; Chetty, Indrin J

    2014-01-06

    The direct dose mapping (DDM) and energy/mass transfer (EMT) mapping are two essential algorithms for accumulating the dose from different anatomic phases to the reference phase when there is organ motion or tumor/tissue deformation during the delivery of radiation therapy. DDM is based on interpolation of the dose values from one dose grid to another and thus lacks rigor in defining the dose when there are multiple dose values mapped to one dose voxel in the reference phase due to tissue/tumor deformation. On the other hand, EMT counts the total energy and mass transferred to each voxel in the reference phase and calculates the dose by dividing the energy by mass. Therefore it is based on fundamentally sound physics principles. In this study, we implemented the two algorithms and integrated them within the Eclipse treatment planning system. We then compared the clinical dosimetric difference between the two algorithms for ten lung cancer patients receiving stereotactic radiosurgery treatment, by accumulating the delivered dose to the end-of-exhale (EE) phase. Specifically, the respiratory period was divided into ten phases and the dose to each phase was calculated and mapped to the EE phase and then accumulated. The displacement vector field generated by Demons-based registration of the source and reference images was used to transfer the dose and energy. The DDM and EMT algorithms produced noticeably different cumulative dose in the regions with sharp mass density variations and/or high dose gradients. For the planning target volume (PTV) and internal target volume (ITV) minimum dose, the difference was up to 11% and 4% respectively. This suggests that DDM might not be adequate for obtaining an accurate dose distribution of the cumulative plan, instead, EMT should be considered.

  19. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Defraene, Gilles; Van den Bergh, Laura; Al-Mamgani, Abrahim; Haustermans, Karin; Heemsbergen, Wilma; Van den Heuvel, Frank; Lebesque, Joos V.

    2012-01-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D 50 . Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions

  20. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints

  1. Hyaluronic Acid in Normal and Neoplastic Colorectal Tissue: Electrospray Ionization Mass Spectrometric and Fluor Metric Analysis

    Directory of Open Access Journals (Sweden)

    Ana Paula Cleto Marolla

    2016-01-01

    Conclusions: The expression of HA was found to be slightly lower in tumor tissue than in colorectal non-neoplastic mucosa, although this difference was not statistically significant. This finding probably influenced the lower expression of HA in tumor tissue than in colorectal non-neoplastic mucosa. Compared to normal tissues, HA levels are significantly increased in the tumor tissues unless they exhibit lymph node metastasis. Otherwise, the expression of HA in tumor tissue did not correlated with the other clinicopathological parameters.

  2. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  3. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  4. LungMAP: The Molecular Atlas of Lung Development Program.

    Science.gov (United States)

    Ardini-Poleske, Maryanne E; Clark, Robert F; Ansong, Charles; Carson, James P; Corley, Richard A; Deutsch, Gail H; Hagood, James S; Kaminski, Naftali; Mariani, Thomas J; Potter, Steven S; Pryhuber, Gloria S; Warburton, David; Whitsett, Jeffrey A; Palmer, Scott M; Ambalavanan, Namasivayam

    2017-11-01

    The National Heart, Lung, and Blood Institute is funding an effort to create a molecular atlas of the developing lung (LungMAP) to serve as a research resource and public education tool. The lung is a complex organ with lengthy development time driven by interactive gene networks and dynamic cross talk among multiple cell types to control and coordinate lineage specification, cell proliferation, differentiation, migration, morphogenesis, and injury repair. A better understanding of the processes that regulate lung development, particularly alveologenesis, will have a significant impact on survival rates for premature infants born with incomplete lung development and will facilitate lung injury repair and regeneration in adults. A consortium of four research centers, a data coordinating center, and a human tissue repository provides high-quality molecular data of developing human and mouse lungs. LungMAP includes mouse and human data for cross correlation of developmental processes across species. LungMAP is generating foundational data and analysis, creating a web portal for presentation of results and public sharing of data sets, establishing a repository of young human lung tissues obtained through organ donor organizations, and developing a comprehensive lung ontology that incorporates the latest findings of the consortium. The LungMAP website (www.lungmap.net) currently contains more than 6,000 high-resolution lung images and transcriptomic, proteomic, and lipidomic human and mouse data and provides scientific information to stimulate interest in research careers for young audiences. This paper presents a brief description of research conducted by the consortium, database, and portal development and upcoming features that will enhance the LungMAP experience for a community of users. Copyright © 2017 the American Physiological Society.

  5. Diffusion weighted imaging demystified. The technique and potential clinical applications for soft tissue imaging

    International Nuclear Information System (INIS)

    Ahlawat, Shivani; Fayad, Laura M.

    2018-01-01

    Diffusion-weighted imaging (DWI) is a fast, non-contrast technique that is readily available and easy to integrate into an existing imaging protocol. DWI with apparent diffusion coefficient (ADC) mapping offers a quantitative metric for soft tissue evaluation and provides information regarding the cellularity of a region of interest. There are several available methods of performing DWI, and artifacts and pitfalls must be considered when interpreting DWI studies. This review article will review the various techniques of DWI acquisition and utility of qualitative as well as quantitative methods of image interpretation, with emphasis on optimal methods for ADC measurement. The current clinical applications for DWI are primarily related to oncologic evaluation: For the assessment of de novo soft tissue masses, ADC mapping can serve as a useful adjunct technique to routine anatomic sequences for lesion characterization as cyst or solid and, if solid, benign or malignant. For treated soft tissue masses, the role of DWI/ADC mapping in the assessment of treatment response as well as recurrent or residual neoplasm in the setting of operative management is discussed, especially when intravenous contrast medium cannot be given. Emerging DWI applications for non-neoplastic clinical indications are also reviewed. (orig.)

  6. Diffusion weighted imaging demystified. The technique and potential clinical applications for soft tissue imaging

    Energy Technology Data Exchange (ETDEWEB)

    Ahlawat, Shivani [The Johns Hopkins Medical Institutions, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Fayad, Laura M. [The Johns Hopkins Medical Institutions, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); The Johns Hopkins Medical Institutions, Department of Oncology, Baltimore, MD (United States); The Johns Hopkins Medical Institutions, Department of Orthopaedic Surgery, Baltimore, MD (United States)

    2018-03-15

    Diffusion-weighted imaging (DWI) is a fast, non-contrast technique that is readily available and easy to integrate into an existing imaging protocol. DWI with apparent diffusion coefficient (ADC) mapping offers a quantitative metric for soft tissue evaluation and provides information regarding the cellularity of a region of interest. There are several available methods of performing DWI, and artifacts and pitfalls must be considered when interpreting DWI studies. This review article will review the various techniques of DWI acquisition and utility of qualitative as well as quantitative methods of image interpretation, with emphasis on optimal methods for ADC measurement. The current clinical applications for DWI are primarily related to oncologic evaluation: For the assessment of de novo soft tissue masses, ADC mapping can serve as a useful adjunct technique to routine anatomic sequences for lesion characterization as cyst or solid and, if solid, benign or malignant. For treated soft tissue masses, the role of DWI/ADC mapping in the assessment of treatment response as well as recurrent or residual neoplasm in the setting of operative management is discussed, especially when intravenous contrast medium cannot be given. Emerging DWI applications for non-neoplastic clinical indications are also reviewed. (orig.)

  7. Mycobacterium avium subsp. Paratuberculosis (MAP) as a modifying factor in Crohn's disease.

    LENUS (Irish Health Repository)

    Sibartie, Shomik

    2010-02-01

    Crohn\\'s disease (CD) is a multifactorial syndrome with genetic and environmental contributions. Mycobacterium avium subspecies paratuberculosis (MAP) has been frequently isolated from mucosal tissues of patients with CD but the cellular immune response to this bacterium has been poorly described. Our aim was to examine the influence of MAP on T-cell proliferation and cytokine responses in patients with inflammatory bowel disease (IBD).

  8. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  9. Can MRI diffusion-weighted imaging identify postoperative residual/recurrent soft-tissue sarcomas?

    Directory of Open Access Journals (Sweden)

    Mai Maher ElDaly

    2018-01-01

    Full Text Available Purpose: The aim of this study was to evaluate contrast-enhanced magnetic resonance imaging (CE-MRI and quantitative diffusion-weighted imaging (DWI with apparent diffusion coefficient (ADC mapping in the detection of recurrent/residual postoperative soft tissue sarcomas. Materials and Methods: This study included 36 patients; 27 patients had postoperative recurrent/residual soft tissue sarcomas and 9 patients had postoperative and treatment-related changes (inflammation/fibrosis. The DWI was obtained with 3 b values including 0, 400, and 800 s/mm2. Calculation of the ADC value of the lesion was done via placing the region of interest (ROI to include the largest area of the lesion. ADC values were compared to histopathology. Results: Our results showed that including CE-MRI improved the diagnostic accuracy and sensitivity in recurrence detection compared to conventional non-enhanced sequences. However, it showed low specificity (55.56% with a high false-positive rate that may lead to an unnecessary biopsy of a mass such as region of postoperative scar tissue. Conclusion: The joint use of gadolinium-enhanced MRI and quantitative DWI with ADC mapping offer added value in the detection of recurrent/residual postoperative soft tissue sarcoma. This combined use increased both the diagnostic sensitivity and specificity with a cut-off average ADC value for detecting nonmyxoid recurrent/residual lesions ≤1.3 × 10−3 mm2/s (100% specificity and 90.48% sensitivity. Our results showed limited value of DWI with ADC mapping in assessing myxoid sarcomatous tumor recurrences.

  10. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  11. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  12. Cyto- and receptor architectonic mapping of the human brain.

    Science.gov (United States)

    Palomero-Gallagher, Nicola; Zilles, Karl

    2018-01-01

    Mapping of the human brain is more than the generation of an atlas-based parcellation of brain regions using histologic or histochemical criteria. It is the attempt to provide a topographically informed model of the structural and functional organization of the brain. To achieve this goal a multimodal atlas of the detailed microscopic and neurochemical structure of the brain must be registered to a stereotaxic reference space or brain, which also serves as reference for topographic assignment of functional data, e.g., functional magnet resonance imaging, electroencephalography, or magnetoencephalography, as well as metabolic imaging, e.g., positron emission tomography. Although classic maps remain pioneering steps, they do not match recent concepts of the functional organization in many regions, and suffer from methodic drawbacks. This chapter provides a summary of the recent status of human brain mapping, which is based on multimodal approaches integrating results of quantitative cyto- and receptor architectonic studies with focus on the cerebral cortex in a widely used reference brain. Descriptions of the methods for observer-independent and statistically testable cytoarchitectonic parcellations, quantitative multireceptor mapping, and registration to the reference brain, including the concept of probability maps and a toolbox for using the maps in functional neuroimaging studies, are provided. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Spatial distribution and occurrence probability of regional new particle formation events in eastern China

    Directory of Open Access Journals (Sweden)

    X. Shen

    2018-01-01

    Full Text Available In this work, the spatial extent of new particle formation (NPF events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ in the North China Plain (NCP, Mt. Tai (TS in central eastern China, and Lin'an (LAN in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100–200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR and new particle formation rate (J than air masses from Inner Mongolia (IM. At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the probability of NPF events. The spatial

  14. Improved Culture Medium (TiKa) for Mycobacterium avium Subspecies Paratuberculosis (MAP) Matches qPCR Sensitivity and Reveals Significant Proportions of Non-viable MAP in Lymphoid Tissue of Vaccinated MAP Challenged Animals

    DEFF Research Database (Denmark)

    Bull, Tim J.; Munshil, Tulika; Melvang, Heidi Mikkelsen

    2017-01-01

    The quantitative detection of viable pathogen load is an important tool in determining the degree of infection in animals and contamination of foodstuffs. Current conventional culture methods are limited in their ability to determine these levels in Mycobacterium avium subspecies paratuberculosis......Ka culture equates well with qPCR and provides important evidence that accuracy in estimating viable MAP load using DNA tests alone may vary significantly between samples of mucosal and lymphatic origin....... (MAP) due to slow growth, clumping and low recoverability issues. The principle goal of this study was to evaluate a novel culturing process (TiKa) with unique ability to stimulate MAP growth from low sample loads and dilutions. We demonstrate it was able to stimulate a mean 29-fold increase...

  15. Streamflow distribution maps for the Cannon River drainage basin, southeast Minnesota, and the St. Louis River drainage basin, northeast Minnesota

    Science.gov (United States)

    Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.

    2017-12-27

    Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface

  16. Monitoring soft tissue coagulation by optical spectroscopy

    Science.gov (United States)

    Lihachev, A.; Lihacova, I.; Heinrichs, H.; Spigulis, J.; Trebst, T.; Wehner, M.

    2017-12-01

    Laser tissue welding (LTW) or laser tissue soldering (LTS) is investigated since many years for treatment of incisions, wound closure and anastomosis of vessels [1, 2]. Depending on the process, a certain temperature in the range between 65 °C to 85 °C must be reached and held for a few seconds. Care has to be taken not to overheat the tissue, otherwise necrosis or tissue carbonization may occur and will impair wound healing. Usually the temperature is monitored during the process to control the laser power [3]. This requires either bulky equipment or expensive and fragile infrared fibers to feed the temperature signal to an infrared detector. Alternatively, changes in tissue morphology can be directly observed by analysis of spectral reflectance. We investigate spectral changes in the range between 400 nm to 900 nm wavelength. Characteristic spectral changes occur when the temperature of tissue samples increase above 70 °C which is a typical setpoint value for temperature control of coagulation. We conclude that simple spectroscopy in the visible range can provide valuable information during LTS and LTW and probably replace the delicate measurement of temperature. A major advantage is that optical measurements can be performed using standard optical fibers and can be easily integrated into a surgical tool.

  17. The response analysis of fractional-order stochastic system via generalized cell mapping method.

    Science.gov (United States)

    Wang, Liang; Xue, Lili; Sun, Chunyan; Yue, Xiaole; Xu, Wei

    2018-01-01

    This paper is concerned with the response of a fractional-order stochastic system. The short memory principle is introduced to ensure that the response of the system is a Markov process. The generalized cell mapping method is applied to display the global dynamics of the noise-free system, such as attractors, basins of attraction, basin boundary, saddle, and invariant manifolds. The stochastic generalized cell mapping method is employed to obtain the evolutionary process of probability density functions of the response. The fractional-order ϕ 6 oscillator and the fractional-order smooth and discontinuous oscillator are taken as examples to give the implementations of our strategies. Studies have shown that the evolutionary direction of the probability density function of the fractional-order stochastic system is consistent with the unstable manifold. The effectiveness of the method is confirmed using Monte Carlo results.

  18. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  19. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  20. Thermal and molecular investigation of laser tissue welding

    Energy Technology Data Exchange (ETDEWEB)

    Small, W., IV

    1998-06-01

    Despite the growing number of successful animal and human trials, the exact mechanisms of laser tissue welding remain unknown. Furthermore, the effects of laser heating on tissue on the molecular scale are not fully understood. To address these issues, a multi-front attack oil both extrinsic (solder/patch mediated) and intrinsic (laser only) tissue welding was launched using two-color infrared thermometry, computer modeling, weld strength assessment, biochemical assays, and vibrational spectroscopy. The coupling of experimentally measured surface temperatures with the predictive numerical simulations provided insight into the sub-surface dynamics of the laser tissue welding process. Quantification of the acute strength of the welds following the welding procedure enabled comparison among trials during an experiment, with previous experiments, and with other studies in the literature. The acute weld integrity also provided an indication of tile probability of long-term success. Molecular effects induced In the tissue by laser irradiation were investigated by measuring tile concentrations of specific collagen covalent crosslinks and characterizing the Fourier-Transform infrared (FTIR) spectra before and after the laser exposure.

  1. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  2. MAP Fault Localization Based on Wide Area Synchronous Phasor Measurement Information

    Science.gov (United States)

    Zhang, Yagang; Wang, Zengping

    2015-02-01

    In the research of complicated electrical engineering, the emergence of phasor measurement units (PMU) is a landmark event. The establishment and application of wide area measurement system (WAMS) in power system has made widespread and profound influence on the safe and stable operation of complicated power system. In this paper, taking full advantage of wide area synchronous phasor measurement information provided by PMUs, we have carried out precise fault localization based on the principles of maximum posteriori probability (MAP). Large numbers of simulation experiments have confirmed that the results of MAP fault localization are accurate and reliable. Even if there are interferences from white Gaussian stochastic noise, the results from MAP classification are also identical to the actual real situation.

  3. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  4. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  5. Epitope mapping of the U1 small nuclear ribonucleoprotein particle in patients with systemic lupus erythematosus and mixed connective tissue disease.

    Science.gov (United States)

    Somarelli, J A; Mesa, A; Rodriguez, R; Avellan, R; Martinez, L; Zang, Y J; Greidinger, E L; Herrera, R J

    2011-03-01

    Systemic lupus erythematosus (SLE) and mixed connective tissue disease (MCTD) are autoimmune illnesses characterized by the presence of high titers of autoantibodies directed against a wide range of 'self ' antigens. Proteins of the U1 small nuclear ribonucleoprotein particle (U1 snRNP) are among the most immunogenic molecules in patients with SLE and MCTD. The recent release of a crystallized U1 snRNP provides a unique opportunity to evaluate the effects of tertiary and quaternary structures on autoantigenicity within the U1 snRNP. In the present study, an epitope map was created using the U1 snRNP crystal structure. A total of 15 peptides were tested in a cohort of 68 patients with SLE, 29 with MCTD and 26 healthy individuals and mapped onto the U1 snRNP structure. Antigenic sites were detected in a variety of structures and appear to include RNA binding domains, but mostly exclude regions necessary for protein-protein interactions. These data suggest that while some autoantibodies may target U1 snRNP proteins as monomers or apoptosis-induced, protease-digested fragments, others may recognize epitopes on assembled protein subcomplexes of the U1 snRNP. Although nearly all of the peptides are strong predictors of autoimmune illness, none were successful at distinguishing between SLE and MCTD. The antigenicity of some peptides significantly correlated with several clinical symptoms. This investigation implicitly highlights the complexities of autoimmune epitopes, and autoimmune illnesses in general, and demonstrates the variability of antigens in patient populations, all of which contribute to difficult clinical diagnoses.

  6. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  7. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  8. Derivation of elastic stiffness from site-matched mineral density and acoustic impedance maps

    International Nuclear Information System (INIS)

    Raum, Kay; Cleveland, Robin O; Peyrin, Francoise; Laugier, Pascal

    2006-01-01

    200 MHz acoustic impedance maps and site-matched synchrotron radiation micro computed tomography (SR-μCT) maps of tissue degree of mineralization of bone (DMB) were used to derive the elastic coefficient c 33 in cross sections of human cortical bone. To accomplish this goal, a model was developed to relate the DMB accessible with SR-μCT to mass density. The formulation incorporates the volume fractions and densities of the major bone tissue components (collagen, mineral and water), and accounts for tissue porosity. We found that the mass density can be well modelled by a second-order polynomial fit to DMB (R 2 = 0.999) and appears to be consistent with measurements of many different types of mineralized tissues. The derived elastic coefficient c 33 correlated more strongly with the acoustic impedance (R 2 = 0.996) than with mass density (R 2 = 0.310). This finding suggests that estimates of c 33 made from measurements of the acoustic impedance are more reliable than those made from density measurements. Mass density and elastic coefficient were in the range between 1.66 and 2.00 g cm -3 and 14.8 and 75.4 GPa, respectively. Although SAM inspection is limited to the evaluation of carefully prepared sample surfaces, it provides a two-dimensional quantitative estimate of elastic tissue properties at the tissue level

  9. Responses of some normal tissues to low doses of γ-radiation

    International Nuclear Information System (INIS)

    Withers, H.R.

    1975-01-01

    The response of four normal tissues to low doses of γ-radiation was measured in mice using three indirect methods. The survival curves for cells of the tissues studied (colon, jejunum, testis and haemoleucopoietic system) may be exponential over an uncertain dose range (from zero to between 100 to 230 rad), the slope being about one third of that in the high-dose region. Some of the uncertainties in the data probably reflect variations in age-density distribution. (author)

  10. Pancreatic tissue fluid pressure and pain in chronic pancreatitis

    DEFF Research Database (Denmark)

    Ebbehøj, N

    1992-01-01

    A casual relation between pancreatic pressure and pain has been searched for decades but lack of appropriate methods for pressure measurements has hindered progress. During the 1980's the needle method has been used for direct intraoperative pancreatic tissue fluid pressure measurements and later...... for percutaneous sonographically-guided pressure measurements. Clinical and experimental evaluation of the method showed comparable results at intraoperative and percutaneous measurements and little week-to-week variation. Furthermore, comparable pressures in duct and adjacent pancreatic tissue were found, i.......e. the needle pressure mirrors the intraductal pressure. Comparisons of pain registrations, morphological and functional parameters with pancreatic tissue fluid pressure measurements have revealed a relation between pressure and pain which probably is causal. In patients with pain the high pressures previously...

  11. Topographic brain mapping of emotion-related hemisphere asymmetries.

    Science.gov (United States)

    Roschmann, R; Wittling, W

    1992-03-01

    The study used topographic brain mapping of visual evoked potentials to investigate emotion-related hemisphere asymmetries. The stimulus material consisted of color photographs of human faces, grouped into two emotion-related categories: normal faces (neutral stimuli) and faces deformed by dermatological diseases (emotional stimuli). The pictures were presented tachistoscopically to 20 adult right-handed subjects. Brain activity was recorded by 30 EEG electrodes with linked ears as reference. The waveforms were averaged separately with respect to each of the two stimulus conditions. Statistical analysis by means of significance probability mapping revealed significant differences between stimulus conditions for two periods of time, indicating right hemisphere superiority in emotion-related processing. The results are discussed in terms of a 2-stage-model of emotional processing in the cerebral hemispheres.

  12. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  13. Development of a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence after curative radiotherapy/chemo-radiotherapy in head and neck cancer

    International Nuclear Information System (INIS)

    Wopken, Kim; Bijl, Hendrik P.; Schaaf, Arjen van der; Laan, Hans Paul van der; Chouvalova, Olga; Steenbakkers, Roel J.H.M.; Doornaert, Patricia; Slotman, Ben J.; Oosting, Sjoukje F.; Christianen, Miranda E.M.C.; Laan, Bernard F.A.M. van der; Roodenburg, Jan L.N.; René Leemans, C.; Verdonck-de Leeuw, Irma M.; Langendijk, Johannes A.

    2014-01-01

    Background and purpose: Curative radiotherapy/chemo-radiotherapy for head and neck cancer (HNC) may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence 6 months (TUBE M6 ) after definitive radiotherapy, radiotherapy plus cetuximab or concurrent chemoradiation based on pre-treatment and treatment characteristics. Materials and methods: The study included 355 patients with HNC. TUBE M6 was scored prospectively in a standard follow-up program. To design the prediction model, the penalized learning method LASSO was used, with TUBE M6 as the endpoint. Results: The prevalence of TUBE M6 was 10.7%. The multivariable model with the best performance consisted of the variables: advanced T-stage, moderate to severe weight loss at baseline, accelerated radiotherapy, chemoradiation, radiotherapy plus cetuximab, the mean dose to the superior and inferior pharyngeal constrictor muscle, to the contralateral parotid gland and to the cricopharyngeal muscle. Conclusions: We developed a multivariable NTCP model for TUBE M6 to identify patients at risk for tube feeding dependence. The dosimetric variables can be used to optimize radiotherapy treatment planning aiming at prevention of tube feeding dependence and to estimate the benefit of new radiation technologies

  14. Identification of QTLs Associated with Callogenesis and Embryogenesis in Oil Palm Using Genetic Linkage Maps Improved with SSR Markers.

    NARCIS (Netherlands)

    Ting, N.C.; Jansen, J.; Nagappan, J.; Ishak, Z.; Chin, C.W.; Tan, S.G.; Cheah, S.C.; Singh, R.

    2013-01-01

    Clonal reproduction of oil palm by means of tissue culture is a very inefficient process. Tissue culturability is known to be genotype dependent with some genotypes being more amenable to tissue culture than others. In this study, genetic linkage maps enriched with simple sequence repeat (SSR)

  15. Dynamic Quantitative T1 Mapping in Orthotopic Brain Tumor Xenografts

    Directory of Open Access Journals (Sweden)

    Kelsey Herrmann

    2016-04-01

    Full Text Available Human brain tumors such as glioblastomas are typically detected using conventional, nonquantitative magnetic resonance imaging (MRI techniques, such as T2-weighted and contrast enhanced T1-weighted MRI. In this manuscript, we tested whether dynamic quantitative T1 mapping by MRI can localize orthotopic glioma tumors in an objective manner. Quantitative T1 mapping was performed by MRI over multiple time points using the conventional contrast agent Optimark. We compared signal differences to determine the gadolinium concentration in tissues over time. The T1 parametric maps made it easy to identify the regions of contrast enhancement and thus tumor location. Doubling the typical human dose of contrast agent resulted in a clearer demarcation of these tumors. Therefore, T1 mapping of brain tumors is gadolinium dose dependent and improves detection of tumors by MRI. The use of T1 maps provides a quantitative means to evaluate tumor detection by gadolinium-based contrast agents over time. This dynamic quantitative T1 mapping technique will also enable future quantitative evaluation of various targeted MRI contrast agents.

  16. Analysis Of Transcriptomes In A Porcine Tissue Collection Using RNA-Seq And Genome Assembly 10

    DEFF Research Database (Denmark)

    Hornshøj, Henrik; Thomsen, Bo; Hedegaard, Jakob

    2011-01-01

    The release of Sus scrofa genome assembly 10 supports improvement of the pig genome annotation and in depth transcriptome analyses using next-generation sequencing technologies. In this study we analyze RNA-seq reads from a tissue collection, including 10 separate tissues from Duroc boars and 10...... short read alignment software we mapped the reads to the genome assembly 10. We extracted contig sequences of gene transcripts using the Cufflinks software. Based on this information we identified expressed genes that are present in the genome assembly. The portion of these genes being previously known...... was roughly estimated by sequence comparison to known genes. Similarly, we searched for genes that are expressed in the tissues but not present in the genome assembly by aligning the non-genome-mapped reads to known gene transcripts. For the genes predicted to have alternative transcript variants by Cufflinks...

  17. Improved method for drawing of a glycan map, and the first page of glycan atlas, which is a compilation of glycan maps for a whole organism.

    Directory of Open Access Journals (Sweden)

    Shunji Natsuka

    Full Text Available Glycan Atlas is a set of glycan maps over the whole body of an organism. The glycan map that includes data of glycan structure and quantity displays micro-heterogeneity of the glycans in a tissue, an organ, or cells. The two-dimensional glycan mapping is widely used for structure analysis of N-linked oligosaccharides on glycoproteins. In this study we developed a comprehensive method for the mapping of both N- and O-glycans with and without sialic acid. The mapping data of 150 standard pyridylaminated glycans were collected. The empirical additivity rule which was proposed in former reports was able to adapt for this extended glycan map. The adapted rule is that the elution time of pyridylamino glycans on high performance liquid chromatography (HPLC is expected to be the simple sum of the partial elution times assigned to each monosaccharide residue. The comprehensive mapping method developed in this study is a powerful tool for describing the micro-heterogeneity of the glycans. Furthermore, we prepared 42 pyridylamino (PA- glycans from human serum and were able to draw the map of human serum N- and O-glycans as an initial step of Glycan Atlas editing.

  18. Molecular imaging of cannabis leaf tissue with MeV-SIMS method

    Science.gov (United States)

    Jenčič, Boštjan; Jeromel, Luka; Ogrinc Potočnik, Nina; Vogel-Mikuš, Katarina; Kovačec, Eva; Regvar, Marjana; Siketić, Zdravko; Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož

    2016-03-01

    To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.

  19. Molecular imaging of cannabis leaf tissue with MeV-SIMS method

    Energy Technology Data Exchange (ETDEWEB)

    Jenčič, Boštjan, E-mail: bostjan.jencic@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Jeromel, Luka [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Ogrinc Potočnik, Nina [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); M4I, Maastricht University, Peter Debijelaan 25A, 6229 HX Maastricht (Netherlands); Vogel-Mikuš, Katarina [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); University of Ljubljana, Biotechnical Faculty, Dept. of Biology, Večna pot 11, SI-1000 Ljubljana (Slovenia); Kovačec, Eva; Regvar, Marjana [University of Ljubljana, Biotechnical Faculty, Dept. of Biology, Večna pot 11, SI-1000 Ljubljana (Slovenia); Siketić, Zdravko [Ruđer Bošković Institute, P.O. Box 180, 10000 Zagreb (Croatia); Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia)

    2016-03-15

    To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.

  20. Molecular imaging of cannabis leaf tissue with MeV-SIMS method

    International Nuclear Information System (INIS)

    Jenčič, Boštjan; Jeromel, Luka; Ogrinc Potočnik, Nina; Vogel-Mikuš, Katarina; Kovačec, Eva; Regvar, Marjana; Siketić, Zdravko; Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož

    2016-01-01

    To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.

  1. Applications of condensed matter understanding to medical tissues and disease progression: Elemental analysis and structural integrity of tissue scaffolds

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, D.A., E-mail: d.a.bradley@surrey.ac.u [Centre for Nuclear and Radiation Physics, Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Farquharson, M.J. [Department of Radiography, School of Community and Health Sciences, City University, London (United Kingdom); Gundogdu, O. [Centre for Nuclear and Radiation Physics, Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Al-Ebraheem, Alia [Department of Radiography, School of Community and Health Sciences, City University, London (United Kingdom); Che Ismail, Elna [Centre for Nuclear and Radiation Physics, Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Kaabar, W., E-mail: w.kaabar@surrey.ac.u [Centre for Nuclear and Radiation Physics, Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Bunk, O. [Paul Scherrer Institute, CH-5232 Villigen (Switzerland); Pfeiffer, F. [Paul Scherrer Institute, CH-5232 Villigen (Switzerland); Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Falkenberg, G. [Hamburger Synchrotronstrahlungslabor HASYLAB at Deutsches Elektronensynchrotron DESY, Notkestr. 85, D-22603 Hamburg (Germany); Bailey, M. [Surrey Ion Beam Centre, Advanced Technology Institute, University of Surrey, Guildford GU2 7XH (United Kingdom)

    2010-02-15

    The investigations reported herein link tissue structure and elemental presence with issues of environmental health and disease, exemplified by uptake and storage of potentially toxic elements in the body, the osteoarthritic condition and malignancy in the breast and other soft tissues. Focus is placed on application of state-of-the-art ionizing radiation techniques, including, micro-synchrotron X-ray fluorescence (mu-SXRF) and particle-induced X-ray emission/Rutherford backscattering mapping (mu-PIXE/RBS), coherent small-angle X-ray scattering (cSAXS) and X-ray phase-contrast imaging, providing information on elemental make-up, the large-scale organisation of collagen and anatomical features of moderate and low atomic number media. For the particular situations under investigation, use of such facilities is allowing information to be obtained at an unprecedented level of detail, yielding new understanding of the affected tissues and the progression of disease.

  2. Applications of condensed matter understanding to medical tissues and disease progression: Elemental analysis and structural integrity of tissue scaffolds

    International Nuclear Information System (INIS)

    Bradley, D.A.; Farquharson, M.J.; Gundogdu, O.; Al-Ebraheem, Alia; Che Ismail, Elna; Kaabar, W.; Bunk, O.; Pfeiffer, F.; Falkenberg, G.; Bailey, M.

    2010-01-01

    The investigations reported herein link tissue structure and elemental presence with issues of environmental health and disease, exemplified by uptake and storage of potentially toxic elements in the body, the osteoarthritic condition and malignancy in the breast and other soft tissues. Focus is placed on application of state-of-the-art ionizing radiation techniques, including, micro-synchrotron X-ray fluorescence (μ-SXRF) and particle-induced X-ray emission/Rutherford backscattering mapping (μ-PIXE/RBS), coherent small-angle X-ray scattering (cSAXS) and X-ray phase-contrast imaging, providing information on elemental make-up, the large-scale organisation of collagen and anatomical features of moderate and low atomic number media. For the particular situations under investigation, use of such facilities is allowing information to be obtained at an unprecedented level of detail, yielding new understanding of the affected tissues and the progression of disease.

  3. Neutron activation analysis of trace elements in biological tissue

    Energy Technology Data Exchange (ETDEWEB)

    Velandia, J A; Perkons, A K

    1974-01-01

    Thermal Neutron Activation Analysis with Instrumental Ge(Li) Gamma Spectrometry was used to determine the amounts of more than 30 trace constituents in heart tissue of rats and kidney tissue of rabbits. The results were confirmed by a rapid ion-exchange group separation method in the initial stages of the experiments. The samples were exposed to thermal neutrons for periods between 3 minutes and 14 hours. Significant differences in the amounts and types of trace elements in the two different tissue types are apparent, however, are probably due to specific diets. Tables of relevant nuclear data, standard concentrations, radiochemical separation recoveries, and quantitative analytical results are presented. The ion-exchange group separation scheme and typical examples of the instrumental gamma ray spectra are shown. The techniques developed in this study are being used for a large scale constituent survey of various diseased and healthy human tissues.

  4. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  5. Constellation and Mapping Optimization of APSK Modulations used in DVB-S2

    Directory of Open Access Journals (Sweden)

    L. Jordanova

    2014-10-01

    Full Text Available This article represents the algorithms of APSK constellation and mapping optimization. The dependencies of the symbol error probability Ps on the parameters of the 16APSK and 32APSK constellations are examined and several options that satisfy the requirements to the minimum value of Ps are selected. Mapping optimization is carried out for the selected APSK constellations. BER characteristics of the satellite DVB-S2 channels are represented when using optimized and standard 16APSK and 32APSK constellations and a comparative analysis of the results achieved is made.

  6. Mapping risk of cadmium and lead contamination to human health in soils of Central Iran

    International Nuclear Information System (INIS)

    Amini, M.; Afyuni, M.; Khademi, H.; Abbaspour, K.C.; Schulin, R.

    2005-01-01

    In order to map Cd and Pb contamination in the soils of the region of Isfahan, Central Iran, we performed indicator kriging on a set of 255 topsoil samples (0-20 cm) gathered irregularly from an area of 6800 km 2 . The measured Cd concentrations exceeded the Swiss guide value in more than 80% of the samples whereas Pb concentrations exceeded the respective guide value only in 2% of the samples. Based on the simulated conditional distribution functions, the probability of exceeding the concentration of Cd and Pb from the specific threshold was computed. The results indicated that in most parts of the region the probability of contamination by Cd is very large (>0.95) whereas it is small (<0.5) for Pb. Based on a misclassification analysis, we chose the probability of 0.45 as optimum probability threshold to delineate the polluted from unpolluted areas for Cd. In addition, we performed a loss analysis to separate risks to human health from potential losses due to remediation costs. Based on this analysis a probability threshold of 0.8 was found to be the optimum threshold for the classification of polluted and unpolluted areas in the case of Cd. Health risks were found to be larger in the western parts of the region. Misclassification analysis was sufficient for risk mapping for Pb as its concentration did not reach risk levels for human health. A probability of 0.7 for Pb was found to be the optimum threshold for the delineation of polluted and unpolluted lands

  7. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  8. Heterogeneity mapping of protein expression in tumors using quantitative immunofluorescence.

    Science.gov (United States)

    Faratian, Dana; Christiansen, Jason; Gustavson, Mark; Jones, Christine; Scott, Christopher; Um, InHwa; Harrison, David J

    2011-10-25

    Morphologic heterogeneity within an individual tumor is well-recognized by histopathologists in surgical practice. While this often takes the form of areas of distinct differentiation into recognized histological subtypes, or different pathological grade, often there are more subtle differences in phenotype which defy accurate classification (Figure 1). Ultimately, since morphology is dictated by the underlying molecular phenotype, areas with visible differences are likely to be accompanied by differences in the expression of proteins which orchestrate cellular function and behavior, and therefore, appearance. The significance of visible and invisible (molecular) heterogeneity for prognosis is unknown, but recent evidence suggests that, at least at the genetic level, heterogeneity exists in the primary tumor(1,2), and some of these sub-clones give rise to metastatic (and therefore lethal) disease. Moreover, some proteins are measured as biomarkers because they are the targets of therapy (for instance ER and HER2 for tamoxifen and trastuzumab (Herceptin), respectively). If these proteins show variable expression within a tumor then therapeutic responses may also be variable. The widely used histopathologic scoring schemes for immunohistochemistry either ignore, or numerically homogenize the quantification of protein expression. Similarly, in destructive techniques, where the tumor samples are homogenized (such as gene expression profiling), quantitative information can be elucidated, but spatial information is lost. Genetic heterogeneity mapping approaches in pancreatic cancer have relied either on generation of a single cell suspension(3), or on macrodissection(4). A recent study has used quantum dots in order to map morphologic and molecular heterogeneity in prostate cancer tissue(5), providing proof of principle that morphology and molecular mapping is feasible, but falling short of quantifying the heterogeneity. Since immunohistochemistry is, at best, only semi

  9. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  10. Constructing Binomial Trees Via Random Maps for Analysis of Financial Assets

    Directory of Open Access Journals (Sweden)

    Antonio Airton Carneiro de Freitas

    2010-04-01

    Full Text Available Random maps can be constructed from a priori knowledge of the financial assets. It is also addressed the reverse problem, i.e. from a function of an empirical stationary probability density function we set up a random map that naturally leads to an implied binomial tree, allowing the adjustment of models, including the ability to incorporate jumps. An applica- tion related to the options market is presented. It is emphasized that the quality of the model to incorporate a priori knowledge of the financial asset may be affected, for example, by the skewed vision of the analyst.

  11. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  12. Map-based cloning and expression analysis of BMR-6 in sorghum

    Indian Academy of Sciences (India)

    CAD), using a map-based cloning approach. Genetic complementation confirmed that CAD is responsible for the BMR-6 phenotype. BMR-6 gene was expressed in all tested sorghum tissues, with the highest being in midrib and stem. Transient ...

  13. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  14. Normal tissue complication probability modeling for cochlea constraints to avoid causing tinnitus after head-and-neck intensity-modulated radiation therapy

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Yeh, Shyh-An; Chao, Pei-Ju; Chang, Liyun; Chiu, Chien-Liang; Ting, Hui-Min; Wang, Hung-Yu; Huang, Yu-Jie

    2015-01-01

    Radiation-induced tinnitus is a side effect of radiotherapy in the inner ear for cancers of the head and neck. Effective dose constraints for protecting the cochlea are under-reported. The aim of this study is to determine the cochlea dose limitation to avoid causing tinnitus after head-and-neck cancer (HNC) intensity-modulated radiation therapy (IMRT). In total 211 patients with HNC were included; the side effects of radiotherapy were investigated for 422 inner ears in the cohort. Forty-nine of the four hundred and twenty-two samples (11.6 %) developed grade 2+ tinnitus symptoms after IMRT, as diagnosed by a clinician. The Late Effects of Normal Tissues–Subjective, Objective, Management, Analytic (LENT-SOMA) criteria were used for tinnitus evaluation. The logistic and Lyman-Kutcher-Burman (LKB) normal tissue complication probability (NTCP) models were used for the analyses. The NTCP-fitted parameters were TD 50 = 46.31 Gy (95 % CI, 41.46–52.50), γ 50 = 1.27 (95 % CI, 1.02–1.55), and TD 50 = 46.52 Gy (95 % CI, 41.91–53.43), m = 0.35 (95 % CI, 0.30–0.42) for the logistic and LKB models, respectively. The suggested guideline TD 20 for the tolerance dose to produce a 20 % complication rate within a specific period of time was TD 20 = 33.62 Gy (95 % CI, 30.15–38.27) (logistic) and TD 20 = 32.82 Gy (95 % CI, 29.58–37.69) (LKB). To maintain the incidence of grade 2+ tinnitus toxicity <20 % in IMRT, we suggest that the mean dose to the cochlea should be <32 Gy. However, models should not be extrapolated to other patient populations without further verification and should first be confirmed before clinical implementation

  15. Assessment of tissue viability by polarization spectroscopy

    Science.gov (United States)

    Nilsson, G.; Anderson, C.; Henricson, J.; Leahy, M.; O'Doherty, J.; Sjöberg, F.

    2008-09-01

    A new and versatile method for tissue viability imaging based on polarization spectroscopy of blood in superficial tissue structures such as the skin is presented in this paper. Linearly polarized light in the visible wavelength region is partly reflected directly by the skin surface and partly diffusely backscattered from the dermal tissue matrix. Most of the directly reflected light preserves its polarization state while the light returning from the deeper tissue layers is depolarized. By the use of a polarization filter positioned in front of a sensitive CCD-array, the light directly reflected from the tissue surface is blocked, while the depolarized light returning from the deeper tissue layers reaches the detector array. By separating the colour planes of the detected image, spectroscopic information about the amount of red blood cells (RBCs) in the microvascular network of the tissue under investigation can be derived. A theory that utilizes the differences in light absorption of RBCs and bloodless tissue in the red and green wavelength region forms the basis of an algorithm for displaying a colour coded map of the RBC distribution in a tissue. Using a fluid model, a linear relationship (cc. = 0.99) between RBC concentration and the output signal was demonstrated within the physiological range 0-4%. In-vivo evaluation using transepidermal application of acetylcholine by the way of iontophoresis displayed the heterogeneity pattern of the vasodilatation produced by the vasoactive agent. Applications of this novel technology are likely to be found in drug and skin care product development as well as in the assessment of skin irritation and tissue repair processes and even ultimately in a clinic case situation.

  16. Genetic Map of Mango: A Tool for Mango Breeding

    Directory of Open Access Journals (Sweden)

    David N. Kuhn

    2017-04-01

    Full Text Available Mango (Mangifera indica is an economically and nutritionally important tropical/subtropical tree fruit crop. Most of the current commercial cultivars are selections rather than the products of breeding programs. To improve the efficiency of mango breeding, molecular markers have been used to create a consensus genetic map that identifies all 20 linkage groups in seven mapping populations. Polyembryony is an important mango trait, used for clonal propagation of cultivars and rootstocks. In polyembryonic mango cultivars, in addition to a zygotic embryo, several apomictic embryos develop from maternal tissue surrounding the fertilized egg cell. This trait has been associated with linkage group 8 in our consensus genetic map and has been validated in two of the seven mapping populations. In addition, we have observed a significant association between trait and single nucleotide polymorphism (SNP markers for the vegetative trait of branch habit and the fruit traits of bloom, ground skin color, blush intensity, beak shape, and pulp color.

  17. New methods to enhance cerebral flow maps made by the stable xenon/CT technique

    International Nuclear Information System (INIS)

    Wist, A.O.; Fatouros, P.P.; Kishore, P.R.S.; Weiss, J.; Cothran, S.J.

    1987-01-01

    The authors developed several new techniques to extract the important information of the high-resolution flow maps as they are being generated by our improved stable Xe/CT technique. First, they adapted a new morphologic filtering technique to separate white, white/gray and gray matter. Second, they generated iso-flow lines using the same filtering technique for easier reading of the values in the flow map. Third, by combining the information in both maps, the authors constructed a new map which shows the areas of high, normal, and low blood flow for the whole brain. When combined with anatomic information, the information in the map can indicate the probable pathologic areas. Fourth, they were able to reduce the calculation time of the flow by almost a factor of 10 by developing a new, faster algorithm for calculating the flow

  18. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  19. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  20. Tissue banking for management of nuclear casualties

    International Nuclear Information System (INIS)

    Singh, Rita

    2014-01-01

    The proliferation of nuclear material and technology has made the acquisition and adversarial use more probable than ever. Devastating medical consequences would follow a nuclear detonation due to the thermal, blast and radiation effects of the weapon. Atomic explosions at Hiroshima and Nagasaki demonstrated the human agonies on vast scale. A full range of medical modalities are required to decrease the morbidity and mortality as a result of the use of nuclear weapons. Biological tissues from human donor like bone, skin, amniotic membrane and other soft tissues can be used for repair or reconstruction of the injured part of the body. Tissues from human donor can be processed and banked for orthopaedic, spinal, trauma and other surgical procedures. Processed tissues can be provided by the tissue banks and can be of great assistance in the treatment of injuries due to the nuclear weapon. The use of allograft tissue avoids the donor site morbidity and reduces the operating time, expense and trauma associated with the acquisition of autografts. Further, allografts have the added advantage of being available in large quantities. This has led to a global increase in allogeneic transplantation and development of tissue banking. The aim of the tissue bank is to provide a wide range of processed biological tissues free from any transmissible disease, that help to restore the growth and function of the damaged tissues. Skin dressings or skin substitutes like allograft skin, xenograft skin and amniotic membrane can be used for the treatment of thermal burns and radiation induced skin injuries. Bone allografts can be used for reconstructive approaches to the skeletal system. Tissue banking would thus ensure health care to the military personnel and population following a nuclear detonation. (author)

  1. Effects of attenuation map accuracy on attenuation-corrected micro-SPECT images

    NARCIS (Netherlands)

    Wu, C.; Gratama van Andel, H.A.; Laverman, P.; Boerman, O.C.; Beekman, F.J.

    2013-01-01

    Background In single-photon emission computed tomography (SPECT), attenuation of photon flux in tissue affects quantitative accuracy of reconstructed images. Attenuation maps derived from X-ray computed tomography (CT) can be employed for attenuation correction. The attenuation coefficients as well

  2. Discrimination of mixed quantum states. Reversible maps and unambiguous strategies

    Energy Technology Data Exchange (ETDEWEB)

    Kleinmann, Matthias

    2008-06-30

    The discrimination of two mixed quantum states is a fundamental task in quantum state estimation and quantum information theory. In quantum state discrimination a quantum system is assumed to be in one of two possible - in general mixed - non-orthogonal quantum states. The discrimination then consists of a measurement strategy that allows to decide in which state the system was before the measurement. In unambiguous state discrimination the aim is to make this decision without errors, but it is allowed to give an inconclusive answer. Especially interesting are measurement strategies that minimize the probability of an inconclusive answer. A starting point for the analysis of this optimization problem was a result by Eldar et al. [Phys. Rev. A 69, 062318 (2004)], which provides non-operational necessary and sufficient conditions for a given measurement strategy to be optimal. These conditions are reconsidered and simplified in such a way that they become operational. The simplified conditions are the basis for further central results: It is shown that the optimal measurement strategy is unique, a statement that is e.g. of importance for the complexity analysis of optimal measurement devices. The optimal measurement strategy is derived for the case, where one of the possible input states has at most rank two, which was an open problem for many years. Furthermore, using the optimality criterion it is shown that there always exists a threshold probability for each state, such that below this probability it is optimal to exclude this state from the discrimination strategy. If the two states subject to discrimination can be brought to a diagonal structure with (2 x 2)-dimensional blocks, then the unambiguous discrimination of these states can be reduced to the unambiguous discrimination of pure states. A criterion is presented that allows to identify the presence of such a structure for two self-adjoint operators. This criterion consists of the evaluation of three

  3. Phytoplankton global mapping from space with a support vector machine algorithm

    Science.gov (United States)

    de Boissieu, Florian; Menkes, Christophe; Dupouy, Cécile; Rodier, Martin; Bonnet, Sophie; Mangeas, Morgan; Frouin, Robert J.

    2014-11-01

    In recent years great progress has been made in global mapping of phytoplankton from space. Two main trends have emerged, the recognition of phytoplankton functional types (PFT) based on reflectance normalized to chlorophyll-a concentration, and the recognition of phytoplankton size class (PSC) based on the relationship between cell size and chlorophyll-a concentration. However, PFTs and PSCs are not decorrelated, and one approach can complement the other in a recognition task. In this paper, we explore the recognition of several dominant PFTs by combining reflectance anomalies, chlorophyll-a concentration and other environmental parameters, such as sea surface temperature and wind speed. Remote sensing pixels are labeled thanks to coincident in-situ pigment data from GeP&CO, NOMAD and MAREDAT datasets, covering various oceanographic environments. The recognition is made with a supervised Support Vector Machine classifier trained on the labeled pixels. This algorithm enables a non-linear separation of the classes in the input space and is especially adapted for small training datasets as available here. Moreover, it provides a class probability estimate, allowing one to enhance the robustness of the classification results through the choice of a minimum probability threshold. A greedy feature selection associated to a 10-fold cross-validation procedure is applied to select the most discriminative input features and evaluate the classification performance. The best classifiers are finally applied on daily remote sensing datasets (SeaWIFS, MODISA) and the resulting dominant PFT maps are compared with other studies. Several conclusions are drawn: (1) the feature selection highlights the weight of temperature, chlorophyll-a and wind speed variables in phytoplankton recognition; (2) the classifiers show good results and dominant PFT maps in agreement with phytoplankton distribution knowledge; (3) classification on MODISA data seems to perform better than on SeaWIFS data

  4. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  5. Random fixed point equations and inverse problems using "collage method" for contraction mappings

    Science.gov (United States)

    Kunze, H. E.; La Torre, D.; Vrscay, E. R.

    2007-10-01

    In this paper we are interested in the direct and inverse problems for the following class of random fixed point equations T(w,x(w))=x(w) where is a given operator, [Omega] is a probability space and X is a Polish metric space. The inverse problem is solved by recourse to the collage theorem for contractive maps. We then consider two applications: (i) random integral equations, and (ii) random iterated function systems with greyscale maps (RIFSM), for which noise is added to the classical IFSM.

  6. Statistical characterization of discrete conservative systems: The web map

    Science.gov (United States)

    Ruiz, Guiomar; Tirnakli, Ugur; Borges, Ernesto P.; Tsallis, Constantino

    2017-10-01

    We numerically study the two-dimensional, area preserving, web map. When the map is governed by ergodic behavior, it is, as expected, correctly described by Boltzmann-Gibbs statistics, based on the additive entropic functional SB G[p (x ) ] =-k ∫d x p (x ) lnp (x ) . In contrast, possible ergodicity breakdown and transitory sticky dynamical behavior drag the map into the realm of generalized q statistics, based on the nonadditive entropic functional Sq[p (x ) ] =k 1/-∫d x [p(x ) ] q q -1 (q ∈R ;S1=SB G ). We statistically describe the system (probability distribution of the sum of successive iterates, sensitivity to the initial condition, and entropy production per unit time) for typical values of the parameter that controls the ergodicity of the map. For small (large) values of the external parameter K , we observe q -Gaussian distributions with q =1.935 ⋯ (Gaussian distributions), like for the standard map. In contrast, for intermediate values of K , we observe a different scenario, due to the fractal structure of the trajectories embedded in the chaotic sea. Long-standing non-Gaussian distributions are characterized in terms of the kurtosis and the box-counting dimension of chaotic sea.

  7. Methods for Reducing Normal Tissue Complication Probabilities in Oropharyngeal Cancer: Dose Reduction or Planning Target Volume Elimination

    Energy Technology Data Exchange (ETDEWEB)

    Samuels, Stuart E.; Eisbruch, Avraham; Vineberg, Karen; Lee, Jae; Lee, Choonik; Matuszak, Martha M.; Ten Haken, Randall K.; Brock, Kristy K., E-mail: kbrock@med.umich.edu

    2016-11-01

    Purpose: Strategies to reduce the toxicities of head and neck radiation (ie, dysphagia [difficulty swallowing] and xerostomia [dry mouth]) are currently underway. However, the predicted benefit of dose and planning target volume (PTV) reduction strategies is unknown. The purpose of the present study was to compare the normal tissue complication probabilities (NTCP) for swallowing and salivary structures in standard plans (70 Gy [P70]), dose-reduced plans (60 Gy [P60]), and plans eliminating the PTV margin. Methods and Materials: A total of 38 oropharyngeal cancer (OPC) plans were analyzed. Standard organ-sparing volumetric modulated arc therapy plans (P70) were created and then modified by eliminating the PTVs and treating the clinical tumor volumes (CTVs) only (C70) or maintaining the PTV but reducing the dose to 60 Gy (P60). NTCP dose models for the pharyngeal constrictors, glottis/supraglottic larynx, parotid glands (PGs), and submandibular glands (SMGs) were analyzed. The minimal clinically important benefit was defined as a mean change in NTCP of >5%. The P70 NTCP thresholds and overlap percentages of the organs at risk with the PTVs (56-59 Gy, vPTV{sub 56}) were evaluated to identify the predictors for NTCP improvement. Results: With the P60 plans, only the ipsilateral PG (iPG) benefited (23.9% vs 16.2%; P<.01). With the C70 plans, only the iPG (23.9% vs 17.5%; P<.01) and contralateral SMG (cSMG) (NTCP 32.1% vs 22.9%; P<.01) benefited. An iPG NTCP threshold of 20% and 30% predicted NTCP benefits for the P60 and C70 plans, respectively (P<.001). A cSMG NTCP threshold of 30% predicted for an NTCP benefit with the C70 plans (P<.001). Furthermore, for the iPG, a vPTV{sub 56} >13% predicted benefit with P60 (P<.001) and C70 (P=.002). For the cSMG, a vPTV{sub 56} >22% predicted benefit with C70 (P<.01). Conclusions: PTV elimination and dose-reduction lowered the NTCP of the iPG, and PTV elimination lowered the NTCP of the cSMG. NTCP thresholds and the

  8. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  9. Probability of obliteration and management risk following gamma knife surgery for cerebral AVM

    International Nuclear Information System (INIS)

    Karlsson, B.; Lax, I.

    1998-01-01

    In order to define the optimal treatment for an AVM patient, the probability of cure and the management risk following the treatment must be estimated before the treatment. Here, Gamma Knife surgery has an advantage over microsurgery and embolization with it's reproducibility within the variability of the individual radiation sensitivity. Based on more than 2000 treatments, we have developed models to predict the probability for obliteration, the risk for radioinduced complications and the probability for a post treatment hemorrhage within the first two years following a Gamma Knife treatment. The factors determining the overall outcome are the absorbed dose in the target and the brain, the AVM volume and location and the age and clinical history of the patient. The probability for obliteration equals 35,69 * ln(D min )-39,66 and is AVM volume independent. The risk for radioinduced complications relates to the average dose in the 20 cm 3 tissue receiving the most radiation, and it is also related to the clinical history of the patient and the AVM location. Finally, the risk for post treatment hemorrhage increases with the age of the patient, and is higher for larger AVM. It decreases with increasing amount of radiation given, and it is independent of the clinical history of the patient. For retreatments, the model for prediction of obliteration is valid, but the risk for radioinduced complications is higher and the risk for post treatment hemorrhage lower as compared to following the first treatment. (author)

  10. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  11. Protein-protein interaction site predictions with three-dimensional probability distributions of interacting atoms on protein surfaces.

    Directory of Open Access Journals (Sweden)

    Ching-Tai Chen

    Full Text Available Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins and were tested on an independent dataset (consisting of 142 proteins. The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted

  12. Protein-Protein Interaction Site Predictions with Three-Dimensional Probability Distributions of Interacting Atoms on Protein Surfaces

    Science.gov (United States)

    Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei

    2012-01-01

    Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with

  13. Design and Selection of Machine Learning Methods Using Radiomics and Dosiomics for Normal Tissue Complication Probability Modeling of Xerostomia.

    Science.gov (United States)

    Gabryś, Hubert S; Buettner, Florian; Sterzing, Florian; Hauswald, Henrik; Bangert, Mark

    2018-01-01

    The purpose of this study is to investigate whether machine learning with dosiomic, radiomic, and demographic features allows for xerostomia risk assessment more precise than normal tissue complication probability (NTCP) models based on the mean radiation dose to parotid glands. A cohort of 153 head-and-neck cancer patients was used to model xerostomia at 0-6 months (early), 6-15 months (late), 15-24 months (long-term), and at any time (a longitudinal model) after radiotherapy. Predictive power of the features was evaluated by the area under the receiver operating characteristic curve (AUC) of univariate logistic regression models. The multivariate NTCP models were tuned and tested with single and nested cross-validation, respectively. We compared predictive performance of seven classification algorithms, six feature selection methods, and ten data cleaning/class balancing techniques using the Friedman test and the Nemenyi post hoc analysis. NTCP models based on the parotid mean dose failed to predict xerostomia (AUCs  0.85), dose gradients in the right-left (AUCs > 0.78), and the anterior-posterior (AUCs > 0.72) direction. Multivariate models of long-term xerostomia were typically based on the parotid volume, the parotid eccentricity, and the dose-volume histogram (DVH) spread with the generalization AUCs ranging from 0.74 to 0.88. On average, support vector machines and extra-trees were the top performing classifiers, whereas the algorithms based on logistic regression were the best choice for feature selection. We found no advantage in using data cleaning or class balancing methods. We demonstrated that incorporation of organ- and dose-shape descriptors is beneficial for xerostomia prediction in highly conformal radiotherapy treatments. Due to strong reliance on patient-specific, dose-independent factors, our results underscore the need for development of personalized data-driven risk profiles for NTCP models of xerostomia. The facilitated

  14. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  15. The effect of the overall treatment time of fractionated irradiation on the tumor control probability of a human soft tissue sarcoma xenograft in nude mice

    International Nuclear Information System (INIS)

    Allam, Ayman; Perez, Luis A.; Huang, Peigen; Taghian, Alphonse; Azinovic, Ignacio; Freeman, Jill; Duffy, Michael; Efird, Jimmy; Suit, Herman D.

    1995-01-01

    Purpose: To study the impact of the overall treatment time of fractionated irradiation on the tumor control probability (TCP) of a human soft tissue sarcoma xenograft growing in nude mice, as well as to compare the pretreatment potential doubling time (T pot ) of this tumor to the effective doubling time (T eff ) derived from three different schedules of irradiation using the same total number of fractions with different overall treatment times. Methods and Materials: The TCP was assessed using the TCD 50 value (the 50% tumor control dose) as an end point. A total of 240 male nude mice, 7-8 weeks old were used in three experimental groups that received the same total number of fractions (30 fractions) with different overall treatment times. In group 1, the animals received three equal fractions/day for 10 consecutive days, in group 2 they received two equal fractions/day for 15 consecutive days, and in group 3 one fraction/day for 30 consecutive days. All irradiations were given under normal blood flow conditions to air breathing animals. The mean tumor diameter at the start of irradiation was 7-8 mm. The mean interfraction intervals were from 8-24 h. The T pot was measured using Iododeoxyuridine (IudR) labeling and flow cytometry and was compared to T eff . Results: The TCD 50 values of the three different treatment schedules were 58.8 Gy, 63.2 Gy, and 75.6 Gy for groups 1, 2, and 3, respectively. This difference in TCD 50 values was significant (p pot (2.4 days) was longer than the calculated T eff in groups 2 and 3 (1.35 days). Conclusion: Our data show a significant loss in TCP with prolongation of the overall treatment time. This is most probably due to an accelerated repopulation of tumor clonogens. The pretreatment T pot of this tumor model does not reflect the actual doubling of the clonogens in a protracted regimen

  16. Probable hepatic capillariosis and hydatidosis in an adolescent from the late Roman period buried in Amiens (France

    Directory of Open Access Journals (Sweden)

    Mowlavi Gholamreza

    2014-01-01

    Full Text Available Two calcified objects recovered from a 3rd to 4th-century grave of an adolescent in Amiens (Northern France were identified as probable hydatid cysts. By using thin-section petrographic techniques, probable Calodium hepaticum (syn. Capillaria hepatica eggs were identified in the wall of the cysts. Human hepatic capillariosis has not been reported from archaeological material so far, but could be expected given the poor level of environmental hygiene prevalent in this period. Identification of tissue-dwelling parasites such as C. hepaticum in archaeological remains is particularly dependent on preservation conditions and taphonomic changes and should be interpreted with caution due to morphological similarities with Trichuris sp. eggs.

  17. Probable hepatic capillariosis and hydatidosis in an adolescent from the late Roman period buried in Amiens (France).

    Science.gov (United States)

    Mowlavi, Gholamreza; Kacki, Sacha; Dupouy-Camet, Jean; Mobedi, Iraj; Makki, Mahsasadat; Harandi, Majid Fasihi; Naddaf, Saied Reza

    2014-01-01

    Two calcified objects recovered from a 3rd to 4th-century grave of an adolescent in Amiens (Northern France) were identified as probable hydatid cysts. By using thin-section petrographic techniques, probable Calodium hepaticum (syn. Capillaria hepatica) eggs were identified in the wall of the cysts. Human hepatic capillariosis has not been reported from archaeological material so far, but could be expected given the poor level of environmental hygiene prevalent in this period. Identification of tissue-dwelling parasites such as C. hepaticum in archaeological remains is particularly dependent on preservation conditions and taphonomic changes and should be interpreted with caution due to morphological similarities with Trichuris sp. eggs. © G. Mowlavi et al., published by EDP Sciences, 2014.

  18. Recurrence determinism and Li-Yorke chaos for interval maps

    OpenAIRE

    Špitalský, Vladimír

    2017-01-01

    Recurrence determinism, one of the fundamental characteristics of recurrence quantification analysis, measures predictability of a trajectory of a dynamical system. It is tightly connected with the conditional probability that, given a recurrence, following states of the trajectory will be recurrences. In this paper we study recurrence determinism of interval dynamical systems. We show that recurrence determinism distinguishes three main types of $\\omega$-limit sets of zero entropy maps: fini...

  19. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  1. Implementation of fast macromolecular proton fraction mapping on 1.5 and 3 Tesla clinical MRI scanners: preliminary experience

    Science.gov (United States)

    Yarnykh, V.; Korostyshevskaya, A.

    2017-08-01

    Macromolecular proton fraction (MPF) is a biophysical parameter describing the amount of macromolecular protons involved into magnetization exchange with water protons in tissues. MPF represents a significant interest as a magnetic resonance imaging (MRI) biomarker of myelin for clinical applications. A recent fast MPF mapping method enabled clinical translation of MPF measurements due to time-efficient acquisition based on the single-point constrained fit algorithm. However, previous MPF mapping applications utilized only 3 Tesla MRI scanners and modified pulse sequences, which are not commonly available. This study aimed to test the feasibility of MPF mapping implementation on a 1.5 Tesla clinical scanner using standard manufacturer’s sequences and compare the performance of this method between 1.5 and 3 Tesla scanners. MPF mapping was implemented on 1.5 and 3 Tesla MRI units of one manufacturer with either optimized custom-written or standard product pulse sequences. Whole-brain three-dimensional MPF maps obtained from a single volunteer were compared between field strengths and implementation options. MPF maps demonstrated similar quality at both field strengths. MPF values in segmented brain tissues and specific anatomic regions appeared in close agreement. This experiment demonstrates the feasibility of fast MPF mapping using standard sequences on 1.5 T and 3 T clinical scanners.

  2. SU-G-BRC-15: The Potential Clinical Significance of Dose Mapping Error for Intra- Fraction Dose Mapping for Lung Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Sayah, N [Thomas Cancer Center, Richmond, VA (United States); Weiss, E [Virginia Commonwealth University, Richmond, Virginia (United States); Watkins, W [University of Virginia, Charlottesville, VA (United States); Siebers, J [University of Virginia Health System, Charlottesville, VA (United States)

    2016-06-15

    Purpose: To evaluate the dose-mapping error (DME) inherent to conventional dose-mapping algorithms as a function of dose-matrix resolution. Methods: As DME has been reported to be greatest where dose-gradients overlap tissue-density gradients, non-clinical 66 Gy IMRT plans were generated for 11 lung patients with the target edge defined as the maximum 3D density gradient on the 0% (end of inhale) breathing phase. Post-optimization, Beams were copied to 9 breathing phases. Monte Carlo dose computed (with 2*2*2 mm{sup 3} resolution) on all 10 breathing phases was deformably mapped to phase 0% using the Monte Carlo energy-transfer method with congruent mass-mapping (EMCM); an externally implemented tri-linear interpolation method with voxel sub-division; Pinnacle’s internal (tri-linear) method; and a post-processing energy-mass voxel-warping method (dTransform). All methods used the same base displacement-vector-field (or it’s pseudo-inverse as appropriate) for the dose mapping. Mapping was also performed at 4*4*4 mm{sup 3} by merging adjacent dose voxels. Results: Using EMCM as the reference standard, no clinically significant (>1 Gy) DMEs were found for the mean lung dose (MLD), lung V20Gy, or esophagus dose-volume indices, although MLD and V20Gy were statistically different (2*2*2 mm{sup 3}). Pinnacle-to-EMCM target D98% DMEs of 4.4 and 1.2 Gy were observed ( 2*2*2 mm{sup 3}). However dTransform, which like EMCM conserves integral dose, had DME >1 Gy for one case. The root mean square RMS of the DME for the tri-linear-to- EMCM methods was lower for the smaller voxel volume for the tumor 4D-D98%, lung V20Gy, and cord D1%. Conclusion: When tissue gradients overlap with dose gradients, organs-at-risk DME was statistically significant but not clinically significant. Target-D98%-DME was deemed clinically significant for 2/11 patients (2*2*2 mm{sup 3}). Since tri-linear RMS-DME between EMCM and tri-linear was reduced at 2*2*2 mm{sup 3}, use of this resolution is

  3. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  4. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  5. Mapping aerial metal deposition in metropolitan areas from tree bark: a case study in Sheffield, England.

    Science.gov (United States)

    Schelle, E; Rawlins, B G; Lark, R M; Webster, R; Staton, I; McLeod, C W

    2008-09-01

    We investigated the use of metals accumulated on tree bark for mapping their deposition across metropolitan Sheffield by sampling 642 trees of three common species. Mean concentrations of metals were generally an order of magnitude greater than in samples from a remote uncontaminated site. We found trivially small differences among tree species with respect to metal concentrations on bark, and in subsequent statistical analyses did not discriminate between them. We mapped the concentrations of As, Cd and Ni by lognormal universal kriging using parameters estimated by residual maximum likelihood (REML). The concentrations of Ni and Cd were greatest close to a large steel works, their probable source, and declined markedly within 500 m of it and from there more gradually over several kilometres. Arsenic was much more evenly distributed, probably as a result of locally mined coal burned in domestic fires for many years. Tree bark seems to integrate airborne pollution over time, and our findings show that sampling and analysing it are cost-effective means of mapping and identifying sources.

  6. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  7. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A five-colour colour-coded mapping method for DCE-MRI analysis of head and neck tumours

    International Nuclear Information System (INIS)

    Yuan, J.; Chow, S.K.K.; Yeung, D.K.W.; King, A.D.

    2012-01-01

    Aim: To devise a method to convert the time–intensity curves (TICs) of head and neck dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) data into a pixel-by-pixel colour-coded map for identifying normal tissues and tumours. Materials and methods: Twenty-three patients with head and neck squamous cell carcinoma (HNSCC) underwent DCE-MRI. TIC patterns of primary tumours, metastatic nodes, and normal tissues were assessed and a program was devised to convert the patterns into a classified colour-coded map. The enhancement patterns of tumours and normal tissue structures were evaluated and categorized into nine grades (0–8) based on the predominance of coloured pixels on maps. Results: Five identified TIC patterns were converted into a colour-coded map consisting of red (maximum enhancement), brown (continuous slow rise-up), yellow (rapid wash-in and wash-out), green (rapid wash-in and plateau), and blue (rapid wash-in and rise-up). The colour-coded map distinguished all 21 primary tumours and 15 metastatic nodes from normal structures. Primary tumours and metastatic nodes were colour coded as predominantly yellow (grades 1–2) in 17/21 and 6/15, green (grades 3–5) in 3/21 and 5/15, and blue (grades 6–7) in 1/21 and 4/15, respectively. Vessels were coded red in 46/46 (grade 0) and muscles were coded brown in 23/23 (grade 8). Salivary glands, thyroid glands, and palatine tonsils were coded into predominantly yellow (grade 1) in 46/46 and 10/10 and 18/22, respectively. Conclusion: DCE-MRI derived five-colour-coded mapping provides an objective easy-to-interpret method to assess the dynamic enhancement pattern of head and neck cancers.

  9. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  10. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  11. Azimuthally invariant Mueller-matrix mapping of biological optically anisotropic network

    Science.gov (United States)

    Ushenko, Yu. O.; Vanchuliak, O.; Bodnar, G. B.; Ushenko, V. O.; Grytsyuk, M.; Pavlyukovich, N.; Pavlyukovich, O. V.; Antonyuk, O.

    2017-09-01

    A new technique of Mueller-matrix mapping of polycrystalline structure of histological sections of biological tissues is suggested. The algorithms of reconstruction of distribution of parameters of linear and circular dichroism of histological sections liver tissue of mice with different degrees of severity of diabetes are found. The interconnections between such distributions and parameters of linear and circular dichroism of liver of mice tissue histological sections are defined. The comparative investigations of coordinate distributions of parameters of amplitude anisotropy formed by Liver tissue with varying severity of diabetes (10 days and 24 days) are performed. The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of coordinate distributions of the value of linear and circular dichroism are defined. The objective criteria of cause of the degree of severity of the diabetes differentiation are determined.

  12. Stokes polarimetry imaging of dog prostate tissue

    Science.gov (United States)

    Kim, Jihoon; Johnston, William K., III; Walsh, Joseph T., Jr.

    2010-02-01

    Prostate cancer is the second leading cause of death in the United States in 2009. Radical prostatectomy (complete removal of the prostate) is the most common treatment for prostate cancer, however, differentiating prostate tissue from adjacent bladder, nerves, and muscle is difficult. Improved visualization could improve oncologic outcomes and decrease damage to adjacent nerves and muscle important for preservation of potency and continence. A novel Stokes polarimetry imaging (SPI) system was developed and evaluated using a dog prostate specimen in order to examine the feasibility of the system to differentiate prostate from bladder. The degree of linear polarization (DOLP) image maps from linearly polarized light illumination at different visible wavelengths (475, 510, and 650 nm) were constructed. The SPI system used the polarization property of the prostate tissue. The DOLP images allowed advanced differentiation by distinguishing glandular tissue of prostate from the muscular-stromal tissue in the bladder. The DOLP image at 650 nm effectively differentiated prostate and bladder by strong DOLP in bladder. SPI system has the potential to improve surgical outcomes in open or robotic-assisted laparoscopic removal of the prostate. Further in vivo testing is warranted.

  13. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  15. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  16. Segmentation of Connective Tissue in Meat from Microtomography Using a Grating Interferometer

    DEFF Research Database (Denmark)

    Einarsdottir, Hildur; Ersbøll, Bjarne Kjær; Larsen, Rasmus

    microtomography provides high resolution, the thin structures of the connective tissues are difficult to segment. This is mainly due to partial object voxels, image noise and artifacts. The segmentation of connective tissue is important for quantitative analysis purposes. Factors such as the surface area......, relative volume and the statistics of the electron density of the connective tissue could prove useful for understanding the structural changes occurring in the meat sample due to heat treatment. In this study a two step segmentation algorithm was implemented in order to segment connective tissue from...... the a priori probability of neighborhood dependencies, and the field can either be isotropic or anisotropic. For the segmentation of connective tissue, the local information of the structure orientation and coherence is extracted to steer the smoothing (anisotropy) of the final segmentation. The results show...

  17. Mapping Local Cytosolic Enzymatic Activity in Human Esophageal Mucosa with Porous Silicon Nanoneedles.

    Science.gov (United States)

    Chiappini, Ciro; Campagnolo, Paola; Almeida, Carina S; Abbassi-Ghadi, Nima; Chow, Lesley W; Hanna, George B; Stevens, Molly M

    2015-09-16

    Porous silicon nanoneedles can map Cathepsin B activity across normal and tumor human esophageal mucosa. Assembling a peptide-based Cathepsin B cleavable sensor over a large array of nano-needles allows the discrimination of cancer cells from healthy ones in mixed culture. The same sensor applied to tissue can map Cathepsin B activity with high resolution across the tumor margin area of esophageal adenocarcinoma. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  19. Development of optimized segmentation map in dual energy computed tomography

    Science.gov (United States)

    Yamakawa, Keisuke; Ueki, Hironori

    2012-03-01

    Dual energy computed tomography (DECT) has been widely used in clinical practice and has been particularly effective for tissue diagnosis. In DECT the difference of two attenuation coefficients acquired by two kinds of X-ray energy enables tissue segmentation. One problem in conventional DECT is that the segmentation deteriorates in some cases, such as bone removal. This is due to two reasons. Firstly, the segmentation map is optimized without considering the Xray condition (tube voltage and current). If we consider the tube voltage, it is possible to create an optimized map, but unfortunately we cannot consider the tube current. Secondly, the X-ray condition is not optimized. The condition can be set empirically, but this means that the optimized condition is not used correctly. To solve these problems, we have developed methods for optimizing the map (Method-1) and the condition (Method-2). In Method-1, the map is optimized to minimize segmentation errors. The distribution of the attenuation coefficient is modeled by considering the tube current. In Method-2, the optimized condition is decided to minimize segmentation errors depending on tube voltagecurrent combinations while keeping the total exposure constant. We evaluated the effectiveness of Method-1 by performing a phantom experiment under the fixed condition and of Method-2 by performing a phantom experiment under different combinations calculated from the total exposure constant. When Method-1 was followed with Method-2, the segmentation error was reduced from 37.8 to 13.5 %. These results demonstrate that our developed methods can achieve highly accurate segmentation while keeping the total exposure constant.

  20. Concomitant administration of nitrous oxide and remifentanil reduces oral tissue blood flow without decreasing blood pressure during sevoflurane anesthesia in rabbits.

    Science.gov (United States)

    Kasahara, Masataka; Ichinohe, Tatsuya; Okamoto, Sota; Okada, Reina; Kanbe, Hiroaki; Matsuura, Nobuyuki

    2015-06-01

    To determine whether continuous administration of nitrous oxide and remifentanil—either alone or together—alters blood flow in oral tissues during sevoflurane anesthesia. Eight male tracheotomized Japanese white rabbits were anesthetized with sevoflurane under mechanical ventilation. Heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), mean arterial pressure (MAP), common carotid arterial blood flow (CCBF), tongue mucosal blood flow (TMBF), mandibular bone marrow blood flow (BBF), masseter muscle blood flow (MBF), upper alveolar tissue blood flow (UBF), and lower alveolar tissue blood flow (LBF) were recorded in the absence of all test agents and after administration of the test agents (50 % nitrous oxide, 0.4 μg/kg/min remifentanil, and their combination) for 20 min. Nitrous oxide increased SBP, DBP, MAP, CCBF, BBF, MBF, UBF, and LBF relative to baseline values but did not affect HR or TMBF. Remifentanil decreased all hemodynamic variables except DBP. Combined administration of nitrous oxide and remifentanil recovered SBP, DBP, MAP, and CCBF to baseline levels, but HR and oral tissue blood flow remained lower than control values. Our findings suggest that concomitant administration of nitrous oxide and remifentanil reduces blood flow in oral tissues without decreasing blood pressure during sevoflurane anesthesia in rabbits.