WorldWideScience

Sample records for tissue probability map

  1. The average baboon brain: MRI templates and tissue probability maps from 89 individuals.

    Science.gov (United States)

    Love, Scott A; Marie, Damien; Roth, Muriel; Lacoste, Romain; Nazarian, Bruno; Bertello, Alice; Coulon, Olivier; Anton, Jean-Luc; Meguerditchian, Adrien

    2016-05-15

    The baboon (Papio) brain is a remarkable model for investigating the brain. The current work aimed at creating a population-average baboon (Papio anubis) brain template and its left/right hemisphere symmetric version from a large sample of T1-weighted magnetic resonance images collected from 89 individuals. Averaging the prior probability maps output during the segmentation of each individual also produced the first baboon brain tissue probability maps for gray matter, white matter and cerebrospinal fluid. The templates and the tissue probability maps were created using state-of-the-art, freely available software tools and are being made freely and publicly available: http://www.nitrc.org/projects/haiko89/ or http://lpc.univ-amu.fr/spip.php?article589. It is hoped that these images will aid neuroimaging research of the baboon by, for example, providing a modern, high quality normalization target and accompanying standardized coordinate system as well as probabilistic priors that can be used during tissue segmentation. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  3. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  4. Cerebral gray matter volume losses in essential tremor: A case-control study using high resolution tissue probability maps.

    Science.gov (United States)

    Cameron, Eric; Dyke, Jonathan P; Hernandez, Nora; Louis, Elan D; Dydak, Ulrike

    2018-03-10

    Essential tremor (ET) is increasingly recognized as a multi-dimensional disorder with both motor and non-motor features. For this reason, imaging studies are more broadly examining regions outside the cerebellar motor loop. Reliable detection of cerebral gray matter (GM) atrophy requires optimized processing, adapted to high-resolution magnetic resonance imaging (MRI). We investigated cerebral GM volume loss in ET cases using automated segmentation of MRI T1-weighted images. MRI was acquired on 47 ET cases and 36 controls. Automated segmentation and voxel-wise comparisons of volume were performed using Statistical Parametric Mapping (SPM) software. To improve upon standard protocols, the high-resolution International Consortium for Brain Mapping (ICBM) 2009a atlas and tissue probability maps were used to process each subject image. Group comparisons were performed: all ET vs. Controls, ET with head tremor (ETH) vs. Controls, and severe ET vs. An analysis of variance (ANOVA) was performed between ET with and without head tremor and controls. Age, sex, and Montreal Cognitive Assessment (MoCA) score were regressed out from each comparison. We were able to consistently identify regions of cerebral GM volume loss in ET and in ET subgroups in the posterior insula, superior temporal gyri, cingulate cortex, inferior frontal gyri and other occipital and parietal regions. There were no significant increases in GM volume in ET in any comparisons with controls. This study, which uses improved methodologies, provides evidence that GM volume loss in ET is present beyond the cerebellum, and in fact, is widespread throughout the cerebrum as well. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  6. Normal tissue complication probability for salivary glands

    International Nuclear Information System (INIS)

    Rana, B.S.

    2008-01-01

    The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years

  7. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    Science.gov (United States)

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  8. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  9. Probability Maps for the Visualization of Assimilation Ensemble Flow Data

    KAUST Repository

    Hollt, Thomas

    2015-05-25

    Ocean forecasts nowadays are created by running ensemble simulations in combination with data assimilation techniques. Most of these techniques resample the ensemble members after each assimilation cycle. This means that in a time series, after resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially with the number of assimilation cycles. In general a single possible path is not of interest but only the probabilities that any point in space might be reached by a particle at some point in time. In this work we present an approach using probability-weighted piecewise particle trajectories to allow such a mapping interactively, instead of tracing quadrillions of individual particles. We achieve interactive rates by binning the domain and splitting up the tracing process into the individual assimilation cycles, so that particles that fall into the same bin after a cycle can be treated as a single particle with a larger probability as input for the next time step. As a result we loose the possibility to track individual particles, but can create probability maps for any desired seed at interactive rates.

  10. Cytoarchitecture, probability maps and functions of the human frontal pole.

    Science.gov (United States)

    Bludau, S; Eickhoff, S B; Mohlberg, H; Caspers, S; Laird, A R; Fox, P T; Schleicher, A; Zilles, K; Amunts, K

    2014-06-01

    The frontal pole has more expanded than any other part in the human brain as compared to our ancestors. It plays an important role for specifically human behavior and cognitive abilities, e.g. action selection (Kovach et al., 2012). Evidence about divergent functions of its medial and lateral part has been provided, both in the healthy brain and in psychiatric disorders. The anatomical correlates of such functional segregation, however, are still unknown due to a lack of stereotaxic, microstructural maps obtained in a representative sample of brains. Here we show that the human frontopolar cortex consists of two cytoarchitectonically and functionally distinct areas: lateral frontopolar area 1 (Fp1) and medial frontopolar area 2 (Fp2). Based on observer-independent mapping in serial, cell-body stained sections of 10 brains, three-dimensional, probabilistic maps of areas Fp1 and Fp2 were created. They show, for each position of the reference space, the probability with which each area was found in a particular voxel. Applying these maps as seed regions for a meta-analysis revealed that Fp1 and Fp2 differentially contribute to functional networks: Fp1 was involved in cognition, working memory and perception, whereas Fp2 was part of brain networks underlying affective processing and social cognition. The present study thus disclosed cortical correlates of a functional segregation of the human frontopolar cortex. The probabilistic maps provide a sound anatomical basis for interpreting neuroimaging data in the living human brain, and open new perspectives for analyzing structure-function relationships in the prefrontal cortex. The new data will also serve as a starting point for further comparative studies between human and non-human primate brains. This allows finding similarities and differences in the organizational principles of the frontal lobe during evolution as neurobiological basis for our behavior and cognitive abilities. Copyright © 2013 Elsevier Inc. All

  11. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  12. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Options and pitfalls of normal tissues complication probability models

    International Nuclear Information System (INIS)

    Dorr, Wolfgang

    2011-01-01

    Full text: Technological improvements in the physical administration of radiotherapy have led to increasing conformation of the treatment volume (TV) with the planning target volume (PTV) and of the irradiated volume (IV) with the TV. In this process of improvement of the physical quality of radiotherapy, the total volumes of organs at risk exposed to significant doses have significantly decreased, resulting in increased inhomogeneities in the dose distributions within these organs. This has resulted in a need to identify and quantify volume effects in different normal tissues. Today, irradiated volume today must be considered a 6t h 'R' of radiotherapy, in addition to the 5 'Rs' defined by Withers and Steel in the mid/end 1980 s. The current status of knowledge of these volume effects has recently been summarized for many organs and tissues by the QUANTEC (Quantitative Analysis of Normal Tissue Effects in the Clinic) initiative [Int. J. Radiat. Oncol. BioI. Phys. 76 (3) Suppl., 2010]. However, the concept of using dose-volume histogram parameters as a basis for dose constraints, even without applying any models for normal tissue complication probabilities (NTCP), is based on (some) assumptions that are not met in clinical routine treatment planning. First, and most important, dose-volume histogram (DVH) parameters are usually derived from a single, 'snap-shot' CT-scan, without considering physiological (urinary bladder, intestine) or radiation induced (edema, patient weight loss) changes during radiotherapy. Also, individual variations, or different institutional strategies of delineating organs at risk are rarely considered. Moreover, the reduction of the 3-dimentional dose distribution into a '2dimensl' DVH parameter implies that the localization of the dose within an organ is irrelevant-there are ample examples that this assumption is not justified. Routinely used dose constraints also do not take into account that the residual function of an organ may be

  14. Normal tissue complication probability (NTCP), the clinician,s perspective

    International Nuclear Information System (INIS)

    Yeoh, E.K.

    2011-01-01

    Full text: 3D radiation treatment planning has enabled dose distributions to be related to the volume of normal tissues irradiated. The dose volume histograms thus derived have been utilized to set NTCP dose constraints to facilitate optimization of treatment planning. However, it is not widely appreciated that a number of important variables other than DYH's which determine NTCP in the individual patient. These variables will be discussed under the headings of patient and treatment related as well as tumour related factors. Patient related factors include age, co-morbidities such as connective tissue disease and diabetes mellitus, previous tissue/organ damage, tissue architectural organization (parallel or serial), regional tissue/organ and individual tissue/organ radiosensitivities as well as the development of severe acute toxicity. Treatment related variables which need to be considered include dose per fraction (if not the conventional 1.8012.00 Gy/fraction, particularly for IMRT), number of fractions and total dose, dose rate (particularly if combined with brachytherapy) and concurrent chemotherapy or other biological dose modifiers. Tumour related factors which impact on NTCP include infiltration of normal tissue/organ usually at presentation leading to compromised function but also with recurrent disease after radiation therapy as well as variable tumour radiosensitivities between and within tumour types. Whilst evaluation of DYH data is a useful guide in the choice of treatment plan, the current state of knowledge requires the clinician to make an educated judgement based on a consideration of the other factors.

  15. Raman spectroscopic biochemical mapping of tissues

    Science.gov (United States)

    Stone, Nicholas; Hart Prieto, Maria C.; Kendall, Catherine A.; Shetty, Geeta; Barr, Hugh

    2006-02-01

    Advances in technologies have brought us closer to routine spectroscopic diagnosis of early malignant disease. However, there is still a poor understanding of the carcinogenesis process. For example it is not known whether many cancers follow a logical sequence from dysplasia, to carcinoma in situ, to invasion. Biochemical tissue changes, triggered by genetic mutations, precede morphological and structural changes. These can be probed using Raman or FTIR microspectroscopy and the spectra analysed for biochemical constituents. Local microscopic distribution of various constituents can then be visualised. Raman mapping has been performed on a number of tissues including oesophagus, breast, bladder and prostate. The biochemical constituents have been calculated at each point using basis spectra and least squares analysis. The residual of the least squares fit indicates any unfit spectral components. The biochemical distribution will be compared with the defined histopathological boundaries. The distribution of nucleic acids, glycogen, actin, collagen I, III, IV, lipids and others appear to follow expected patterns.

  16. Probability of cavitation for single ultrasound pulses applied to tissues and tissue-mimicking materials.

    Science.gov (United States)

    Maxwell, Adam D; Cain, Charles A; Hall, Timothy L; Fowlkes, J Brian; Xu, Zhen

    2013-03-01

    In this study, the negative pressure values at which inertial cavitation consistently occurs in response to a single, two-cycle, focused ultrasound pulse were measured in several media relevant to cavitation-based ultrasound therapy. The pulse was focused into a chamber containing one of the media, which included liquids, tissue-mimicking materials, and ex vivo canine tissue. Focal waveforms were measured by two separate techniques using a fiber-optic hydrophone. Inertial cavitation was identified by high-speed photography in optically transparent media and an acoustic passive cavitation detector. The probability of cavitation (P(cav)) for a single pulse as a function of peak negative pressure (p(-)) followed a sigmoid curve, with the probability approaching one when the pressure amplitude was sufficient. The statistical threshold (defined as P(cav) = 0.5) was between p(-) = 26 and 30 MPa in all samples with high water content but varied between p(-) = 13.7 and >36 MPa in other media. A model for radial cavitation bubble dynamics was employed to evaluate the behavior of cavitation nuclei at these pressure levels. A single bubble nucleus with an inertial cavitation threshold of p(-) = 28.2 megapascals was estimated to have a 2.5 nm radius in distilled water. These data may be valuable for cavitation-based ultrasound therapy to predict the likelihood of cavitation at various pressure levels and dimensions of cavitation-induced lesions in tissue. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  17. Joint probability discrimination between stationary tissue and blood velocity signals

    DEFF Research Database (Denmark)

    Schlaikjer, Malene; Jensen, Jørgen Arendt

    2001-01-01

    before and after echo-canceling, and (b) the amplitude variations between samples in consecutive RF-signals before and after echo-canceling. The statistical discriminator was obtained by computing the probability density functions (PDFs) for each feature through histogram analysis of data....... This study presents a new statistical discriminator. Investigation of the RF-signals reveals that features can be derived that distinguish the segments of the signal, which do an do not carry information on the blood flow. In this study 4 features, have been determined: (a) the energy content in the segments....... The discrimination is performed by determining the joint probability of the features for the segment under investigation and choosing the segment type that is most likely. The method was tested on simulated data resembling RF-signals from the carotid artery....

  18. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  19. Linear-fitting-based similarity coefficient map for tissue dissimilarity analysis in -w magnetic resonance imaging

    International Nuclear Information System (INIS)

    Yu Shao-De; Wu Shi-Bin; Xie Yao-Qin; Wang Hao-Yu; Wei Xin-Hua; Chen Xin; Pan Wan-Long; Hu Jiani

    2015-01-01

    Similarity coefficient mapping (SCM) aims to improve the morphological evaluation of weighted magnetic resonance imaging However, how to interpret the generated SCM map is still pending. Moreover, is it probable to extract tissue dissimilarity messages based on the theory behind SCM? The primary purpose of this paper is to address these two questions. First, the theory of SCM was interpreted from the perspective of linear fitting. Then, a term was embedded for tissue dissimilarity information. Finally, our method was validated with sixteen human brain image series from multi-echo . Generated maps were investigated from signal-to-noise ratio (SNR) and perceived visual quality, and then interpreted from intra- and inter-tissue intensity. Experimental results show that both perceptibility of anatomical structures and tissue contrast are improved. More importantly, tissue similarity or dissimilarity can be quantified and cross-validated from pixel intensity analysis. This method benefits image enhancement, tissue classification, malformation detection and morphological evaluation. (paper)

  20. Compositional cokriging for mapping the probability risk of groundwater contamination by nitrates.

    Science.gov (United States)

    Pardo-Igúzquiza, Eulogio; Chica-Olmo, Mario; Luque-Espinar, Juan A; Rodríguez-Galiano, Víctor

    2015-11-01

    Contamination by nitrates is an important cause of groundwater pollution and represents a potential risk to human health. Management decisions must be made using probability maps that assess the nitrate concentration potential of exceeding regulatory thresholds. However these maps are obtained with only a small number of sparse monitoring locations where the nitrate concentrations have been measured. It is therefore of great interest to have an efficient methodology for obtaining those probability maps. In this paper, we make use of the fact that the discrete probability density function is a compositional variable. The spatial discrete probability density function is estimated by compositional cokriging. There are several advantages in using this approach: (i) problems of classical indicator cokriging, like estimates outside the interval (0,1) and order relations, are avoided; (ii) secondary variables (e.g. aquifer parameters) can be included in the estimation of the probability maps; (iii) uncertainty maps of the probability maps can be obtained; (iv) finally there are modelling advantages because the variograms and cross-variograms of real variables that do not have the restrictions of indicator variograms and indicator cross-variograms. The methodology was applied to the Vega de Granada aquifer in Southern Spain and the advantages of the compositional cokriging approach were demonstrated. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Linking retinotopic fMRI mapping and anatomical probability maps of human occipital areas V1 and V2.

    Science.gov (United States)

    Wohlschläger, A M; Specht, K; Lie, C; Mohlberg, H; Wohlschläger, A; Bente, K; Pietrzyk, U; Stöcker, T; Zilles, K; Amunts, K; Fink, G R

    2005-05-15

    Using functional MRI, we characterized field sign maps of the occipital cortex and created three-dimensional maps of these areas. By averaging the individual maps into group maps, probability maps of functionally defined V1 or V2 were determined and compared to anatomical probability maps of Brodmann areas BA17 and BA18 derived from cytoarchitectonic analysis (Amunts, K., Malikovic, A., Mohlberg, H., Schormann, T., Zilles, K., 2000. Brodmann's areas 17 and 18 brought into stereotaxic space-where and how variable? NeuroImage 11, 66-84). Comparison of areas BA17/V1 and BA18/V2 revealed good agreement of the anatomical and functional probability maps. Taking into account that our functional stimulation (due to constraints of the visual angle of stimulation achievable in the MR scanner) only identified parts of V1 and V2, for statistical evaluation of the spatial correlation of V1 and BA17, or V2 and BA18, respectively, the a priori measure kappa was calculated testing the hypothesis that a region can only be part of functionally defined V1 or V2 if it is also in anatomically defined BA17 or BA18, respectively. kappa = 1 means the hypothesis is fully true, kappa = 0 means functionally and anatomically defined visual areas are independent. When applying this measure to the probability maps, kappa was equal to 0.84 for both V1/BA17 and V2/BA18. The data thus show a good correspondence of functionally and anatomically derived segregations of early visual processing areas and serve as a basis for employing anatomical probability maps of V1 and V2 in group analyses to characterize functional activations of early visual processing areas.

  2. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A.; van t Veld, Aart A.

    2012-01-01

    PURPOSE: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator

  3. Taking potential probability function maps to the local scale and matching them with land use maps

    Science.gov (United States)

    Garg, Saryu; Sinha, Vinayak; Sinha, Baerbel

    2013-04-01

    Source-Receptor models have been developed using different methods. Residence-time weighted concentration back trajectory analysis and Potential Source Contribution Function (PSCF) are the two most popular techniques for identification of potential sources of a substance in a defined geographical area. Both techniques use back trajectories calculated using global models and assign values of probability/concentration to various locations in an area. These values represent the probability of threshold exceedances / the average concentration measured at the receptor in air masses with a certain residence time over a source area. Both techniques, however, have only been applied to regional and long-range transport phenomena due to inherent limitation with respect to both spatial accuracy and temporal resolution of the of back trajectory calculations. Employing the above mentioned concepts of residence time weighted concentration back-trajectory analysis and PSCF, we developed a source-receptor model capable of identifying local and regional sources of air pollutants like Particulate Matter (PM), NOx, SO2 and VOCs. We use 1 to 30 minute averages of concentration values and wind direction and speed from a single receptor site or from multiple receptor sites to trace the air mass back in time. The model code assumes all the atmospheric transport to be Lagrangian and linearly extrapolates air masses reaching the receptor location, backwards in time for a fixed number of steps. We restrict the model run to the lifetime of the chemical species under consideration. For long lived species the model run is limited to 180 trees/gridsquare); moderate concentrations for agricultural lands with low tree density (1.5-2.5 ppbv for 250 μg/m3 for traffic hotspots in Chandigarh City are observed. Based on the validation against the land use maps, the model appears to do an excellent job in source apportionment and identifying emission hotspots. Acknowledgement: We thank the IISER

  4. Cancerous tissue mapping from random lasing emission spectra

    International Nuclear Information System (INIS)

    Polson, R C; Vardeny, Z V

    2010-01-01

    Random lasing emission spectra have been collected from both healthy and cancerous tissues. The two types of tissue with optical gain have different light scattering properties as obtained from an average power Fourier transform of their random lasing emission spectra. The difference in the power Fourier transform leads to a contrast between cancerous and benign tissues, which is utilized for tissue mapping of healthy and cancerous regions of patients

  5. Inferring the most probable maps of underground utilities using Bayesian mapping model

    Science.gov (United States)

    Bilal, Muhammad; Khan, Wasiq; Muggleton, Jennifer; Rustighi, Emiliano; Jenks, Hugo; Pennock, Steve R.; Atkins, Phil R.; Cohn, Anthony

    2018-03-01

    Mapping the Underworld (MTU), a major initiative in the UK, is focused on addressing social, environmental and economic consequences raised from the inability to locate buried underground utilities (such as pipes and cables) by developing a multi-sensor mobile device. The aim of MTU device is to locate different types of buried assets in real time with the use of automated data processing techniques and statutory records. The statutory records, even though typically being inaccurate and incomplete, provide useful prior information on what is buried under the ground and where. However, the integration of information from multiple sensors (raw data) with these qualitative maps and their visualization is challenging and requires the implementation of robust machine learning/data fusion approaches. An approach for automated creation of revised maps was developed as a Bayesian Mapping model in this paper by integrating the knowledge extracted from sensors raw data and available statutory records. The combination of statutory records with the hypotheses from sensors was for initial estimation of what might be found underground and roughly where. The maps were (re)constructed using automated image segmentation techniques for hypotheses extraction and Bayesian classification techniques for segment-manhole connections. The model consisting of image segmentation algorithm and various Bayesian classification techniques (segment recognition and expectation maximization (EM) algorithm) provided robust performance on various simulated as well as real sites in terms of predicting linear/non-linear segments and constructing refined 2D/3D maps.

  6. Tissue-based map of the human proteome

    DEFF Research Database (Denmark)

    Uhlén, Mathias; Fagerberg, Linn; Hallström, Björn M.

    2015-01-01

    Resolving the molecular details of proteome variation in the different tissues and organs of the human body will greatly increase our knowledge of human biology and disease. Here, we present a map of the human tissue proteome based on an integrated omics approach that involves quantitative transc...

  7. A probable risk factor of female breast cancer: study on benign and malignant breast tissue samples.

    Science.gov (United States)

    Rehman, Sohaila; Husnain, Syed M

    2014-01-01

    The study reports enhanced Fe, Cu, and Zn contents in breast tissues, a probable risk factor of breast cancer in females. Forty-one formalin-fixed breast tissues were analyzed using atomic absorption spectrophotometry. Twenty malignant, six adjacent to malignant and 15 benign tissues samples were investigated. The malignant tissues samples were of grade 11 and type invasive ductal carcinoma. The quantitative comparison between the elemental levels measured in the two types of specimen (benign and malignant) tissues (removed after surgery) suggests significant elevation of these metals (Fe, Cu, and Zn) in the malignant tissue. The specimens were collected just after mastectomy of women aged 19 to 59 years from the hospitals of Islamabad and Rawalpindi, Pakistan. Most of the patients belong to urban areas of Pakistan. Findings of study depict that these elements have a promising role in the initiation and development of carcinoma as consistent pattern of elevation for Fe, Cu, and Zn was observed. The results showed the excessive accumulation of Fe (229 ± 121 mg/L) in malignant breast tissue samples of patients (p factor of breast cancer. In order to validate our method of analysis, certified reference material muscle tissue lyophilized (IAEA) MA-M-2/TM was analyzed for metal studied. Determined concentrations were quite in good agreement with certified levels. Asymmetric concentration distribution for Fe, Cu, and Zn was observed in both malignant and benign tissue samples.

  8. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  9. Calculation of normal tissue complication probability and dose-volume histogram reduction schemes for tissues with a critical element architecture

    International Nuclear Information System (INIS)

    Niemierko, Andrzej; Goitein, Michael

    1991-01-01

    The authors investigate a model of normal tissue complication probability for tissues that may be represented by a critical element architecture. They derive formulas for complication probability that apply to both a partial volume irradiation and to an arbitrary inhomogeneous dose distribution. The dose-volume isoeffect relationship which is a consequence of a critical element architecture is discussed and compared to the empirical power law relationship. A dose-volume histogram reduction scheme for a 'pure' critical element model is derived. In addition, a point-based algorithm which does not require precomputation of a dose-volume histogram is derived. The existing published dose-volume histogram reduction algorithms are analyzed. The authors show that the existing algorithms, developed empirically without an explicit biophysical model, have a close relationship to the critical element model at low levels of complication probability. However, it is also showed that they have aspects which are not compatible with a critical element model and the authors propose a modification to one of them to circumvent its restriction to low complication probabilities. (author). 26 refs.; 7 figs

  10. Ensemble based system for whole-slide prostate cancer probability mapping using color texture features.

    LENUS (Irish Health Repository)

    DiFranco, Matthew D

    2011-01-01

    We present a tile-based approach for producing clinically relevant probability maps of prostatic carcinoma in histological sections from radical prostatectomy. Our methodology incorporates ensemble learning for feature selection and classification on expert-annotated images. Random forest feature selection performed over varying training sets provides a subset of generalized CIEL*a*b* co-occurrence texture features, while sample selection strategies with minimal constraints reduce training data requirements to achieve reliable results. Ensembles of classifiers are built using expert-annotated tiles from training images, and scores for the probability of cancer presence are calculated from the responses of each classifier in the ensemble. Spatial filtering of tile-based texture features prior to classification results in increased heat-map coherence as well as AUC values of 95% using ensembles of either random forests or support vector machines. Our approach is designed for adaptation to different imaging modalities, image features, and histological decision domains.

  11. Mapping fire probability and severity in a Mediterranean area using different weather and fuel moisture scenarios

    Science.gov (United States)

    Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.

    2009-04-01

    Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in

  12. Radiation risk of tissue late effects, a net consequence of probabilities of various cellular responses

    International Nuclear Information System (INIS)

    Feinendegen, L.E.

    1991-01-01

    Late effects from the exposure to low doses of ionizing radiation are hardly or not at all observed in man mainly due to the low values of risk coefficients that preclude statistical analyses of data from populations that are exposed to doses less than 0.2 Gy. In order to arrive at an assessment of potential risk from radiation exposure in the low dose range, the microdosimetry approach is essential. In the low dose range, ionizing radiation generates particle tracks, mainly electrons, which are distributed rather heterogeneously within the exposed tissue. Taking the individual cell as the elemental unit of life, observations and calculations of cellular responses to being hit by energy depositions events from low LET type are analysed. It emerges that besides the probability of a hit cell to sustain a detrimental effect with the consequense of malignant transformation there are probabilities of various adaptive responses that equipp the hit cell with a benefit. On the one hand, an improvement of cellular radical detoxification was observed in mouse bone marrow cells; another adaptive response pertaining to improved DNA repair, was reported for human lymphocytes. The improved radical detoxification in mouse bone marrow cells lasts for a period of 5-10 hours and improved DNA repair in human lymphocytes was seen for some 60 hours following acute irradiation. It is speculated that improved radical detoxification and improved DNA repair may reduce the probability of spontaneous carcinogenesis. Thus it is proposed to weigh the probability of detriment for a hit cell within a multicellular system against the probability of benefit through adaptive responses in other hit cells in the same system per radiation exposure. In doing this, the net effect of low doses of low LET radiation in tissue with individual cells being hit by energy deposition events could be zero or even beneficial. (orig./MG)

  13. Identification of land degradation evidences in an organic farm using probability maps (Croatia)

    Science.gov (United States)

    Pereira, Paulo; Bogunovic, Igor; Estebaranz, Ferran

    2017-04-01

    Land degradation is a biophysical process with important impacts on society, economy and policy. Areas affected by land degradation do not provide services in quality and with capacity to full-field the communities that depends on them (Amaya-Romero et al., 2015; Beyene, 2015; Lanckriet et al., 2015). Agricultural activities are one of the main causes of land degradation (Kraaijvanger and Veldkamp, 2015), especially when they decrease soil organic matter (SOM), a crucial element for soil fertility. In temperate areas, the critical level of SOM concentration in agricultural soils is 3.4%. Below this level there is a potential decrease of soil quality (Loveland and Weeb, 2003). However, no previous work was carried out in other environments, such as the Mediterranean. The spatial distribution of potential degraded land is important to be identified and mapped, in order to identify the areas that need restoration (Brevik et al., 2016; Pereira et al., 2017). The aim of this work is to assess the spatial distribution of areas with evidences of land degradation (SOM bellow 3.4%) using probability maps in an organic farm located in Croatia. In order to find the best method, we compared several probability methods, such as Ordinary Kriging (OK), Simple Kriging (SK), Universal Kriging (UK), Indicator Kriging (IK), Probability Kriging (PK) and Disjunctive Kriging (DK). The study area is located on the Istria peninsula (45°3' N; 14°2' E), with a total area of 182 ha. One hundred eighty-two soil samples (0-30 cm) were collected during July of 2015 and SOM was assessed using wet combustion procedure. The assessment of the best probability method was carried out using leave one out cross validation method. The probability method with the lowest Root Mean Squared Error (RMSE) was the most accurate. The results showed that the best method to predict the probability of potential land degradation was SK with an RMSE of 0.635, followed by DK (RMSE=0.636), UK (RMSE=0.660), OK (RMSE

  14. Improving normal tissue complication probability models: the need to adopt a "data-pooling" culture.

    Science.gov (United States)

    Deasy, Joseph O; Bentzen, Søren M; Jackson, Andrew; Ten Haken, Randall K; Yorke, Ellen D; Constine, Louis S; Sharma, Ashish; Marks, Lawrence B

    2010-03-01

    Clinical studies of the dependence of normal tissue response on dose-volume factors are often confusingly inconsistent, as the QUANTEC reviews demonstrate. A key opportunity to accelerate progress is to begin storing high-quality datasets in repositories. Using available technology, multiple repositories could be conveniently queried, without divulging protected health information, to identify relevant sources of data for further analysis. After obtaining institutional approvals, data could then be pooled, greatly enhancing the capability to construct predictive models that are more widely applicable and better powered to accurately identify key predictive factors (whether dosimetric, image-based, clinical, socioeconomic, or biological). Data pooling has already been carried out effectively in a few normal tissue complication probability studies and should become a common strategy. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Expression cartography of human tissues using self organizing maps

    Directory of Open Access Journals (Sweden)

    Löffler Markus

    2011-07-01

    Full Text Available Abstract Background Parallel high-throughput microarray and sequencing experiments produce vast quantities of multidimensional data which must be arranged and analyzed in a concerted way. One approach to addressing this challenge is the machine learning technique known as self organizing maps (SOMs. SOMs enable a parallel sample- and gene-centered view of genomic data combined with strong visualization and second-level analysis capabilities. The paper aims at bridging the gap between the potency of SOM-machine learning to reduce dimension of high-dimensional data on one hand and practical applications with special emphasis on gene expression analysis on the other hand. Results The method was applied to generate a SOM characterizing the whole genome expression profiles of 67 healthy human tissues selected from ten tissue categories (adipose, endocrine, homeostasis, digestion, exocrine, epithelium, sexual reproduction, muscle, immune system and nervous tissues. SOM mapping reduces the dimension of expression data from ten of thousands of genes to a few thousand metagenes, each representing a minicluster of co-regulated single genes. Tissue-specific and common properties shared between groups of tissues emerge as a handful of localized spots in the tissue maps collecting groups of co-regulated and co-expressed metagenes. The functional context of the spots was discovered using overrepresentation analysis with respect to pre-defined gene sets of known functional impact. We found that tissue related spots typically contain enriched populations of genes related to specific molecular processes in the respective tissue. Analysis techniques normally used at the gene-level such as two-way hierarchical clustering are better represented and provide better signal-to-noise ratios if applied to the metagenes. Metagene-based clustering analyses aggregate the tissues broadly into three clusters containing nervous, immune system and the remaining tissues

  16. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  17. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    DEFF Research Database (Denmark)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren

    2013-01-01

    To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors.......To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors....

  18. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van' t [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2012-03-15

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  19. Impact of statistical learning methods on the predictive power of multivariate normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A

    2012-03-15

    To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Impact of Statistical Learning Methods on the Predictive Power of Multivariate Normal Tissue Complication Probability Models

    International Nuclear Information System (INIS)

    Xu Chengjian; Schaaf, Arjen van der; Schilstra, Cornelis; Langendijk, Johannes A.; Veld, Aart A. van’t

    2012-01-01

    Purpose: To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. Methods and Materials: In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. Results: It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. Conclusions: The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended.

  1. Normal Tissue Complication Probability Modeling of Acute Hematologic Toxicity in Cervical Cancer Patients Treated With Chemoradiotherapy

    International Nuclear Information System (INIS)

    Rose, Brent S.; Aydogan, Bulent; Liang, Yun; Yeginer, Mete; Hasselle, Michael D.; Dandekar, Virag; Bafana, Rounak; Yashar, Catheryn M.; Mundt, Arno J.; Roeske, John C.; Mell, Loren K.

    2011-01-01

    Purpose: To test the hypothesis that increased pelvic bone marrow (BM) irradiation is associated with increased hematologic toxicity (HT) in cervical cancer patients undergoing chemoradiotherapy and to develop a normal tissue complication probability (NTCP) model for HT. Methods and Materials: We tested associations between hematologic nadirs during chemoradiotherapy and the volume of BM receiving ≥10 and 20 Gy (V 10 and V 20 ) using a previously developed linear regression model. The validation cohort consisted of 44 cervical cancer patients treated with concurrent cisplatin and pelvic radiotherapy. Subsequently, these data were pooled with data from 37 identically treated patients from a previous study, forming a cohort of 81 patients for normal tissue complication probability analysis. Generalized linear modeling was used to test associations between hematologic nadirs and dosimetric parameters, adjusting for body mass index. Receiver operating characteristic curves were used to derive optimal dosimetric planning constraints. Results: In the validation cohort, significant negative correlations were observed between white blood cell count nadir and V 10 (regression coefficient (β) = -0.060, p = 0.009) and V 20 (β = -0.044, p = 0.010). In the combined cohort, the (adjusted) β estimates for log (white blood cell) vs. V 10 and V 20 were as follows: -0.022 (p = 0.025) and -0.021 (p = 0.002), respectively. Patients with V 10 ≥ 95% were more likely to experience Grade ≥3 leukopenia (68.8% vs. 24.6%, p 20 > 76% (57.7% vs. 21.8%, p = 0.001). Conclusions: These findings support the hypothesis that HT increases with increasing pelvic BM volume irradiated. Efforts to maintain V 10 20 < 76% may reduce HT.

  2. Mapping probabilities of extreme continental water storage changes from space gravimetry

    Science.gov (United States)

    Kusche, J.; Eicker, A.; Forootan, E.; Springer, A.; Longuevergne, L.

    2016-12-01

    Using data from the Gravity Recovery and Climate Experiment (GRACE) mission, we derive statistically robust 'hotspot' regions of high probability of peak anomalous - i.e. with respect to the seasonal cycle - water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/mon). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hotspot regions to GRACE results, and that most exceptions are located in the Tropics. However, a simulation experiment reveals that differences observed by GRACE are statistically significant, and further error analysis suggests that by around the year 2020 it will be possible to detect temporal changes in the frequency of extreme total fluxes (i.e. combined effects of mainly precipitation and floods) for at least 10-20% of the continental area, assuming that we have a continuation of GRACE by its follow-up GRACE-FO. J. Kusche et al. (2016): Mapping probabilities of extreme continental water storage changes from space gravimetry, Geophysical Research Letters, accepted online, doi:10.1002/2016GL069538

  3. Cardiac tissue slices: preparation, handling, and successful optical mapping.

    Science.gov (United States)

    Wang, Ken; Lee, Peter; Mirams, Gary R; Sarathchandra, Padmini; Borg, Thomas K; Gavaghan, David J; Kohl, Peter; Bollensdorff, Christian

    2015-05-01

    Cardiac tissue slices are becoming increasingly popular as a model system for cardiac electrophysiology and pharmacology research and development. Here, we describe in detail the preparation, handling, and optical mapping of transmembrane potential and intracellular free calcium concentration transients (CaT) in ventricular tissue slices from guinea pigs and rabbits. Slices cut in the epicardium-tangential plane contained well-aligned in-slice myocardial cell strands ("fibers") in subepicardial and midmyocardial sections. Cut with a high-precision slow-advancing microtome at a thickness of 350 to 400 μm, tissue slices preserved essential action potential (AP) properties of the precutting Langendorff-perfused heart. We identified the need for a postcutting recovery period of 36 min (guinea pig) and 63 min (rabbit) to reach 97.5% of final steady-state values for AP duration (APD) (identified by exponential fitting). There was no significant difference between the postcutting recovery dynamics in slices obtained using 2,3-butanedione 2-monoxime or blebistatin as electromechanical uncouplers during the cutting process. A rapid increase in APD, seen after cutting, was caused by exposure to ice-cold solution during the slicing procedure, not by tissue injury, differences in uncouplers, or pH-buffers (bicarbonate; HEPES). To characterize intrinsic patterns of CaT, AP, and conduction, a combination of multipoint and field stimulation should be used to avoid misinterpretation based on source-sink effects. In summary, we describe in detail the preparation, mapping, and data analysis approaches for reproducible cardiac tissue slice-based investigations into AP and CaT dynamics. Copyright © 2015 the American Physiological Society.

  4. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  5. The use of normal tissue complication probability to predict radiation hepatitis

    International Nuclear Information System (INIS)

    Keum, Ki Chang; Seong, Jin Sil; Suh, Chang Ok; Lee, Sang Wook; Chung, Eun Ji; Shin, Hyun Soo; Kim, Gwi Eon

    2000-01-01

    Although it has been known that the tolerance of the liver to external beam irradiation depends on the irradiated volume and dose, few data exist which quantify this dependence. However, recently, with the development of three dimensional (3-D) treatment planning, have the tools to quantify the relationships between dose, volume, and normal tissue complications become available. The objective of this study is to investigate the relationships between normal tissue complication probability (NTCP) and the risk of radiation hepatitis for patients who received variant dose partial liver irradiation. From March 1992 to December 1994, 10 patients with hepatoma and 10 patients with bile duct cancer were included in this study. Eighteen patients had normal hepatic function, but 2 patients (prothrombin time 73%, 68%) had mild liver cirrhosis before irradiation. Radiation therapy was delivered with 10MV linear accelerator, 180-200 cGy fraction per day. The total dose ranged from 3,960 cGy to 6,000 cGy (median dose 5,040 cGy). The normal tissue complication probability was calculated by using Lyman's model. Radiation hepatitis was defined as the development of anicteric elevation of alkaline phosphatase of at least two fold and non-malignant ascites in the absence of documented progressive. The calculated NTCP ranged from 0.001 to 0.840 (median 0.05). Three of the 20 patients developed radiation hepatitis. The NTCP of the patients with radiation hepatitis were 0.390, 0.528, 0.844 (median: O.58±0.23), but that of the patients without radiation hepatitis ranged from 0.001 to 0.308 (median: 0.09±0.09). When the NTCP was calculated by using the volume factor of 0.32, a radiation hepatitis was observed only in patients with the NTCP value more than 0.39. By contrast, clinical results of evolving radiation hepatitis were not well correlated with NTCP value calculated when the volume factor of 0.69 was applied. On the basis of these observations, volume factor of 0.32 was more

  6. A simple method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation

    International Nuclear Information System (INIS)

    Begnozzi, L.; Gentile, F.P.; Di Nallo, A.M.; Chiatti, L.; Zicari, C.; Consorti, R.; Benassi, M.

    1994-01-01

    Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.) [de

  7. Normal tissue complication probabilities correlated with late effects in the rectum after prostate conformal radiotherapy

    International Nuclear Information System (INIS)

    Dale, Einar; Olsen, Dag R.; Fossa, Sophie D.

    1999-01-01

    Purpose: Radiation therapy of deep-sited tumours will always result in normal tissue doses to some extent. The aim of this study was to calculate different risk estimates of late effects in the rectum for a group of cancer prostate patients treated with conformal radiation therapy (CRT) and correlate these estimates with the occurrences of late effects. Since the rectum is a hollow organ, several ways of generating dose-volume distributions over the organ are possible, and we wanted to investigate two of them. Methods and Materials: A mathematical model, known as the Lyman-Kutcher model, conventionally used to estimate normal tissue complication probabilities (NTCP) associated with radiation therapy, was applied to a material of 52 cancer prostate patients. The patients were treated with a four field box technique, with the rectum as organ at risk. Dose-volume histograms (DVH) were generated for the whole rectum (including the cavity) and of the rectum wall. One to two years after the treatment, the patients completed a questionnaire concerning bowel (rectum) related morbidity quantifying the extent of late effects. Results: A correlation analysis using Spearman's rank correlation coefficient, for NTCP values calculated from the DVHs and the patients' scores, gave correlation coefficients which were not statistically significant at the p max , of the whole rectum, correlated better to observed late toxicity than D max derived from histograms of the rectum wall. Correlation coefficients from 'high-dose' measures were larger than those calculated from the NTCP values. Accordingly, as the volume parameter of the Lyman-Kutcher model was reduced, raising the impact of small high-dose volumes on the NTCP values, the correlation between observed effects and NTCP values became significant at p < 0.01 level. Conclusions: 1) High-dose levels corresponding to small volume fractions of the cumulative dose-volume histograms were best correlated with the occurrences of late

  8. Occurrence and Probability Maps of Lutzomyia longipalpis and Lutzomyia cruzi (Diptera: Psychodidae: Phlebotominae) in Brazil.

    Science.gov (United States)

    Andrade-Filho, J D; Scholte, R G C; Amaral, A L G; Shimabukuro, P H F; Carvalho, O S; Caldeira, R L

    2017-09-01

    Leishmaniases are serious diseases caused by trypanosomatid protozoans of the genus Leishmania transmitted by the bite of phlebotomine sand flies. We analyzed records pertaining to Lutzomyia longipalpis (Lutz and Neiva, 1912) and Lutzomyia cruzi (Mangabeira, 1938) in Brazil from the following sources: the collection of phlebotomine sand flies of the Centro de Pesquisas René Rachou/Fiocruz (FIOCRUZ-COLFLEB), the "SpeciesLink" (CRIA) database, from systematic surveys of scientific articles and gray literature (dissertations, theses, and communications), and disease data obtained from the Information System for Notifiable Diseases/Ministry of Health (SINAN/MS). Environmental data and ecological niche modeling (ESMS) using the approach of MaxEnt algorithm produced maps of occurrence probability for both Lu. longipalpis and Lu. cruzi. Lutzomyia longipalpis was found in 229 Brazilian municipalities and Lu. cruzi in 27. The species were sympatric in 16 municipalities of the Central-West region of Brazil. Our results show that Lu. longipalpis is widely distributed and associated with the high number of cases of visceral leishmaniasis reported in Brazil. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. MR-based automatic delineation of volumes of interest in human brain PET images using probability maps

    DEFF Research Database (Denmark)

    Svarer, Claus; Madsen, Karina; Hasselbalch, Steen G.

    2005-01-01

    The purpose of this study was to develop and validate an observer-independent approach for automatic generation of volume-of-interest (VOI) brain templates to be used in emission tomography studies of the brain. The method utilizes a VOI probability map created on the basis of a database of several...... delineation of the VOI set. The approach was also shown to work equally well in individuals with pronounced cerebral atrophy. Probability-map-based automatic delineation of VOIs is a fast, objective, reproducible, and safe way to assess regional brain values from PET or SPECT scans. In addition, the method...

  10. Investigation of normal tissue complication probabilities in prostate and partial breast irradiation radiotherapy techniques

    International Nuclear Information System (INIS)

    Bezak, E.; Takam, R.; Bensaleh, S.; Yeoh, E.; Marcu, L.

    2011-01-01

    Full text: Normal- Tissue-Complication Probabilities of rectum, bladder and urethra following various radiation techniques for prostate cancer were evaluated using the relative-seriality and Lyman models. NTCPs of lungs, heart and skin, their dependence on sourceposition, balloon-deformation were also investigated for HDR mammosite brachytherapy. The prostate treatment techniques included external three dimentional conformal-radiotherapy, Low-Dose-Rate brachytherapy (1-125), High-Dose-Rate brachytherapy (Ir-I92). Dose- Volume-Histograms of critical structures for prostate and breast radiotherapy, retrieved from corresponding treatment planning systems, were converted to Biological Effective Dose (BEffD)-based and Equivalent Dose(Deq)-based DVHs to account for differences in radiation delivery and fractionation schedule. Literature-based model parameters were used to calculate NTCPs. Hypofractionated 3D-CRT (2.75 Gy/fraction, total dose 55 Gy) NTCPs of rectum, bladder and urethra were less than those for standard fractionated 4-field 3D-CRT (2-Gy/fraction, 64 Gy) and dose-escalated 4- and 5-field 3D-CRT (74 Gy). Rectal and bladder NTCPs (5.2% and 6.6%) following the dose-escalated 4-field 3D-CRT (74 Gy) were the highest among analyzed techniques. The average NTCP for rectum and urethra were 0.6% and 24.7% for LDRBT and 0.5% and 11.2% for HDR-BT. For Mammosite, NTCP was estimated to be 0.1 %, 0.1 %, 1.2% and 3.5% for skin desquamation, erythema, telangiectasia and fibrosis respectively (the source positioned at the balloon centre). A 4 mm Mammosite-balloon deformation leads to overdosing of PTV regions by ∼40%, resulting in excessive skin dose and increased NTCP. Conclusions Prostate brachytherapy resulted in NTCPs lower compared to external beam techniques. Mammosite-brachytherapy resulted in no heart/lung complications regardless of balloon deformation. However, 4 mm deformation caused 0.6% increase in tissue fibrosis NTCP.

  11. Method for Automatic Selection of Parameters in Normal Tissue Complication Probability Modeling.

    Science.gov (United States)

    Christophides, Damianos; Appelt, Ane L; Gusnanto, Arief; Lilley, John; Sebag-Montefiore, David

    2018-07-01

    To present a fully automatic method to generate multiparameter normal tissue complication probability (NTCP) models and compare its results with those of a published model, using the same patient cohort. Data were analyzed from 345 rectal cancer patients treated with external radiation therapy to predict the risk of patients developing grade 1 or ≥2 cystitis. In total, 23 clinical factors were included in the analysis as candidate predictors of cystitis. Principal component analysis was used to decompose the bladder dose-volume histogram into 8 principal components, explaining more than 95% of the variance. The data set of clinical factors and principal components was divided into training (70%) and test (30%) data sets, with the training data set used by the algorithm to compute an NTCP model. The first step of the algorithm was to obtain a bootstrap sample, followed by multicollinearity reduction using the variance inflation factor and genetic algorithm optimization to determine an ordinal logistic regression model that minimizes the Bayesian information criterion. The process was repeated 100 times, and the model with the minimum Bayesian information criterion was recorded on each iteration. The most frequent model was selected as the final "automatically generated model" (AGM). The published model and AGM were fitted on the training data sets, and the risk of cystitis was calculated. The 2 models had no significant differences in predictive performance, both for the training and test data sets (P value > .05) and found similar clinical and dosimetric factors as predictors. Both models exhibited good explanatory performance on the training data set (P values > .44), which was reduced on the test data sets (P values < .05). The predictive value of the AGM is equivalent to that of the expert-derived published model. It demonstrates potential in saving time, tackling problems with a large number of parameters, and standardizing variable selection in NTCP

  12. Multivariate Normal Tissue Complication Probability Modeling of Heart Valve Dysfunction in Hodgkin Lymphoma Survivors

    International Nuclear Information System (INIS)

    Cella, Laura; Liuzzi, Raffaele; Conson, Manuel; D’Avino, Vittoria; Salvatore, Marco; Pacelli, Roberto

    2013-01-01

    Purpose: To establish a multivariate normal tissue complication probability (NTCP) model for radiation-induced asymptomatic heart valvular defects (RVD). Methods and Materials: Fifty-six patients treated with sequential chemoradiation therapy for Hodgkin lymphoma (HL) were retrospectively reviewed for RVD events. Clinical information along with whole heart, cardiac chambers, and lung dose distribution parameters was collected, and the correlations to RVD were analyzed by means of Spearman's rank correlation coefficient (Rs). For the selection of the model order and parameters for NTCP modeling, a multivariate logistic regression method using resampling techniques (bootstrapping) was applied. Model performance was evaluated using the area under the receiver operating characteristic curve (AUC). Results: When we analyzed the whole heart, a 3-variable NTCP model including the maximum dose, whole heart volume, and lung volume was shown to be the optimal predictive model for RVD (Rs = 0.573, P<.001, AUC = 0.83). When we analyzed the cardiac chambers individually, for the left atrium and for the left ventricle, an NTCP model based on 3 variables including the percentage volume exceeding 30 Gy (V30), cardiac chamber volume, and lung volume was selected as the most predictive model (Rs = 0.539, P<.001, AUC = 0.83; and Rs = 0.557, P<.001, AUC = 0.82, respectively). The NTCP values increase as heart maximum dose or cardiac chambers V30 increase. They also increase with larger volumes of the heart or cardiac chambers and decrease when lung volume is larger. Conclusions: We propose logistic NTCP models for RVD considering not only heart irradiation dose but also the combined effects of lung and heart volumes. Our study establishes the statistical evidence of the indirect effect of lung size on radio-induced heart toxicity

  13. Cardiovascular magnetic resonance frontiers: Tissue characterisation with mapping

    Directory of Open Access Journals (Sweden)

    Rebecca Schofield

    2016-11-01

    Full Text Available The clinical use of cardiovascular magnetic resonance (CMR imaging has expanded rapidly over the last decade. Its role in cardiac morphological and functional assessment is established, with perfusion and late gadolinium enhancement (LGE imaging for scar increasingly used in day-to-day clinical decision making. LGE allows a virtual histological assessment of the myocardium, with the pattern of scar suggesting disease aetiology, and the extent of predicting risk. However, even combined, the full range of pathological processes occurring in the myocardium are not interrogated. Mapping is a new frontier where the intrinsic magnetic properties of heart muscle are measured to probe further. T1, T2 and T2* mapping measures the three fundamental tissue relaxation rate constants before contrast, and the extracellular volume (ECV after contrast. These are displayed in colour, often providing an immediate appreciation of pathology. These parameters are differently sensitive to pathologies. Iron (cardiac siderosis, intramyocardial haemorrhage makes T1, T2 and T2* fall. T2 also falls with fat infiltration (Fabry disease. T2 increases with oedema (acute infarction, takotsubo cardiomyopathy, myocarditis, rheumatological disease. Native T1 increases with fibrosis, oedema and amyloid. Some of these changes are large (e.g. iron, oedema, amyloid, others more modest (diffuse fibrosis. They can be used to detect early disease, distinguish aetiology and, in some circumstances, guide therapy. In this review, we discuss these processes, illustrating clinical application and future advances.

  14. Effects of tissue susceptibility on brain temperature mapping.

    Science.gov (United States)

    Maudsley, Andrew A; Goryawala, Mohammed Z; Sheriff, Sulaiman

    2017-02-01

    A method for mapping of temperature over a large volume of the brain using volumetric proton MR spectroscopic imaging has been implemented and applied to 150 normal subjects. Magnetic susceptibility-induced frequency shifts in gray- and white-matter regions were measured and included as a correction in the temperature mapping calculation. Additional sources of magnetic susceptibility variations of the individual metabolite resonance frequencies were also observed that reflect the cellular-level organization of the brain metabolites, with the most notable differences being attributed to changes of the N-Acetylaspartate resonance frequency that reflect the intra-axonal distribution and orientation of the white-matter tracts with respect to the applied magnetic field. These metabolite-specific susceptibility effects are also shown to change with age. Results indicate no change of apparent brain temperature with age from 18 to 84 years old, with a trend for increased brain temperature throughout the cerebrum in females relative for males on the order of 0.1°C; slightly increased temperatures in the left hemisphere relative to the right; and a lower temperature of 0.3°C in the cerebellum relative to that of cerebral white-matter. This study presents a novel acquisition method for noninvasive measurement of brain temperature that is of potential value for diagnostic purposes and treatment monitoring, while also demonstrating limitations of the measurement due to the confounding effects of tissue susceptibility variations. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  16. Using widely spaced observations of land use, forest attributes, and intrusions to map resource potential and human impact probability

    Science.gov (United States)

    Victor A. Rudis

    2000-01-01

    Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven State survey region (Alabama, Arkansas, Louisiana, Mississippi,...

  17. Using widely spaced observations of land use, forest attributes, and intrusions to map resource potential and human impact probability

    Science.gov (United States)

    Victor A. Rudis

    2000-01-01

    Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven-state survey region (Alabama, Arkansas, Louisiana, Mississippi,...

  18. Mapping and characterization of iron compounds in Alzheimer's tissue

    International Nuclear Information System (INIS)

    Collingwood, Joanna; Dobson, Jon

    2006-01-01

    Understanding the management of iron in the brain is of great importance in the study of neurodegeneration, where regional iron overload is frequently evident. A variety of approaches have been employed, from quantifying iron in various anatomical structures, to identifying genetic risk factors related to iron metabolism, and exploring chelation approaches to tackle iron overload in neurodegenerative disease. However, the ease with which iron can change valence state ensures that it is present in vivo in a wide variety of forms, both soluble and insoluble. Here, we review recent developments in approaches to locate and identify iron compounds in neurodegenerative tissue. In addition to complementary techniques that allow us to quantify and identify iron compounds using magnetometry, extraction, and electron microscopy, we are utilizing a powerful combined mapping/characterization approach with synchrotron X-rays. This has enabled the location and characterization of iron accumulations containing magnetite and ferritin in human Alzheimer's disease (AD) brain tissue sections in situ at micron-resolution. It is hoped that such approaches will contribute to our understanding of the role of unusual iron accumulations in disease pathogenesis, and optimise the potential to use brain iron as a clinical biomarker for early detection and diagnosis.

  19. An imaging colorimeter for noncontact tissue color mapping.

    Science.gov (United States)

    Balas, C

    1997-06-01

    There has been a considerable effort in several medical fields, for objective color analysis and characterization of biological tissues. Conventional colorimeters have proved inadequate for this purpose, since they do not provide spatial color information and because the measuring procedure randomly affects the color of the tissue. In this paper an imaging colorimeter is presented, where the nonimaging optical photodetector of colorimeters is replaced with the charge-coupled device (CCD) sensor of a color video camera, enabling the independent capturing of the color information for any spatial point within its field-of-view. Combining imaging and colorimetry methods, the acquired image is calibrated and corrected, under several ambient light conditions, providing noncontact reproducible color measurements and mapping, free of the errors and the limitations present in conventional colorimeters. This system was used for monitoring of blood supply changes of psoriatic plaques, that have undergone Psoralens and ultraviolet-A radiation (PUVA) therapy, where reproducible and reliable measurements were demonstrated. These features highlight the potential of the imaging colorimeters as clinical and research tools for the standardization of clinical diagnosis and for the objective evaluation of treatment effectiveness.

  20. Suitable reference tissues for quantitative susceptibility mapping of the brain.

    Science.gov (United States)

    Straub, Sina; Schneider, Till M; Emmerich, Julian; Freitag, Martin T; Ziener, Christian H; Schlemmer, Heinz-Peter; Ladd, Mark E; Laun, Frederik B

    2017-07-01

    Since quantitative susceptibility mapping (QSM) quantifies magnetic susceptibility relative to a reference value, a suitable reference tissue has to be available to compare different subjects and stages of disease. To find such a suitable reference tissue for QSM of the brain, melanoma patients with and without brain metastases were measured. Twelve reference regions were chosen and assessed for stability of susceptibility values with respect to multiple intra-individual and inter-individual measurements, age, and stage of disease. Cerebrospinal fluid (CSF), the internal capsule and one region in the splenium of the corpus callosum are the regions with the smallest standard deviations of the mean susceptibility value. The mean susceptibility is 0.010 ± 0.014 ppm for CSF in the atrium of the lateral ventricles (csf post ), -0.060 ± 0.019 ppm for the posterior limb of the internal capsule (ci2), and -0.008 ± 0.019 ppm for the splenium of the corpus callosum. csf post and ci2 show nearly no dependence on age or stage of disease, whereas some other regions, e.g., the red nucleus, show moderate dependence on age or disease. The internal capsule and CSF appear to be the most suitable reference regions for QSM of the brain in the melanoma patients studied. Both showed virtually no dependence on age or disease and small variations among patients. Magn Reson Med 78:204-214, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  1. Collapse susceptibility mapping in karstified gypsum terrain (Sivas basin - Turkey) by conditional probability, logistic regression, artificial neural network models

    Science.gov (United States)

    Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin

    2010-05-01

    This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.

  2. A fast algorithm for estimating transmission probabilities in QTL detection designs with dense maps

    Directory of Open Access Journals (Sweden)

    Gilbert Hélène

    2009-11-01

    Full Text Available Abstract Background In the case of an autosomal locus, four transmission events from the parents to progeny are possible, specified by the grand parental origin of the alleles inherited by this individual. Computing the probabilities of these transmission events is essential to perform QTL detection methods. Results A fast algorithm for the estimation of these probabilities conditional to parental phases has been developed. It is adapted to classical QTL detection designs applied to outbred populations, in particular to designs composed of half and/or full sib families. It assumes the absence of interference. Conclusion The theory is fully developed and an example is given.

  3. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  4. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  5. Southern pine beetle infestation probability mapping using weights of evidence analysis

    Science.gov (United States)

    Jason B. Grogan; David L. Kulhavy; James C. Kroll

    2010-01-01

    Weights of Evidence (WofE) spatial analysis was used to predict probability of southern pine beetle (Dendroctonus frontalis) (SPB) infestation in Angelina, Nacogdoches, San Augustine and Shelby Co., TX. Thematic data derived from Landsat imagery (1974–2002 Landsat 1–7) were used. Data layers included: forest covertype, forest age, forest patch size...

  6. Mapping probabilities of extreme continental water storage changes from space gravimetry

    OpenAIRE

    Kusche , Jürgen; Eicker , Annette; Forootan , Ehsan; Springer , Anne; Longuevergne , Laurent

    2016-01-01

    International audience; Using data from the Gravity Recovery And Climate Experiment (GRACE) mission, we derive statistically robust " hot spot " regions of high probability of peak anomalous—i.e., with respect to the seasonal cycle—water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/month). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hot spot regions to GRACE results and that most e...

  7. Identification of titanium in human tissues: probable role in pathologic processes

    International Nuclear Information System (INIS)

    Moran, C.A.; Mullick, F.G.; Ishak, K.G.; Johnson, F.B.; Hummer, W.B.

    1991-01-01

    Six cases of titanium dioxide exposure involving lung, skin, and synovium are described, with a review of the literature. The patients, four men and two women, were between the ages of 22 and 65 years. The pulmonary changes were characterized by fibrosis and numerous macrophages with abundant deposition of a black pigment. Adjacent areas of bronchopneumonia were also observed. In the skin a severe necrotizing lesion involving the subcutaneous tissue with extension to the muscle was observed in one case and a nonspecific inflammatory response was observed in another; both cases showed abundant black pigment deposition. Electron microscopy and energy dispersive x-ray analysis demonstrated the presence of large quantities of titanium in the pigment granules. There may be a combination of black pigment deposition and fibrosis, necrosis, or a xanthomatous or granulomatous reaction, that, together with negative results on special staining and culture studies for organisms, should raise the suspicion of titanium-associated injury and prompt the study of the affected tissues by x-ray analysis for positive identification

  8. Mapping closure for probability distribution function in low frequency magnetized plasma turbulence

    International Nuclear Information System (INIS)

    Das, A.; Kaw, P.

    1995-01-01

    Recent numerical studies on the Hasegawa--Mima equation and its variants describing low frequency magnetized plasma turbulence indicate that the potential fluctuations have a Gaussian character whereas the vorticity exhibits non-Gaussian features. A theoretical interpretation for this observation using the recently developed mapping closure technique [Chen, Chen, and Kraichnan, Phys. Rev. Lett. 63, 2657 (1989)] has been provided here. It has been shown that non-Gaussian statistics for the vorticity arises because of a competition between nonlinear straining and diffusive damping whereas the Gaussianity of the statistics of φ arises because the only significant nonlinearity is associated with divergence free convection, which produces no strain terms. copyright 1995 American Institute of Physics

  9. Creating and validating cis-regulatory maps of tissue-specific gene expression regulation

    Science.gov (United States)

    O'Connor, Timothy R.; Bailey, Timothy L.

    2014-01-01

    Predicting which genomic regions control the transcription of a given gene is a challenge. We present a novel computational approach for creating and validating maps that associate genomic regions (cis-regulatory modules–CRMs) with genes. The method infers regulatory relationships that explain gene expression observed in a test tissue using widely available genomic data for ‘other’ tissues. To predict the regulatory targets of a CRM, we use cross-tissue correlation between histone modifications present at the CRM and expression at genes within 1 Mbp of it. To validate cis-regulatory maps, we show that they yield more accurate models of gene expression than carefully constructed control maps. These gene expression models predict observed gene expression from transcription factor binding in the CRMs linked to that gene. We show that our maps are able to identify long-range regulatory interactions and improve substantially over maps linking genes and CRMs based on either the control maps or a ‘nearest neighbor’ heuristic. Our results also show that it is essential to include CRMs predicted in multiple tissues during map-building, that H3K27ac is the most informative histone modification, and that CAGE is the most informative measure of gene expression for creating cis-regulatory maps. PMID:25200088

  10. A new plan-scoring method using normal tissue complication probability for personalized treatment plan decisions in prostate cancer

    Science.gov (United States)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan

    2018-01-01

    The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.

  11. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD).

    Science.gov (United States)

    Luxton, Gary; Keall, Paul J; King, Christopher R

    2008-01-07

    To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within approximately 0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD(50), and conversely m and TD(50) are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d(ref), n, v(eff) and the Niemierko equivalent uniform dose (EUD), where d(ref) and v(eff) are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data.

  12. A new formula for normal tissue complication probability (NTCP) as a function of equivalent uniform dose (EUD)

    International Nuclear Information System (INIS)

    Luxton, Gary; Keall, Paul J; King, Christopher R

    2008-01-01

    To facilitate the use of biological outcome modeling for treatment planning, an exponential function is introduced as a simpler equivalent to the Lyman formula for calculating normal tissue complication probability (NTCP). The single parameter of the exponential function is chosen to reproduce the Lyman calculation to within ∼0.3%, and thus enable easy conversion of data contained in empirical fits of Lyman parameters for organs at risk (OARs). Organ parameters for the new formula are given in terms of Lyman model m and TD 50 , and conversely m and TD 50 are expressed in terms of the parameters of the new equation. The role of the Lyman volume-effect parameter n is unchanged from its role in the Lyman model. For a non-homogeneously irradiated OAR, an equation relates d ref , n, v eff and the Niemierko equivalent uniform dose (EUD), where d ref and v eff are the reference dose and effective fractional volume of the Kutcher-Burman reduction algorithm (i.e. the LKB model). It follows in the LKB model that uniform EUD irradiation of an OAR results in the same NTCP as the original non-homogeneous distribution. The NTCP equation is therefore represented as a function of EUD. The inverse equation expresses EUD as a function of NTCP and is used to generate a table of EUD versus normal tissue complication probability for the Emami-Burman parameter fits as well as for OAR parameter sets from more recent data

  13. Mapping by monoclonal antibody detection of glycosaminoglycans in connective tissues

    DEFF Research Database (Denmark)

    Couchman, J R; Caterson, B; Christner, J E

    1984-01-01

    Chondroitin sulphate proteoglycans are widespread connective tissue components and chemical analysis of cartilage and other proteoglycans has demonstrated molecular speciation involving the degree and position of sulphation of the carbohydrate chains. This may, in turn, affect the properties...... of the glycosaminoglycan (GAG), particularly with respect to self-association and interactions with other extracellular matrix components. Interactions with specific molecules from different connective tissue types, such as the collagens and their associated glycoproteins, could be favoured by particular charge...... and dermatan sulphate. These provide novel opportunities to study the in vivo distribution of chondroitin sulphate proteoglycans. We demonstrate that chondroitin sulphates exhibit remarkable connective tissue specificity and furthermore provide evidence that some proteoglycans may predominantly carry only one...

  14. Vaccination with map specific peptides reduces map burden in tissues of infected goats

    DEFF Research Database (Denmark)

    Melvang, Heidi Mikkelsen; Hassan, Sufia Butt; Thakur, Aneesh

    As an alternative to protein-based vaccines, we investigated the effect of post-exposure vaccination with Map specific peptides in a goat model aiming at developing a Map vaccine that will neither interfere with diagnosis of paratuberculosis nor bovine tuberculosis. Peptides were initially select...... in the unvaccinated control group seroconverted in ID Screen® ELISA at last sampling prior to euthanasia. These results indicate that a subunit vaccine against Map can induce a protective immune response against paratuberculosis in goats....

  15. Nanomechanical mapping of bone tissue regenerated by magnetic scaffolds.

    Science.gov (United States)

    Bianchi, Michele; Boi, Marco; Sartori, Maria; Giavaresi, Gianluca; Lopomo, Nicola; Fini, Milena; Dediu, Alek; Tampieri, Anna; Marcacci, Maurilio; Russo, Alessandro

    2015-01-01

    Nanoindentation can provide new insights on the maturity stage of regenerating bone. The aim of the present study was the evaluation of the nanomechanical properties of newly-formed bone tissue at 4 weeks from the implantation of permanent magnets and magnetic scaffolds in the trabecular bone of rabbit femoral condyles. Three different groups have been investigated: MAG-A (NdFeB magnet + apatite/collagen scaffold with magnetic nanoparticles directly nucleated on the collagen fibers during scaffold synthesis); MAG-B (NdFeB magnet + apatite/collagen scaffold later infiltrated with magnetic nanoparticles) and MAG (NdFeB magnet). The mechanical properties of different-maturity bone tissues, i.e. newly-formed immature, newly-formed mature and native trabecular bone have been evaluated for the three groups. Contingent correlations between elastic modulus and hardness of immature, mature and native bone have been examined and discussed, as well as the efficacy of the adopted regeneration method in terms of "mechanical gap" between newly-formed and native bone tissue. The results showed that MAG-B group provided regenerated bone tissue with mechanical properties closer to that of native bone compared to MAG-A or MAG groups after 4 weeks from implantation. Further, whereas the mechanical properties of newly-formed immature and mature bone were found to be fairly good correlated, no correlation was detected between immature or mature bone and native bone. The reported results evidence the efficacy of nanoindentation tests for the investigation of the maturity of newly-formed bone not accessible through conventional analyses.

  16. MR-based automatic delineation of volumes of interest in human brain PET images using probability maps

    DEFF Research Database (Denmark)

    Svarer, Claus; Madsen, Karina; Hasselbalch, Steen G.

    2005-01-01

    subjects' MR-images, where VOI sets have been defined manually. High-resolution structural MR-images and 5-HT(2A) receptor binding PET-images (in terms of (18)F-altanserin binding) from 10 healthy volunteers and 10 patients with mild cognitive impairment were included for the analysis. A template including...... 35 VOIs was manually delineated on the subjects' MR images. Through a warping algorithm template VOI sets defined from each individual were transferred to the other subjects MR-images and the voxel overlap was compared to the VOI set specifically drawn for that particular individual. Comparisons were...... delineation of the VOI set. The approach was also shown to work equally well in individuals with pronounced cerebral atrophy. Probability-map-based automatic delineation of VOIs is a fast, objective, reproducible, and safe way to assess regional brain values from PET or SPECT scans. In addition, the method...

  17. Probability distribution of dose rates in the body tissue as a function of the rhytm of Sr90 administration and the age of animals

    International Nuclear Information System (INIS)

    Rasin, I.M.; Sarapul'tsev, I.A.

    1975-01-01

    The probability distribution of tissue radiation doses in the skeleton were studied in experiments on swines and dogs. When introducing Sr-90 into the organism from the day of birth till 90 days dose rate probability distribution is characterized by one, or, for adult animals, by two independent aggregates. Each of these aggregates correspond to the normal distribution law

  18. Normal tissue complication probabilities: dependence on choice of biological model and dose-volume histogram reduction scheme

    International Nuclear Information System (INIS)

    Moiseenko, Vitali; Battista, Jerry; Van Dyk, Jake

    2000-01-01

    Purpose: To evaluate the impact of dose-volume histogram (DVH) reduction schemes and models of normal tissue complication probability (NTCP) on ranking of radiation treatment plans. Methods and Materials: Data for liver complications in humans and for spinal cord in rats were used to derive input parameters of four different NTCP models. DVH reduction was performed using two schemes: 'effective volume' and 'preferred Lyman'. DVHs for competing treatment plans were derived from a sample DVH by varying dose uniformity in a high dose region so that the obtained cumulative DVHs intersected. Treatment plans were ranked according to the calculated NTCP values. Results: Whenever the preferred Lyman scheme was used to reduce the DVH, competing plans were indistinguishable as long as the mean dose was constant. The effective volume DVH reduction scheme did allow us to distinguish between these competing treatment plans. However, plan ranking depended on the radiobiological model used and its input parameters. Conclusions: Dose escalation will be a significant part of radiation treatment planning using new technologies, such as 3-D conformal radiotherapy and tomotherapy. Such dose escalation will depend on how the dose distributions in organs at risk are interpreted in terms of expected complication probabilities. The present study indicates considerable variability in predicted NTCP values because of the methods used for DVH reduction and radiobiological models and their input parameters. Animal studies and collection of standardized clinical data are needed to ascertain the effects of non-uniform dose distributions and to test the validity of the models currently in use

  19. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    Science.gov (United States)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin

  20. Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition

    Science.gov (United States)

    Li, Wei; Wu, Bing; Liu, Chunlei

    2011-01-01

    Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002

  1. Normal tissue complication probability modeling of radiation-induced hypothyroidism after head-and-neck radiation therapy.

    Science.gov (United States)

    Bakhshandeh, Mohsen; Hashemi, Bijan; Mahdavi, Seied Rabi Mehdi; Nikoofar, Alireza; Vasheghani, Maryam; Kazemnejad, Anoshirvan

    2013-02-01

    To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with α/β = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D(50) estimated from the models was approximately 44 Gy. The implemented normal tissue complication probability models showed a parallel architecture for the

  2. Normal Tissue Complication Probability Modeling of Radiation-Induced Hypothyroidism After Head-and-Neck Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bakhshandeh, Mohsen [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Hashemi, Bijan, E-mail: bhashemi@modares.ac.ir [Department of Medical Physics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of); Mahdavi, Seied Rabi Mehdi [Department of Medical Physics, Faculty of Medical Sciences, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Nikoofar, Alireza; Vasheghani, Maryam [Department of Radiation Oncology, Hafte-Tir Hospital, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kazemnejad, Anoshirvan [Department of Biostatistics, Faculty of Medical Sciences, Tarbiat Modares University, Tehran (Iran, Islamic Republic of)

    2013-02-01

    Purpose: To determine the dose-response relationship of the thyroid for radiation-induced hypothyroidism in head-and-neck radiation therapy, according to 6 normal tissue complication probability models, and to find the best-fit parameters of the models. Methods and Materials: Sixty-five patients treated with primary or postoperative radiation therapy for various cancers in the head-and-neck region were prospectively evaluated. Patient serum samples (tri-iodothyronine, thyroxine, thyroid-stimulating hormone [TSH], free tri-iodothyronine, and free thyroxine) were measured before and at regular time intervals until 1 year after the completion of radiation therapy. Dose-volume histograms (DVHs) of the patients' thyroid gland were derived from their computed tomography (CT)-based treatment planning data. Hypothyroidism was defined as increased TSH (subclinical hypothyroidism) or increased TSH in combination with decreased free thyroxine and thyroxine (clinical hypothyroidism). Thyroid DVHs were converted to 2 Gy/fraction equivalent doses using the linear-quadratic formula with {alpha}/{beta} = 3 Gy. The evaluated models included the following: Lyman with the DVH reduced to the equivalent uniform dose (EUD), known as LEUD; Logit-EUD; mean dose; relative seriality; individual critical volume; and population critical volume models. The parameters of the models were obtained by fitting the patients' data using a maximum likelihood analysis method. The goodness of fit of the models was determined by the 2-sample Kolmogorov-Smirnov test. Ranking of the models was made according to Akaike's information criterion. Results: Twenty-nine patients (44.6%) experienced hypothyroidism. None of the models was rejected according to the evaluation of the goodness of fit. The mean dose model was ranked as the best model on the basis of its Akaike's information criterion value. The D{sub 50} estimated from the models was approximately 44 Gy. Conclusions: The implemented

  3. Hypothyroidism after primary radiotherapy for head and neck squamous cell carcinoma: Normal tissue complication probability modeling with latent time correction

    International Nuclear Information System (INIS)

    Rønjom, Marianne Feen; Brink, Carsten; Bentzen, Søren M.; Hegedüs, Laszlo; Overgaard, Jens; Johansen, Jørgen

    2013-01-01

    Background and purpose: To develop a normal tissue complication probability (NTCP) model of radiation-induced biochemical hypothyroidism (HT) after primary radiotherapy for head and neck squamous cell carcinoma (HNSCC) with adjustment for latency and clinical risk factors. Patients and methods: Patients with HNSCC receiving definitive radiotherapy with 66–68 Gy without surgery were followed up with serial post-treatment thyrotropin (TSH) assessment. HT was defined as TSH >4.0 mU/l. Data were analyzed with both a logistic and a mixture model (correcting for latency) to determine risk factors for HT and develop an NTCP model based on mean thyroid dose (MTD) and thyroid volume. Results: 203 patients were included. Median follow-up: 25.1 months. Five-year estimated risk of HT was 25.6%. In the mixture model, the only independent risk factors for HT were thyroid volume (cm 3 ) (OR = 0.75 [95% CI: 0.64–0.85], p 3 , respectively. Conclusions: Comparing the logistic and mixture models demonstrates the importance of latent-time correction in NTCP-modeling. Thyroid dose constraints in treatment planning should be individualized based on thyroid volume

  4. Evaluation of carrier collection probability in bifacial interdigitated-back-contact crystalline silicon solar cells by the internal quantum efficiency mapping method

    Science.gov (United States)

    Tachibana, Tomihisa; Tanahashi, Katsuto; Mochizuki, Toshimitsu; Shirasawa, Katsuhiko; Takato, Hidetaka

    2018-04-01

    Bifacial interdigitated-back-contact (IBC) silicon solar cells with a high bifaciality of 0.91 were fabricated. Screen printing and firing technology were used to reduce the production cost. For the first time, the relationship between the rear side structure and carrier collection probability was evaluated using internal quantum efficiency (IQE) mapping. The measurement results showed that the screen-printed electrode and back surface field (BSF) area led to low IQE. The low carrier collection probability by BSF area can be explained by electrical shading effects. Thus, it is clear that the IQE mapping system is useful to evaluate the IBC cell.

  5. METABOLIC MAPPING BY ENZYME HISTOCHEMISTRY IN LIVING ANIMALS, TISSUES AND CELLS

    NARCIS (Netherlands)

    van Noorden, C. J. F.

    2009-01-01

    Imaging of reporter molecules such as fluorescent proteins in intact animals, tissue and cells has become an indispensable tool in cell biology Imaging activity of enzymes, which is called metabolic mapping, provides information on subcellular localisation in combination with function of the enzymes

  6. NIH Scientists Map Genetic Changes That Drive Tumors in a Common Pediatric Soft-Tissue Cancer

    Science.gov (United States)

    ... Press Release NIH scientists map genetic changes that drive tumors in a common pediatric soft-tissue cancer ... of Health FOLLOW US Facebook Twitter Instagram YouTube Google+ LinkedIn GovDelivery RSS CONTACT INFORMATION Contact Us LiveHelp ...

  7. Spatial cluster analysis of nanoscopically mapped serotonin receptors for classification of fixed brain tissue

    Science.gov (United States)

    Sams, Michael; Silye, Rene; Göhring, Janett; Muresan, Leila; Schilcher, Kurt; Jacak, Jaroslaw

    2014-01-01

    We present a cluster spatial analysis method using nanoscopic dSTORM images to determine changes in protein cluster distributions within brain tissue. Such methods are suitable to investigate human brain tissue and will help to achieve a deeper understanding of brain disease along with aiding drug development. Human brain tissue samples are usually treated postmortem via standard fixation protocols, which are established in clinical laboratories. Therefore, our localization microscopy-based method was adapted to characterize protein density and protein cluster localization in samples fixed using different protocols followed by common fluorescent immunohistochemistry techniques. The localization microscopy allows nanoscopic mapping of serotonin 5-HT1A receptor groups within a two-dimensional image of a brain tissue slice. These nanoscopically mapped proteins can be confined to clusters by applying the proposed statistical spatial analysis. Selected features of such clusters were subsequently used to characterize and classify the tissue. Samples were obtained from different types of patients, fixed with different preparation methods, and finally stored in a human tissue bank. To verify the proposed method, samples of a cryopreserved healthy brain have been compared with epitope-retrieved and paraffin-fixed tissues. Furthermore, samples of healthy brain tissues were compared with data obtained from patients suffering from mental illnesses (e.g., major depressive disorder). Our work demonstrates the applicability of localization microscopy and image analysis methods for comparison and classification of human brain tissues at a nanoscopic level. Furthermore, the presented workflow marks a unique technological advance in the characterization of protein distributions in brain tissue sections.

  8. Mapping absolute tissue endogenous fluorophore concentrations with chemometric wide-field fluorescence microscopy

    Science.gov (United States)

    Xu, Zhang; Reilley, Michael; Li, Run; Xu, Min

    2017-06-01

    We report chemometric wide-field fluorescence microscopy for imaging the spatial distribution and concentration of endogenous fluorophores in thin tissue sections. Nonnegative factorization aided by spatial diversity is used to learn both the spectral signature and the spatial distribution of endogenous fluorophores from microscopic fluorescence color images obtained under broadband excitation and detection. The absolute concentration map of individual fluorophores is derived by comparing the fluorescence from "pure" fluorophores under the identical imaging condition following the identification of the fluorescence species by its spectral signature. This method is then demonstrated by characterizing the concentration map of endogenous fluorophores (including tryptophan, elastin, nicotinamide adenine dinucleotide, and flavin adenine dinucleotide) for lung tissue specimens. The absolute concentrations of these fluorophores are all found to decrease significantly from normal, perilesional, to cancerous (squamous cell carcinoma) tissue. Discriminating tissue types using the absolute fluorophore concentration is found to be significantly more accurate than that achievable with the relative fluorescence strength. Quantification of fluorophores in terms of the absolute concentration map is also advantageous in eliminating the uncertainties due to system responses or measurement details, yielding more biologically relevant data, and simplifying the assessment of competing imaging approaches.

  9. Integrating spatial, temporal, and size probabilities for the annual landslide hazard maps in the Shihmen watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    C. Y. Wu

    2013-09-01

    Full Text Available Landslide spatial, temporal, and size probabilities were used to perform a landslide hazard assessment in this study. Eleven intrinsic geomorphological, and two extrinsic rainfall factors were evaluated as landslide susceptibility related factors as they related to the success rate curves, landslide ratio plots, frequency distributions of landslide and non-landslide groups, as well as probability–probability plots. Data on landslides caused by Typhoon Aere in the Shihmen watershed were selected to train the susceptibility model. The landslide area probability, based on the power law relationship between the landslide area and a noncumulative number, was analyzed using the Pearson type 5 probability density function. The exceedance probabilities of rainfall with various recurrence intervals, including 2, 5, 10, 20, 50, 100 and 200 yr, were used to determine the temporal probabilities of the events. The study was conducted in the Shihmen watershed, which has an area of 760 km2 and is one of the main water sources for northern Taiwan. The validation result of Typhoon Krosa demonstrated that this landslide hazard model could be used to predict the landslide probabilities. The results suggested that integration of spatial, area, and exceedance probabilities to estimate the annual probability of each slope unit is feasible. The advantage of this annual landslide probability model lies in its ability to estimate the annual landslide risk, instead of a scenario-based risk.

  10. High-spatial-resolution mapping of the oxygen concentration in cortical tissue (Conference Presentation)

    Science.gov (United States)

    Jaswal, Rajeshwer S.; Yaseen, Mohammad A.; Fu, Buyin; Boas, David A.; Sakadžic, Sava

    2016-03-01

    Due to a lack of imaging tools for high-resolution imaging of cortical tissue oxygenation, the detailed maps of the oxygen partial pressure (PO2) around arterioles, venules, and capillaries remain largely unknown. Therefore, we have limited knowledge about the mechanisms that secure sufficient oxygen delivery in microvascular domains during brain activation, and provide some metabolic reserve capacity in diseases that affect either microvascular networks or the regulation of cerebral blood flow (CBF). To address this challenge, we applied a Two-Photon PO2 Microscopy to map PO2 at different depths in mice cortices. Measurements were performed through the cranial window in the anesthetized healthy mice as well as in the mouse models of microvascular dysfunctions. In addition, microvascular morphology was recorded by the two-photon microscopy at the end of each experiment and subsequently segmented. Co-registration of the PO2 measurements and exact microvascular morphology enabled quantification of the tissue PO2 dependence on distance from the arterioles, capillaries, and venules at various depths. Our measurements reveal significant spatial heterogeneity of the cortical tissue PO2 distribution that is dominated by the high oxygenation in periarteriolar spaces. In cases of impaired oxygen delivery due to microvascular dysfunction, significant reduction in tissue oxygenation away from the arterioles was observed. These tissue domains may be the initial sites of cortical injury that can further exacerbate the progression of the disease.

  11. Design and Selection of Machine Learning Methods Using Radiomics and Dosiomics for Normal Tissue Complication Probability Modeling of Xerostomia.

    Science.gov (United States)

    Gabryś, Hubert S; Buettner, Florian; Sterzing, Florian; Hauswald, Henrik; Bangert, Mark

    2018-01-01

    The purpose of this study is to investigate whether machine learning with dosiomic, radiomic, and demographic features allows for xerostomia risk assessment more precise than normal tissue complication probability (NTCP) models based on the mean radiation dose to parotid glands. A cohort of 153 head-and-neck cancer patients was used to model xerostomia at 0-6 months (early), 6-15 months (late), 15-24 months (long-term), and at any time (a longitudinal model) after radiotherapy. Predictive power of the features was evaluated by the area under the receiver operating characteristic curve (AUC) of univariate logistic regression models. The multivariate NTCP models were tuned and tested with single and nested cross-validation, respectively. We compared predictive performance of seven classification algorithms, six feature selection methods, and ten data cleaning/class balancing techniques using the Friedman test and the Nemenyi post hoc analysis. NTCP models based on the parotid mean dose failed to predict xerostomia (AUCs  0.85), dose gradients in the right-left (AUCs > 0.78), and the anterior-posterior (AUCs > 0.72) direction. Multivariate models of long-term xerostomia were typically based on the parotid volume, the parotid eccentricity, and the dose-volume histogram (DVH) spread with the generalization AUCs ranging from 0.74 to 0.88. On average, support vector machines and extra-trees were the top performing classifiers, whereas the algorithms based on logistic regression were the best choice for feature selection. We found no advantage in using data cleaning or class balancing methods. We demonstrated that incorporation of organ- and dose-shape descriptors is beneficial for xerostomia prediction in highly conformal radiotherapy treatments. Due to strong reliance on patient-specific, dose-independent factors, our results underscore the need for development of personalized data-driven risk profiles for NTCP models of xerostomia. The facilitated

  12. Methods for Reducing Normal Tissue Complication Probabilities in Oropharyngeal Cancer: Dose Reduction or Planning Target Volume Elimination

    Energy Technology Data Exchange (ETDEWEB)

    Samuels, Stuart E.; Eisbruch, Avraham; Vineberg, Karen; Lee, Jae; Lee, Choonik; Matuszak, Martha M.; Ten Haken, Randall K.; Brock, Kristy K., E-mail: kbrock@med.umich.edu

    2016-11-01

    Purpose: Strategies to reduce the toxicities of head and neck radiation (ie, dysphagia [difficulty swallowing] and xerostomia [dry mouth]) are currently underway. However, the predicted benefit of dose and planning target volume (PTV) reduction strategies is unknown. The purpose of the present study was to compare the normal tissue complication probabilities (NTCP) for swallowing and salivary structures in standard plans (70 Gy [P70]), dose-reduced plans (60 Gy [P60]), and plans eliminating the PTV margin. Methods and Materials: A total of 38 oropharyngeal cancer (OPC) plans were analyzed. Standard organ-sparing volumetric modulated arc therapy plans (P70) were created and then modified by eliminating the PTVs and treating the clinical tumor volumes (CTVs) only (C70) or maintaining the PTV but reducing the dose to 60 Gy (P60). NTCP dose models for the pharyngeal constrictors, glottis/supraglottic larynx, parotid glands (PGs), and submandibular glands (SMGs) were analyzed. The minimal clinically important benefit was defined as a mean change in NTCP of >5%. The P70 NTCP thresholds and overlap percentages of the organs at risk with the PTVs (56-59 Gy, vPTV{sub 56}) were evaluated to identify the predictors for NTCP improvement. Results: With the P60 plans, only the ipsilateral PG (iPG) benefited (23.9% vs 16.2%; P<.01). With the C70 plans, only the iPG (23.9% vs 17.5%; P<.01) and contralateral SMG (cSMG) (NTCP 32.1% vs 22.9%; P<.01) benefited. An iPG NTCP threshold of 20% and 30% predicted NTCP benefits for the P60 and C70 plans, respectively (P<.001). A cSMG NTCP threshold of 30% predicted for an NTCP benefit with the C70 plans (P<.001). Furthermore, for the iPG, a vPTV{sub 56} >13% predicted benefit with P60 (P<.001) and C70 (P=.002). For the cSMG, a vPTV{sub 56} >22% predicted benefit with C70 (P<.01). Conclusions: PTV elimination and dose-reduction lowered the NTCP of the iPG, and PTV elimination lowered the NTCP of the cSMG. NTCP thresholds and the

  13. Design and Selection of Machine Learning Methods Using Radiomics and Dosiomics for Normal Tissue Complication Probability Modeling of Xerostomia

    Directory of Open Access Journals (Sweden)

    Hubert S. Gabryś

    2018-03-01

    Full Text Available PurposeThe purpose of this study is to investigate whether machine learning with dosiomic, radiomic, and demographic features allows for xerostomia risk assessment more precise than normal tissue complication probability (NTCP models based on the mean radiation dose to parotid glands.Material and methodsA cohort of 153 head-and-neck cancer patients was used to model xerostomia at 0–6 months (early, 6–15 months (late, 15–24 months (long-term, and at any time (a longitudinal model after radiotherapy. Predictive power of the features was evaluated by the area under the receiver operating characteristic curve (AUC of univariate logistic regression models. The multivariate NTCP models were tuned and tested with single and nested cross-validation, respectively. We compared predictive performance of seven classification algorithms, six feature selection methods, and ten data cleaning/class balancing techniques using the Friedman test and the Nemenyi post hoc analysis.ResultsNTCP models based on the parotid mean dose failed to predict xerostomia (AUCs < 0.60. The most informative predictors were found for late and long-term xerostomia. Late xerostomia correlated with the contralateral dose gradient in the anterior–posterior (AUC = 0.72 and the right–left (AUC = 0.68 direction, whereas long-term xerostomia was associated with parotid volumes (AUCs > 0.85, dose gradients in the right–left (AUCs > 0.78, and the anterior–posterior (AUCs > 0.72 direction. Multivariate models of long-term xerostomia were typically based on the parotid volume, the parotid eccentricity, and the dose–volume histogram (DVH spread with the generalization AUCs ranging from 0.74 to 0.88. On average, support vector machines and extra-trees were the top performing classifiers, whereas the algorithms based on logistic regression were the best choice for feature selection. We found no advantage in using data cleaning or class balancing

  14. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    International Nuclear Information System (INIS)

    Defraene, Gilles; Van den Bergh, Laura; Al-Mamgani, Abrahim; Haustermans, Karin; Heemsbergen, Wilma; Van den Heuvel, Frank; Lebesque, Joos V.

    2012-01-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011–0.013) clinical factor was “previous abdominal surgery.” As second significant (p = 0.012–0.016) factor, “cardiac history” was included in all three rectal bleeding fits, whereas including “diabetes” was significant (p = 0.039–0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003–0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D 50 . Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints. Conclusions

  15. Multivariate normal tissue complication probability modeling of gastrointestinal toxicity after external beam radiotherapy for localized prostate cancer

    International Nuclear Information System (INIS)

    Cella, Laura; D’Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Doria, Francesca; Faiella, Adriana; Loffredo, Filomena; Salvatore, Marco; Pacelli, Roberto

    2013-01-01

    The risk of radio-induced gastrointestinal (GI) complications is affected by several factors other than the dose to the rectum such as patient characteristics, hormonal or antihypertensive therapy, and acute rectal toxicity. Purpose of this work is to study clinical and dosimetric parameters impacting on late GI toxicity after prostate external beam radiotherapy (RT) and to establish multivariate normal tissue complication probability (NTCP) model for radiation-induced GI complications. A total of 57 men who had undergone definitive RT for prostate cancer were evaluated for GI events classified using the RTOG/EORTC scoring system. Their median age was 73 years (range 53–85). The patients were assessed for GI toxicity before, during, and periodically after RT completion. Several clinical variables along with rectum dose-volume parameters (Vx) were collected and their correlation to GI toxicity was analyzed by Spearman’s rank correlation coefficient (Rs). Multivariate logistic regression method using resampling techniques was applied to select model order and parameters for NTCP modeling. Model performance was evaluated through the area under the receiver operating characteristic curve (AUC). At a median follow-up of 30 months, 37% (21/57) patients developed G1-2 acute GI events while 33% (19/57) were diagnosed with G1-2 late GI events. An NTCP model for late mild/moderate GI toxicity based on three variables including V65 (OR = 1.03), antihypertensive and/or anticoagulant (AH/AC) drugs (OR = 0.24), and acute GI toxicity (OR = 4.3) was selected as the most predictive model (Rs = 0.47, p < 0.001; AUC = 0.79). This three-variable model outperforms the logistic model based on V65 only (Rs = 0.28, p < 0.001; AUC = 0.69). We propose a logistic NTCP model for late GI toxicity considering not only rectal irradiation dose but also clinical patient-specific factors. Accordingly, the risk of G1-2 late GI increases as V65 increases, it is higher for patients experiencing

  16. The Benefits of Including Clinical Factors in Rectal Normal Tissue Complication Probability Modeling After Radiotherapy for Prostate Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Defraene, Gilles, E-mail: gilles.defraene@uzleuven.be [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Van den Bergh, Laura [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center - Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Haustermans, Karin [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Heemsbergen, Wilma [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands); Van den Heuvel, Frank [Radiation Oncology Department, University Hospitals Leuven, Leuven (Belgium); Lebesque, Joos V. [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital, Amsterdam (Netherlands)

    2012-03-01

    Purpose: To study the impact of clinical predisposing factors on rectal normal tissue complication probability modeling using the updated results of the Dutch prostate dose-escalation trial. Methods and Materials: Toxicity data of 512 patients (conformally treated to 68 Gy [n = 284] and 78 Gy [n = 228]) with complete follow-up at 3 years after radiotherapy were studied. Scored end points were rectal bleeding, high stool frequency, and fecal incontinence. Two traditional dose-based models (Lyman-Kutcher-Burman (LKB) and Relative Seriality (RS) and a logistic model were fitted using a maximum likelihood approach. Furthermore, these model fits were improved by including the most significant clinical factors. The area under the receiver operating characteristic curve (AUC) was used to compare the discriminating ability of all fits. Results: Including clinical factors significantly increased the predictive power of the models for all end points. In the optimal LKB, RS, and logistic models for rectal bleeding and fecal incontinence, the first significant (p = 0.011-0.013) clinical factor was 'previous abdominal surgery.' As second significant (p = 0.012-0.016) factor, 'cardiac history' was included in all three rectal bleeding fits, whereas including 'diabetes' was significant (p = 0.039-0.048) in fecal incontinence modeling but only in the LKB and logistic models. High stool frequency fits only benefitted significantly (p = 0.003-0.006) from the inclusion of the baseline toxicity score. For all models rectal bleeding fits had the highest AUC (0.77) where it was 0.63 and 0.68 for high stool frequency and fecal incontinence, respectively. LKB and logistic model fits resulted in similar values for the volume parameter. The steepness parameter was somewhat higher in the logistic model, also resulting in a slightly lower D{sub 50}. Anal wall DVHs were used for fecal incontinence, whereas anorectal wall dose best described the other two endpoints

  17. Towards optical spectroscopic anatomical mapping (OSAM) for lesion validation in cardiac tissue (Conference Presentation)

    Science.gov (United States)

    Singh-Moon, Rajinder P.; Zaryab, Mohammad; Hendon, Christine P.

    2017-02-01

    Electroanatomical mapping (EAM) is an invaluable tool for guiding cardiac radiofrequency ablation (RFA) therapy. The principle roles of EAM is the identification of candidate ablation sites by detecting regions of abnormal electrogram activity and lesion validation subsequent to RF energy delivery. However, incomplete lesions may present interim electrical inactivity similar to effective treatment in the acute setting, despite efforts to reveal them with pacing or drugs, such as adenosine. Studies report that the misidentification and recovery of such lesions is a leading cause of arrhythmia recurrence and repeat procedures. In previous work, we demonstrated spectroscopic characterization of cardiac tissues using a fiber optic-integrated RF ablation catheter. In this work, we introduce OSAM (optical spectroscopic anatomical mapping), the application of this spectroscopic technique to obtain 2-dimensional biodistribution maps. We demonstrate its diagnostic potential as an auxiliary method for lesion validation in treated swine preparations. Endocardial lesion sets were created on fresh swine cardiac samples using a commercial RFA system. An optically-integrated catheter console fabricated in-house was used for measurement of tissue optical spectra between 600-1000nm. Three dimensional, Spatio-spectral datasets were generated by raster scanning of the optical catheter across the treated sample surface in the presence of whole blood. Tissue optical parameters were recovered at each spatial position using an inverse Monte Carlo method. OSAM biodistribution maps showed stark correspondence with gross examination of tetrazolium chloride stained tissue specimens. Specifically, we demonstrate the ability of OSAM to readily distinguish between shallow and deeper lesions, a limitation faced by current EAM techniques. These results showcase the OSAMs potential for lesion validation strategies for the treatment of cardiac arrhythmias.

  18. A comparison between probability and information measures of uncertainty in a simulated soil map and the economic value of imperfect soil information.

    Science.gov (United States)

    Lark, R. Murray

    2014-05-01

    Conventionally the uncertainty of a conventional soil map has been expressed in terms of the mean purity of its map units: the probability that the soil profile class examined at a site would be found to correspond to the eponymous class of the simple map unit that is delineated there (Burrough et al, 1971). This measure of uncertainty has an intuitive meaning and is used for quality control in soil survey contracts (Western, 1978). However, it may be of limited value to the manager or policy maker who wants to decide whether the map provides a basis for decision making, and whether the cost of producing a better map would be justified. In this study I extend a published analysis of the economic implications of uncertainty in a soil map (Giasson et al., 2000). A decision analysis was developed to assess the economic value of imperfect soil map information for agricultural land use planning. Random error matrices for the soil map units were then generated, subject to constraints which ensure consistency with fixed frequencies of the different soil classes. For each error matrix the mean map unit purity was computed, and the value of the implied imperfect soil information was computed by the decision analysis. An alternative measure of the uncertainty in a soil map was considered. This is the mean soil map information which is the difference between the information content of a soil observation, at a random location in the region, and the information content of a soil observation given that the map unit is known. I examined the relationship between the value of imperfect soil information and the purity and information measures of map uncertainty. In both cases there was considerable variation in the economic value of possible maps with fixed values of the uncertainty measure. However, the correlation was somewhat stronger with the information measure, and there was a clear upper bound on the value of an imperfect soil map when the mean information takes some

  19. Tissue Cancellation in Dual Energy Mammography Using a Calibration Phantom Customized for Direct Mapping.

    Science.gov (United States)

    Han, Seokmin; Kang, Dong-Goo

    2014-01-01

    An easily implementable tissue cancellation method for dual energy mammography is proposed to reduce anatomical noise and enhance lesion visibility. For dual energy calibration, the images of an imaging object are directly mapped onto the images of a customized calibration phantom. Each pixel pair of the low and high energy images of the imaging object was compared to pixel pairs of the low and high energy images of the calibration phantom. The correspondence was measured by absolute difference between the pixel values of imaged object and those of the calibration phantom. Then the closest pixel pair of the calibration phantom images is marked and selected. After the calibration using direct mapping, the regions with lesion yielded different thickness from the background tissues. Taking advantage of the different thickness, the visibility of cancerous lesions was enhanced with increased contrast-to-noise ratio, depending on the size of lesion and breast thickness. However, some tissues near the edge of imaged object still remained after tissue cancellation. These remaining residuals seem to occur due to the heel effect, scattering, nonparallel X-ray beam geometry and Poisson distribution of photons. To improve its performance further, scattering and the heel effect should be compensated.

  20. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    Science.gov (United States)

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  1. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    International Nuclear Information System (INIS)

    Farace, Paolo; Antolini, Renzo; Pontalti, Rolando; Cristoforetti, Luca; Scarpa, Marina

    1997-01-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  2. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Farace, Paolo; Antolini, Renzo [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy); Pontalti, Rolando; Cristoforetti, Luca [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Scarpa, Marina [Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy)

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  3. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning.

    Science.gov (United States)

    Farace, P; Pontalti, R; Cristoforetti, L; Antolini, R; Scarpa, M

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning.

  4. An automated method for mapping human tissue permittivities by MRI in hyperthermia treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Farace, Paolo; Antolini, Renzo [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy); Pontalti, Rolando; Cristoforetti, Luca [CMBM-ITC, Centro Materiali e Biofisica Medica, 38050 Povo-Trento (Italy); Scarpa, Marina [Dipartimento di Fisica and INFM, Universita di Trento, 38050 Povo-Trento (Italy)

    1997-11-01

    This paper presents an automatic method to obtain tissue complex permittivity values to be used as input data in the computer modelling for hyperthermia treatment planning. Magnetic resonance (MR) images were acquired and the tissue water content was calculated from the signal intensity of the image pixels. The tissue water content was converted into complex permittivity values by monotonic functions based on mixture theory. To obtain a water content map by MR imaging a gradient-echo pulse sequence was used and an experimental procedure was set up to correct for relaxation and radiofrequency field inhomogeneity effects on signal intensity. Two approaches were followed to assign the permittivity values to fat-rich tissues: (i) fat-rich tissue localization by a segmentation procedure followed by assignment of tabulated permittivity values; (ii) water content evaluation by chemical shift imaging followed by permittivity calculation. Tests were performed on phantoms of known water content to establish the reliability of the proposed method. MRI data were acquired and processed pixel-by-pixel according to the outlined procedure. The signal intensity in the phantom images correlated well with water content. Experiments were performed on volunteers' healthy tissue. In particular two anatomical structures were chosen to calculate permittivity maps: the head and the thigh. The water content and electric permittivity values were obtained from the MRI data and compared to others in the literature. A good agreement was found for muscle, cerebrospinal fluid (CSF) and white and grey matter. The advantages of the reported method are discussed in the light of possible application in hyperthermia treatment planning. (author)

  5. Distribution and probable physiological role of esterases in reproductive, digestive, and fat-body tissues of the adult cotton boll weevil, Anthonomus grandis Boh.

    Science.gov (United States)

    Jones, B R; Bancroft, H R

    1986-06-01

    Polyacrylamide gel electrophoresis was used to examine gut, Malpighian tube, fat-body, testes, and ovarioles tissues of the adult cotton boll weevil, Anthonomus grandis Boh. Esterases for which the inheritance has been reported previously by Terranova using whole-body homogenates were detected in dissected tissues and the probable physiological function of each allozyme is suggested. EST-1 occurs most frequently in ovarioles and female fat bodies. EST-2 is most often found in fat bodies and may be important in lipid turnover. No sex difference was observed. EST-3S is found in fat bodies and reproductive tissue, while EST-3F is always located in gut tissues, indicating that EST-3 is not controlled by a single autosomal locus with two codominant alleles as previously reported. EST-4, the most abundant esterase, can be detected in gut tissue at any age and is probably involved in digestion. EST-5 contains four allozymes which appear most frequently in testes and may be important during reproduction.

  6. Mapping of Mechanical Strains and Stresses around Quiescent Engineered Three-Dimensional Epithelial Tissues

    Science.gov (United States)

    Gjorevski, Nikolce; Nelson, Celeste M.

    2012-01-01

    Understanding how physical signals guide biological processes requires qualitative and quantitative knowledge of the mechanical forces generated and sensed by cells in a physiologically realistic three-dimensional (3D) context. Here, we used computational modeling and engineered epithelial tissues of precise geometry to define the experimental parameters that are required to measure directly the mechanical stress profile of 3D tissues embedded within native type I collagen. We found that to calculate the stresses accurately in these settings, we had to account for mechanical heterogeneities within the matrix, which we visualized and quantified using confocal reflectance and atomic force microscopy. Using this technique, we were able to obtain traction forces at the epithelium-matrix interface, and to resolve and quantify patterns of mechanical stress throughout the surrounding matrix. We discovered that whereas single cells generate tension by contracting and pulling on the matrix, the contraction of multicellular tissues can also push against the matrix, causing emergent compression. Furthermore, tissue geometry defines the spatial distribution of mechanical stress across the epithelium, which communicates mechanically over distances spanning hundreds of micrometers. Spatially resolved mechanical maps can provide insight into the types and magnitudes of physical parameters that are sensed and interpreted by multicellular tissues during normal and pathological processes. PMID:22828342

  7. Parametric techniques for characterizing myocardial tissue by magnetic resonance imaging (part 1): T1 mapping.

    Science.gov (United States)

    Perea Palazón, R J; Ortiz Pérez, J T; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Solé Arqués, M

    2016-01-01

    The development of myocardial fibrosis is a common process in the appearance of ventricular dysfunction in many heart diseases. Magnetic resonance imaging makes it possible to accurately evaluate the structure and function of the heart, and its role in the macroscopic characterization of myocardial fibrosis by late enhancement techniques has been widely validated clinically. Recent studies have demonstrated that T1-mapping techniques can quantify diffuse myocardial fibrosis and the expansion of the myocardial extracellular space in absolute terms. However, further studies are necessary to validate the usefulness of this technique in the early detection of tissue remodeling at a time when implementing early treatment would improve a patient's prognosis. This article reviews the state of the art for T1 mapping of the myocardium, its clinical applications, and its limitations. Copyright © 2016 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  8. Parametric methods for characterizing myocardial tissue by magnetic resonance imaging (part 2): T2 mapping.

    Science.gov (United States)

    Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T

    2015-01-01

    Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  9. Digital integration of geological and aeroradiometric data for probability mapping of uranium occurrences in parts of south-eastern Rajasthan, India

    International Nuclear Information System (INIS)

    Chawla, A.S.; Katti, V.J.; Kak, S.N.; Das, S.K.

    1993-01-01

    Integration and evaluation of geological, radio geochemical, and magnetic information of Umra-Udaisagar and Sarara inlier area, Udaipur district, Rajasthan was attempted. Seventeen lithostructural variables from colour infrared (CIR) photo geological analogue maps were interpreted, radio geochemical and magnetic variables thematically evaluated from airborne gamma-ray spectrometric (AGRS) and aero magnetic (AM) digital data were co-registered using a sequential grid matrix of 500 m x 500 m. The variables were quantified using theme-specific equations and digitized in simple Boolean representation format, depending on the presence or absence of a variable, its positive or negative interest, and/or greater than or less than a theme-specific value in each cell. The database so generated was subjected to a software programme weighted modelling wherein, weights for each variable are computed based on conditional probability method, and favorability index maps are generated by discriminant objective analysis. Areas with high probability for uranium mineralisation were delineated by computing composite coincidence of cells with high favorability index value considering each variable as a control variable. Critical analysis of weights computed for each variable in different sets predicts the importance of control of the variable to uranium mineralisation. This attempt has resulted in delineating several new high-probability zones of uranium enrichment and indicated a regional structural control in the area. (author). 11 refs., 4 figs

  10. Mapping the Relative Probability of Common Toad Occurrence in Terrestrial Lowland Farm Habitat in the United Kingdom.

    Directory of Open Access Journals (Sweden)

    Rosie D Salazar

    Full Text Available The common toad (Bufo bufo is of increasing conservation concern in the United Kingdom (UK due to dramatic population declines occurring in the past century. Many of these population declines coincided with reductions in both terrestrial and aquatic habitat availability and quality and have been primarily attributed to the effect of agricultural land conversion (of natural and semi-natural habitats to arable and pasture fields and pond drainage. However, there is little evidence available to link habitat availability with common toad population declines, especially when examined at a broad landscape scale. Assessing such patterns of population declines at the landscape scale, for instance, require an understanding of how this species uses terrestrial habitat.We intensively studied the terrestrial resource selection of a large population of common toads in Oxfordshire, England, UK. Adult common toads were fitted with passive integrated transponder (PIT tags to allow detection in the terrestrial environment using a portable PIT antenna once toads left the pond and before going into hibernation (April/May-October 2012 and 2013. We developed a population-level resource selection function (RSF to assess the relative probability of toad occurrence in the terrestrial environment by collecting location data for 90 recaptured toads.The predicted relative probability of toad occurrence for this population was greatest in wooded habitat near to water bodies; relative probability of occurrence declined dramatically > 50 m from these habitats. Toads also tended to select habitat near to their breeding pond and toad occurrence was negatively related to urban environments.

  11. The effect of 6 and 15 MV on intensity-modulated radiation therapy prostate cancer treatment: plan evaluation, tumour control probability and normal tissue complication probability analysis, and the theoretical risk of secondary induced malignancies

    Science.gov (United States)

    Hussein, M; Aldridge, S; Guerrero Urbano, T; Nisbet, A

    2012-01-01

    Objective The aim of this study was to investigate the effect of 6 and 15-MV photon energies on intensity-modulated radiation therapy (IMRT) prostate cancer treatment plan outcome and to compare the theoretical risks of secondary induced malignancies. Methods Separate prostate cancer IMRT plans were prepared for 6 and 15-MV beams. Organ-equivalent doses were obtained through thermoluminescent dosemeter measurements in an anthropomorphic Aldersen radiation therapy human phantom. The neutron dose contribution at 15 MV was measured using polyallyl-diglycol-carbonate neutron track etch detectors. Risk coefficients from the International Commission on Radiological Protection Report 103 were used to compare the risk of fatal secondary induced malignancies in out-of-field organs and tissues for 6 and 15 MV. For the bladder and the rectum, a comparative evaluation of the risk using three separate models was carried out. Dose–volume parameters for the rectum, bladder and prostate planning target volume were evaluated, as well as normal tissue complication probability (NTCP) and tumour control probability calculations. Results There is a small increased theoretical risk of developing a fatal cancer from 6 MV compared with 15 MV, taking into account all the organs. Dose–volume parameters for the rectum and bladder show that 15 MV results in better volume sparing in the regions below 70 Gy, but the volume exposed increases slightly beyond this in comparison with 6 MV, resulting in a higher NTCP for the rectum of 3.6% vs 3.0% (p=0.166). Conclusion The choice to treat using IMRT at 15 MV should not be excluded, but should be based on risk vs benefit while considering the age and life expectancy of the patient together with the relative risk of radiation-induced cancer and NTCPs. PMID:22010028

  12. In-situ Characterization and Mapping of Iron Compounds in Alzheimer's Tissue

    International Nuclear Information System (INIS)

    Collingwood, J.F.; Mikhaylova, A.; Davidson, M.; Batich, C.; Streit, W.J.; Terry, J.; Dobson, J.

    2005-01-01

    There is a well-established link between iron overload in the brain and pathology associated with neurodegeneration in a variety of disorders such as Alzheimer's (AD), Parkinson's (PD) and Huntington's (HD) diseases. This association was first discovered in AD by Goodman in 1953, where, in addition to abnormally high concentrations of iron in autopsy brain tissue, iron has also been shown to accumulate at sites of brain pathology such as senile plaques. However, since this discovery, progress in understanding the origin, role and nature of iron compounds associated with neurodegeneration has been slow. Here we report, for the first time, the location and characterization of iron compounds in human AD brain tissue sections. Iron fluorescence was mapped over a frontal-lobe tissue section from an Alzheimer's patient, and anomalous iron concentrations were identified using synchrotron X-ray absorption techniques at 5 (micro)m spatial resolution. Concentrations of ferritin and magnetite, a magnetic iron oxide potentially indicating disrupted brain-iron metabolism, were evident. These results demonstrate a practical means of correlating iron compounds and disease pathology in-situ and have clear implications for disease pathogenesis and potential therapies.

  13. Prediction of radiation-induced liver disease by Lyman normal-tissue complication probability model in three-dimensional conformal radiation therapy for primary liver carcinoma

    International Nuclear Information System (INIS)

    Xu ZhiYong; Liang Shixiong; Zhu Ji; Zhu Xiaodong; Zhao Jiandong; Lu Haijie; Yang Yunli; Chen Long; Wang Anyu; Fu Xiaolong; Jiang Guoliang

    2006-01-01

    Purpose: To describe the probability of RILD by application of the Lyman-Kutcher-Burman normal-tissue complication (NTCP) model for primary liver carcinoma (PLC) treated with hypofractionated three-dimensional conformal radiotherapy (3D-CRT). Methods and Materials: A total of 109 PLC patients treated by 3D-CRT were followed for RILD. Of these patients, 93 were in liver cirrhosis of Child-Pugh Grade A, and 16 were in Child-Pugh Grade B. The Michigan NTCP model was used to predict the probability of RILD, and then the modified Lyman NTCP model was generated for Child-Pugh A and Child-Pugh B patients by maximum-likelihood analysis. Results: Of all patients, 17 developed RILD in which 8 were of Child-Pugh Grade A, and 9 were of Child-Pugh Grade B. The prediction of RILD by the Michigan model was underestimated for PLC patients. The modified n, m, TD 5 (1) were 1.1, 0.28, and 40.5 Gy and 0.7, 0.43, and 23 Gy for patients with Child-Pugh A and B, respectively, which yielded better estimations of RILD probability. The hepatic tolerable doses (TD 5 ) would be MDTNL of 21 Gy and 6 Gy, respectively, for Child-Pugh A and B patients. Conclusions: The Michigan model was probably not fit to predict RILD in PLC patients. A modified Lyman NTCP model for RILD was recommended

  14. Surface density mapping of natural tissue by a scanning haptic microscope (SHM).

    Science.gov (United States)

    Moriwaki, Takeshi; Oie, Tomonori; Takamizawa, Keiichi; Murayama, Yoshinobu; Fukuda, Toru; Omata, Sadao; Nakayama, Yasuhide

    2013-02-01

    To expand the performance capacity of the scanning haptic microscope (SHM) beyond surface mapping microscopy of elastic modulus or topography, surface density mapping of a natural tissue was performed by applying a measurement theory of SHM, in which a frequency change occurs upon contact of the sample surface with the SHM sensor - a microtactile sensor (MTS) that vibrates at a pre-determined constant oscillation frequency. This change was mainly stiffness-dependent at a low oscillation frequency and density-dependent at a high oscillation frequency. Two paragon examples with extremely different densities but similar macroscopic elastic moduli in the range of natural soft tissues were selected: one was agar hydrogels and the other silicon organogels with extremely low (less than 25 mg/cm(3)) and high densities (ca. 1300 mg/cm(3)), respectively. Measurements were performed in saline solution near the second-order resonance frequency, which led to the elastic modulus, and near the third-order resonance frequency. There was little difference in the frequency changes between the two resonance frequencies in agar gels. In contrast, in silicone gels, a large frequency change by MTS contact was observed near the third-order resonance frequency, indicating that the frequency change near the third-order resonance frequency reflected changes in both density and elastic modulus. Therefore, a density image of the canine aortic wall was subsequently obtained by subtracting the image observed near the second-order resonance frequency from that near the third-order resonance frequency. The elastin-rich region had a higher density than the collagen-rich region.

  15. Quantification of the volumetric benefit of image-guided radiotherapy (I.G.R.T.) in prostate cancer: Margins and presence probability map

    International Nuclear Information System (INIS)

    Cazoulat, G.; Crevoisier, R. de; Simon, A.; Louvel, G.; Manens, J.P.; Haigron, P.; Crevoisier, R. de; Louvel, G.; Manens, J.P.; Lafond, C.

    2009-01-01

    Purpose: To quantify the prostate and seminal vesicles (S.V.) anatomic variations in order to choose appropriate margins including intrapelvic anatomic variations. To quantify volumetric benefit of image-guided radiotherapy (I.G.R.T.). Patients and methods: Twenty patients, receiving a total dose of 70 Gy in the prostate, had a planning CT scan and eight weekly CT scans during treatment. Prostate and S.V. were manually contoured. Each weekly CT scan was registered to the planning CT scan according to three modalities: radiopaque skin marks, pelvis bone or prostate. For each patient, prostate and S.V. displacements were quantified. 3-dimensional maps of prostate and S.V. presence probability were established. Volumes including minimal presence probabilities were compared between the three modalities of registration. Result: For the prostate intrapelvic displacements, systematic and random variations and maximal displacements for the entire population were: 5 mm, 2.7 mm and 16.5 mm in anteroposterior axis; 2.7 mm, 2.4 mm and 11.4 mm in supero-inferior axis and 0.5 mm, 0.8 mm and 3.3 mm laterally. Margins according to van Herk recipe (to cover the prostate for 90% of the patients with the 95% isodose) were: 8 mm, 8.3 mm and 1.9 mm, respectively. The 100% prostate presence probability volumes correspond to 37%, 50% and 61% according to the registration modality. For the S.V., these volumes correspond to 8%, 14% and 18% of the S.V. volume. Conclusions: Without I.G.R.T., 5 mm prostate posterior margins are insufficient and should be at least 8 mm, to account for intrapelvic anatomic variations. Prostate registration almost doubles the 100% presence probability volume compared to skin registration. Deformation of S.V. will require either to increase dramatically margins (simple) or new planning (not realistic). (authors)

  16. Dual-energy digital mammography: Calibration and inverse-mapping techniques to estimate calcification thickness and glandular-tissue ratio

    International Nuclear Information System (INIS)

    Kappadath, S. Cheenu; Shaw, Chris C.

    2003-01-01

    Breast cancer may manifest as microcalcifications in x-ray mammography. Small microcalcifications, essential to the early detection of breast cancer, are often obscured by overlapping tissue structures. Dual-energy imaging, where separate low- and high-energy images are acquired and synthesized to cancel the tissue structures, may improve the ability to detect and visualize microcalcifications. Transmission measurements at two different kVp values were made on breast-tissue-equivalent materials under narrow-beam geometry using an indirect flat-panel mammographic imager. The imaging scenario consisted of variable aluminum thickness (to simulate calcifications) and variable glandular ratio (defined as the ratio of the glandular-tissue thickness to the total tissue thickness) for a fixed total tissue thickness--the clinical situation of microcalcification imaging with varying tissue composition under breast compression. The coefficients of the inverse-mapping functions used to determine material composition from dual-energy measurements were calculated by a least-squares analysis. The linear function poorly modeled both the aluminum thickness and the glandular ratio. The inverse-mapping functions were found to vary as analytic functions of second (conic) or third (cubic) order. By comparing the model predictions with the calibration values, the root-mean-square residuals for both the cubic and the conic functions were ∼50 μm for the aluminum thickness and ∼0.05 for the glandular ratio

  17. Quantitative susceptibility mapping (QSM): Decoding MRI data for a tissue magnetic biomarker

    Science.gov (United States)

    Wang, Yi; Liu, Tian

    2015-01-01

    In MRI, the main magnetic field polarizes the electron cloud of a molecule, generating a chemical shift for observer protons within the molecule and a magnetic susceptibility inhomogeneity field for observer protons outside the molecule. The number of water protons surrounding a molecule for detecting its magnetic susceptibility is vastly greater than the number of protons within the molecule for detecting its chemical shift. However, the study of tissue magnetic susceptibility has been hindered by poor molecular specificities of hitherto used methods based on MRI signal phase and T2* contrast, which depend convolutedly on surrounding susceptibility sources. Deconvolution of the MRI signal phase can determine tissue susceptibility but is challenged by the lack of MRI signal in the background and by the zeroes in the dipole kernel. Recently, physically meaningful regularizations, including the Bayesian approach, have been developed to enable accurate quantitative susceptibility mapping (QSM) for studying iron distribution, metabolic oxygen consumption, blood degradation, calcification, demyelination, and other pathophysiological susceptibility changes, as well as contrast agent biodistribution in MRI. This paper attempts to summarize the basic physical concepts and essential algorithmic steps in QSM, to describe clinical and technical issues under active development, and to provide references, codes, and testing data for readers interested in QSM. Magn Reson Med 73:82–101, 2015. © 2014 The Authors. Magnetic Resonance in Medicine Published by Wiley Periodicals, Inc. on behalf of International Society of Medicine in Resonance. This is an open access article under the terms of the Creative commons Attribution License, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited. PMID:25044035

  18. Increase in tumor control and normal tissue complication probabilities in advanced head-and-neck cancer for dose-escalated intensity-modulated photon and proton therapy

    Directory of Open Access Journals (Sweden)

    Annika eJakobi

    2015-11-01

    Full Text Available Introduction:Presently used radio-chemotherapy regimens result in moderate local control rates for patients with advanced head and neck squamous cell carcinoma (HNSCC. Dose escalation (DE may be an option to improve patient outcome, but may also increase the risk of toxicities in healthy tissue. The presented treatment planning study evaluated the feasibility of two DE levels for advanced HNSCC patients, planned with either intensity-modulated photon therapy (IMXT or proton therapy (IMPT.Materials and Methods:For 45 HNSCC patients, IMXT and IMPT treatment plans were created including DE via a simultaneous integrated boost (SIB in the high-risk volume, while maintaining standard fractionation with 2 Gy per fraction in the remaining target volume. Two DE levels for the SIB were compared: 2.3 Gy and 2.6 Gy. Treatment plan evaluation included assessment of tumor control probabilities (TCP and normal tissue complication probabilities (NTCP.Results:An increase of approximately 10% in TCP was estimated between the DE levels. A pronounced high-dose rim surrounding the SIB volume was identified in IMXT treatment. Compared to IMPT, this extra dose slightly increased the TCP values and to a larger extent the NTCP values. For both modalities, the higher DE level led only to a small increase in NTCP values (mean differences < 2% in all models, except for the risk of aspiration, which increased on average by 8% and 6% with IMXT and IMPT, respectively, but showed a considerable patient dependence. Conclusions:Both DE levels appear applicable to patients with IMXT and IMPT since all calculated NTCP values, except for one, increased only little for the higher DE level. The estimated TCP increase is of relevant magnitude. The higher DE schedule needs to be investigated carefully in the setting of a prospective clinical trial, especially regarding toxicities caused by high local doses that lack a sound dose response description, e.g., ulcers.

  19. The role of micro-NRA and micro-PIXE in carbon mapping of organic tissues

    International Nuclear Information System (INIS)

    Niekraszewicz, L.A.B.; Souza, C.T. de; Stori, E.M.; Jobim, P.F.C.; Amaral, L.; Dias, J.F.

    2015-01-01

    This study reports the work developed in the Ion Implantation Laboratory (Porto Alegre, RS, Brazil) in order to implement the micro-NRA technique for the study of light elements in organic tissues. In particular, the work was focused on nuclear reactions employing protons and alphas with carbon. The (p,p) resonances at 0.475 and 1.734 were investigated. The (α,α) resonance at 4.265 MeV was studied as well. The results indicate that the yields for the 0.475 and 1.734 MeV resonances are similar. Elemental maps of different structures obtained with the micro-NRA technique using the 1.734 MeV resonance were compared with those obtained with micro-PIXE employing a SDD detector equipped with an ultra-thin window. The results show that the use of micro-NRA for carbon at 1.734 MeV resonance provides good results in some cases at the expense of longer beam times. On the other hand, micro-PIXE provides enhanced yields but is limited to surface analysis since soft X-rays are greatly attenuated by matter

  20. Deep-tissue temperature mapping by multi-illumination photoacoustic tomography aided by a diffusion optical model: a numerical study

    Science.gov (United States)

    Zhou, Yuan; Tang, Eric; Luo, Jianwen; Yao, Junjie

    2018-01-01

    Temperature mapping during thermotherapy can help precisely control the heating process, both temporally and spatially, to efficiently kill the tumor cells and prevent the healthy tissues from heating damage. Photoacoustic tomography (PAT) has been used for noninvasive temperature mapping with high sensitivity, based on the linear correlation between the tissue's Grüneisen parameter and temperature. However, limited by the tissue's unknown optical properties and thus the optical fluence at depths beyond the optical diffusion limit, the reported PAT thermometry usually takes a ratiometric measurement at different temperatures and thus cannot provide absolute measurements. Moreover, ratiometric measurement over time at different temperatures has to assume that the tissue's optical properties do not change with temperatures, which is usually not valid due to the temperature-induced hemodynamic changes. We propose an optical-diffusion-model-enhanced PAT temperature mapping that can obtain the absolute temperature distribution in deep tissue, without the need of multiple measurements at different temperatures. Based on the initial acoustic pressure reconstructed from multi-illumination photoacoustic signals, both the local optical fluence and the optical parameters including absorption and scattering coefficients are first estimated by the optical-diffusion model, then the temperature distribution is obtained from the reconstructed Grüneisen parameters. We have developed a mathematic model for the multi-illumination PAT of absolute temperatures, and our two-dimensional numerical simulations have shown the feasibility of this new method. The proposed absolute temperature mapping method may set the technical foundation for better temperature control in deep tissue in thermotherapy.

  1. MAP3K8 (TPL2/COT affects obesity-induced adipose tissue inflammation without systemic effects in humans and in mice.

    Directory of Open Access Journals (Sweden)

    Dov B Ballak

    Full Text Available Chronic low-grade inflammation in adipose tissue often accompanies obesity, leading to insulin resistance and increasing the risk for metabolic diseases. MAP3K8 (TPL2/COT is an important signal transductor and activator of pro-inflammatory pathways that has been linked to obesity-induced adipose tissue inflammation. We used human adipose tissue biopsies to study the relationship of MAP3K8 expression with markers of obesity and expression of pro-inflammatory cytokines (IL-1β, IL-6 and IL-8. Moreover, we evaluated obesity-induced adipose tissue inflammation and insulin resistance in mice lacking MAP3K8 and WT mice on a high-fat diet (HFD for 16 weeks. Individuals with a BMI >30 displayed a higher mRNA expression of MAP3K8 in adipose tissue compared to individuals with a normal BMI. Additionally, high mRNA expression levels of IL-1β, IL-6 and IL-8, but not TNF -α, in human adipose tissue were associated with higher expression of MAP3K8. Moreover, high plasma SAA and CRP did not associate with increased MAP3K8 expression in adipose tissue. Similarly, no association was found for MAP3K8 expression with plasma insulin or glucose levels. Mice lacking MAP3K8 had similar bodyweight gain as WT mice, yet displayed lower mRNA expression levels of IL-1β, IL-6 and CXCL1 in adipose tissue in response to the HFD as compared to WT animals. However, MAP3K8 deficient mice were not protected against HFD-induced adipose tissue macrophage infiltration or the development of insulin resistance. Together, the data in both human and mouse show that MAP3K8 is involved in local adipose tissue inflammation, specifically for IL-1β and its responsive cytokines IL-6 and IL-8, but does not seem to have systemic effects on insulin resistance.

  2. The effect of the overall treatment time of fractionated irradiation on the tumor control probability of a human soft tissue sarcoma xenograft in nude mice

    International Nuclear Information System (INIS)

    Allam, Ayman; Perez, Luis A.; Huang, Peigen; Taghian, Alphonse; Azinovic, Ignacio; Freeman, Jill; Duffy, Michael; Efird, Jimmy; Suit, Herman D.

    1995-01-01

    Purpose: To study the impact of the overall treatment time of fractionated irradiation on the tumor control probability (TCP) of a human soft tissue sarcoma xenograft growing in nude mice, as well as to compare the pretreatment potential doubling time (T pot ) of this tumor to the effective doubling time (T eff ) derived from three different schedules of irradiation using the same total number of fractions with different overall treatment times. Methods and Materials: The TCP was assessed using the TCD 50 value (the 50% tumor control dose) as an end point. A total of 240 male nude mice, 7-8 weeks old were used in three experimental groups that received the same total number of fractions (30 fractions) with different overall treatment times. In group 1, the animals received three equal fractions/day for 10 consecutive days, in group 2 they received two equal fractions/day for 15 consecutive days, and in group 3 one fraction/day for 30 consecutive days. All irradiations were given under normal blood flow conditions to air breathing animals. The mean tumor diameter at the start of irradiation was 7-8 mm. The mean interfraction intervals were from 8-24 h. The T pot was measured using Iododeoxyuridine (IudR) labeling and flow cytometry and was compared to T eff . Results: The TCD 50 values of the three different treatment schedules were 58.8 Gy, 63.2 Gy, and 75.6 Gy for groups 1, 2, and 3, respectively. This difference in TCD 50 values was significant (p pot (2.4 days) was longer than the calculated T eff in groups 2 and 3 (1.35 days). Conclusion: Our data show a significant loss in TCP with prolongation of the overall treatment time. This is most probably due to an accelerated repopulation of tumor clonogens. The pretreatment T pot of this tumor model does not reflect the actual doubling of the clonogens in a protracted regimen

  3. Impact of Chemotherapy on Normal Tissue Complication Probability Models of Acute Hematologic Toxicity in Patients Receiving Pelvic Intensity Modulated Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bazan, Jose G.; Luxton, Gary; Kozak, Margaret M.; Anderson, Eric M.; Hancock, Steven L.; Kapp, Daniel S.; Kidd, Elizabeth A.; Koong, Albert C.; Chang, Daniel T., E-mail: dtchang@stanford.edu

    2013-12-01

    Purpose: To determine how chemotherapy agents affect radiation dose parameters that correlate with acute hematologic toxicity (HT) in patients treated with pelvic intensity modulated radiation therapy (P-IMRT) and concurrent chemotherapy. Methods and Materials: We assessed HT in 141 patients who received P-IMRT for anal, gynecologic, rectal, or prostate cancers, 95 of whom received concurrent chemotherapy. Patients were separated into 4 groups: mitomycin (MMC) + 5-fluorouracil (5FU, 37 of 141), platinum ± 5FU (Cis, 32 of 141), 5FU (26 of 141), and P-IMRT alone (46 of 141). The pelvic bone was contoured as a surrogate for pelvic bone marrow (PBM) and divided into subsites: ilium, lower pelvis, and lumbosacral spine (LSS). The volumes of each region receiving 5-40 Gy were calculated. The endpoint for HT was grade ≥3 (HT3+) leukopenia, neutropenia or thrombocytopenia. Normal tissue complication probability was calculated using the Lyman-Kutcher-Burman model. Logistic regression was used to analyze association between HT3+ and dosimetric parameters. Results: Twenty-six patients experienced HT3+: 10 of 37 (27%) MMC, 14 of 32 (44%) Cis, 2 of 26 (8%) 5FU, and 0 of 46 P-IMRT. PBM dosimetric parameters were correlated with HT3+ in the MMC group but not in the Cis group. LSS dosimetric parameters were well correlated with HT3+ in both the MMC and Cis groups. Constrained optimization (0tissue complication probability curve compared with treatment with Cis. Dose tolerance of PBM and the LSS subsite may be lower for

  4. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Chao, Pei-Ju; Wang, Hung-Yu; Hsu, Hsuan-Chih; Chang, PaoShu; Chen, Wen-Cheng

    2012-01-01

    With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3 + xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R 2 , the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD 50 ) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD 50 =43.6 Gy and m=0.18 with the SEF data, and TD 50 =44.1 Gy and m=0.11 with the QoL data. The rate of grade 3 + xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Our study shows the agreement between the NTCP parameter modeling based on SEF and

  5. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Science.gov (United States)

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  6. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    Directory of Open Access Journals (Sweden)

    Lee Tsair-Fwu

    2012-12-01

    Full Text Available Abstract Background With advances in modern radiotherapy (RT, many patients with head and neck (HN cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB model to derive parameters for the normal tissue complication probability (NTCP for xerostomia based on scintigraphy assessments and quality of life (QoL questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50 and the slope of the dose–response curve (m were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement

  7. Mechanisms for the inversion of chirality: Global reaction route mapping of stereochemical pathways in a probable chiral extraterrestrial molecule, 2-aminopropionitrile

    International Nuclear Information System (INIS)

    Kaur, Ramanpreet; Vikas

    2015-01-01

    2-Aminopropionitrile (APN), a probable candidate as a chiral astrophysical molecule, is a precursor to amino-acid alanine. Stereochemical pathways in 2-APN are explored using Global Reaction Route Mapping (GRRM) method employing high-level quantum-mechanical computations. Besides predicting the conventional mechanism for chiral inversion that proceeds through an achiral intermediate, a counterintuitive flipping mechanism is revealed for 2-APN through chiral intermediates explored using the GRRM. The feasibility of the proposed stereochemical pathways, in terms of the Gibbs free-energy change, is analyzed at the temperature conditions akin to the interstellar medium. Notably, the stereoinversion in 2-APN is observed to be more feasible than the dissociation of 2-APN and intermediates involved along the stereochemical pathways, and the flipping barrier is observed to be as low as 3.68 kJ/mol along one of the pathways. The pathways proposed for the inversion of chirality in 2-APN may provide significant insight into the extraterrestrial origin of life

  8. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jakobi, Annika, E-mail: Annika.Jakobi@OncoRay.de [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Bandurska-Luque, Anna [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Department of Radiation Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Dresden (Germany); Stützer, Kristin; Haase, Robert; Löck, Steffen [OncoRay-National Center for Radiation Research in Oncology, Faculty of Medicine and University Hospital Carl Gustav Carus, Technische Universität Dresden, Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Wack, Linda-Jacqueline [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); Mönnich, David [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); German Cancer Research Center, Heidelberg (Germany); German Cancer Consortium, Tübingen (Germany); Thorwarth, Daniela [Section for Biomedical Physics, University Hospital for Radiation Oncology, Eberhard Karls Universät Tübingen (Germany); and others

    2015-08-01

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.

  9. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    International Nuclear Information System (INIS)

    Jakobi, Annika; Bandurska-Luque, Anna; Stützer, Kristin; Haase, Robert; Löck, Steffen; Wack, Linda-Jacqueline; Mönnich, David; Thorwarth, Daniela

    2015-01-01

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons

  10. Development of a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence after curative radiotherapy/chemo-radiotherapy in head and neck cancer

    International Nuclear Information System (INIS)

    Wopken, Kim; Bijl, Hendrik P.; Schaaf, Arjen van der; Laan, Hans Paul van der; Chouvalova, Olga; Steenbakkers, Roel J.H.M.; Doornaert, Patricia; Slotman, Ben J.; Oosting, Sjoukje F.; Christianen, Miranda E.M.C.; Laan, Bernard F.A.M. van der; Roodenburg, Jan L.N.; René Leemans, C.; Verdonck-de Leeuw, Irma M.; Langendijk, Johannes A.

    2014-01-01

    Background and purpose: Curative radiotherapy/chemo-radiotherapy for head and neck cancer (HNC) may result in severe acute and late side effects, including tube feeding dependence. The purpose of this prospective cohort study was to develop a multivariable normal tissue complication probability (NTCP) model for tube feeding dependence 6 months (TUBE M6 ) after definitive radiotherapy, radiotherapy plus cetuximab or concurrent chemoradiation based on pre-treatment and treatment characteristics. Materials and methods: The study included 355 patients with HNC. TUBE M6 was scored prospectively in a standard follow-up program. To design the prediction model, the penalized learning method LASSO was used, with TUBE M6 as the endpoint. Results: The prevalence of TUBE M6 was 10.7%. The multivariable model with the best performance consisted of the variables: advanced T-stage, moderate to severe weight loss at baseline, accelerated radiotherapy, chemoradiation, radiotherapy plus cetuximab, the mean dose to the superior and inferior pharyngeal constrictor muscle, to the contralateral parotid gland and to the cricopharyngeal muscle. Conclusions: We developed a multivariable NTCP model for TUBE M6 to identify patients at risk for tube feeding dependence. The dosimetric variables can be used to optimize radiotherapy treatment planning aiming at prevention of tube feeding dependence and to estimate the benefit of new radiation technologies

  11. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  12. Normal tissue complication probability: Does simultaneous integrated boost intensity-modulated radiotherapy score over other techniques in treatment of prostate adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Jothy Basu K

    2009-01-01

    Full Text Available Aim: The main objective of this study was to analyze the radiobiological effect of different treatment strategies on high-risk prostate adenocarcinoma. Materials and Methods: Ten cases of high-risk prostate adenocarcinoma were selected for this dosimetric study. Four different treatment strategies used for treating prostate cancer were compared. Conventional four-field box technique covering prostate and nodal volumes followed by three-field conformal boost (3D + 3DCRT, four-field box technique followed by intensity-modulated radiotherapy (IMRT boost (3D + IMRT, IMRT followed by IMRT boost (IMRT + IMRT, and simultaneous integrated boost IMRT (SIBIMRT were compared in terms of tumor control probability (TCP and normal tissue complication probability (NTCP. The dose prescription except for SIBIMRT was 45 Gy in 25 fractions for the prostate and nodal volumes in the initial phase and 27 Gy in 15 fractions for the prostate in the boost phase. For SIBIMRT, equivalent doses were calculated using biologically equivalent dose assuming the α/β ratio of 1.5 Gy with a dose prescription of 60.75 Gy for the gross tumor volume (GTV and 45 Gy for the clinical target volume in 25 fractions. IMRT plans were made with 15-MV equispaced seven coplanar fields. NTCP was calculated using the Lyman-Kutcher-Burman (LKB model. Results: An NTCP of 10.7 ± 0.99%, 8.36 ± 0.66%, 6.72 ± 0.85%, and 1.45 ± 0.11% for the bladder and 14.9 ± 0.99%, 14.04 ± 0.66%, 11.38 ± 0.85%, 5.12 ± 0.11% for the rectum was seen with 3D + 3DCRT, 3D + IMRT, IMRT + IMRT, and SIBIMRT respectively. Conclusions: SIBIMRT had the least NTCP over all other strategies with a reduced treatment time (3 weeks less. It should be the technique of choice for dose escalation in prostate carcinoma.

  13. The effect of the overall treatment time of fractionated irradiation on the tumor control probability of a human soft tissue sarcoma xenograft in nude mice

    Energy Technology Data Exchange (ETDEWEB)

    Allam, Ayman; Perez, Luis A; Huang, Peigen; Taghian, Alphonse; Azinovic, Ignacio; Freeman, Jill; Duffy, Michael; Efird, Jimmy; Suit, Herman D

    1995-04-30

    Purpose: To study the impact of the overall treatment time of fractionated irradiation on the tumor control probability (TCP) of a human soft tissue sarcoma xenograft growing in nude mice, as well as to compare the pretreatment potential doubling time (T{sub pot}) of this tumor to the effective doubling time (T{sub eff}) derived from three different schedules of irradiation using the same total number of fractions with different overall treatment times. Methods and Materials: The TCP was assessed using the TCD{sub 50} value (the 50% tumor control dose) as an end point. A total of 240 male nude mice, 7-8 weeks old were used in three experimental groups that received the same total number of fractions (30 fractions) with different overall treatment times. In group 1, the animals received three equal fractions/day for 10 consecutive days, in group 2 they received two equal fractions/day for 15 consecutive days, and in group 3 one fraction/day for 30 consecutive days. All irradiations were given under normal blood flow conditions to air breathing animals. The mean tumor diameter at the start of irradiation was 7-8 mm. The mean interfraction intervals were from 8-24 h. The T{sub pot} was measured using Iododeoxyuridine (IudR) labeling and flow cytometry and was compared to T{sub eff}. Results: The TCD{sub 50} values of the three different treatment schedules were 58.8 Gy, 63.2 Gy, and 75.6 Gy for groups 1, 2, and 3, respectively. This difference in TCD{sub 50} values was significant (p < 0.05) between groups 1 and 2 (30 fractions/10 days and 30 fractions/15 days) vs. group 3 (30 fractions/30 days). The loss in TCP due to the prolongation of the overall treatment time from 10 days to 30 days was found to be 1.35-1.4 Gy/day. The pretreatment T{sub pot} (2.4 days) was longer than the calculated T{sub eff} in groups 2 and 3 (1.35 days). Conclusion: Our data show a significant loss in TCP with prolongation of the overall treatment time. This is most probably due to an

  14. Normal Tissue Complication Probability Estimation by the Lyman-Kutcher-Burman Method Does Not Accurately Predict Spinal Cord Tolerance to Stereotactic Radiosurgery

    International Nuclear Information System (INIS)

    Daly, Megan E.; Luxton, Gary; Choi, Clara Y.H.; Gibbs, Iris C.; Chang, Steven D.; Adler, John R.; Soltys, Scott G.

    2012-01-01

    Purpose: To determine whether normal tissue complication probability (NTCP) analyses of the human spinal cord by use of the Lyman-Kutcher-Burman (LKB) model, supplemented by linear–quadratic modeling to account for the effect of fractionation, predict the risk of myelopathy from stereotactic radiosurgery (SRS). Methods and Materials: From November 2001 to July 2008, 24 spinal hemangioblastomas in 17 patients were treated with SRS. Of the tumors, 17 received 1 fraction with a median dose of 20 Gy (range, 18–30 Gy) and 7 received 20 to 25 Gy in 2 or 3 sessions, with cord maximum doses of 22.7 Gy (range, 17.8–30.9 Gy) and 22.0 Gy (range, 20.2–26.6 Gy), respectively. By use of conventional values for α/β, volume parameter n, 50% complication probability dose TD 50 , and inverse slope parameter m, a computationally simplified implementation of the LKB model was used to calculate the biologically equivalent uniform dose and NTCP for each treatment. Exploratory calculations were performed with alternate values of α/β and n. Results: In this study 1 case (4%) of myelopathy occurred. The LKB model using radiobiological parameters from Emami and the logistic model with parameters from Schultheiss overestimated complication rates, predicting 13 complications (54%) and 18 complications (75%), respectively. An increase in the volume parameter (n), to assume greater parallel organization, improved the predictive value of the models. Maximum-likelihood LKB fitting of α/β and n yielded better predictions (0.7 complications), with n = 0.023 and α/β = 17.8 Gy. Conclusions: The spinal cord tolerance to the dosimetry of SRS is higher than predicted by the LKB model using any set of accepted parameters. Only a high α/β value in the LKB model and only a large volume effect in the logistic model with Schultheiss data could explain the low number of complications observed. This finding emphasizes that radiobiological models traditionally used to estimate spinal cord NTCP

  15. Three-dimensional micro-scale strain mapping in living biological soft tissues.

    Science.gov (United States)

    Moo, Eng Kuan; Sibole, Scott C; Han, Sang Kuy; Herzog, Walter

    2018-04-01

    Non-invasive characterization of the mechanical micro-environment surrounding cells in biological tissues at multiple length scales is important for the understanding of the role of mechanics in regulating the biosynthesis and phenotype of cells. However, there is a lack of imaging methods that allow for characterization of the cell micro-environment in three-dimensional (3D) space. The aims of this study were (i) to develop a multi-photon laser microscopy protocol capable of imprinting 3D grid lines onto living tissue at a high spatial resolution, and (ii) to develop image processing software capable of analyzing the resulting microscopic images and performing high resolution 3D strain analyses. Using articular cartilage as the biological tissue of interest, we present a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning length scales from the tissue to the cell level. Using custom image processing software, we provide accurate and robust 3D micro-strain analysis that allows for detailed qualitative and quantitative assessment of the 3D tissue kinematics. This novel technique preserves tissue structural integrity post-scanning, therefore allowing for multiple strain measurements at different time points in the same specimen. The proposed technique is versatile and opens doors for experimental and theoretical investigations on the relationship between tissue deformation and cell biosynthesis. Studies of this nature may enhance our understanding of the mechanisms underlying cell mechano-transduction, and thus, adaptation and degeneration of soft connective tissues. We presented a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning from tissue length scale to cellular length scale. Using a custom image processing software (lsmgridtrack), we provide accurate and robust micro

  16. Normal Tissue Complication Probability Analysis of Acute Gastrointestinal Toxicity in Cervical Cancer Patients Undergoing Intensity Modulated Radiation Therapy and Concurrent Cisplatin

    International Nuclear Information System (INIS)

    Simpson, Daniel R.; Song, William Y.; Moiseenko, Vitali; Rose, Brent S.; Yashar, Catheryn M.; Mundt, Arno J.; Mell, Loren K.

    2012-01-01

    Purpose: To test the hypothesis that increased bowel radiation dose is associated with acute gastrointestinal (GI) toxicity in cervical cancer patients undergoing concurrent chemotherapy and intensity-modulated radiation therapy (IMRT), using a previously derived normal tissue complication probability (NTCP) model. Methods: Fifty patients with Stage I–III cervical cancer undergoing IMRT and concurrent weekly cisplatin were analyzed. Acute GI toxicity was graded using the Radiation Therapy Oncology Group scale, excluding upper GI events. A logistic model was used to test correlations between acute GI toxicity and bowel dosimetric parameters. The primary objective was to test the association between Grade ≥2 GI toxicity and the volume of bowel receiving ≥45 Gy (V 45 ) using the logistic model. Results: Twenty-three patients (46%) had Grade ≥2 GI toxicity. The mean (SD) V 45 was 143 mL (99). The mean V 45 values for patients with and without Grade ≥2 GI toxicity were 176 vs. 115 mL, respectively. Twenty patients (40%) had V 45 >150 mL. The proportion of patients with Grade ≥2 GI toxicity with and without V 45 >150 mL was 65% vs. 33% (p = 0.03). Logistic model parameter estimates V50 and γ were 161 mL (95% confidence interval [CI] 60–399) and 0.31 (95% CI 0.04–0.63), respectively. On multivariable logistic regression, increased V 45 was associated with an increased odds of Grade ≥2 GI toxicity (odds ratio 2.19 per 100 mL, 95% CI 1.04–4.63, p = 0.04). Conclusions: Our results support the hypothesis that increasing bowel V 45 is correlated with increased GI toxicity in cervical cancer patients undergoing IMRT and concurrent cisplatin. Reducing bowel V 45 could reduce the risk of Grade ≥2 GI toxicity by approximately 50% per 100 mL of bowel spared.

  17. Normal tissue complication probability modeling for cochlea constraints to avoid causing tinnitus after head-and-neck intensity-modulated radiation therapy

    International Nuclear Information System (INIS)

    Lee, Tsair-Fwu; Yeh, Shyh-An; Chao, Pei-Ju; Chang, Liyun; Chiu, Chien-Liang; Ting, Hui-Min; Wang, Hung-Yu; Huang, Yu-Jie

    2015-01-01

    Radiation-induced tinnitus is a side effect of radiotherapy in the inner ear for cancers of the head and neck. Effective dose constraints for protecting the cochlea are under-reported. The aim of this study is to determine the cochlea dose limitation to avoid causing tinnitus after head-and-neck cancer (HNC) intensity-modulated radiation therapy (IMRT). In total 211 patients with HNC were included; the side effects of radiotherapy were investigated for 422 inner ears in the cohort. Forty-nine of the four hundred and twenty-two samples (11.6 %) developed grade 2+ tinnitus symptoms after IMRT, as diagnosed by a clinician. The Late Effects of Normal Tissues–Subjective, Objective, Management, Analytic (LENT-SOMA) criteria were used for tinnitus evaluation. The logistic and Lyman-Kutcher-Burman (LKB) normal tissue complication probability (NTCP) models were used for the analyses. The NTCP-fitted parameters were TD 50 = 46.31 Gy (95 % CI, 41.46–52.50), γ 50 = 1.27 (95 % CI, 1.02–1.55), and TD 50 = 46.52 Gy (95 % CI, 41.91–53.43), m = 0.35 (95 % CI, 0.30–0.42) for the logistic and LKB models, respectively. The suggested guideline TD 20 for the tolerance dose to produce a 20 % complication rate within a specific period of time was TD 20 = 33.62 Gy (95 % CI, 30.15–38.27) (logistic) and TD 20 = 32.82 Gy (95 % CI, 29.58–37.69) (LKB). To maintain the incidence of grade 2+ tinnitus toxicity <20 % in IMRT, we suggest that the mean dose to the cochlea should be <32 Gy. However, models should not be extrapolated to other patient populations without further verification and should first be confirmed before clinical implementation

  18. Quantitative maps of protein phosphorylation sites across 14 different rat organs and tissues

    DEFF Research Database (Denmark)

    Lundby, Alicia; Secher, Anna; Lage, Kasper

    2012-01-01

    Deregulated cellular signalling is a common hallmark of disease, and delineating tissue phosphoproteomes is key to unravelling the underlying mechanisms. Here we present the broadest tissue catalogue of phosphoproteins to date, covering 31,480 phosphorylation sites on 7,280 proteins quantified ac...

  19. Magnetic resonance tissue phase mapping demonstrates altered left ventricular diastolic function in children with chronic kidney disease

    International Nuclear Information System (INIS)

    Gimpel, Charlotte; Pohl, Martin; Jung, Bernd A.; Jung, Sabine; Brado, Johannes; Odening, Katja E.; Schwendinger, Daniel; Burkhardt, Barbara; Geiger, Julia; Arnold, Raoul

    2017-01-01

    Echocardiographic examinations have revealed functional cardiac abnormalities in children with chronic kidney disease. To assess the feasibility of MRI tissue phase mapping in children and to assess regional left ventricular wall movements in children with chronic kidney disease. Twenty pediatric patients with chronic kidney disease (before or after renal transplantation) and 12 healthy controls underwent tissue phase mapping (TPM) to quantify regional left ventricular function through myocardial long (Vz) and short-axis (Vr) velocities at all 3 levels of the left ventricle. Patients and controls (age: 8 years - 20 years) were matched for age, height, weight, gender and heart rate. Patients had higher systolic blood pressure. No patient had left ventricular hypertrophy on MRI or diastolic dysfunction on echocardiography. Fifteen patients underwent tissue Doppler echocardiography, with normal z-scores for mitral early diastolic (V E ), late diastolic (V A ) and peak systolic (V S ) velocities. Throughout all left ventricular levels, peak diastolic Vz and Vr (cm/s) were reduced in patients: Vz base -10.6 ± 1.9 vs. -13.4 ± 2.0 (P < 0.0003), Vz mid -7.8 ± 1.6 vs. -11 ± 1.5 (P < 0.0001), Vz apex -3.8 ± 1.6 vs. -5.3 ± 1.6 (P = 0.01), Vr base -4.2 ± 0.8 vs. -4.9 ± 0.7 (P = 0.01), Vr mid -4.7 ± 0.7 vs. -5.4 ± 0.7 (P = 0.01), Vr apex -4.7 ± 1.4 vs. -5.6 ± 1.1 (P = 0.05). Tissue phase mapping is feasible in children and adolescents. Children with chronic kidney disease show significantly reduced peak diastolic long- and short-axis left ventricular wall velocities, reflecting impaired early diastolic filling. Thus, tissue phase mapping detects chronic kidney disease-related functional myocardial changes before overt left ventricular hypertrophy or echocardiographic diastolic dysfunction occurs. (orig.)

  20. Magnetic resonance tissue phase mapping demonstrates altered left ventricular diastolic function in children with chronic kidney disease

    Energy Technology Data Exchange (ETDEWEB)

    Gimpel, Charlotte; Pohl, Martin [Medical Center - University of Freiburg, Department of General Pediatrics, Adolescent Medicine and Neonatology, Center for Pediatrics, Freiburg (Germany); Jung, Bernd A. [Inselspital Bern, Institute of Diagnostic, Interventional and Pediatric Radiology, Bern (Switzerland); Jung, Sabine [Medical Center - University of Freiburg, Department of Nuclear Medicine, Freiburg (Germany); Brado, Johannes; Odening, Katja E. [University Heart Center Freiburg, Department of Cardiology and Angiology I, Freiburg (Germany); Schwendinger, Daniel [University Children' s Hospital Zurich, Zurich (Switzerland); Burkhardt, Barbara [University Children' s Hospital Zurich, Pediatric Heart Center, Zurich (Switzerland); Geiger, Julia [University Children' s Hospital Zurich, Department of Radiology, Zurich (Switzerland); Northwestern University, Department of Radiology, Chicago, IL (United States); Arnold, Raoul [University Hospital Heidelberg, Department of Pediatric and Congenital Cardiology, Heidelberg (Germany)

    2017-02-15

    Echocardiographic examinations have revealed functional cardiac abnormalities in children with chronic kidney disease. To assess the feasibility of MRI tissue phase mapping in children and to assess regional left ventricular wall movements in children with chronic kidney disease. Twenty pediatric patients with chronic kidney disease (before or after renal transplantation) and 12 healthy controls underwent tissue phase mapping (TPM) to quantify regional left ventricular function through myocardial long (Vz) and short-axis (Vr) velocities at all 3 levels of the left ventricle. Patients and controls (age: 8 years - 20 years) were matched for age, height, weight, gender and heart rate. Patients had higher systolic blood pressure. No patient had left ventricular hypertrophy on MRI or diastolic dysfunction on echocardiography. Fifteen patients underwent tissue Doppler echocardiography, with normal z-scores for mitral early diastolic (V{sub E}), late diastolic (V{sub A}) and peak systolic (V{sub S}) velocities. Throughout all left ventricular levels, peak diastolic Vz and Vr (cm/s) were reduced in patients: Vz{sub base} -10.6 ± 1.9 vs. -13.4 ± 2.0 (P < 0.0003), Vz{sub mid} -7.8 ± 1.6 vs. -11 ± 1.5 (P < 0.0001), Vz{sub apex} -3.8 ± 1.6 vs. -5.3 ± 1.6 (P = 0.01), Vr{sub base} -4.2 ± 0.8 vs. -4.9 ± 0.7 (P = 0.01), Vr{sub mid} -4.7 ± 0.7 vs. -5.4 ± 0.7 (P = 0.01), Vr{sub apex} -4.7 ± 1.4 vs. -5.6 ± 1.1 (P = 0.05). Tissue phase mapping is feasible in children and adolescents. Children with chronic kidney disease show significantly reduced peak diastolic long- and short-axis left ventricular wall velocities, reflecting impaired early diastolic filling. Thus, tissue phase mapping detects chronic kidney disease-related functional myocardial changes before overt left ventricular hypertrophy or echocardiographic diastolic dysfunction occurs. (orig.)

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  2. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. Tissue

    Directory of Open Access Journals (Sweden)

    David Morrissey

    2012-01-01

    Full Text Available Purpose. In vivo gene therapy directed at tissues of mesenchymal origin could potentially augment healing. We aimed to assess the duration and magnitude of transene expression in vivo in mice and ex vivo in human tissues. Methods. Using bioluminescence imaging, plasmid and adenoviral vector-based transgene expression in murine quadriceps in vivo was examined. Temporal control was assessed using a doxycycline-inducible system. An ex vivo model was developed and optimised using murine tissue, and applied in ex vivo human tissue. Results. In vivo plasmid-based transgene expression did not silence in murine muscle, unlike in liver. Although maximum luciferase expression was higher in muscle with adenoviral delivery compared with plasmid, expression reduced over time. The inducible promoter cassette successfully regulated gene expression with maximum levels a factor of 11 greater than baseline. Expression was re-induced to a similar level on a temporal basis. Luciferase expression was readily detected ex vivo in human muscle and tendon. Conclusions. Plasmid constructs resulted in long-term in vivo gene expression in skeletal muscle, in a controllable fashion utilising an inducible promoter in combination with oral agents. Successful plasmid gene transfection in human ex vivo mesenchymal tissue was demonstrated for the first time.

  6. Wavelet analysis of polarization azimuths maps for laser images of myocardial tissue for the purpose of diagnosing acute coronary insufficiency

    Science.gov (United States)

    Wanchuliak, O. Ya.; Peresunko, A. P.; Bakko, Bouzan Adel; Kushnerick, L. Ya.

    2011-09-01

    This paper presents the foundations of a large scale - localized wavelet - polarization analysis - inhomogeneous laser images of histological sections of myocardial tissue. Opportunities were identified defining relations between the structures of wavelet coefficients and causes of death. The optical model of polycrystalline networks of myocardium protein fibrils is presented. The technique of determining the coordinate distribution of polarization azimuth of the points of laser images of myocardium histological sections is suggested. The results of investigating the interrelation between the values of statistical (statistical moments of the 1st-4th order) parameters are presented which characterize distributions of wavelet - coefficients polarization maps of myocardium layers and death reasons.

  7. Polarized light microscopy for 3-dimensional mapping of collagen fiber architecture in ocular tissues.

    Science.gov (United States)

    Yang, Bin; Jan, Ning-Jiun; Brazile, Bryn; Voorhees, Andrew; Lathrop, Kira L; Sigal, Ian A

    2018-04-06

    Collagen fibers play a central role in normal eye mechanics and pathology. In ocular tissues, collagen fibers exhibit a complex 3-dimensional (3D) fiber orientation, with both in-plane (IP) and out-of-plane (OP) orientations. Imaging techniques traditionally applied to the study of ocular tissues only quantify IP fiber orientation, providing little information on OP fiber orientation. Accurate description of the complex 3D fiber microstructures of the eye requires quantifying full 3D fiber orientation. Herein, we present 3dPLM, a technique based on polarized light microscopy developed to quantify both IP and OP collagen fiber orientations of ocular tissues. The performance of 3dPLM was examined by simulation and experimental verification and validation. The experiments demonstrated an excellent agreement between extracted and true 3D fiber orientation. Both IP and OP fiber orientations can be extracted from the sclera and the cornea, providing previously unavailable quantitative 3D measures and insight into the tissue microarchitecture. Together, the results demonstrate that 3dPLM is a powerful imaging technique for the analysis of ocular tissues. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  9. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  10. Post-exposure vaccination with multi-stage vaccine significantly reduce map level in tissues without interference in diagnostics

    DEFF Research Database (Denmark)

    Thakur, Aneesh; Aagaard, Claus; Melvang, Heidi Mikkelsen

    A new (Fet11) vaccine against paratuberculosis based on recombinant antigens from acute and latent stages of Map infection was developed to be used without interference with diagnostic tests for bovine TB and Johne’s disease. Calves were orally inoculated with 2x10E10 live Map in their third week...... of life and randomly assigned to four groups of seven calves each. One group was left unvaccinated, while other calves were post-exposure vaccinated with either a whole-cell vaccine at 16 weeks, or Fet11 vaccine at 3 and 7, or 16 and 20 weeks of age, respectively. Antibody responses were measured by ID...... Screen® ELISA and individual vaccine protein ELISAs along with FACS and IFN-γ responses to PPDj and to individual vaccine proteins. At termination 8 or 12 months of age, Map burden in a number of gut tissues was determined by quantitative IS900 PCR and histopathology. Fet11 vaccination of calves at 16...

  11. Livers, guts and gills: mapping the decay profiles of soft tissues to understand authigenic mineral replacement.

    Science.gov (United States)

    Clements, Thomas; Purnell, Mark; Gabbott, Sarah

    2016-04-01

    The hard mineralised parts of organisms such as shells, teeth and bones dominate the fossil record. There are, however, sites around the world where soft-tissues are preserved often through rapid replacement of original tissue by rapidly-precipitating authigenic minerals. These exceptionally well-preserved soft-bodied fossils are much more informative about the anatomy, physiology, ecology and behaviour of ancient organisms as well as providing a more inclusive picture of ecosystems and evolution throughout geological time. However, despite the wealth of information that soft-bodied fossils can provide they must first be correctly interpreted as the processes of both decay and preservation act to modify the carcass from its in vivo condition. Decay leads to alteration of the appearance and topology of anatomy, and ultimately to loss. Preservation is selective with some anatomical features being more likely to be captured than others. These problems are especially germane to the interpretation of deep-time and/or enigmatic fossils where no modern analogue exist for comparative anatomical analysis. It is therefore of vital importance to understand the processes carcasses undergo during the fossilisation process, , in order to interpret the anatomical remains of fossils and thus extract true evolutionary presence or absence of anatomy from absence due to taphonomic biases. We have designed a series of novel experiments to investigate, in real time, how decay processes affect the fossilisation potential of soft-tissues - especially of internal anatomy. Our data allow us to unravel both the timing and sequence of anatomical decay of different organs. At the same time through measuring Eh and pH in selected organs we can predict when anatomical features will fall in to the window of authigenic mineralization and thus potentially become preserved. We can also place constraints on which minerals will operate to capture tissues. Our findings are applied to the fossil record

  12. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  13. Determination of glutamate dehydrogenase activity and its kinetics in mouse tissues using metabolic mapping (quantitative enzyme histochemistry).

    Science.gov (United States)

    Botman, Dennis; Tigchelaar, Wikky; Van Noorden, Cornelis J F

    2014-11-01

    Glutamate dehydrogenase (GDH) catalyses the reversible conversion of glutamate into α-ketoglutarate with the concomitant reduction of NAD(P)(+) to NAD(P)H or vice versa. GDH activity is subject to complex allosteric regulation including substrate inhibition. To determine GDH kinetics in situ, we assessed the effects of various glutamate concentrations in combination with either the coenzyme NAD(+) or NADP(+) on GDH activity in mouse liver cryostat sections using metabolic mapping. NAD(+)-dependent GDH V(max) was 2.5-fold higher than NADP(+)-dependent V(max), whereas the K(m) was similar, 1.92 mM versus 1.66 mM, when NAD(+) or NADP(+) was used, respectively. With either coenzyme, V(max) was determined at 10 mM glutamate and substrate inhibition was observed at higher glutamate concentrations with a K(i) of 12.2 and 3.95 for NAD(+) and NADP(+) used as coenzyme, respectively. NAD(+)- and NADP(+)-dependent GDH activities were examined in various mouse tissues. GDH activity was highest in liver and much lower in other tissues. In all tissues, the highest activity was found when NAD(+) was used as a coenzyme. In conclusion, GDH activity in mice is highest in the liver with NAD(+) as a coenzyme and highest GDH activity was determined at a glutamate concentration of 10 mM. © The Author(s) 2014.

  14. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  15. Laser-induced Breakdown Spectroscopy: A New Approach for Nanoparticle's Mapping and Quantification in Organ Tissue

    Science.gov (United States)

    Sancey, Lucie; Motto-Ros, Vincent; Kotb, Shady; Wang, Xiaochun; Lux, François; Panczer, Gérard; Yu, Jin; Tillement, Olivier

    2014-01-01

    Emission spectroscopy of laser-induced plasma was applied to elemental analysis of biological samples. Laser-induced breakdown spectroscopy (LIBS) performed on thin sections of rodent tissues: kidneys and tumor, allows the detection of inorganic elements such as (i) Na, Ca, Cu, Mg, P, and Fe, naturally present in the body and (ii) Si and Gd, detected after the injection of gadolinium-based nanoparticles. The animals were euthanized 1 to 24 hr after intravenous injection of particles. A two-dimensional scan of the sample, performed using a motorized micrometric 3D-stage, allowed the infrared laser beam exploring the surface with a lateral resolution less than 100 μm. Quantitative chemical images of Gd element inside the organ were obtained with sub-mM sensitivity. LIBS offers a simple and robust method to study the distribution of inorganic materials without any specific labeling. Moreover, the compatibility of the setup with standard optical microscopy emphasizes its potential to provide multiple images of the same biological tissue with different types of response: elemental, molecular, or cellular. PMID:24962015

  16. Mapping of NKp46+ cells in healthy human lymphoid and non-lymphoid tissues

    Directory of Open Access Journals (Sweden)

    Elena eTomasello

    2012-11-01

    Full Text Available Understanding Natural Killer (NK cell anatomical distribution is key to dissect the role of these unconventional lymphocytes in physiological and disease conditions. In mouse, NK cells have been detected in various lymphoid and non-lymphoid organs, while in humans the current knowledge of NK cell distribution at steady state is mainly restricted to lymphoid tissues. The translation to humans of findings obtained in mice is facilitated by the identification of NK cell markers conserved between these two species. The Natural Cytotoxicity Receptor (NCR NKp46 is a marker of the NK cell lineage evolutionary conserved in mammals. In mice, NKp46 is also present on rare T cell subsets and on a subset of gut Innate Lymphoid Cells (ILCs expressing the retinoic acid receptor-related orphan receptor t (RORt transcription factor. Here, we documented the distribution and the phenotype of human NKp46+ cells in lymphoid and non-lymphoid tissues isolated from healthy donors. Human NKp46+ cells were found in splenic red pulp, in lymph nodes, in lungs and gut lamina propria, thus mirroring mouse NKp46+ cell distribution. We also identified a novel cell subset of CD56dimNKp46low cells that includes RORt+ILCs with a lineage-CD94-CD117brightCD127bright phenotype. The use of NKp46 thus contributes to establish the basis for analyzing quantitative and qualitative changes of NK cell and ILC subsets in human diseases.

  17. Mapping of NKp46+ Cells in Healthy Human Lymphoid and Non-Lymphoid Tissues

    Science.gov (United States)

    Tomasello, Elena; Yessaad, Nadia; Gregoire, Emilie; Hudspeth, Kelly; Luci, Carmelo; Mavilio, Domenico; Hardwigsen, Jean; Vivier, Eric

    2012-01-01

    Understanding Natural Killer (NK) cell anatomical distribution is key to dissect the role of these unconventional lymphocytes in physiological and disease conditions. In mouse, NK cells have been detected in various lymphoid and non-lymphoid organs, while in humans the current knowledge of NK cell distribution at steady state is mainly restricted to lymphoid tissues. The translation to humans of findings obtained in mice is facilitated by the identification of NK cell markers conserved between these two species. The Natural Cytotoxicity Receptor (NCR) NKp46 is a marker of the NK cell lineage evolutionary conserved in mammals. In mice, NKp46 is also present on rare T cell subsets and on a subset of gut Innate Lymphoid Cells (ILCs) expressing the retinoic acid receptor-related orphan receptor γt (RORγt) transcription factor. Here, we documented the distribution and the phenotype of human NKp46+ cells in lymphoid and non-lymphoid tissues isolated from healthy donors. Human NKp46+ cells were found in splenic red pulp, in lymph nodes, in lungs, and gut lamina propria, thus mirroring mouse NKp46+ cell distribution. We also identified a novel cell subset of CD56dimNKp46low cells that includes RORγt+ ILCs with a lineage−CD94−CD117brightCD127bright phenotype. The use of NKp46 thus contributes to establish the basis for analyzing quantitative and qualitative changes of NK cell and ILC subsets in human diseases. PMID:23181063

  18. A Tissue-Mapped Axolotl De Novo Transcriptome Enables Identification of Limb Regeneration Factors

    Directory of Open Access Journals (Sweden)

    Donald M. Bryant

    2017-01-01

    Full Text Available Mammals have extremely limited regenerative capabilities; however, axolotls are profoundly regenerative and can replace entire limbs. The mechanisms underlying limb regeneration remain poorly understood, partly because the enormous and incompletely sequenced genomes of axolotls have hindered the study of genes facilitating regeneration. We assembled and annotated a de novo transcriptome using RNA-sequencing profiles for a broad spectrum of tissues that is estimated to have near-complete sequence information for 88% of axolotl genes. We devised expression analyses that identified the axolotl orthologs of cirbp and kazald1 as highly expressed and enriched in blastemas. Using morpholino anti-sense oligonucleotides, we find evidence that cirbp plays a cytoprotective role during limb regeneration whereas manipulation of kazald1 expression disrupts regeneration. Our transcriptome and annotation resources greatly complement previous transcriptomic studies and will be a valuable resource for future research in regenerative biology.

  19. Mouse tetranectin: cDNA sequence, tissue-specific expression, and chromosomal mapping

    DEFF Research Database (Denmark)

    Ibaraki, K; Kozak, C A; Wewer, U M

    1995-01-01

    regulation, mouse tetranectin cDNA was cloned from a 16-day-old mouse embryo library. Sequence analysis revealed a 992-bp cDNA with an open reading frame of 606 bp, which is identical in length to the human tetranectin cDNA. The deduced amino acid sequence showed high homology to the human cDNA with 76......(s) of tetranectin. The sequence analysis revealed a difference in both sequence and size of the noncoding regions between mouse and human cDNAs. Northern analysis of the various tissues from mouse, rat, and cow showed the major transcript(s) to be approximately 1 kb, which is similar in size to that observed...

  20. The Influence of Phonotactic Probability on Nonword Repetition and Fast Mapping in 3-Year-Olds with a History of Expressive Language Delay

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Dalton, Kevin Patrick

    2015-01-01

    Purpose: The purpose of this study was to examine the influence of phonotactic probability on sublexical (phonological) and lexical representations in 3-year-olds who had a history of being late talkers in comparison with their peers with typical language development. Method: Ten 3-year-olds who were late talkers and 10 age-matched typically…

  1. Mapping photothermally induced gene expression in living cells and tissues by nanorod-locked nucleic acid complexes.

    Science.gov (United States)

    Riahi, Reza; Wang, Shue; Long, Min; Li, Na; Chiou, Pei-Yu; Zhang, Donna D; Wong, Pak Kin

    2014-04-22

    The photothermal effect of plasmonic nanostructures has numerous applications, such as cancer therapy, photonic gene circuit, large cargo delivery, and nanostructure-enhanced laser tweezers. The photothermal operation can also induce unwanted physical and biochemical effects, which potentially alter the cell behaviors. However, there is a lack of techniques for characterizing the dynamic cell responses near the site of photothermal operation with high spatiotemporal resolution. In this work, we show that the incorporation of locked nucleic acid probes with gold nanorods allows photothermal manipulation and real-time monitoring of gene expression near the area of irradiation in living cells and animal tissues. The multimodal gold nanorod serves as an endocytic delivery reagent to transport the probes into the cells, a fluorescence quencher and a binding competitor to detect intracellular mRNA, and a plasmonic photothermal transducer to induce cell ablation. We demonstrate the ability of the gold nanorod-locked nucleic acid complex for detecting the spatiotemporal gene expression in viable cells and tissues and inducing photothermal ablation of single cells. Using the gold nanorod-locked nucleic acid complex, we systematically characterize the dynamic cellular heat shock responses near the site of photothermal operation. The gold nanorod-locked nucleic acid complex enables mapping of intracellular gene expressions and analyzes the photothermal effects of nanostructures toward various biomedical applications.

  2. Development of a fluorescence endoscopic system for pH mapping of gastric tissue

    Science.gov (United States)

    Rochon, Philippe; Mordon, Serge; Buys, Bruno; Dhelin, Guy; Lesage, Jean C.; Chopin, Claude

    2003-10-01

    Measurement of gastro intestinal intramucosal pH (pHim) has been recognized as an important factor in the detection of hypoxia induced dysfonctions. However, current pH measurements techniques are limited in terms of time and spatial resolutions. A major advance in accurate pH measurement was the development of the ratiometric fluorescent indicator dye, 2',7'-bis(carboxyethyl)-5,6-carboxyfluorescein (BCECF). BCECF which pKa is in the physiological pH range is suitable for pH tissue measurements in vivo. This study aimed to develop and evaluate an endoscopic imaging system for real time pH measurements in the stomach in order to provide to ICU a new tool for gastro intestinal intramucosal pH (pHim) measurements. This fluorescence imaging technique should allow the temporal exploration of sequential events, particularly in ICU where the pHim provides a predictive information of the patient' status. The experimental evaluations of this new and innovative endoscopic fluorescence system confirms the accuracy of pH measurement using BCECF.

  3. A draft map of the human ovarian proteome for tissue engineering and clinical applications.

    Science.gov (United States)

    Ouni, Emna; Vertommen, Didier; Chiti, Maria Costanza; Dolmans, Marie-Madeleine; Amorim, Christiani Andrade

    2018-02-23

    Fertility preservation research in women today is increasingly taking advantage of bioengineering techniques to develop new biomimetic materials and solutions to safeguard ovarian cell function and microenvironment in vitro and in vivo. However, available data on the human ovary are limited and fundamental differences between animal models and humans are hampering researchers in their quest for more extensive knowledge of human ovarian physiology and key reproductive proteins that need to be preserved. We therefore turned to multi-dimensional label-free mass spectrometry to analyze human ovarian cortex, as it is a high-throughput and conclusive technique providing information on the proteomic composition of complex tissues like the ovary. In-depth proteomic profiling through two-dimensional liquid chromatography-mass spectrometry, western blot, histological and immunohistochemical analyses, and data mining helped us to confidently identify 1,508 proteins. Moreover, our method allowed us to chart the most complete representation so far of the ovarian matrisome, defined as the ensemble of extracellular matrix proteins and associated factors, including more than 80 proteins. In conclusion, this study will provide a better understanding of ovarian proteomics, with a detailed characterization of the ovarian follicle microenvironment, in order to enable bioengineers to create biomimetic scaffolds for transplantation and three-dimensional in vitro culture. By publishing our proteomic data, we also hope to contribute to accelerating biomedical research into ovarian health and disease in general. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  4. In-Field, In Situ, and In Vivo 3-Dimensional Elemental Mapping for Plant Tissue and Soil Analysis Using Laser-Induced Breakdown Spectroscopy

    Directory of Open Access Journals (Sweden)

    Chunjiang Zhao

    2016-10-01

    Full Text Available Sensing and mapping element distributions in plant tissues and its growth environment has great significance for understanding the uptake, transport, and accumulation of nutrients and harmful elements in plants, as well as for understanding interactions between plants and the environment. In this study, we developed a 3-dimensional elemental mapping system based on laser-induced breakdown spectroscopy that can be deployed in- field to directly measure the distribution of multiple elements in living plants as well as in the soil. Mapping is performed by a fast scanning laser, which ablates a micro volume of a sample to form a plasma. The presence and concentration of specific elements are calculated using the atomic, ionic, and molecular spectral characteristics of the plasma emission spectra. Furthermore, we mapped the pesticide residues in maize leaves after spraying to demonstrate the capacity of this method for trace elemental mapping. We also used the system to quantitatively detect the element concentrations in soil, which can be used to further understand the element transport between plants and soil. We demonstrate that this method has great potential for elemental mapping in plant tissues and soil with the advantages of 3-dimensional and multi-elemental mapping, in situ and in vivo measurement, flexible use, and low cost.

  5. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  6. Exceedance probability map: a tool helping the definition of arsenic Natural Background Level (NBL) within the Drainage Basin to the Venice Lagoon (NE Italy)

    Science.gov (United States)

    Dalla Libera, Nico; Fabbri, Paolo; Mason, Leonardo; Piccinini, Leonardo; Pola, Marco

    2017-04-01

    estimates the spatial distribution of the exceedance probabilities respect some pre-defined thresholds. This approach is largely mentioned in literature to face similar environmental problems. To test the validity of the procedure, we used the dataset from "A.Li.Na" project (founded by the Regional Environmental Agency) that defined regional NBLs of As, Fe, Mn and NH4+ into DBVL's groundwater. Primarily, we defined two thresholds corresponding respectively to the IDWS and the median of the data over the IDWS. These values were decided basing on the dataset's statistical structure and the quality criteria of the GWD 2006/118/EC. Subsequently, we evaluated the spatial distribution of the probability to exceed the defined thresholds using the Indicator kriging. The results highlight different zones with high exceedance probability ranging from 75% to 95% respect both the IDWS and the median value. Considering the geological setting of the DBVL, these probability values correspond with the occurrence of both organic matter and reducing conditions. In conclusion, the spatial prediction of the exceedance probability could be useful to define the areas in which estimate the local NBLs, enhancing the procedure of NBL definition. In that way, the NBL estimation could be more realistic because it considers the spatial distribution of the studied contaminant, distinguishing areas with high natural concentrations from polluted ones.

  7. MAP3K8 (TPL2/COT) Affects Obesity-Induced Adipose Tissue Inflammation without Systemic Effects in Humans and in Mice

    NARCIS (Netherlands)

    Ballak, D.B.; Essen, P. van; Diepen, J.A. van; Jansen, H.J.; Hijmans, A.G.; Matsuguchi, T.; Sparrer, H.; Tack, C.J.J.; Netea, M.G.; Joosten, L.A.B.; Stienstra, R.

    2014-01-01

    Chronic low-grade inflammation in adipose tissue often accompanies obesity, leading to insulin resistance and increasing the risk for metabolic diseases. MAP3K8 (TPL2/COT) is an important signal transductor and activator of pro-inflammatory pathways that has been linked to obesity-induced adipose

  8. Tissue reduction of map numbers after post-exposure vaccination with single latency antigen is improved by combination with acute-stage antigens in goats

    DEFF Research Database (Denmark)

    Thakur, Aneesh; Aagaard, C.; Melvang, Heidi Mikkelsen

    compared to unvaccinated control goats. FET11 and FET13 vaccination, however, provided significantly protection with absent or very low Map numbers in tissues. No goats seroconverted in ID Screen® ELISA, except for a single goat in the unvaccinated control group at last sampling prior to euthanasia. PPDj...

  9. Left ventricular regional myocardial motion and twist function in repaired tetralogy of Fallot evaluated by magnetic resonance tissue phase mapping

    International Nuclear Information System (INIS)

    Chang, Meng-Chu; Peng, Hsu-Hsia; Wu, Ming-Ting; Weng, Ken-Pen; Su, Mao-Yuan; Menza, Marius; Huang, Hung-Chieh

    2018-01-01

    We aimed to characterise regional myocardial motion and twist function in the left ventricles (LV) in patients with repaired tetralogy of Fallot (rTOF) and preserved LV global function. We recruited 47 rTOF patients and 38 age-matched normal volunteers. Tissue phase mapping (TPM) was performed for evaluating the LV myocardial velocity in longitudinal, radial, and circumferential (Vz, Vr, and VOe) directions in basal, middle, and apical slices. The VOe peak-to-peak (PTP) during systolic phases, the rotation angle of each slice, and VOe inconsistency were computed for evaluating LV twist function and VOe dyssynchrony. As compared to the controls, the rTOF patients presented decreased RV ejection fraction (RVEF) (p = 0.002) and preserved global LV ejection fraction (LVEF). They also demonstrated decreased systolic and diastolic Vz in several LV segments and higher diastolic Vr in the septum (all p < 0.05). A lower VOe PTP, higher VOe inconsistency, and reduced peak net rotation angle (all p < 0.05) were observed. The aforementioned indices demonstrated an altered LV twist function in rTOF patients in an early disease stage. MR TPM could provide information about early abnormalities of LV regional motion and twist function in rTOF patients with preserved LV global function. (orig.)

  10. Determination of composition and structure of spongy bone tissue in human head of femur by Raman spectral mapping.

    Science.gov (United States)

    Kozielski, M; Buchwald, T; Szybowicz, M; Błaszczak, Z; Piotrowski, A; Ciesielczyk, B

    2011-07-01

    Biomechanical properties of bone depend on the composition and organization of collagen fibers. In this study, Raman microspectroscopy was employed to determine the content of mineral and organic constituents and orientation of collagen fibers in spongy bone in the human head of femur at the microstructural level. Changes in composition and structure of trabecula were illustrated using Raman spectral mapping. The polarized Raman spectra permit separate analysis of local variations in orientation and composition. The ratios of ν₂PO₄³⁻/Amide III, ν₄PO₄³⁻/Amide III and ν₁CO₃²⁻/ν₂PO₄³⁻ are used to describe relative amounts of spongy bone components. The ν₁PO₄³⁻/Amide I ratio is quite susceptible to orientation effect and brings information on collagen fibers orientation. The results presented illustrate the versatility of the Raman method in the study of bone tissue. The study permits better understanding of bone physiology and evaluation of the biomechanical properties of bone.

  11. Gene Expression Profiling Soybean Stem Tissue Early Response to Sclerotinia sclerotiorum and In Silico Mapping in Relation to Resistance Markers

    Directory of Open Access Journals (Sweden)

    Bernarda Calla

    2009-07-01

    Full Text Available White mold, caused by (Lib. de Bary, can be a serious disease of crops grown under cool, moist environments. In many plants, such as soybean [ (L. Merr.], complete genetic resistance does not exist. To identify possible genes involved in defense against this pathogen, and to determine possible physiological changes that occur during infection, a microarray screen was conducted using stem tissue to evaluate changes in gene expression between partially resistant and susceptible soybean genotypes at 8 and 14 hours post inoculation. RNA from 15 day-old inoculated plants was labeled and hybridized to soybean cDNA microarrays. ANOVA identified 1270 significant genes from the comparison between time points and 105 genes from the comparison between genotypes. Selected genes were classified into functional categories. The analyses identified changes in cell-wall composition and signaling pathways, as well as suggesting a role for anthocyanin and anthocyanidin synthesis in the defense against . In-silico mapping of both the differentially expressed transcripts and of public markers associated with partial resistance to white mold, provided evidence of several differentially expressed genes being closely positioned to white mold resistance markers, with the two most promising genes encoding a PR-5 and anthocyanidin synthase.

  12. Left ventricular regional myocardial motion and twist function in repaired tetralogy of Fallot evaluated by magnetic resonance tissue phase mapping

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Meng-Chu; Peng, Hsu-Hsia [National Tsing Hua University, Department of Biomedical Engineering and Environmental Sciences, Hsinchu (China); Wu, Ming-Ting [Kaohsiung Veterans General Hospital, Department of Radiology, Kaohsiung (China); National Yang-Ming University, Faculty of Medicine, Taipei (China); Weng, Ken-Pen [National Yang-Ming University, Faculty of Medicine, Taipei (China); Kaohsiung Veterans General Hospital, Department of Pediatrics, Kaohsiung (China); Shu-Zen Junior College of Medicine and Management, Department of Physical Therapy, Kaohsiung (China); Su, Mao-Yuan [National Taiwan University Hospital, Department of Medical Imaging, Taipei (China); Menza, Marius [Medical Center University of Freiburg, Faculty of Medicine, University of Freiburg, Department of Radiology, Medical Physics, Freiburg (Germany); Huang, Hung-Chieh [Kaohsiung Veterans General Hospital, Department of Radiology, Kaohsiung (China)

    2018-01-15

    We aimed to characterise regional myocardial motion and twist function in the left ventricles (LV) in patients with repaired tetralogy of Fallot (rTOF) and preserved LV global function. We recruited 47 rTOF patients and 38 age-matched normal volunteers. Tissue phase mapping (TPM) was performed for evaluating the LV myocardial velocity in longitudinal, radial, and circumferential (Vz, Vr, and VOe) directions in basal, middle, and apical slices. The VOe peak-to-peak (PTP) during systolic phases, the rotation angle of each slice, and VOe inconsistency were computed for evaluating LV twist function and VOe dyssynchrony. As compared to the controls, the rTOF patients presented decreased RV ejection fraction (RVEF) (p = 0.002) and preserved global LV ejection fraction (LVEF). They also demonstrated decreased systolic and diastolic Vz in several LV segments and higher diastolic Vr in the septum (all p < 0.05). A lower VOe PTP, higher VOe inconsistency, and reduced peak net rotation angle (all p < 0.05) were observed. The aforementioned indices demonstrated an altered LV twist function in rTOF patients in an early disease stage. MR TPM could provide information about early abnormalities of LV regional motion and twist function in rTOF patients with preserved LV global function. (orig.)

  13. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    International Nuclear Information System (INIS)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C.; Schaarschmidt, Frank; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido

    2017-01-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  14. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping

    Energy Technology Data Exchange (ETDEWEB)

    Baessler, Bettina; Treutlein, Melanie; Maintz, David; Bunck, Alexander C. [University Hospital of Cologne, Department of Radiology, Cologne (Germany); Schaarschmidt, Frank [Leibniz Universitaet Hannover, Institute of Biostatistics, Faculty of Natural Sciences, Hannover (Germany); Stehning, Christian [Philips Research, Hamburg (Germany); Schnackenburg, Bernhard [Philips, Healthcare Germany, Hamburg (Germany); Michels, Guido [University Hospital of Cologne, Department III of Internal Medicine, Heart Centre, Cologne (Germany)

    2017-12-15

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. (orig.)

  15. DNA methylation map in circulating leukocytes mirrors subcutaneous adipose tissue methylation pattern: a genome-wide analysis from non-obese and obese patients

    Science.gov (United States)

    Crujeiras, A. B.; Diaz-Lagares, A.; Sandoval, J.; Milagro, F. I.; Navas-Carretero, S.; Carreira, M. C.; Gomez, A.; Hervas, D.; Monteiro, M. P.; Casanueva, F. F.; Esteller, M.; Martinez, J. A.

    2017-01-01

    The characterization of the epigenetic changes within the obesity-related adipose tissue will provide new insights to understand this metabolic disorder, but adipose tissue is not easy to sample in population-based studies. We aimed to evaluate the capacity of circulating leukocytes to reflect the adipose tissue-specific DNA methylation status of obesity susceptibility. DNA samples isolated from subcutaneous adipose tissue and circulating leukocytes were hybridized in the Infinium HumanMethylation 450 BeadChip. Data were compared between samples from obese (n = 45) and non-obese (n = 8–10) patients by Wilcoxon-rank test, unadjusted for cell type distributions. A global hypomethylation of the differentially methylated CpG sites (DMCpGs) was observed in the obese subcutaneous adipose tissue and leukocytes. The overlap analysis yielded a number of genes mapped by the common DMCpGs that were identified to reflect the obesity state in the leukocytes. Specifically, the methylation levels of FGFRL1, NCAPH2, PNKD and SMAD3 exhibited excellent and statistically significant efficiencies in the discrimination of obesity from non-obesity status (AUC > 0.80; p obesity-related adipose tissue pathogenesis through peripheral blood analysis, an easily accessible and minimally invasive biological material instead of adipose tissue. PMID:28211912

  16. Does the fluence map editing in electronic tissue compensator improve dose homogeneity in bilateral field plan of head and neck patients?

    Directory of Open Access Journals (Sweden)

    Kinhikar Rajesh

    2008-01-01

    Full Text Available The purpose of this study was to evaluate the effect of fluence map editing in electronic tissue compensator (ETC on the dose homogeneity for head and neck cancer patients. Treatment planning using 6-MV X-rays and bilateral field arrangement employing ETC was carried out on the computed tomography (CT datasets of 20 patients with head and neck cancer. All the patients were planned in Varian Eclipse three-dimensional treatment planning system (3DTPS with dynamic multileaf collimator (DMLC. The treatment plans, with and without fluence editing, was compared and the effect of pre-editing and post-editing the fluence maps in the treatment field was evaluated. The skin dose was measured with thermoluminescent dosimeters (TLDs and was compared with the skin dose estimated by TPS. The mean percentage volume of the tissue receiving at least 107% of the prescription dose was 5.4 (range 1.5-10; SD 2.4. Post-editing fluence map showed that the mean percentage volume of the tissue receiving at least 107% of the prescription dose was 0.47 (range 0.1-0.9; SD 0.3. The mean skin dose measured with TLD was found to be 74% (range 71-80% of the prescribed dose while the TPS showed the mean skin dose as 85% (range 80-90%. The TPS overestimated the skin dose by 11%. Fluence map editing thus proved to be a potential tool for improving dose homogeneity in head and neck cancer patients planned with ETC, thus reducing the hot spots in the treatment region as well. The treatment with ETC is feasible with DMLC and does not take any additional time for setup or delivery. The method used to edit the fluence maps is simple and time efficient. Manual control over a plan is essential to create the best treatment plan possible.

  17. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  18. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  19. Validity of T2 mapping in characterization of the regeneration tissue by bone marrow derived cell transplantation in osteochondral lesions of the ankle

    Energy Technology Data Exchange (ETDEWEB)

    Battaglia, M., E-mail: milva.battaglia@ior.it [Service of Ecography and Radiology, Rizzoli Orthopaedic Institute, via Pupilli n. 1, 40136 Bologna (Italy); Rimondi, E. [Service of Ecography and Radiology, Rizzoli Orthopaedic Institute, via Pupilli n. 1, 40136 Bologna (Italy); Monti, C. [Service of CT and MRI, Casa di Cura Madre Fortunata Toniolo, Bologna (Italy); Guaraldi, F. [Department of Pathology, The Johns Hopkins University, School of Medicine, Baltimore, MD (United States); Sant' Andrea, A. [Service of CT and MRI, Casa di Cura Madre Fortunata Toniolo, Bologna (Italy); Buda, R.; Cavallo, M.; Giannini, S.; Vannini, F. [Clinical Orthopaedic and Traumatology Unit II, Rizzoli Orthopaedic Institute, Bologna (Italy)

    2011-11-15

    Objective: Bone marrow derived cell transplantation (BMDCT) has been recently suggested as a possible surgical technique to repair osteochondral lesions. To date, no qualitative MRI studies have evaluated its efficacy. The aim of our study is to investigate the validity of MRI T2-mapping sequence in characterizing the reparative tissue obtained and its ability to correlate with clinical results. Methods and materials: 20 patients with an osteochondral lesion of the talus underwent BMDCT and were evaluated at 2 years follow up using MRI T2-mapping sequence. 20 healthy volunteers were recruited as controls. MRI images were acquired using a protocol suggested by the International Cartilage Repair Society, MOCART scoring system and T2 mapping. Results were then correlated with AOFAS clinical score. Results: AOFAS score increased from 66.8 {+-} 14.5 pre-operatively to 91.2 {+-} 8.3 (p < 0.0005) at 2 years follow-up. T2-relaxation time value of 35-45 ms was derived from healthy ankles evaluation and assumed as normal hyaline cartilage value and used as a control. Regenerated tissue with a T2-relaxation time value comparable to hyaline cartilage was found in all the cases treated, covering a mean of 78% of the repaired lesion area. A high clinical score was related directly to isointense signal in DPFSE fat sat (p = 0.05), and percentage of regenerated hyaline cartilage (p = 0.05), inversely to the percentage of regenerated fibrocartilage. Lesion's depth negatively related to the integrity of the repaired tissue's surface (tau = -0.523, p = 0.007), and to the percentage of regenerated hyaline cartilage (rho = -0.546, p = 0.013). Conclusions: Because of its ability to detect cartilage's quality and to correlate to the clinical score, MRI T2-mapping sequence integrated with Mocart score represent a valid, non-invasive technique for qualitative cartilage assessment after regenerative surgical procedures.

  20. Oxygen Mapping within Healthy and Acutely Infarcted Brain Tissue in Humans Using the NMR Relaxation of Lipids: A Proof-Of-Concept Translational Study.

    Science.gov (United States)

    Colliez, Florence; Safronova, Marta M; Magat, Julie; Joudiou, Nicolas; Peeters, André P; Jordan, Bénédicte F; Gallez, Bernard; Duprez, Thierry

    2015-01-01

    The clinical applicability of brain oxygenation mapping using the MOBILE (Mapping of Oxygen By Imaging Lipids relaxation Enhancement) magnetic resonance (MR) technique was assessed in the clinical setting of normal brain and of acute cerebral ischemia as a founding proof-of-concept translational study. Changes in the oxygenation level within healthy brain tissue can be detected by analyzing the spin-lattice proton relaxation ('Global T1' combining water and lipid protons) because of the paramagnetic properties of molecular oxygen. It was hypothesized that selective measurement of the relaxation of the lipid protons ('Lipids T1') would result in enhanced sensitivity of pO2 mapping because of higher solubility of oxygen in lipids than in water, and this was demonstrated in pre-clinical models using the MOBILE technique. In the present study, 12 healthy volunteers and eight patients with acute (48-72 hours) brain infarction were examined with the same clinical 3T MR system. Both Lipids R1 (R1 = 1/T1) and Global R1 were significantly different in the infarcted area and the contralateral unaffected brain tissue, with a higher statistical significance for Lipids R1 (median difference: 0.408 s-1; pbrain tissue of stroke patients were not significantly different from the R1 values calculated in the brain tissue of healthy volunteers. The main limitations of the present prototypic version of the MOBILE sequence are the long acquisition time (4 min), hampering robustness of data in uncooperative patients, and a 2 mm slice thickness precluding accurate measurements in small infarcts because of partial volume averaging effects.

  1. Molecular cloning, genomic organization, chromosome mapping, tissues expression pattern and identification of a novel splicing variant of porcine CIDEb gene

    International Nuclear Information System (INIS)

    Li, YanHua; Li, AiHua; Yang, Z.Q.

    2016-01-01

    Cell death-inducing DNA fragmentation factor-α-like effector b (CIDEb) is a member of the CIDE family of apoptosis-inducing factors, CIDEa and CIDEc have been reported to be Lipid droplets (LDs)-associated proteins that promote atypical LD fusion in adipocytes, and responsible for liver steatosis under fasting and obese conditions, whereas CIDEb promotes lipid storage under normal diet conditions [1], and promotes the formation of triacylglyceride-enriched VLDL particles in hepatocytes [2]. Here, we report the gene cloning, chromosome mapping, tissue distribution, genetic expression analysis, and identification of a novel splicing variant of the porcine CIDEb gene. Sequence analysis shows that the open reading frame of the normal porcine CIDEb isoform covers 660bp and encodes a 219-amino acid polypeptide, whereas its alternative splicing variant encodes a 142-amino acid polypeptide truncated at the fourth exon and comprised of the CIDE-N domain and part of the CIDE-C domain. The deduced amino acid sequence of normal porcine CIDEb shows an 85.8% similarity to the human protein and 80.0% to the mouse protein. The CIDEb genomic sequence spans approximately 6KB comprised of five exons and four introns. Radiation hybrid mapping demonstrated that porcine CIDEb is located at chromosome 7q21 and at a distance of 57cR from the most significantly linked marker, S0334, regions that are syntenic with the corresponding region in the human genome. Tissue expression analysis indicated that normal CIDEb mRNA is ubiquitously expressed in many porcine tissues. It was highly expressed in white adipose tissue and was observed at relatively high levels in the liver, lung, small intestine, lymphatic tissue and brain. The normal version of CIDEb was the predominant form in all tested tissues, whereas the splicing variant was expressed at low levels in all examined tissues except the lymphatic tissue. Furthermore, genetic expression analysis indicated that CIDEb mRNA levels were

  2. Molecular cloning, genomic organization, chromosome mapping, tissues expression pattern and identification of a novel splicing variant of porcine CIDEb gene

    Energy Technology Data Exchange (ETDEWEB)

    Li, YanHua, E-mail: liyanhua.1982@aliyun.com [Ministry of Education Key Laboratory of Child Development and Disorders, Chongqing Key Laboratory of Translational Medical Research in Cognitive Development and Learning and Memory Disorders, China International Science and Technology Cooperation base of Child development and Critical Disorders, Children’s Hospital of Chongqing Medical University, Chongqing 400014 (China); Li, AiHua [Chongqing Cancer Institute & Hospital & Cancer Center, Chongqing 404100 (China); Yang, Z.Q. [Key Laboratory of Agricultural Animal Genetics, Breeding and Reproduction of Ministry of Education, College of Life Science and Technology, Huazhong Agricultural University, Wuhan 430070 (China)

    2016-09-09

    Cell death-inducing DNA fragmentation factor-α-like effector b (CIDEb) is a member of the CIDE family of apoptosis-inducing factors, CIDEa and CIDEc have been reported to be Lipid droplets (LDs)-associated proteins that promote atypical LD fusion in adipocytes, and responsible for liver steatosis under fasting and obese conditions, whereas CIDEb promotes lipid storage under normal diet conditions [1], and promotes the formation of triacylglyceride-enriched VLDL particles in hepatocytes [2]. Here, we report the gene cloning, chromosome mapping, tissue distribution, genetic expression analysis, and identification of a novel splicing variant of the porcine CIDEb gene. Sequence analysis shows that the open reading frame of the normal porcine CIDEb isoform covers 660bp and encodes a 219-amino acid polypeptide, whereas its alternative splicing variant encodes a 142-amino acid polypeptide truncated at the fourth exon and comprised of the CIDE-N domain and part of the CIDE-C domain. The deduced amino acid sequence of normal porcine CIDEb shows an 85.8% similarity to the human protein and 80.0% to the mouse protein. The CIDEb genomic sequence spans approximately 6KB comprised of five exons and four introns. Radiation hybrid mapping demonstrated that porcine CIDEb is located at chromosome 7q21 and at a distance of 57cR from the most significantly linked marker, S0334, regions that are syntenic with the corresponding region in the human genome. Tissue expression analysis indicated that normal CIDEb mRNA is ubiquitously expressed in many porcine tissues. It was highly expressed in white adipose tissue and was observed at relatively high levels in the liver, lung, small intestine, lymphatic tissue and brain. The normal version of CIDEb was the predominant form in all tested tissues, whereas the splicing variant was expressed at low levels in all examined tissues except the lymphatic tissue. Furthermore, genetic expression analysis indicated that CIDEb mRNA levels were

  3. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  4. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  5. Evaluation and comparison of cartilage repair tissue of the patella and medial femoral condyle by using morphological MRI and biochemical zonal T2 mapping

    International Nuclear Information System (INIS)

    Welsch, Goetz H.; Mamisch, Tallal C.; Quirbach, Sebastian; Trattnig, Siegfried; Zak, Lukas; Marlovits, Stefan

    2009-01-01

    The objective of this study was to use advanced MR techniques to evaluate and compare cartilage repair tissue after matrix-associated autologous chondrocyte transplantation (MACT) in the patella and medial femoral condyle (MFC). Thirty-four patients treated with MACT underwent 3-T MRI of the knee. Patients were treated on either patella (n = 17) or MFC (n = 17) cartilage and were matched by age and postoperative interval. For morphological evaluation, the MR observation of cartilage repair tissue (MOCART) score was used, with a 3D-True-FISP sequence. For biochemical assessment, T2 mapping was prepared by using a multiecho spin-echo approach with particular attention to the cartilage zonal structure. Statistical evaluation was done by analyses of variance. The MOCART score showed no significant differences between the patella and MFC (p ≥ 0.05). With regard to biochemical T2 relaxation, higher T2 values were found throughout the MFC (p < 0.05). The zonal increase in T2 values from deep to superficial was significant for control cartilage (p < 0.001) and cartilage repair tissue (p < 0.05), with an earlier onset in the repair tissue of the patella. The assessment of cartilage repair tissue of the patella and MFC afforded comparable morphological results, whereas biochemical T2 values showed differences, possibly due to dissimilar biomechanical loading conditions. (orig.)

  6. MO-F-CAMPUS-J-04: Tissue Segmentation-Based MR Electron Density Mapping Method for MR-Only Radiation Treatment Planning of Brain

    Energy Technology Data Exchange (ETDEWEB)

    Yu, H [Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Lee, Y [Sunnybrook Odette Cancer Centre, Toronto, Ontario (Canada); Ruschin, M [Odette Cancer Centre, Toronto, ON (Canada); Karam, I [Sunnybrook Odette Cancer Center, Toronto, Ontario (Canada); Sahgal, A [University of Toronto, Toronto, ON (Canada)

    2015-06-15

    Purpose: Automatically derive electron density of tissues using MR images and generate a pseudo-CT for MR-only treatment planning of brain tumours. Methods: 20 stereotactic radiosurgery (SRS) patients’ T1-weighted MR images and CT images were retrospectively acquired. First, a semi-automated tissue segmentation algorithm was developed to differentiate tissues with similar MR intensities and large differences in electron densities. The method started with approximately 12 slices of manually contoured spatial regions containing sinuses and airways, then air, bone, brain, cerebrospinal fluid (CSF) and eyes were automatically segmented using edge detection and anatomical information including location, shape, tissue uniformity and relative intensity distribution. Next, soft tissues - muscle and fat were segmented based on their relative intensity histogram. Finally, intensities of voxels in each segmented tissue were mapped into their electron density range to generate pseudo-CT by linearly fitting their relative intensity histograms. Co-registered CT was used as a ground truth. The bone segmentations of pseudo-CT were compared with those of co-registered CT obtained by using a 300HU threshold. The average distances between voxels on external edges of the skull of pseudo-CT and CT in three axial, coronal and sagittal slices with the largest width of skull were calculated. The mean absolute electron density (in Hounsfield unit) difference of voxels in each segmented tissues was calculated. Results: The average of distances between voxels on external skull from pseudo-CT and CT were 0.6±1.1mm (mean±1SD). The mean absolute electron density differences for bone, brain, CSF, muscle and fat are 78±114 HU, and 21±8 HU, 14±29 HU, 57±37 HU, and 31±63 HU, respectively. Conclusion: The semi-automated MR electron density mapping technique was developed using T1-weighted MR images. The generated pseudo-CT is comparable to that of CT in terms of anatomical position of

  7. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  10. Re-evaluation of a novel approach for quantitative myocardial oedema detection by analysing tissue inhomogeneity in acute myocarditis using T2-mapping.

    Science.gov (United States)

    Baeßler, Bettina; Schaarschmidt, Frank; Treutlein, Melanie; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C

    2017-12-01

    To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. • A novel quantitative approach to myocardial oedema imaging in myocarditis was re-evaluated. • The T2-mapping-derived parameters maxT2 and madSD were compared to traditional Lake-Louise criteria. • Using maxT2 and madSD with dedicated cut-offs performs similarly to Lake-Louise criteria. • Adding maxT2 and madSD to LGE results in further increased diagnostic performance. • This novel approach has the potential to overcome the limitations of T2-mapping.

  11. In situ biological dose mapping estimates the radiation burden delivered to 'spared' tissue between synchrotron X-ray microbeam radiotherapy tracks.

    Directory of Open Access Journals (Sweden)

    Kai Rothkamm

    Full Text Available Microbeam radiation therapy (MRT using high doses of synchrotron X-rays can destroy tumours in animal models whilst causing little damage to normal tissues. Determining the spatial distribution of radiation doses delivered during MRT at a microscopic scale is a major challenge. Film and semiconductor dosimetry as well as Monte Carlo methods struggle to provide accurate estimates of dose profiles and peak-to-valley dose ratios at the position of the targeted and traversed tissues whose biological responses determine treatment outcome. The purpose of this study was to utilise γ-H2AX immunostaining as a biodosimetric tool that enables in situ biological dose mapping within an irradiated tissue to provide direct biological evidence for the scale of the radiation burden to 'spared' tissue regions between MRT tracks. Γ-H2AX analysis allowed microbeams to be traced and DNA damage foci to be quantified in valleys between beams following MRT treatment of fibroblast cultures and murine skin where foci yields per unit dose were approximately five-fold lower than in fibroblast cultures. Foci levels in cells located in valleys were compared with calibration curves using known broadbeam synchrotron X-ray doses to generate spatial dose profiles and calculate peak-to-valley dose ratios of 30-40 for cell cultures and approximately 60 for murine skin, consistent with the range obtained with conventional dosimetry methods. This biological dose mapping approach could find several applications both in optimising MRT or other radiotherapeutic treatments and in estimating localised doses following accidental radiation exposure using skin punch biopsies.

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. Flood inundation maps and water-surface profiles for tropical storm Irene and selected annual exceedance probability floods for Flint Brook and the Third Branch White River in Roxbury, Vermont

    Science.gov (United States)

    Ahearn, Elizabeth A.; Lombard, Pamela J.

    2014-01-01

    for the 10-, 2-, 1, or 0.2-percent annual exceedance probabilities. The simulated water-surface elevations for August 2011 flood equal the elevations of State Route 12A about 500 ft downstream of Thurston Hill Road adjacent to the troughs between the rearing ponds. Four flood mitigation alternatives being considered by the Vermont Agency of Transportation to improve the hydraulic performance of Flint Brook and reduce the risk of flooding at the hatchery include: (A) no changes to the infrastructure or existing alignment of Flint Brook (existing conditions [2014]), (B) structural changes to the bridges and the existing retaining wall along Flint Brook, (C) realignment of Flint Brook to flow along the south side of Oxbow Road to accommodate larger stream discharges, and (D) a diversion channel for flows greater than 1-percent annual exceedance probability. Although the 10-, 2-, and 1-percent AEP floods do not flood the hatchery under alternative A (no changes to the infrastructure), the 0.2-percent AEP flow still poses a flooding threat to the hatchery because flow will continue to overtop the existing retaining wall and flood the hatchery. Under the other mitigation alternatives (B, C, and D) that include some variation of structural changes to bridges, a retaining wall, and (or) channel, the peak discharges for the 10-, 2-, 1-, and 0.2-percent annual exceedance probabilities do not flood the hatchery. Water-surface profiles and flood inundation maps of the August 2011 flood and the 10-, 2-, 1-, and 0.2-percent AEPs for four mitigation alternatives were developed for Flint Brook and the Third Branch White River in the vicinity of the hatchery and can be used by the Federal, State, and local agencies to better understand the potential for future flooding at the hatchery.

  14. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  15. Mapping of the spatial distribution of silver nanoparticles in root tissues of Vicia faba by laser-induced breakdown spectroscopy (LIBS).

    Science.gov (United States)

    Krajcarová, L; Novotný, K; Kummerová, M; Dubová, J; Gloser, V; Kaiser, J

    2017-10-01

    The manuscript presents a procedure for optimal sample preparation and the mapping of the spatial distribution of metal ions and nanoparticles in plant roots using laser-induced breakdown spectroscopy (LIBS) in a double-pulse configuration (DP LIBS) in orthogonal reheating mode. Two Nd:YAG lasers were used; the first one was an ablation laser (UP-266 MACRO, New Wave, USA) with a wavelength of 266nm, and the second one (Brilliant, Quantel, France), with a fundamental wavelength of 1064nm, was used to reheat the microplasma. Seedlings of Vicia faba were cultivated for 7 days in CuSO 4 or AgNO 3 solutions with a concentration of 10µmoll -1 or in a solution of silver nanoparticles (AgNPs) with a concentration of 10µmoll -1 of total Ag, and in distilled water as a control. The total contents of the examined metals in the roots after sample mineralization as well as changes in the concentrations of the metals in the cultivation solutions were monitored by ICP-OES. Root samples embedded in the TissueTek medium and cut into 40µm thick cross sections using the Cryo-Cut Microtome proved to be best suited for an accurate LIBS analysis with a 50µm spatial resolution. 2D raster maps of elemental distribution were created for the emission lines of Cu(I) at 324.754nm and Ag(I) at 328.068nm. The limits of detection of DP LIBS for the root cross sections were estimated to be 4pg for Cu, 18pg for Ag, and 3pg for AgNPs. The results of Ag spatial distribution mapping indicated that unlike Ag + ions, AgNPs do not penetrate into the inner tissues of Vicia faba roots but stay in their outermost layers. The content of Ag in roots cultivated in the AgNP solution was one order of magnitude lower compared to roots cultivated in the metal ion solutions. The significantly smaller concentration of Ag in root tissues cultivated in the AgNP solution also supports the conclusion that the absorption and uptake of AgNPs by roots of Vicia faba is very slow. LIBS mapping of root sections

  16. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  17. Mueller-matrix mapping of biological tissues in differential diagnosis of optical anisotropy mechanisms of protein networks

    Energy Technology Data Exchange (ETDEWEB)

    Ushenko, V A; Sidor, M I [Yuriy Fedkovych Chernivtsi National University, Chernivtsi (Ukraine); Marchuk, Yu F; Pashkovskaya, N V; Andreichuk, D R [Bukovinian State Medical University, Chernivtsi (Ukraine)

    2015-03-31

    We report a model of Mueller-matrix description of optical anisotropy of protein networks in biological tissues with allowance for the linear birefringence and dichroism. The model is used to construct the reconstruction algorithms of coordinate distributions of phase shifts and the linear dichroism coefficient. In the statistical analysis of such distributions, we have found the objective criteria of differentiation between benign and malignant tissues of the female reproductive system. From the standpoint of evidence-based medicine, we have determined the operating characteristics (sensitivity, specificity and accuracy) of the Mueller-matrix reconstruction method of optical anisotropy parameters and demonstrated its effectiveness in the differentiation of benign and malignant tumours. (laser applications and other topics in quantum electronics)

  18. A Map of General and Specialized Chromatin Readers in Mouse Tissues Generated by Label-free Interaction Proteomics

    DEFF Research Database (Denmark)

    Eberl, H.C.; Mann, M.; Spruijt, C.G.

    2013-01-01

    Posttranslational modifications on core histones can serve as binding scaffolds for chromatin-associated proteins. Proteins that specifically bind to or "read" these modifications were previously identified in mass spectrometry-based proteomics screens based on stable isotope-labeling in cell lines...... the chromatin interaction landscape in mouse tissues, our workflow can be used for peptides with different modifications and cell types of any organism....

  19. Improved Culture Medium (TiKa) for Mycobacterium avium Subspecies Paratuberculosis (MAP) Matches qPCR Sensitivity and Reveals Significant Proportions of Non-viable MAP in Lymphoid Tissue of Vaccinated MAP Challenged Animals

    DEFF Research Database (Denmark)

    Bull, Tim J.; Munshil, Tulika; Melvang, Heidi Mikkelsen

    2017-01-01

    The quantitative detection of viable pathogen load is an important tool in determining the degree of infection in animals and contamination of foodstuffs. Current conventional culture methods are limited in their ability to determine these levels in Mycobacterium avium subspecies paratuberculosis......Ka culture equates well with qPCR and provides important evidence that accuracy in estimating viable MAP load using DNA tests alone may vary significantly between samples of mucosal and lymphatic origin....... (MAP) due to slow growth, clumping and low recoverability issues. The principle goal of this study was to evaluate a novel culturing process (TiKa) with unique ability to stimulate MAP growth from low sample loads and dilutions. We demonstrate it was able to stimulate a mean 29-fold increase...

  20. Three-dimensional optical micro-angiography maps directional blood perfusion deep within microcirculation tissue beds in vivo

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Ruikang K [Department of Biomedical Engineering, Oregon Health and Science University, Portland, OR 97237 (United States)

    2007-12-07

    Optical micro-angiography (OMAG) is a recently developed method of imaging localized blood perfusion at capillary level resolution within microcirculatory beds. This paper reports that the OMAG is capable of directional blood perfusion mapping in vivo. This is achieved simply by translating the mirror located in the reference arm back and forth while 3D imaging is performed. The mirror which moves toward the incident beam gives the blood perfusion that flows away from the beam direction and vice versa. The approach is experimentally demonstrated by imaging of a flow phantom and then cerebro-vascular perfusion of a live mouse with cranium intact.

  1. Incidence of late rectal bleeding in high-dose conformal radiotherapy of prostate cancer using equivalent uniform dose-based and dose-volume-based normal tissue complication probability models

    International Nuclear Information System (INIS)

    Soehn, Matthias; Yan Di; Liang Jian; Meldolesi, Elisa; Vargas, Carlos; Alber, Markus

    2007-01-01

    Purpose: Accurate modeling of rectal complications based on dose-volume histogram (DVH) data are necessary to allow safe dose escalation in radiotherapy of prostate cancer. We applied different equivalent uniform dose (EUD)-based and dose-volume-based normal tissue complication probability (NTCP) models to rectal wall DVHs and follow-up data for 319 prostate cancer patients to identify the dosimetric factors most predictive for Grade ≥ 2 rectal bleeding. Methods and Materials: Data for 319 patients treated at the William Beaumont Hospital with three-dimensional conformal radiotherapy (3D-CRT) under an adaptive radiotherapy protocol were used for this study. The following models were considered: (1) Lyman model and (2) logit-formula with DVH reduced to generalized EUD (3) serial reconstruction unit (RU) model (4) Poisson-EUD model, and (5) mean dose- and (6) cutoff dose-logistic regression model. The parameters and their confidence intervals were determined using maximum likelihood estimation. Results: Of the patients, 51 (16.0%) showed Grade 2 or higher bleeding. As assessed qualitatively and quantitatively, the Lyman- and Logit-EUD, serial RU, and Poisson-EUD model fitted the data very well. Rectal wall mean dose did not correlate to Grade 2 or higher bleeding. For the cutoff dose model, the volume receiving > 73.7 Gy showed most significant correlation to bleeding. However, this model fitted the data more poorly than the EUD-based models. Conclusions: Our study clearly confirms a volume effect for late rectal bleeding. This can be described very well by the EUD-like models, of which the serial RU- and Poisson-EUD model can describe the data with only two parameters. Dose-volume-based cutoff-dose models performed worse

  2. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  3. Seismic Ground Motion Hazards with 2 Percent Probability

    Data.gov (United States)

    Department of Homeland Security — This map layer shows seismic hazard in the United States. The data represent a model showing the probability that ground motion will reach a certain level. This map...

  4. Seismic Ground Motion Hazards with 10 Percent Probability

    Data.gov (United States)

    Department of Homeland Security — This map layer shows seismic hazard in the United States. The data represent a model showing the probability that ground motion will reach a certain level. This map...

  5. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  6. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  7. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  8. Volumetric segmentation of ADC maps and utility of standard deviation as measure of tumor heterogeneity in soft tissue tumors.

    Science.gov (United States)

    Singer, Adam D; Pattany, Pradip M; Fayad, Laura M; Tresley, Jonathan; Subhawong, Ty K

    2016-01-01

    Determine interobserver concordance of semiautomated three-dimensional volumetric and two-dimensional manual measurements of apparent diffusion coefficient (ADC) values in soft tissue masses (STMs) and explore standard deviation (SD) as a measure of tumor ADC heterogeneity. Concordance correlation coefficients for mean ADC increased with more extensive sampling. Agreement on the SD of tumor ADC values was better for large regions of interest and multislice methods. Correlation between mean and SD ADC was low, suggesting that these parameters are relatively independent. Mean ADC of STMs can be determined by volumetric quantification with high interobserver agreement. STM heterogeneity merits further investigation as a potential imaging biomarker that complements other functional magnetic resonance imaging parameters. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Polarized Raman anisotropic response of collagen in tendon: towards 3D orientation mapping of collagen in tissues.

    Directory of Open Access Journals (Sweden)

    Leonardo Galvis

    Full Text Available In this study, polarized Raman spectroscopy (PRS was used to characterize the anisotropic response of the amide I band of collagen as a basis for evaluating three-dimensional collagen fibril orientation in tissues. Firstly, the response was investigated theoretically by applying classical Raman theory to collagen-like peptide crystal structures. The theoretical methodology was then tested experimentally, by measuring amide I intensity anisotropy in rat tail as a function of the orientation of the incident laser polarization. For the theoretical study, several collagen-like triple-helical peptide crystal structures obtained from the Protein Data Bank were rotated "in plane" and "out of plane" to evaluate the role of molecular orientation on the intensity of the amide I band. Collagen-like peptides exhibit a sinusoidal anisotropic response when rotated "in plane" with respect to the polarized incident laser. Maximal intensity was obtained when the polarization of the incident light is perpendicular to the molecule and minimal when parallel. In the case of "out of plane" rotation of the molecular structure a decreased anisotropic response was observed, becoming completely isotropic when the structure was perpendicular to the plane of observation. The theoretical Raman response of collagen was compared to that of alpha helical protein fragments. In contrast to collagen, alpha helices have a maximal signal when incident light is parallel to the molecule and minimal when perpendicular. For out-of-plane molecular orientations alpha-helix structures display a decreased average intensity. Results obtained from experiments on rat tail tendon are in excellent agreement with the theoretical predictions, thus demonstrating the high potential of PRS for experimental evaluation of the three-dimensional orientation of collagen fibers in biological tissues.

  10. Full-length cDNA sequences from Rhesus monkey placenta tissue: analysis and utility for comparative mapping

    Directory of Open Access Journals (Sweden)

    Lee Sang-Rae

    2010-07-01

    Full Text Available Abstract Background Rhesus monkeys (Macaca mulatta are widely-used as experimental animals in biomedical research and are closely related to other laboratory macaques, such as cynomolgus monkeys (Macaca fascicularis, and to humans, sharing a last common ancestor from about 25 million years ago. Although rhesus monkeys have been studied extensively under field and laboratory conditions, research has been limited by the lack of genetic resources. The present study generated placenta full-length cDNA libraries, characterized the resulting expressed sequence tags, and described their utility for comparative mapping with human RefSeq mRNA transcripts. Results From rhesus monkey placenta full-length cDNA libraries, 2000 full-length cDNA sequences were determined and 1835 rhesus placenta cDNA sequences longer than 100 bp were collected. These sequences were annotated based on homology to human genes. Homology search against human RefSeq mRNAs revealed that our collection included the sequences of 1462 putative rhesus monkey genes. Moreover, we identified 207 genes containing exon alterations in the coding region and the untranslated region of rhesus monkey transcripts, despite the highly conserved structure of the coding regions. Approximately 10% (187 of all full-length cDNA sequences did not represent any public human RefSeq mRNAs. Intriguingly, two rhesus monkey specific exons derived from the transposable elements of AluYRa2 (SINE family and MER11B (LTR family were also identified. Conclusion The 1835 rhesus monkey placenta full-length cDNA sequences described here could expand genomic resources and information of rhesus monkeys. This increased genomic information will greatly contribute to the development of evolutionary biology and biomedical research.

  11. IgE-Binding Epitope Mapping and Tissue Localization of the Major American Cockroach Allergen Per a 2.

    Science.gov (United States)

    Lee, Mey Fann; Chang, Chia Wei; Song, Pei Pong; Hwang, Guang Yuh; Lin, Shyh Jye; Chen, Yi Hsing

    2015-07-01

    Cockroaches are the second leading allergen in Taiwan. Sensitization to Per a 2, the major American cockroach allergen, correlates with clinical severity among patients with airway allergy, but there is limited information on IgE epitopes and tissue localization of Per a 2. This study aimed to identify Per a 2 linear IgE-binding epitopes and its distribution in the body of a cockroach. The cDNA of Per a 2 was used as a template and combined with oligonucleotide primers specific to the target areas with appropriate restriction enzyme sites. Eleven overlapping fragments of Per a 2 covering the whole allergen molecule, except 20 residues of signal peptide, were generated by PCR. Mature Per a 2 and overlapping deletion mutants were affinity-purified and assayed for IgE reactivity by immunoblotting. Three synthetic peptides comprising the B cell epitopes were evaluated by direct binding ELISA. Rabbit anti-Per a 2 antibody was used for immunohistochemistry. Human linear IgE-binding epitopes of Per a 2 were located at the amino acid sequences 57-86, 200-211, and 299-309. There was positive IgE binding to 10 tested Per a 2-allergic sera in 3 synthetic peptides, but none in the controls. Immunostaining revealed that Per a 2 was localized partly in the mouth and midgut of the cockroach, with the most intense staining observed in the hindgut, suggesting that the Per a 2 allergen might be excreted through the feces. Information on the IgE-binding epitope of Per a 2 may be used for designing more specific diagnostic and therapeutic approaches to cockroach allergy.

  12. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  13. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  14. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  15. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  17. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  18. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  19. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  20. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  1. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  2. Epitope mapping of the U1 small nuclear ribonucleoprotein particle in patients with systemic lupus erythematosus and mixed connective tissue disease.

    Science.gov (United States)

    Somarelli, J A; Mesa, A; Rodriguez, R; Avellan, R; Martinez, L; Zang, Y J; Greidinger, E L; Herrera, R J

    2011-03-01

    Systemic lupus erythematosus (SLE) and mixed connective tissue disease (MCTD) are autoimmune illnesses characterized by the presence of high titers of autoantibodies directed against a wide range of 'self ' antigens. Proteins of the U1 small nuclear ribonucleoprotein particle (U1 snRNP) are among the most immunogenic molecules in patients with SLE and MCTD. The recent release of a crystallized U1 snRNP provides a unique opportunity to evaluate the effects of tertiary and quaternary structures on autoantigenicity within the U1 snRNP. In the present study, an epitope map was created using the U1 snRNP crystal structure. A total of 15 peptides were tested in a cohort of 68 patients with SLE, 29 with MCTD and 26 healthy individuals and mapped onto the U1 snRNP structure. Antigenic sites were detected in a variety of structures and appear to include RNA binding domains, but mostly exclude regions necessary for protein-protein interactions. These data suggest that while some autoantibodies may target U1 snRNP proteins as monomers or apoptosis-induced, protease-digested fragments, others may recognize epitopes on assembled protein subcomplexes of the U1 snRNP. Although nearly all of the peptides are strong predictors of autoimmune illness, none were successful at distinguishing between SLE and MCTD. The antigenicity of some peptides significantly correlated with several clinical symptoms. This investigation implicitly highlights the complexities of autoimmune epitopes, and autoimmune illnesses in general, and demonstrates the variability of antigens in patient populations, all of which contribute to difficult clinical diagnoses.

  3. Cardiac tissue geometry as a determinant of unidirectional conduction block: assessment of microscopic excitation spread by optical mapping in patterned cell cultures and in a computer model.

    Science.gov (United States)

    Fast, V G; Kléber, A G

    1995-05-01

    Unidirectional conduction block (UCB) and reentry may occur as a consequence of an abrupt tissue expansion and a related change in the electrical load. The aim of this study was to evaluate critical dimensions of the tissue necessary for establishing UCB in heart cell culture. Neonatal rat heart cell cultures with cell strands of variable width emerging into a large cell area were grown using a technique of patterned cell growth. Action potential upstrokes were measured using a voltage sensitive dye (RH-237) and a linear array of 10 photodiodes with a 15 microns resolution. A mathematical model was used to relate action potential wave shapes to underlying ionic currents. UCB (block of a single impulse in anterograde direction - from a strand to a large area - and conduction in the retrograde direction) occurred in narrow cell strands with a width of 15(SD 4) microns (1-2 cells in width, n = 7) and there was no conduction block in strands with a width of 31(8) microns (n = 9, P multiple rising phases. Mathematical modelling showed that two rising phases were caused by electronic current flow, whereas local ionic current did not coincide with the rising portions of the upstrokes. (1) High resolution optical mapping shows multiphasic action potential upstrokes at the region of abrupt expansion. At the site of the maximum decrement in conduction, these peaks were largely determined by the electrotonus and not by the local ionic current. (2) Unidirectional conduction block occurred in strands with a width of 15(4) microns (1-2 cells).

  4. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  5. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  6. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  7. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  8. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  9. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  10. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  11. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  14. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  15. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  16. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  17. Recursive recovery of Markov transition probabilities from boundary value data

    Energy Technology Data Exchange (ETDEWEB)

    Patch, Sarah Kathyrn [Univ. of California, Berkeley, CA (United States)

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requires finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.

  18. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  19. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  2. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  3. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  4. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  5. Relaxation-compensated CEST-MRI at 7 T for mapping of creatine content and pH--preliminary application in human muscle tissue in vivo.

    Science.gov (United States)

    Rerich, Eugenia; Zaiss, Moritz; Korzowski, Andreas; Ladd, Mark E; Bachert, Peter

    2015-11-01

    The small biomolecule creatine is involved in energy metabolism. Mapping of the total creatine (mostly PCr and Cr) in vivo has been done with chemical shift imaging. Chemical exchange saturation transfer (CEST) allows an alternative detection of creatine via water MRI. Living tissue exhibits CEST effects from different small metabolites, including creatine, with four exchanging protons of its guanidinium group resonating about 2 ppm from the water peak and hence contributing to the amine proton CEST peak. The intermediate exchange rate (≈ 1000 Hz) of the guanidinium protons requires high RF saturation amplitude B1. However, strong B1 fields also label semi-solid magnetization transfer (MT) effects originating from immobile protons with broad linewidths (~kHz) in the tissue. Recently, it was shown that endogenous CEST contrasts are strongly affected by the MT background as well as by T1 relaxation of the water protons. We show that this influence can be corrected in the acquired CEST data by an inverse metric that yields the apparent exchange-dependent relaxation (AREX). AREX has some useful linearity features that enable preparation of both concentration, and--by using the AREX-ratio of two RF irradiation amplitudes B1--purely exchange-rate-weighted CEST contrasts. These two methods could be verified in phantom experiments with different concentration and pH values, but also varying water relaxation properties. Finally, results from a preliminary application to in vivo CEST imaging data of the human calf muscle before and after exercise are presented. The creatine concentration increases during exercise as expected and as confirmed by (31)P NMR spectroscopic imaging. However, the estimated concentrations obtained by our method were higher than the literature values: cCr,rest=24.5±3.74mM to cCr,ex=38.32±13.05mM. The CEST-based pH method shows a pH decrease during exercise, whereas a slight increase was observed by (31)P NMR spectroscopy. Copyright © 2015

  6. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  7. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  8. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  9. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  10. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  11. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  12. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  13. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  14. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  15. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  16. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  17. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  18. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  19. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  20. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  1. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  2. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  3. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  4. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  5. Decomposing the Hounsfield unit: probabilistic segmentation of brain tissue in computed tomography.

    Science.gov (United States)

    Kemmling, A; Wersching, H; Berger, K; Knecht, S; Groden, C; Nölte, I

    2012-03-01

    The aim of this study was to present and evaluate a standardized technique for brain segmentation of cranial computed tomography (CT) using probabilistic partial volume tissue maps based on a database of high resolution T1 magnetic resonance images (MRI). Probabilistic tissue maps of white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) were derived from 600 normal brain MRIs (3.0 Tesla, T1-3D-turbo-field-echo) of 2 large community-based population studies (BiDirect and SEARCH Health studies). After partial tissue segmentation (FAST 4.0), MR images were linearly registered to MNI-152 standard space (FLIRT 5.5) with non-linear refinement (FNIRT 1.0) to obtain non-binary probabilistic volume images for each tissue class which were subsequently used for CT segmentation. From 150 normal cerebral CT scans a customized reference image in standard space was constructed with iterative non-linear registration to MNI-152 space. The inverse warp of tissue-specific probability maps to CT space (MNI-152 to individual CT) was used to decompose a CT image into tissue specific components (GM, WM, CSF). Potential benefits and utility of this novel approach with regard to unsupervised quantification of CT images and possible visual enhancement are addressed. Illustrative examples of tissue segmentation in different pathological cases including perfusion CT are presented. Automated tissue segmentation of cranial CT images using highly refined tissue probability maps derived from high resolution MR images is feasible. Potential applications include automated quantification of WM in leukoaraiosis, CSF in hydrocephalic patients, GM in neurodegeneration and ischemia and perfusion maps with separate assessment of GM and WM.

  6. Maps showing predicted probabilities for selected dissolved oxygen and dissolved manganese threshold events in depth zones used by the domestic and public drinking water supply wells, Central Valley, California

    Science.gov (United States)

    Rosecrans, Celia Z.; Nolan, Bernard T.; Gronberg, JoAnn M.

    2018-01-31

    The purpose of the prediction grids for selected redox constituents—dissolved oxygen and dissolved manganese—are intended to provide an understanding of groundwater-quality conditions at the domestic and public-supply drinking water depths. The chemical quality of groundwater and the fate of many contaminants is influenced by redox processes in all aquifers, and understanding the redox conditions horizontally and vertically is critical in evaluating groundwater quality. The redox condition of groundwater—whether oxic (oxygen present) or anoxic (oxygen absent)—strongly influences the oxidation state of a chemical in groundwater. The anoxic dissolved oxygen thresholds of water, making drinking water undesirable with respect to taste, staining, or scaling. Three dissolved manganese thresholds, supply water wells. The 50 µg/L event threshold represents the secondary maximum contaminant level (SMCL) benchmark for manganese (U.S. Environmental Protection Agency, 2017; California Division of Drinking Water, 2014), whereas the 300 µg/L event threshold represents the U.S. Geological Survey (USGS) health-based screening level (HBSL) benchmark, used to put measured concentrations of drinking-water contaminants into a human-health context (Toccalino and others, 2014). The 150 µg/L event threshold represents one-half the USGS HBSL. The resultant dissolved oxygen and dissolved manganese prediction grids may be of interest to water-resource managers, water-quality researchers, and groundwater modelers concerned with the occurrence of natural and anthropogenic contaminants related to anoxic conditions. Prediction grids for selected redox constituents and thresholds were created by the USGS National Water-Quality Assessment (NAWQA) modeling and mapping team.

  7. Ergodicity of polygonal slap maps

    International Nuclear Information System (INIS)

    Del Magno, Gianluigi; Pedro Gaivão, José; Lopes Dias, João; Duarte, Pedro

    2014-01-01

    Polygonal slap maps are piecewise affine expanding maps of the interval obtained by projecting the sides of a polygon along their normals onto the perimeter of the polygon. These maps arise in the study of polygonal billiards with non-specular reflection laws. We study the absolutely continuous invariant probabilities (acips) of the slap maps for several polygons, including regular polygons and triangles. We also present a general method for constructing polygons with slap maps with more than one ergodic acip. (paper)

  8. The profile of potential organ and tissue donors El perfil de probables donadores de órganos y tejidos O perfil de potenciais doadores de órgãos e tecidos

    Directory of Open Access Journals (Sweden)

    Edvaldo Leal de Moraes

    2009-10-01

    Full Text Available This study aimed to characterize donors according to gender, age group, cause of brain death; quantify donors with hypernatremia, hyperpotassemia and hypopotassemia; and get to know which organs were the most used in transplantations. This quantitative, descriptive, exploratory and retrospective study was performed at the Organ Procurement Organization of the University of São Paulo Medical School Hospital das Clínicas. Data from the medical records of 187 potential donors were analyzed. Cerebrovascular accidents represented 53.48% of all brain death causes, sodium and potassium disorders occurred in 82.36% of cases and 45.46% of the potential donors were between 41 and 60 years old. The results evidenced that natural death causes exceeded traumatic deaths, and that most donors presented sodium and potassium alterations, likely associated to inappropriate maintenance.Se tuvo como objetivos determinar las características de los donadores según el sexo, el intervalo de edad, y, las causas por muerte encefálica; determinar el número donadores que presentaban hipernatremia, hiperpotasemia y hipopotasemia; conocer los órganos que fueron más utilizados para el trasplante. Es un estudio de tipo cuantitativo, descriptivo, exploratorio y retrospectivo. La investigación fue realizada en una Institución de donación de Órganos perteneciente al Hospital de las Clínicas de Sao Paulo. Fueron analizados los datos de 187 probables donadores. Entre las causas de muerte encefálica el 53,48% fueron por accidente cerebro vascular, en 82,36% de los casos se produjeron alteraciones en los valores de sodio y potasio y los donadores se encontraban entre 41 y 60 años de edad. Los resultados muestran que las causas naturales de muerte superaron a las muertes por traumatismo. La mayoría de los donadores tuvo alteraciones en los niveles de sodio y potasio, estando posiblemente relacionadas a medidas de conservación inadecuadas.Objetivou-se caracterizar os

  9. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  10. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  11. Mapping of lead, magnesium and copper accumulation in plant tissues by laser-induced breakdown spectroscopy and laser-ablation inductively coupled plasma mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, J. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technicka 2896/2, 616 69 Brno (Czech Republic)], E-mail: kaiser@fme.vutbr.cz; Galiova, M.; Novotny, K.; Cervenka, R. [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Reale, L. [Faculty of Sciences, University of L' Aquila, Via Vetoio (Coppito 1), 67010 L' Aquila (Italy); Novotny, J.; Liska, M.; Samek, O. [Institute of Physical Engineering, Faculty of Mechanical Engineering, Brno University of Technology, Technicka 2896/2, 616 69 Brno (Czech Republic); Kanicky, V.; Hrdlicka, A. [Department of Chemistry, Faculty of Science, Masaryk University, Kotlarska 2, 611 37 Brno (Czech Republic); Stejskal, K.; Adam, V.; Kizek, R. [Department of Chemistry and Biochemistry, Faculty of Agronomy, Mendel University of Agriculture and Forestry, Zemedelska 1, 613 00 Brno (Czech Republic)

    2009-01-15

    Laser-Induced Breakdown Spectroscopy (LIBS) and Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) were utilized for mapping the accumulation of Pb, Mg and Cu with a resolution up to 200 {mu}m in a up to cm x cm area of sunflower (Helianthus annuus L.) leaves. The results obtained by LIBS and LA-ICP-MS are compared with the outcomes from Atomic Absorption Spectrometry (AAS) and Thin-Layer Chromatography (TLC). It is shown that laser-ablation based analytical methods can substitute or supplement these techniques mainly in the cases when a fast multi-elemental mapping of a large sample area is needed.

  12. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  13. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  15. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  16. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  17. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  18. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  19. Direct mapping of 19F in 19FDG-6P in brain tissue at subcellular resolution using soft X-ray fluorescence

    Science.gov (United States)

    Poitry-Yamate, C.; Gianoncelli, A.; Kourousias, G.; Kaulich, B.; Lepore, M.; Gruetter, R.; Kiskinova, M.

    2013-10-01

    Low energy x-ray fluorescence (LEXRF) detection was optimized for imaging cerebral glucose metabolism by mapping the fluorine LEXRF signal of 19F in 19FDG, trapped as intracellular 19F-deoxyglucose-6-phosphate (19FDG-6P) at 1μm spatial resolution from 3μm thick brain slices. 19FDG metabolism was evaluated in brain structures closely resembling the general cerebral cytoarchitecture following formalin fixation of brain slices and their inclusion in an epon matrix. 2-dimensional distribution maps of 19FDG-6P were placed in a cytoarchitectural and morphological context by simultaneous LEXRF mapping of N and O, and scanning transmission x-ray (STXM) imaging. A disproportionately high uptake and metabolism of glucose was found in neuropil relative to intracellular domains of the cell body of hypothalamic neurons, showing directly that neurons, like glial cells, also metabolize glucose. As 19F-deoxyglucose-6P is structurally identical to 18F-deoxyglucose-6P, LEXRF of subcellular 19F provides a link to in vivo 18FDG PET, forming a novel basis for understanding the physiological mechanisms underlying the 18FDG PET image, and the contribution of neurons and glia to the PET signal.

  20. Direct mapping of 19F in 19FDG-6P in brain tissue at subcellular resolution using soft X-ray fluorescence

    International Nuclear Information System (INIS)

    Poitry-Yamate, C; Lepore, M; Gruetter, R; Gianoncelli, A; Kourousias, G; Kiskinova, M; Kaulich, B

    2013-01-01

    Low energy x-ray fluorescence (LEXRF) detection was optimized for imaging cerebral glucose metabolism by mapping the fluorine LEXRF signal of 19 F in 19 FDG, trapped as intracellular 19 F-deoxyglucose-6-phosphate ( 19 FDG-6P) at 1μm spatial resolution from 3μm thick brain slices. 19 FDG metabolism was evaluated in brain structures closely resembling the general cerebral cytoarchitecture following formalin fixation of brain slices and their inclusion in an epon matrix. 2-dimensional distribution maps of 19 FDG-6P were placed in a cytoarchitectural and morphological context by simultaneous LEXRF mapping of N and O, and scanning transmission x-ray (STXM) imaging. A disproportionately high uptake and metabolism of glucose was found in neuropil relative to intracellular domains of the cell body of hypothalamic neurons, showing directly that neurons, like glial cells, also metabolize glucose. As 19 F-deoxyglucose-6P is structurally identical to 18 F-deoxyglucose-6P, LEXRF of subcellular 19 F provides a link to in vivo 18 FDG PET, forming a novel basis for understanding the physiological mechanisms underlying the 18 FDG PET image, and the contribution of neurons and glia to the PET signal

  1. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  2. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    infinity, while the Hill sphere method results in a severely underestimated probability. We provide a discussion of the reasons for these differences, and we finally present the results of the MOID method in the form of probability maps for the Earth and Mars on their current orbits. These maps show a relatively flat probability distribution, except for the occurrence of two ridges found at small inclinations and for coinciding projectile/target perihelion distances. Conclusions: Our results verify the standard formulae in the general case, away from the singularities. In fact, severe shortcomings are limited to the immediate vicinity of those extreme orbits. On the other hand, the new Monte Carlo methods can be used without excessive consumption of computer time, and the MOID method avoids the problems associated with the other methods. Appendices are available in electronic form at http://www.aanda.org

  3. Probable functions of calcium oxalate crystals in different tissues of ...

    African Journals Online (AJOL)

    Representatives of seven major edible aroid accessions were screened for calcium oxalate using standard histochemical methods. All the accessions were noted to contain calcium oxalate in the forms of raphide bundles and intra-amylar crystals. The crystals were widely present in all parts of the plants including spongy ...

  4. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    Science.gov (United States)

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  5. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Kevin T. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts Institute of Technology, Division of Health Sciences and Technology, Cambridge, MA (United States); Izquierdo-Garcia, David; Catana, Ciprian [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Poynton, Clare B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts General Hospital, Department of Psychiatry, Boston, MA (United States); University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Chonde, Daniel B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Harvard University, Program in Biophysics, Cambridge, MA (United States)

    2017-03-15

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps (''μ-maps'') were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map (''PAC-map'') generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach. (orig.)

  6. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  7. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  8. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  9. Probability Maps for the Visualization of Assimilation Ensemble Flow Data

    KAUST Repository

    Hollt, Thomas; Hadwiger, Markus; Knio, Omar; Hoteit, Ibrahim

    2015-01-01

    resampling, every member can follow up on any of the members before resampling. Tracking behavior over time, such as all possible paths of a particle in an ensemble vector field, becomes very difficult, as the number of combinations rises exponentially

  10. A (simplified) Bluetooth Maximum A posteriori Probability (MAP) receiver

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2003-01-01

    In our software-defined radio project we aim at combining two standards: Bluetooth and HiperLAN/2. The Hiper- LAN/2 receiver requires the most computation power in comparison with Bluetooth. We choose to use this computational power also for Bluetooth and look for more advanced demodulation

  11. Determining and Mapping the Probability of Aquatic Plant Colonization

    National Research Council Canada - National Science Library

    Balard, Jerrell

    1999-01-01

    ... at the Waterways Experiment Station. It is principally intended to be a forum whereby information pertaining to and resulting from the Corps of Engineers' nationwide Aquatic Plant Control Research Program (APCRP...

  12. Correlative two-photon and serial block face scanning electron microscopy in neuronal tissue using 3D near-infrared branding maps.

    Science.gov (United States)

    Lees, Robert M; Peddie, Christopher J; Collinson, Lucy M; Ashby, Michael C; Verkade, Paul

    2017-01-01

    Linking cellular structure and function has always been a key goal of microscopy, but obtaining high resolution spatial and temporal information from the same specimen is a fundamental challenge. Two-photon (2P) microscopy allows imaging deep inside intact tissue, bringing great insight into the structural and functional dynamics of cells in their physiological environment. At the nanoscale, the complex ultrastructure of a cell's environment in tissue can be reconstructed in three dimensions (3D) using serial block face scanning electron microscopy (SBF-SEM). This provides a snapshot of high resolution structural information pertaining to the shape, organization, and localization of multiple subcellular structures at the same time. The pairing of these two imaging modalities in the same specimen provides key information to relate cellular dynamics to the ultrastructural environment. Until recently, approaches to relocate a region of interest (ROI) in tissue from 2P microscopy for SBF-SEM have been inefficient or unreliable. However, near-infrared branding (NIRB) overcomes this by using the laser from a multiphoton microscope to create fiducial markers for accurate correlation of 2P and electron microscopy (EM) imaging volumes. The process is quick and can be user defined for each sample. Here, to increase the efficiency of ROI relocation, multiple NIRB marks are used in 3D to target ultramicrotomy. A workflow is described and discussed to obtain a data set for 3D correlated light and electron microscopy, using three different preparations of brain tissue as examples. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  14. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  15. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  16. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  17. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  18. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  19. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  20. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  1. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  2. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  3. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  4. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  5. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  6. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  7. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  8. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  10. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  11. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  12. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  14. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  15. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  16. The MAP kinase ERK and its scaffold protein MP1 interact with the chromatin regulator Corto during Drosophila wing tissue development

    Science.gov (United States)

    2011-01-01

    Background Mitogen-activated protein kinase (MAPK) cascades (p38, JNK, ERK pathways) are involved in cell fate acquisition during development. These kinase modules are associated with scaffold proteins that control their activity. In Drosophila, dMP1, that encodes an ERK scaffold protein, regulates ERK signaling during wing development and contributes to intervein and vein cell differentiation. Functional relationships during wing development between a chromatin regulator, the Enhancer of Trithorax and Polycomb Corto, ERK and its scaffold protein dMP1, are examined here. Results Genetic interactions show that corto and dMP1 act together to antagonize rolled (which encodes ERK) in the future intervein cells, thus promoting intervein fate. Although Corto, ERK and dMP1 are present in both cytoplasmic and nucleus compartments, they interact exclusively in nucleus extracts. Furthermore, Corto, ERK and dMP1 co-localize on several sites on polytene chromosomes, suggesting that they regulate gene expression directly on chromatin. Finally, Corto is phosphorylated. Interestingly, its phosphorylation pattern differs between cytoplasm and nucleus and changes upon ERK activation. Conclusions Our data therefore suggest that the Enhancer of Trithorax and Polycomb Corto could participate in regulating vein and intervein genes during wing tissue development in response to ERK signaling. PMID:21401930

  17. The MAP kinase ERK and its scaffold protein MP1 interact with the chromatin regulator Corto during Drosophila wing tissue development.

    Science.gov (United States)

    Mouchel-Vielh, Emmanuèle; Rougeot, Julien; Decoville, Martine; Peronnet, Frédérique

    2011-03-14

    Mitogen-activated protein kinase (MAPK) cascades (p38, JNK, ERK pathways) are involved in cell fate acquisition during development. These kinase modules are associated with scaffold proteins that control their activity. In Drosophila, dMP1, that encodes an ERK scaffold protein, regulates ERK signaling during wing development and contributes to intervein and vein cell differentiation. Functional relationships during wing development between a chromatin regulator, the Enhancer of Trithorax and Polycomb Corto, ERK and its scaffold protein dMP1, are examined here. Genetic interactions show that corto and dMP1 act together to antagonize rolled (which encodes ERK) in the future intervein cells, thus promoting intervein fate. Although Corto, ERK and dMP1 are present in both cytoplasmic and nucleus compartments, they interact exclusively in nucleus extracts. Furthermore, Corto, ERK and dMP1 co-localize on several sites on polytene chromosomes, suggesting that they regulate gene expression directly on chromatin. Finally, Corto is phosphorylated. Interestingly, its phosphorylation pattern differs between cytoplasm and nucleus and changes upon ERK activation. Our data therefore suggest that the Enhancer of Trithorax and Polycomb Corto could participate in regulating vein and intervein genes during wing tissue development in response to ERK signaling.

  18. The MAP kinase ERK and its scaffold protein MP1 interact with the chromatin regulator Corto during Drosophila wing tissue development

    Directory of Open Access Journals (Sweden)

    Peronnet Frédérique

    2011-03-01

    Full Text Available Abstract Background Mitogen-activated protein kinase (MAPK cascades (p38, JNK, ERK pathways are involved in cell fate acquisition during development. These kinase modules are associated with scaffold proteins that control their activity. In Drosophila, dMP1, that encodes an ERK scaffold protein, regulates ERK signaling during wing development and contributes to intervein and vein cell differentiation. Functional relationships during wing development between a chromatin regulator, the Enhancer of Trithorax and Polycomb Corto, ERK and its scaffold protein dMP1, are examined here. Results Genetic interactions show that corto and dMP1 act together to antagonize rolled (which encodes ERK in the future intervein cells, thus promoting intervein fate. Although Corto, ERK and dMP1 are present in both cytoplasmic and nucleus compartments, they interact exclusively in nucleus extracts. Furthermore, Corto, ERK and dMP1 co-localize on several sites on polytene chromosomes, suggesting that they regulate gene expression directly on chromatin. Finally, Corto is phosphorylated. Interestingly, its phosphorylation pattern differs between cytoplasm and nucleus and changes upon ERK activation. Conclusions Our data therefore suggest that the Enhancer of Trithorax and Polycomb Corto could participate in regulating vein and intervein genes during wing tissue development in response to ERK signaling.

  19. Mapping microscopic order in plant and mammalian cells and tissues: novel differential polarization attachment for new generation confocal microscopes (DP-LSM)

    Science.gov (United States)

    Steinbach, G.; Pawlak, K.; Pomozi, I.; Tóth, E. A.; Molnár, A.; Matkó, J.; Garab, G.

    2014-03-01

    Elucidation of the molecular architecture of complex, highly organized molecular macro-assemblies is an important, basic task for biology. Differential polarization (DP) measurements, such as linear (LD) and circular dichroism (CD) or the anisotropy of the fluorescence emission (r), which can be carried out in a dichrograph or spectrofluorimeter, respectively, carry unique, spatially averaged information about the molecular organization of the sample. For inhomogeneous samples—e.g. cells and tissues—measurements on macroscopic scale are not satisfactory, and in some cases not feasible, thus microscopic techniques must be applied. The microscopic DP-imaging technique, when based on confocal laser scanning microscope (LSM), allows the pixel by pixel mapping of anisotropy of a sample in 2D and 3D. The first DP-LSM configuration, which, in fluorescence mode, allowed confocal imaging of different DP quantities in real-time, without interfering with the ‘conventional’ imaging, was built on a Zeiss LSM410. It was demonstrated to be capable of determining non-confocally the linear birefringence (LB) or LD of a sample and, confocally, its FDLD (fluorescence detected LD), the degree of polarization (P) and the anisotropy of the fluorescence emission (r), following polarized and non-polarized excitation, respectively (Steinbach et al 2009 Acta Histochem.111 316-25). This DP-LSM configuration, however, cannot simply be adopted to new generation microscopes with considerably more compact structures. As shown here, for an Olympus FV500, we designed an easy-to-install DP attachment to determine LB, LD, FDLD and r, in new-generation confocal microscopes, which, in principle, can be complemented with a P-imaging unit, but specifically to the brand and type of LSM.

  20. Mapping microscopic order in plant and mammalian cells and tissues: novel differential polarization attachment for new generation confocal microscopes (DP-LSM)

    International Nuclear Information System (INIS)

    Steinbach, G; Pawlak, K; Garab, G; Pomozi, I; Tóth, E A; Molnár, A; Matkó, J

    2014-01-01

    Elucidation of the molecular architecture of complex, highly organized molecular macro-assemblies is an important, basic task for biology. Differential polarization (DP) measurements, such as linear (LD) and circular dichroism (CD) or the anisotropy of the fluorescence emission (r), which can be carried out in a dichrograph or spectrofluorimeter, respectively, carry unique, spatially averaged information about the molecular organization of the sample. For inhomogeneous samples—e.g. cells and tissues—measurements on macroscopic scale are not satisfactory, and in some cases not feasible, thus microscopic techniques must be applied. The microscopic DP-imaging technique, when based on confocal laser scanning microscope (LSM), allows the pixel by pixel mapping of anisotropy of a sample in 2D and 3D. The first DP-LSM configuration, which, in fluorescence mode, allowed confocal imaging of different DP quantities in real-time, without interfering with the ‘conventional’ imaging, was built on a Zeiss LSM410. It was demonstrated to be capable of determining non-confocally the linear birefringence (LB) or LD of a sample and, confocally, its FDLD (fluorescence detected LD), the degree of polarization (P) and the anisotropy of the fluorescence emission (r), following polarized and non-polarized excitation, respectively (Steinbach et al 2009 Acta Histochem.111 316–25). This DP-LSM configuration, however, cannot simply be adopted to new generation microscopes with considerably more compact structures. As shown here, for an Olympus FV500, we designed an easy-to-install DP attachment to determine LB, LD, FDLD and r, in new-generation confocal microscopes, which, in principle, can be complemented with a P-imaging unit, but specifically to the brand and type of LSM. (paper)

  1. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  2. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  3. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Cosmic string induced CMB maps

    International Nuclear Information System (INIS)

    Landriau, M.; Shellard, E. P. S.

    2011-01-01

    We compute maps of CMB temperature fluctuations seeded by cosmic strings using high resolution simulations of cosmic strings in a Friedmann-Robertson-Walker universe. We create full-sky, 18 deg. and 3 deg. CMB maps, including the relevant string contribution at each resolution from before recombination to today. We extract the angular power spectrum from these maps, demonstrating the importance of recombination effects. We briefly discuss the probability density function of the pixel temperatures, their skewness, and kurtosis.

  5. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  6. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  7. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  8. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  9. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  10. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  11. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  12. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  13. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  14. Delayed gadolinium-enhanced MRI of cartilage and T2 mapping for evaluation of reparative cartilage-like tissue after autologous chondrocyte implantation associated with Atelocollagen-based scaffold in the knee

    Energy Technology Data Exchange (ETDEWEB)

    Tadenuma, Taku; Uchio, Yuji; Kumahashi, Nobuyuki; Iwasa, Junji [Shimane University School of Medicine, Department of Orthopaedic Surgery, Izumo-shi, Shimane-ken (Japan); Fukuba, Eiji; Kitagaki, Hajime [Shimane University School of Medicine, Department of Radiology, Izumo-shi, Shimane-ken (Japan); Ochi, Mitsuo [Hiroshima University, Department of Orthopaedic Surgery, Integrated Health Sciences, Institute of Biomedical and Health Sciences, Minami-ku, Hiroshima (Japan)

    2016-10-15

    To elucidate the quality of tissue-engineered cartilage after an autologous chondrocyte implantation (ACI) technique with Atelocollagen gel as a scaffold in the knee in the short- to midterm postoperatively, we assessed delayed gadolinium-enhanced magnetic resonance imaging (MRI) of cartilage (dGEMRIC) and T2 mapping and clarified the relationship between T1 and T2 values and clinical results. In this cross-sectional study, T1 and T2 mapping were performed on 11 knees of 8 patients (mean age at ACI, 37.2 years) with a 3.0-T MRI scanner. T1{sub implant} and T2{sub implant} values were compared with those of the control cartilage region (T1{sub control} and T2{sub control}). Lysholm scores were also assessed for clinical evaluation. The relationships between the T1 and T2 values and the clinical Lysholm score were also assessed. There were no significant differences in the T1 values between the T1{sub implant} (386.64 ± 101.78 ms) and T1{sub control} (375.82 ± 62.89 ms) at the final follow-up. The implants showed significantly longer T2 values compared to the control cartilage (53.83 ± 13.89 vs. 38.21 ± 4.43 ms). The postoperative Lysholm scores were significantly higher than the preoperative scores. A significant correlation was observed between T1{sub implant} and clinical outcomes, but not between T2{sub implant} and clinical outcomes. Third-generation ACI implants might have obtained an almost equivalent glycosaminoglycan concentration compared to the normal cartilage, but they had lower collagen density at least 3 years after transplantation. The T1{sub implant} value, but not the T2 value, might be a predictor of clinical outcome after ACI. (orig.)

  15. Delayed gadolinium-enhanced MRI of cartilage and T2 mapping for evaluation of reparative cartilage-like tissue after autologous chondrocyte implantation associated with Atelocollagen-based scaffold in the knee

    International Nuclear Information System (INIS)

    Tadenuma, Taku; Uchio, Yuji; Kumahashi, Nobuyuki; Iwasa, Junji; Fukuba, Eiji; Kitagaki, Hajime; Ochi, Mitsuo

    2016-01-01

    To elucidate the quality of tissue-engineered cartilage after an autologous chondrocyte implantation (ACI) technique with Atelocollagen gel as a scaffold in the knee in the short- to midterm postoperatively, we assessed delayed gadolinium-enhanced magnetic resonance imaging (MRI) of cartilage (dGEMRIC) and T2 mapping and clarified the relationship between T1 and T2 values and clinical results. In this cross-sectional study, T1 and T2 mapping were performed on 11 knees of 8 patients (mean age at ACI, 37.2 years) with a 3.0-T MRI scanner. T1 implant and T2 implant values were compared with those of the control cartilage region (T1 control and T2 control ). Lysholm scores were also assessed for clinical evaluation. The relationships between the T1 and T2 values and the clinical Lysholm score were also assessed. There were no significant differences in the T1 values between the T1 implant (386.64 ± 101.78 ms) and T1 control (375.82 ± 62.89 ms) at the final follow-up. The implants showed significantly longer T2 values compared to the control cartilage (53.83 ± 13.89 vs. 38.21 ± 4.43 ms). The postoperative Lysholm scores were significantly higher than the preoperative scores. A significant correlation was observed between T1 implant and clinical outcomes, but not between T2 implant and clinical outcomes. Third-generation ACI implants might have obtained an almost equivalent glycosaminoglycan concentration compared to the normal cartilage, but they had lower collagen density at least 3 years after transplantation. The T1 implant value, but not the T2 value, might be a predictor of clinical outcome after ACI. (orig.)

  16. Delayed gadolinium-enhanced MRI of cartilage and T2 mapping for evaluation of reparative cartilage-like tissue after autologous chondrocyte implantation associated with Atelocollagen-based scaffold in the knee.

    Science.gov (United States)

    Tadenuma, Taku; Uchio, Yuji; Kumahashi, Nobuyuki; Fukuba, Eiji; Kitagaki, Hajime; Iwasa, Junji; Ochi, Mitsuo

    2016-10-01

    To elucidate the quality of tissue-engineered cartilage after an autologous chondrocyte implantation (ACI) technique with Atelocollagen gel as a scaffold in the knee in the short- to midterm postoperatively, we assessed delayed gadolinium-enhanced magnetic resonance imaging (MRI) of cartilage (dGEMRIC) and T2 mapping and clarified the relationship between T1 and T2 values and clinical results. In this cross-sectional study, T1 and T2 mapping were performed on 11 knees of 8 patients (mean age at ACI, 37.2 years) with a 3.0-T MRI scanner. T1implant and T2implant values were compared with those of the control cartilage region (T1control and T2control). Lysholm scores were also assessed for clinical evaluation. The relationships between the T1 and T2 values and the clinical Lysholm score were also assessed. There were no significant differences in the T1 values between the T1implant (386.64 ± 101.78 ms) and T1control (375.82 ± 62.89 ms) at the final follow-up. The implants showed significantly longer T2 values compared to the control cartilage (53.83 ± 13.89 vs. 38.21 ± 4.43 ms). The postoperative Lysholm scores were significantly higher than the preoperative scores. A significant correlation was observed between T1implant and clinical outcomes, but not between T2implant and clinical outcomes. Third-generation ACI implants might have obtained an almost equivalent glycosaminoglycan concentration compared to the normal cartilage, but they had lower collagen density at least 3 years after transplantation. The T1implant value, but not the T2 value, might be a predictor of clinical outcome after ACI.

  17. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  18. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  19. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  20. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  1. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  2. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  3. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  4. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  5. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  6. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  7. Cloning of a cDNA encoding the rat high molecular weight neurofilament peptide (NF-H): Developmental and tissue expression in the rat, and mapping of its human homologue to chromosomes 1 and 22

    International Nuclear Information System (INIS)

    Lieberburg, I.; Spinner, N.; Snyder, S.

    1989-01-01

    Neurofilaments (NFs) are the intermediate filaments specific to nervous tissue. Three peptides with apparent molecular masses of approximately 68 (NF-L), 145 (NF-M), and 200 (NF-H) kDa appear to be the major components of NF. The expression of these peptides is specific to nervous tissue and is developmentally regulated. Recently, complete cDNAs encoding NF-L and NF-M, and partial cDNAs encoding NF-H, have been described. To better understand the normal pathophysiology of NFs the authors chose to clone the cDNA encoding the rat NF-H peptide. Using monoclonal antibodies that recognized NF-H, they screened a rat brain λgt11 library and identified a clone that contained a 2,100-nucleotide cDNA insert representing the carboxyl-terminal portion of the NF-H protein. Levels of NF-H mRNA varied 20-fold among brain regions, with highest levels in pons/medulla, spinal cord, and cerebellum, and lowest levels in olfactory bulb and hypothalamus. Based on these results, the authors infer that half of the developmental increase and most of the interregional variation in the levels of the NF-H mRNA are mediated through message stabilization. Sequence information revealed that the carboxyl-terminal region of the NF-H peptide contained a unique serine-, proline-, alanine-, glutamic acid-, and lysine-rich repeat. Genomic blots revealed a single copy of the gene in the rat genome and two copies in the human genome. In situ hybridizations performed on human chromosomes mapped the NF-H gene to chromosomes 1 and 22

  8. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  9. Towards a global land subsidence map

    NARCIS (Netherlands)

    Erkens, G.; Sutanudjaja, E. H.

    2015-01-01

    Land subsidence is a global problem, but a global land subsidence map is not available yet. Such map is crucial to raise global awareness of land subsidence, as land subsidence causes extensive damage (probably in the order of billions of dollars annually). With the global land subsidence map

  10. Acceleration of tissue phase mapping by k-t BLAST: a detailed analysis of the influence of k-t-BLAST for the quantification of myocardial motion at 3T

    Directory of Open Access Journals (Sweden)

    Nienhaus G Ulrich

    2011-01-01

    Full Text Available Abstract Background The assessment of myocardial motion with tissue phase mapping (TPM provides high spatiotemporal resolution and quantitative motion information in three directions. Today, whole volume coverage of the heart by TPM encoding at high spatial and temporal resolution is limited by long data acquisition times. Therefore, a significant increase in imaging speed without deterioration of the quantitative motion information is required. For this purpose, the k-t BLAST acceleration technique was combined with TPM black-blood functional imaging of the heart. Different k-t factors were evaluated with respect to their impact on the quantitative assessment of cardiac motion. Results It is demonstrated that a k-t BLAST factor of two can be used with a marginal, but statistically significant deterioration of the quantitative motion data. Further increasing the k-t acceleration causes substantial alteration of the peak velocities and the motion pattern, but the temporal behavior of the contraction is well maintained up to an acceleration factor of six. Conclusions The application of k-t BLAST for the acceleration of TPM appears feasible. A reduction of the acquisition time of almost 45% could be achieved without substantial loss of quantitative motion information.

  11. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  12. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  13. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  14. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  15. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  16. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  17. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  18. Color on emergency mapping

    Science.gov (United States)

    Jiang, Lili; Qi, Qingwen; Zhang, An

    2007-06-01

    There are so many emergency issues in our daily life. Such as typhoons, tsunamis, earthquake, fires, floods, epidemics, etc. These emergencies made people lose their lives and their belongings. Every day, every hour, even every minute people probably face the emergency, so how to handle it and how to decrease its hurt are the matters people care most. If we can map it exactly before or after the emergencies; it will be helpful to the emergency researchers and people who live in the emergency place. So , through the emergency map, before emergency is occurring we can predict the situation, such as when and where the emergency will be happen; where people can refuge, etc. After disaster, we can also easily assess the lost, discuss the cause and make the lost less. The primary effect of mapping is offering information to the people who care about the emergency and the researcher who want to study it. Mapping allows the viewers to get a spatial sense of hazard. It can also provide the clues to study the relationship of the phenomenon in emergency. Color, as the basic element of the map, it can simplify and clarify the phenomenon. Color can also affects the general perceptibility of the map, and elicits subjective reactions to the map. It is to say, structure, readability, and the reader's psychological reactions can be affected by the use of color.

  19. Understanding map projections: Chapter 15

    Science.gov (United States)

    Usery, E. Lynn; Kent, Alexander J.; Vujakovic, Peter

    2018-01-01

    It has probably never been more important in the history of cartography than now that people understand how maps work. With increasing globalization, for example, world maps provide a key format for the transmission of information, but are often poorly used. Examples of poor understanding and use of projections and the resultant maps are many; for instance, the use of rectangular world maps in the United Kingdom press to show Chinese and Korean missile ranges as circles, something which can only be achieved on equidistant projections and then only from one launch point (Vujakovic, 2014).

  20. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  1. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  2. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  3. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  4. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  5. Local Directional Probability Optimization for Quantification of Blurred Gray/White Matter Junction in Magnetic Resonance Image

    Directory of Open Access Journals (Sweden)

    Xiaoxia Qu

    2017-09-01

    Full Text Available The blurred gray/white matter junction is an important feature of focal cortical dysplasia (FCD lesions. FCD is the main cause of epilepsy and can be detected through magnetic resonance (MR imaging. Several earlier studies have focused on computing the gradient magnitude of the MR image and used the resulting map to model the blurred gray/white matter junction. However, gradient magnitude cannot quantify the blurred gray/white matter junction. Therefore, we proposed a novel algorithm called local directional probability optimization (LDPO for detecting and quantifying the width of the gray/white matter boundary (GWB within the lesional areas. The proposed LDPO method mainly consists of the following three stages: (1 introduction of a hidden Markov random field-expectation-maximization algorithm to compute the probability images of brain tissues in order to obtain the GWB region; (2 generation of local directions from gray matter (GM to white matter (WM passing through the GWB, considering the GWB to be an electric potential field; (3 determination of the optimal local directions for any given voxel of GWB, based on iterative searching of the neighborhood. This was then used to measure the width of the GWB. The proposed LDPO method was tested on real MR images of patients with FCD lesions. The results indicated that the LDPO method could quantify the GWB width. On the GWB width map, the width of the blurred GWB in the lesional region was observed to be greater than that in the non-lesional regions. The proposed GWB width map produced higher F-scores in terms of detecting the blurred GWB within the FCD lesional region as compared to that of FCD feature maps, indicating better trade-off between precision and recall.

  6. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  7. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  8. MAPS of Cancer

    Science.gov (United States)

    Gray, Lincoln

    1998-01-01

    Our goal was to produce an interactive visualization from a mathematical model that successfully predicts metastases from head and neck cancer. We met this goal early in the project. The visualization is available for the public to view. Our work appears to fill a need for more information about this deadly disease. The idea of this project was to make an easily interpretable visualization based on what we call "functional maps" of disease. A functional map is a graphic summary of medical data, where distances between parts of the body are determined by the probability of disease, not by anatomical distances. Functional maps often beat little resemblance to anatomical maps, but they can be used to predict the spread of disease. The idea of modeling the spread of disease in an abstract multidimensional space is difficult for many people. Our goal was to make the important predictions easy to see. NASA must face this problem frequently: how to help laypersons and professionals see important trends in abstract, complex data. We took advantage of concepts perfected in NASA's graphics libraries. As an analogy, consider a functional map of early America. Suppose we choose travel times, rather than miles, as our measures of inter-city distances. For Abraham Lincoln, travel times would have been the more meaningful measure of separation between cities. In such a map New Orleans would be close to Memphis because of the Mississippi River. St. Louis would be close to Portland because of the Oregon Trail. Oklahoma City would be far from Little Rock because of the Cheyenne. Such a map would look puzzling to those of us who have always seen physical maps, but the functional map would be more useful in predicting the probabilities of inter-site transit. Continuing the analogy, we could predict the spread of social diseases such as gambling along the rivers and cattle rustling along the trails. We could simply print the functional map of America, but it would be more interesting

  9. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  10. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  11. The probability representation as a new formulation of quantum mechanics

    International Nuclear Information System (INIS)

    Man'ko, Margarita A; Man'ko, Vladimir I

    2012-01-01

    We present a new formulation of conventional quantum mechanics, in which the notion of a quantum state is identified via a fair probability distribution of the position measured in a reference frame of the phase space with rotated axes. In this formulation, the quantum evolution equation as well as the equation for finding energy levels are expressed as linear equations for the probability distributions that determine the quantum states. We also give the integral transforms relating the probability distribution (called the tomographic-probability distribution or the state tomogram) to the density matrix and the Wigner function and discuss their connection with the Radon transform. Qudit states are considered and the invertible map of the state density operators onto the probability vectors is discussed. The tomographic entropies and entropic uncertainty relations are reviewed. We demonstrate the uncertainty relations for the position and momentum and the entropic uncertainty relations in the tomographic-probability representation, which is suitable for an experimental check of the uncertainty relations.

  12. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  13. New exponential, logarithm and q-probability in the non-extensive statistical physics

    OpenAIRE

    Chung, Won Sang

    2013-01-01

    In this paper, a new exponential and logarithm related to the non-extensive statistical physics is proposed by using the q-sum and q-product which satisfy the distributivity. And we discuss the q-mapping from an ordinary probability to q-probability. The q-entropy defined by the idea of q-probability is shown to be q-additive.

  14. Joint probability distributions and fluctuation theorems

    International Nuclear Information System (INIS)

    García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien

    2012-01-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators

  15. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  16. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  17. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  18. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  19. Tissue engineering

    CERN Document Server

    Fisher, John P; Bronzino, Joseph D

    2007-01-01

    Increasingly viewed as the future of medicine, the field of tissue engineering is still in its infancy. As evidenced in both the scientific and popular press, there exists considerable excitement surrounding the strategy of regenerative medicine. To achieve its highest potential, a series of technological advances must be made. Putting the numerous breakthroughs made in this field into a broad context, Tissue Engineering disseminates current thinking on the development of engineered tissues. Divided into three sections, the book covers the fundamentals of tissue engineering, enabling technologies, and tissue engineering applications. It examines the properties of stem cells, primary cells, growth factors, and extracellular matrix as well as their impact on the development of tissue engineered devices. Contributions focus on those strategies typically incorporated into tissue engineered devices or utilized in their development, including scaffolds, nanocomposites, bioreactors, drug delivery systems, and gene t...

  20. On the Inclusion of Short-distance Bystander Effects into a Logistic Tumor Control Probability Model.

    Science.gov (United States)

    Tempel, David G; Brodin, N Patrik; Tomé, Wolfgang A

    2018-01-01

    Currently, interactions between voxels are neglected in the tumor control probability (TCP) models used in biologically-driven intensity-modulated radiotherapy treatment planning. However, experimental data suggests that this may not always be justified when bystander effects are important. We propose a model inspired by the Ising model, a short-range interaction model, to investigate if and when it is important to include voxel to voxel interactions in biologically-driven treatment planning. This Ising-like model for TCP is derived by first showing that the logistic model of tumor control is mathematically equivalent to a non-interacting Ising model. Using this correspondence, the parameters of the logistic model are mapped to the parameters of an Ising-like model and bystander interactions are introduced as a short-range interaction as is the case for the Ising model. As an example, we apply the model to study the effect of bystander interactions in the case of radiation therapy for prostate cancer. The model shows that it is adequate to neglect bystander interactions for dose distributions that completely cover the treatment target and yield TCP estimates that lie in the shoulder of the dose response curve. However, for dose distributions that yield TCP estimates that lie on the steep part of the dose response curve or for inhomogeneous dose distributions having significant hot and/or cold regions, bystander effects may be important. Furthermore, the proposed model highlights a previously unexplored and potentially fruitful connection between the fields of statistical mechanics and tumor control probability/normal tissue complication probability modeling.

  1. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  2. The maximum entropy method of moments and Bayesian probability theory

    Science.gov (United States)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  3. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  4. Incidence Probability of Delayed Health Consequences of the Chernobyl Accident

    International Nuclear Information System (INIS)

    Abdel-Ghani, A.H.; El-Naggar, A.M.; El-Kadi, A.A.

    2000-01-01

    During the first international Conference on the long -term consequences of the Chernobyl disaster in 1995 at Kiev, and also during the 1996 International Conference at Vienna, Summing up the consequences of the Chernobyl accident, the data regarding the delayed health consequences were mainly related to thyroid cancer, hereditary disorders, general morbidity, mortality and psychological disturbances. Contrary to expectations, the incidences of Leukemia and Soft Tissue tumors were similar to the spontaneous incident. The expected delayed effects, however, among the accident survivors, the liquidators and populations resident in contaminated areas would show higher incidence probability to Leukemia. These population groups have been continuously exposed to low level radiation both externally and internally. Application of the new ICRP concept of radiation-induced Detriment, and the Nominal Probability Coefficient for Cancer and hereditary effects for both workers and populations are used as the rationale to calculate the incidence probability of occurrence of delayed health effects of the Chernobyl accidents

  5. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  6. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  7. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  8. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  9. Tissue types (image)

    Science.gov (United States)

    ... are 4 basic types of tissue: connective tissue, epithelial tissue, muscle tissue, and nervous tissue. Connective tissue supports ... binds them together (bone, blood, and lymph tissues). Epithelial tissue provides a covering (skin, the linings of the ...

  10. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  11. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  12. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  13. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  14. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  15. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  16. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  17. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  18. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  19. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  1. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  2. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  3. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  4. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  5. Mapping racism.

    Science.gov (United States)

    Moss, Donald B

    2006-01-01

    The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.

  6. Regularizing mappings of Lévy measures

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Thorbjørnsen, Steen

    2006-01-01

    the class of selfdecomposable laws onto the so called Thorin class . Further, partly motivated by our previous studies of infinite divisibility in free probability, we introduce a one-parameter family of one-to-one mappings , which interpolates smoothly between ( α=0 ) and the identity mapping on ( α=1...... ). We prove that each of the mappings shares many of the properties of . In particular, they are representable in terms of stochastic integrals with respect to associated Levy processes....

  7. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  8. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  9. Evolution of an array of elements with logistic transition probability

    International Nuclear Information System (INIS)

    Majernik, Vladimir; Surda, Anton

    1996-01-01

    The paper addresses the problem how the state of an array of elements changes if the transition probabilities of its elements is chosen in the form of a logistic map. This problem leads to a special type of a discrete-time Markov which we simulated numerically for the different transition probabilities and the number of elements in the array. We show that the time evolution of the array exhibits a wide scale of behavior depending on the value of the total number of its elements and on the logistic constant a. We point out that this problem can be applied for description of a spin system with a certain type of mean field and of the multispecies ecosystems with an internal noise. (authors)

  10. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  11. Genetic Mapping

    Science.gov (United States)

    ... greatly advanced genetics research. The improved quality of genetic data has reduced the time required to identify a ... cases, a matter of months or even weeks. Genetic mapping data generated by the HGP's laboratories is freely accessible ...

  12. Three-dimensional multislice spiral computed tomographic angiography: a potentially useful tool for safer free tissue transfer to complicated regions

    DEFF Research Database (Denmark)

    Demirtas, Yener; Cifci, Mehmet; Kelahmetoglu, Osman

    2009-01-01

    Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer to compli......Three-dimensional multislice spiral computed tomographic angiography (3D-MSCTA) is a minimally invasive method of vascular mapping. The aim of this study was to evaluate the clinical usefulness of this imaging technique in delineating the recipient vessels for safer free tissue transfer...... be kept in mind, especially inthe patients with peripheral vascular disease. 3D-MSCTA has the potential to replace digital subtraction angiography for planning of microvascular reconstructions and newer devices with higher resolutions will probably increase the reliability of this technique. (c) 2009...

  13. Systematic variations in multi-spectral lidar representations of canopy height profiles and gap probability

    Science.gov (United States)

    Chasmer, L.; Hopkinson, C.; Gynan, C.; Mahoney, C.; Sitar, M.

    2015-12-01

    Airborne and terrestrial lidar are increasingly used in forest attribute modeling for carbon, ecosystem and resource monitoring. The near infra-red wavelength at 1064nm has been utilised most in airborne applications due to, for example, diode manufacture costs, surface reflectance and eye safety. Foliage reflects well at 1064nm and most of the literature on airborne lidar forest structure is based on data from this wavelength. However, lidar systems also operate at wavelengths further from the visible spectrum (e.g. 1550nm) for eye safety reasons. This corresponds to a water absorption band and can be sensitive to attenuation if surfaces contain moisture. Alternatively, some systems operate in the visible range (e.g. 532nm) for specialised applications requiring simultaneous mapping of terrestrial and bathymetric surfaces. All these wavelengths provide analogous 3D canopy structure reconstructions and thus offer the potential to be combined for spatial comparisons or temporal monitoring. However, a systematic comparison of wavelength-dependent foliage profile and gap probability (index of transmittance) is needed. Here we report on two multispectral lidar missions carried out in 2013 and 2015 over conifer, deciduous and mixed stands in Ontario, Canada. The first used separate lidar sensors acquiring comparable data at three wavelengths, while the second used a single sensor with 3 integrated laser systems. In both cases, wavelenegths sampled were 532nm, 1064nm and 1550nm. The experiment revealed significant differences in proportions of returns at ground level, the vertical foliage distribution and gap probability across wavelengths. Canopy attenuation was greatest at 532nm due to photosynthetic plant tissue absorption. Relative to 1064nm, foliage was systematically undersampled at the 10% to 60% height percentiles at both 1550nm and 532nm (this was confirmed with coincident terrestrial lidar data). When using all returns to calculate gap probability, all

  14. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  15. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  16. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  17. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  18. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  19. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  20. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  1. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  2. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  3. Osteosarcoma: correlation of T1 map and histology map

    International Nuclear Information System (INIS)

    Suh, Jin Suck; Yun, Mi Jin; Jeong, Eun Kee; Shin, Kyoo Ho; Yang, Woo Ick

    1999-01-01

    To determine whether T1 mapping shows regional differences between viable and necrotic regions of osteosarcomas after anticancer chemotherapy and to assess whether this mapping is able to express the characteristics of various intramural tissue components. Eleven of 20 osteosarcomas were included in this study, while the remaining nine were excluded because the tumor site was inappropriate for comparison of T1 map and tumor macrosection. All patients underwent MR imaging for the purpose of T1 mapping, followed by pre-operative chemotherapy and subsequent limb-salvage surgery. Spin echo pulse sequencing was used with varying TR (100, 200, 400, 800, 1600, and 2400 msec) and a constant TE of 20 msec. Using a C-language software program, T1 relaxation time was calculated on a pixel-by-pixel basis and then a T1 map was generated by using a post-processing program, NIH Image. We attempted correlation of the T1 map and histologic findings, particularly in regions of interest(ROI) if certain areas were different from other regions on either the T1 or histologic map. Value was expressed as an average of the ratio of T1 of ROI and T1 of fat tissue, and this was used as an internal reference for normalization of the measurement. Tumor necrosis was 100 %(Grade IV) in six specimens, and over 90 % (Grade III) in five. Viable tumor cells were found mostly in regions with chondroid matrix and seldom in regions with osteoid matrix. Regardless of cell viability, values ranged from 0.9 to 9.87(mean, 4.02) in tumor necrotic area with osteoid matrices, and from 3.04 to 3.9(mean, 3.55) in areas with chondroid matrices. Other regions with fibrous tissue proliferation, hemorrhage, and fatty necrosis showed values of 2.92-9.83(mean, 7.20), 2.65-5.96(mean,3.59), and 1.43-3.11(mean, 2.68) respectively. The values of various tissues overlapped. No statistically significant difference was found between regions in which tumors were viable and those with tumor necrosis. Although we hypothesized

  4. Seismic hazard maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  5. Digital karyotyping reveals probable target genes at 7q21.3 locus in hepatocellular carcinoma

    Directory of Open Access Journals (Sweden)

    Wang Shengyue

    2011-07-01

    Full Text Available Abstract Background Hepatocellular carcinoma (HCC is a worldwide malignant liver tumor with high incidence in China. Subchromosomal amplifications and deletions accounted for major genomic alterations occurred in HCC. Digital karyotyping was an effective method for analyzing genome-wide chromosomal aberrations at high resolution. Methods A digital karyotyping library of HCC was constructed and 454 Genome Sequencer FLX System (Roche was applied in large scale sequencing of the library. Digital Karyotyping Data Viewer software was used to analyze genomic amplifications and deletions. Genomic amplifications of genes detected by digital karyotyping were examined by real-time quantitative PCR. The mRNA expression level of these genes in tumorous and paired nontumorous tissues was also detected by real-time quantitative RT-PCR. Results A total of 821,252 genomic tags were obtained from the digital karyotyping library of HCC, with 529,162 tags (64% mapped to unique loci of human genome. Multiple subchromosomal amplifications and deletions were detected through analyzing the digital karyotyping data, among which the amplification of 7q21.3 drew our special attention. Validation of genes harbored within amplicons at 7q21.3 locus revealed that genomic amplification of SGCE, PEG10, DYNC1I1 and SLC25A13 occurred in 11 (21%, 11 (21%, 11 (21% and 23 (44% of the 52 HCC samples respectively. Furthermore, the mRNA expression level of SGCE, PEG10 and DYNC1I1 were significantly up-regulated in tumorous liver tissues compared with corresponding nontumorous counterparts. Conclusions Our results indicated that subchromosomal region of 7q21.3 was amplified in HCC, and SGCE, PEG10 and DYNC1I1 were probable protooncogenes located within the 7q21.3 locus.

  6. Accounting for access costs in validation of soil maps

    NARCIS (Netherlands)

    Yang, Lin; Brus, Dick J.; Zhu, A.X.; Li, Xinming; Shi, Jingjing

    2018-01-01

    The quality of soil maps can best be estimated by collecting additional data at locations selected by probability sampling. These data can be used in design-based estimation of map quality measures such as the population mean of the squared prediction errors (MSE) for continuous soil maps and

  7. Modeling highway travel time distribution with conditional probability models

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira Neto, Francisco Moraes [ORNL; Chin, Shih-Miao [ORNL; Hwang, Ho-Ling [ORNL; Han, Lee [University of Tennessee, Knoxville (UTK)

    2014-01-01

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program provides a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).

  8. Tissue-specific expression of the human laminin alpha5-chain, and mapping of the gene to human chromosome 20q13.2-13.3 and to distal mouse chromosome 2 near the locus for the ragged (Ra) mutation

    DEFF Research Database (Denmark)

    Durkin, M E; Loechel, F; Mattei, M G

    1997-01-01

    , heart, lung, skeletal muscle, kidney, and pancreas. The human laminin alpha5-chain gene (LAMA5) was assigned to chromosome 20q13.2-q13.3 by in situ hybridization, and the mouse gene (Lama5) was mapped by linkage analysis to a syntonic region of distal chromosome 2, close to the locus for the ragged (Ra...

  9. Projective mapping

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus

    2012-01-01

    by the practical testing environment. As a result of the changes, a reasonable assumption would be to question the consequences caused by the variations in method procedures. Here, the aim is to highlight the proven or hypothetic consequences of variations of Projective Mapping. Presented variations will include...... instructions and influence heavily the product placements and the descriptive vocabulary (Dehlholm et.al., 2012b). The type of assessors performing the method influences results with an extra aspect in Projective Mapping compared to more analytical tests, as the given spontaneous perceptions are much dependent......Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometimes...

  10. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  11. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  12. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  13. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  14. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  15. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  16. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  17. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  18. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  19. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  20. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  1. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  2. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  3. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  4. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  5. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  6. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  7. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  8. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  9. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  10. Affective Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    . In particular, mapping environmental damage, endangered species, and human made disasters has become one of the focal point of affective knowledge production. These ‘more-than-humangeographies’ practices include notions of species, space and territory, and movement towards a new political ecology. This type...... of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper looks at computer-assisted cartography as part...

  11. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  12. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  13. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  14. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  15. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  16. Experimental investigations into sample preparation of Alzheimer tissue specimens for nuclear microprobe analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pinheiro, T [CEC-JRC, Central Bureau for Nuclear Measurements, Geel (Belgium); Tapper, U A.S. [Dept. of Nuclear Physics, Lund Inst. of Science and Tech. (Sweden); Sturesson, K; Brun, A [Div. of Neuropathology, Dept. of Pathology, Lund University Hospital (Sweden)

    1991-03-01

    Nuclear microprobe analysis was applied to the study of elemental distribution in brains sections of patients with a diagnosis of Alzheimer's disease. Stained and nonstained cryosections were studied. The work carried out shows that serious elemental losses follow the sample staining procedure. Major losses occurred in a simple rinse of the tissue section, probably reducing most of the in-vivo gradients, which show that generally very little information can be gained from stained sections. However, in many cases stained sections are compulsory because of the requirement to recognize the area which is to be studied. All the elemental maps obtained for the neurofibrillary deposits indicate a localized concentration for Si and probably also Al, associated with the senile plaque core. Neither of these elements were found in the staining solutions used. The validity of the results is discussed as well as the possible link of Al and/or Si in the development of Alzheimer's desease. (orig.).

  17. Energetic map

    International Nuclear Information System (INIS)

    2012-01-01

    This report explains the energetic map of Uruguay as well as the different systems that delimits political frontiers in the region. The electrical system importance is due to the electricity, oil and derived , natural gas, potential study, biofuels, wind and solar energy

  18. Necklace maps

    NARCIS (Netherlands)

    Speckmann, B.; Verbeek, K.A.B.

    2010-01-01

    Statistical data associated with geographic regions is nowadays globally available in large amounts and hence automated methods to visually display these data are in high demand. There are several well-established thematic map types for quantitative data on the ratio-scale associated with regions:

  19. Participatory maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    towards a new political ecology. This type of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper...

  20. Divergence from, and Convergence to, Uniformity of Probability Density Quantiles

    Directory of Open Access Journals (Sweden)

    Robert G. Staudte

    2018-04-01

    Full Text Available We demonstrate that questions of convergence and divergence regarding shapes of distributions can be carried out in a location- and scale-free environment. This environment is the class of probability density quantiles (pdQs, obtained by normalizing the composition of the density with the associated quantile function. It has earlier been shown that the pdQ is representative of a location-scale family and carries essential information regarding shape and tail behavior of the family. The class of pdQs are densities of continuous distributions with common domain, the unit interval, facilitating metric and semi-metric comparisons. The Kullback–Leibler divergences from uniformity of these pdQs are mapped to illustrate their relative positions with respect to uniformity. To gain more insight into the information that is conserved under the pdQ mapping, we repeatedly apply the pdQ mapping and find that further applications of it are quite generally entropy increasing so convergence to the uniform distribution is investigated. New fixed point theorems are established with elementary probabilistic arguments and illustrated by examples.

  1. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  2. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  3. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners.

    Science.gov (United States)

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.

  4. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  5. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  6. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  7. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  8. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  9. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  10. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  11. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  12. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  13. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  14. Earthquake Probability Assessment for the Active Faults in Central Taiwan: A Case Study

    Directory of Open Access Journals (Sweden)

    Yi-Rui Lee

    2016-06-01

    Full Text Available Frequent high seismic activities occur in Taiwan due to fast plate motions. According to the historical records the most destructive earthquakes in Taiwan were caused mainly by inland active faults. The Central Geological Survey (CGS of Taiwan has published active fault maps in Taiwan since 1998. There are 33 active faults noted in the 2012 active fault map. After the Chi-Chi earthquake, CGS launched a series of projects to investigate the details to better understand each active fault in Taiwan. This article collected this data to develop active fault parameters and referred to certain experiences from Japan and the United States to establish a methodology for earthquake probability assessment via active faults. We consider the active faults in Central Taiwan as a good example to present the earthquake probability assessment process and results. The appropriate “probability model” was used to estimate the conditional probability where M ≥ 6.5 and M ≥ 7.0 earthquakes. Our result shows that the highest earthquake probability for M ≥ 6.5 earthquake occurring in 30, 50, and 100 years in Central Taiwan is the Tachia-Changhua fault system. Conversely, the lowest earthquake probability is the Chelungpu fault. The goal of our research is to calculate the earthquake probability of the 33 active faults in Taiwan. The active fault parameters are important information that can be applied in the following seismic hazard analysis and seismic simulation.

  15. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  16. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  17. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  18. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  19. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  20. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  1. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  2. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  3. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  4. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  5. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  6. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  7. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  8. Pattern formation, logistics, and maximum path probability

    Science.gov (United States)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  9. Atomically resolved tissue integration.

    Science.gov (United States)

    Karlsson, Johan; Sundell, Gustav; Thuvander, Mattias; Andersson, Martin

    2014-08-13

    In the field of biomedical technology, a critical aspect is the ability to control and understand the integration of an implantable device in living tissue. Despite the technical advances in the development of biomaterials, the elaborate interplay encompassing materials science and biology on the atomic level is not very well understood. Within implantology, anchoring a biomaterial device into bone tissue is termed osseointegration. In the most accepted theory, osseointegration is defined as an interfacial bonding between implant and bone; however, there is lack of experimental evidence to confirm this. Here we show that atom probe tomography can be used to study the implant-tissue interaction, allowing for three-dimensional atomic mapping of the interface region. Interestingly, our analyses demonstrated that direct contact between Ca atoms and the implanted titanium oxide surface is formed without the presence of a protein interlayer, which means that a pure inorganic interface is created, hence giving experimental support to the current theory of osseointegration. We foresee that this result will be of importance in the development of future biomaterials as well as in the design of in vitro evaluation techniques.

  10. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  11. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  12. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  13. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  14. MAPPING INNOVATION

    DEFF Research Database (Denmark)

    Thuesen, Christian Langhoff; Koch, Christian

    2011-01-01

    By adopting a theoretical framework from strategic niche management research (SNM) this paper presents an analysis of the innovation system of the Danish Construction industry. The analysis shows a multifaceted landscape of innovation around an existing regime, built around existing ways of working...... and developed over generations. The regime is challenged from various niches and the socio-technical landscape through trends as globalization. Three niches (Lean Construction, BIM and System Deliveries) are subject to a detailed analysis showing partly incompatible rationales and various degrees of innovation...... potential. The paper further discusses how existing policymaking operates in a number of tensions one being between government and governance. Based on the concepts from SNM the paper introduces an innovation map in order to support the development of meta-governance policymaking. By mapping some...

  15. Mapping filmmaking

    DEFF Research Database (Denmark)

    Gilje, Øystein; Frølunde, Lisbeth; Lindstrand, Fredrik

    2010-01-01

    This chapter concerns mapping patterns in regards to how young filmmakers (age 15 – 20) in the Scandinavian countries learn about filmmaking. To uncover the patterns, we present portraits of four young filmmakers who participated in the Scandinavian research project Making a filmmaker. The focus ...... is on their learning practices and how they create ‘learning paths’ in relation to resources in diverse learning contexts, whether formal, non-formal and informal contexts.......This chapter concerns mapping patterns in regards to how young filmmakers (age 15 – 20) in the Scandinavian countries learn about filmmaking. To uncover the patterns, we present portraits of four young filmmakers who participated in the Scandinavian research project Making a filmmaker. The focus...

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. Application of a weighted spatial probability model in GIS to analyse landslides in Penang Island, Malaysia

    Directory of Open Access Journals (Sweden)

    Samy Ismail Elmahdy

    2016-01-01

    Full Text Available In the current study, Penang Island, which is one of the several mountainous areas in Malaysia that is often subjected to landslide hazard, was chosen for further investigation. A multi-criteria Evaluation and the spatial probability weighted approach and model builder was applied to map and analyse landslides in Penang Island. A set of automated algorithms was used to construct new essential geological and morphometric thematic maps from remote sensing data. The maps were ranked using the weighted probability spatial model based on their contribution to the landslide hazard. Results obtained showed that sites at an elevation of 100–300 m, with steep slopes of 10°–37° and slope direction (aspect in the E and SE directions were areas of very high and high probability for the landslide occurrence; the total areas were 21.393 km2 (11.84% and 58.690 km2 (32.48%, respectively. The obtained map was verified by comparing variogram models of the mapped and the occurred landslide locations and showed a strong correlation with the locations of occurred landslides, indicating that the proposed method can successfully predict the unpredictable landslide hazard. The method is time and cost effective and can be used as a reference for geological and geotechnical engineers.

  18. Tissue irradiator

    International Nuclear Information System (INIS)

    Hungate, F.P.; Riemath, W.F.; Bunnell, L.R.

    1975-01-01

    A tissue irradiator is provided for the in-vivo irradiation of body tissue. The irradiator comprises a radiation source material contained and completely encapsulated within vitreous carbon. An embodiment for use as an in-vivo blood irradiator comprises a cylindrical body having an axial bore therethrough. A radioisotope is contained within a first portion of vitreous carbon cylindrically surrounding the axial bore, and a containment portion of vitreous carbon surrounds the radioisotope containing portion, the two portions of vitreous carbon being integrally formed as a single unit. Connecting means are provided at each end of the cylindrical body to permit connections to blood-carrying vessels and to provide for passage of blood through the bore. In a preferred embodiment, the radioisotope is thulium-170 which is present in the irradiator in the form of thulium oxide. A method of producing the preferred blood irradiator is also provided, whereby nonradioactive thulium-169 is dispersed within a polyfurfuryl alcohol resin which is carbonized and fired to form the integral vitreous carbon body and the device is activated by neutron bombardment of the thulium-169 to produce the beta-emitting thulium-170

  19. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  20. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.