WorldWideScience

Sample records for probabilistic anatomical mapping

  1. Probabilistic anatomical labeling of brain structures using statistical probabilistic anatomical maps

    International Nuclear Information System (INIS)

    Kim, Jin Su; Lee, Dong Soo; Lee, Byung Il; Lee, Jae Sung; Shin, Hee Won; Chung, June Key; Lee, Myung Chul

    2002-01-01

    The use of statistical parametric mapping (SPM) program has increased for the analysis of brain PET and SPECT images. Montreal neurological institute (MNI) coordinate is used in SPM program as a standard anatomical framework. While the most researchers look up Talairach atlas to report the localization of the activations detected in SPM program, there is significant disparity between MNI templates and Talairach atlas. That disparity between Talairach and MNI coordinates makes the interpretation of SPM result time consuming, subjective and inaccurate. The purpose of this study was to develop a program to provide objective anatomical information of each x-y-z position in ICBM coordinate. Program was designed to provide the anatomical information for the given x-y-z position in MNI coordinate based on the statistical probabilistic anatomical map (SPAM) images of ICBM. When x-y-z position was given to the program, names of the anatomical structures with non-zero probability and the probabilities that the given position belongs to the structures were tabulated. The program was coded using IDL and JAVA language for the easy transplantation to any operating system or platform. Utility of this program was shown by comparing the results of this program to those of SPM program. Preliminary validation study was performed by applying this program to the analysis of PET brain activation study of human memory in which the anatomical information on the activated areas are previously known. Real time retrieval of probabilistic information with 1 mm spatial resolution was archived using the programs. Validation study showed the relevance of this program: probability that the activated area for memory belonged to hippocampal formation was more than 80%. These programs will be useful for the result interpretation of the image analysis performed on MNI coordinate, as done in SPM program

  2. Automated Analysis of 123I-beta-CIT SPECT Images with Statistical Probabilistic Anatomical Mapping

    International Nuclear Information System (INIS)

    Eo, Jae Seon; Lee, Hoyoung; Lee, Jae Sung; Kim, Yu Kyung; Jeon, Bumseok; Lee, Dong Soo

    2014-01-01

    Population-based statistical probabilistic anatomical maps have been used to generate probabilistic volumes of interest for analyzing perfusion and metabolic brain imaging. We investigated the feasibility of automated analysis for dopamine transporter images using this technique and evaluated striatal binding potentials in Parkinson's disease and Wilson's disease. We analyzed 2β-Carbomethoxy-3β-(4- 123 I-iodophenyl)tropane ( 123 I-beta-CIT) SPECT images acquired from 26 people with Parkinson's disease (M:F=11:15,mean age=49±12 years), 9 people with Wilson's disease (M: F=6:3, mean age=26±11 years) and 17 normal controls (M:F=5:12, mean age=39±16 years). A SPECT template was created using striatal statistical probabilistic map images. All images were spatially normalized onto the template, and probability-weighted regional counts in striatal structures were estimated. The binding potential was calculated using the ratio of specific and nonspecific binding activities at equilibrium. Voxel-based comparisons between groups were also performed using statistical parametric mapping. Qualitative assessment showed that spatial normalizations of the SPECT images were successful for all images. The striatal binding potentials of participants with Parkinson's disease and Wilson's disease were significantly lower than those of normal controls. Statistical parametric mapping analysis found statistically significant differences only in striatal regions in both disease groups compared to controls. We successfully evaluated the regional 123 I-beta-CIT distribution using the SPECT template and probabilistic map data automatically. This procedure allows an objective and quantitative comparison of the binding potential, which in this case showed a significantly decreased binding potential in the striata of patients with Parkinson's disease or Wilson's disease

  3. Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach.

    Science.gov (United States)

    Badde, Stephanie; Heed, Tobias; Röder, Brigitte

    2016-04-01

    To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.

  4. Validation of simple quantification methods for 18F FP CIT PET Using Automatic Delineation of volumes of interest based on statistical probabilistic anatomical mapping and isocontour margin setting

    International Nuclear Information System (INIS)

    Kim, Yong Il; Im, Hyung Jun; Paeng, Jin Chul; Lee, Jae Sung; Eo, Jae Seon; Kim, Dong Hyun; Kim, Euishin E.; Kang, Keon Wook; Chung, June Key; Lee Dong Soo

    2012-01-01

    18 F FP CIT positron emission tomography (PET) is an effective imaging for dopamine transporters. In usual clinical practice, 18 F FP CIT PET is analyzed visually or quantified using manual delineation of a volume of interest (VOI) fir the stratum. in this study, we suggested and validated two simple quantitative methods based on automatic VOI delineation using statistical probabilistic anatomical mapping (SPAM) and isocontour margin setting. Seventy five 18 F FP CIT images acquired in routine clinical practice were used for this study. A study-specific image template was made and the subject images were normalized to the template. afterwards, uptakes in the striatal regions and cerebellum were quantified using probabilistic VOI based on SPAM. A quantitative parameter, Q SPAM, was calculated to simulate binding potential. additionally, the functional volume of each striatal region and its uptake were measured in automatically delineated VOI using isocontour margin setting. Uptake volume product(Q UVP) was calculated for each striatal region. Q SPAMa nd Q UVPw as calculated for each visual grading and the influence of cerebral atrophy on the measurements was tested. Image analyses were successful in all the cases. Both the Q SPAMa nd Q UVPw ere significantly different according to visual grading (0.001). The agreements of Q UVPa nd Q SPAMw ith visual grading were slight to fair for the caudate nucleus (K= 0.421 and 0.291, respectively) and good to prefect to the putamen (K=0.663 and 0.607, respectively). Also, Q SPAMa nd Q UVPh ad a significant correlation with each other (0.001). Cerebral atrophy made a significant difference in Q SPAMa nd Q UVPo f the caudate nuclei regions with decreased 18 F FP CIT uptake. Simple quantitative measurements of Q SPAMa nd Q UVPs howed acceptable agreement with visual grad-ing. although Q SPAMi n some group may be influenced by cerebral atrophy, these simple methods are expected to be effective in the quantitative analysis of F FP

  5. Statistical parametric mapping and statistical probabilistic anatomical mapping analyses of basal/acetazolamide Tc-99m ECD brain SPECT for efficacy assessment of endovascular stent placement for middle cerebral artery stenosis

    International Nuclear Information System (INIS)

    Lee, Tae-Hong; Kim, Seong-Jang; Kim, In-Ju; Kim, Yong-Ki; Kim, Dong-Soo; Park, Kyung-Pil

    2007-01-01

    Statistical parametric mapping (SPM) and statistical probabilistic anatomical mapping (SPAM) were applied to basal/acetazolamide Tc-99m ECD brain perfusion SPECT images in patients with middle cerebral artery (MCA) stenosis to assess the efficacy of endovascular stenting of the MCA. Enrolled in the study were 11 patients (8 men and 3 women, mean age 54.2 ± 6.2 years) who had undergone endovascular stent placement for MCA stenosis. Using SPM and SPAM analyses, we compared the number of significant voxels and cerebral counts in basal and acetazolamide SPECT images before and after stenting, and assessed the perfusion changes and cerebral vascular reserve index (CVRI). The numbers of hypoperfusion voxels in SPECT images were decreased from 10,083 ± 8,326 to 4,531 ± 5,091 in basal images (P 0.0317) and from 13,398 ± 14,222 to 7,699 ± 10,199 in acetazolamide images (P = 0.0142) after MCA stenting. On SPAM analysis, the increases in cerebral counts were significant in acetazolamide images (90.9 ± 2.2 to 93.5 ± 2.3, P = 0.0098) but not in basal images (91 ± 2.7 to 92 ± 2.6, P = 0.1602). The CVRI also showed a statistically significant increase from before stenting (median 0.32; 95% CI -2.19-2.37) to after stenting (median 1.59; 95% CI -0.85-4.16; P = 0.0068). This study revealed the usefulness of voxel-based analysis of basal/acetazolamide brain perfusion SPECT after MCA stent placement. This study showed that SPM and SPAM analyses of basal/acetazolamide Tc-99m brain SPECT could be used to evaluate the short-term hemodynamic efficacy of successful MCA stent placement. (orig.)

  6. Validation of Simple Quantification Methods for (18)F-FP-CIT PET Using Automatic Delineation of Volumes of Interest Based on Statistical Probabilistic Anatomical Mapping and Isocontour Margin Setting.

    Science.gov (United States)

    Kim, Yong-Il; Im, Hyung-Jun; Paeng, Jin Chul; Lee, Jae Sung; Eo, Jae Seon; Kim, Dong Hyun; Kim, Euishin E; Kang, Keon Wook; Chung, June-Key; Lee, Dong Soo

    2012-12-01

    (18)F-FP-CIT positron emission tomography (PET) is an effective imaging for dopamine transporters. In usual clinical practice, (18)F-FP-CIT PET is analyzed visually or quantified using manual delineation of a volume of interest (VOI) for the striatum. In this study, we suggested and validated two simple quantitative methods based on automatic VOI delineation using statistical probabilistic anatomical mapping (SPAM) and isocontour margin setting. Seventy-five (18)F-FP-CIT PET images acquired in routine clinical practice were used for this study. A study-specific image template was made and the subject images were normalized to the template. Afterwards, uptakes in the striatal regions and cerebellum were quantified using probabilistic VOI based on SPAM. A quantitative parameter, QSPAM, was calculated to simulate binding potential. Additionally, the functional volume of each striatal region and its uptake were measured in automatically delineated VOI using isocontour margin setting. Uptake-volume product (QUVP) was calculated for each striatal region. QSPAM and QUVP were compared with visual grading and the influence of cerebral atrophy on the measurements was tested. Image analyses were successful in all the cases. Both the QSPAM and QUVP were significantly different according to visual grading (P Simple quantitative measurements of QSPAM and QUVP showed acceptable agreement with visual grading. Although QSPAM in some group may be influenced by cerebral atrophy, these simple methods are expected to be effective in the quantitative analysis of (18)F-FP-CIT PET in usual clinical practice.

  7. Probabilistic Flood Mapping using Volunteered Geographical Information

    Science.gov (United States)

    Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.

    2016-12-01

    Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).

  8. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    Science.gov (United States)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  9. Quantification of brain images using Korean standard templates and structural and cytoarchitectonic probabilistic maps

    International Nuclear Information System (INIS)

    Lee, Jae Sung; Lee, Dong Soo; Kim, Yu Kyeong

    2004-01-01

    Population based structural and functional maps of the brain provide effective tools for the analysis and interpretation of complex and individually variable brain data. Brain MRI and PET standard templates and statistical probabilistic maps based on image data of Korean normal volunteers have been developed and probabilistic maps based on cytoarchitectonic data have been introduced. A quantification method using these data was developed for the objective assessment of regional intensity in the brain images. Age, gender and ethnic specific anatomical and functional brain templates based on MR and PET images of Korean normal volunteers were developed. Korean structural probabilistic maps for 89 brain regions and cytoarchitectonic probabilistic maps for 13 Brodmann areas were transformed onto the standard templates. Brain FDG PET and SPGR MR images of normal volunteers were spatially normalized onto the template of each modality and gender. Regional uptake of radiotracers in PET and gray matter concentration in MR images were then quantified by averaging (or summing) regional intensities weighted using the probabilistic maps of brain regions. Regionally specific effects of aging on glucose metabolism in cingulate cortex were also examined. Quantification program could generate quantification results for single spatially normalized images per 20 seconds. Glucose metabolism change in cingulate gyrus was regionally specific: ratios of glucose metabolism in the rostral anterior cingulate vs. posterior cingulate and the caudal anterior cingulate vs. posterior cingulate were significantly decreased as the age increased. 'Rostral anterior' / 'posterior' was decreased by 3.1% per decade of age (p -11 , r=0.81) and 'caudal anterior' / 'posterior' was decreased by 1.7% (p -8 , r=0.72). Ethnic specific standard templates and probabilistic maps and quantification program developed in this study will be useful for the analysis of brain image of Korean people since the difference

  10. Identification of probabilistic approaches and map-based navigation ...

    Indian Academy of Sciences (India)

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  11. Quantification of brain images using Korean standard templates and structural and cytoarchitectonic probabilistic maps

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Sung; Lee, Dong Soo; Kim, Yu Kyeong [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)] [and others

    2004-06-01

    Population based structural and functional maps of the brain provide effective tools for the analysis and interpretation of complex and individually variable brain data. Brain MRI and PET standard templates and statistical probabilistic maps based on image data of Korean normal volunteers have been developed and probabilistic maps based on cytoarchitectonic data have been introduced. A quantification method using these data was developed for the objective assessment of regional intensity in the brain images. Age, gender and ethnic specific anatomical and functional brain templates based on MR and PET images of Korean normal volunteers were developed. Korean structural probabilistic maps for 89 brain regions and cytoarchitectonic probabilistic maps for 13 Brodmann areas were transformed onto the standard templates. Brain FDG PET and SPGR MR images of normal volunteers were spatially normalized onto the template of each modality and gender. Regional uptake of radiotracers in PET and gray matter concentration in MR images were then quantified by averaging (or summing) regional intensities weighted using the probabilistic maps of brain regions. Regionally specific effects of aging on glucose metabolism in cingulate cortex were also examined. Quantification program could generate quantification results for single spatially normalized images per 20 seconds. Glucose metabolism change in cingulate gyrus was regionally specific: ratios of glucose metabolism in the rostral anterior cingulate vs. posterior cingulate and the caudal anterior cingulate vs. posterior cingulate were significantly decreased as the age increased. 'Rostral anterior' / 'posterior' was decreased by 3.1% per decade of age (p<10{sup -11}, r=0.81) and 'caudal anterior' / 'posterior' was decreased by 1.7% (p<10{sup -8}, r=0.72). Ethnic specific standard templates and probabilistic maps and quantification program developed in this study will be useful for the analysis

  12. Tridimensional Regression for Comparing and Mapping 3D Anatomical Structures

    Directory of Open Access Journals (Sweden)

    Kendra K. Schmid

    2012-01-01

    Full Text Available Shape analysis is useful for a wide variety of disciplines and has many applications. There are many approaches to shape analysis, one of which focuses on the analysis of shapes that are represented by the coordinates of predefined landmarks on the object. This paper discusses Tridimensional Regression, a technique that can be used for mapping images and shapes that are represented by sets of three-dimensional landmark coordinates, for comparing and mapping 3D anatomical structures. The degree of similarity between shapes can be quantified using the tridimensional coefficient of determination (2. An experiment was conducted to evaluate the effectiveness of this technique to correctly match the image of a face with another image of the same face. These results were compared to the 2 values obtained when only two dimensions are used and show that using three dimensions increases the ability to correctly match and discriminate between faces.

  13. A probabilistic map of the human ventral sensorimotor cortex using electrical stimulation.

    Science.gov (United States)

    Breshears, Jonathan D; Molinaro, Annette M; Chang, Edward F

    2015-08-01

    The human ventral sensorimotor cortex (vSMC) is involved in facial expression, mastication, and swallowing, as well as the dynamic and highly coordinated movements of human speech production. However, vSMC organization remains poorly understood, and previously published population-driven maps of its somatotopy do not accurately reflect the variability across individuals in a quantitative, probabilistic fashion. The goal of this study was to describe the responses to electrical stimulation of the vSMC, generate probabilistic maps of function in the vSMC, and quantify the variability across individuals. Photographic, video, and stereotactic MRI data of intraoperative electrical stimulation of the vSMC were collected for 33 patients undergoing awake craniotomy. Stimulation sites were converted to a 2D coordinate system based on anatomical landmarks. Motor, sensory, and speech stimulation responses were reviewed and classified. Probabilistic maps of stimulation responses were generated, and spatial variance was quantified. In 33 patients, the authors identified 194 motor, 212 sensory, 61 speech-arrest, and 27 mixed responses. Responses were complex, stereotyped, and mostly nonphysiological movements, involving hand, orofacial, and laryngeal musculature. Within individuals, the presence of oral movement representations varied; however, the dorsal-ventral order was always preserved. The most robust motor responses were jaw (probability 0.85), tongue (0.64), lips (0.58), and throat (0.52). Vocalizations were seen in 6 patients (0.18), more dorsally near lip and dorsal throat areas. Sensory responses were spatially dispersed; however, patients' subjective reports were highly precise in localization within the mouth. The most robust responses included tongue (0.82) and lips (0.42). The probability of speech arrest was 0.85, highest 15-20 mm anterior to the central sulcus and just dorsal to the sylvian fissure, in the anterior precentral gyrus or pars opercularis. The

  14. Cytoarchitectonical analysis and probabilistic mapping of two extrastriate areas of the human posterior fusiform gyrus.

    Science.gov (United States)

    Caspers, Julian; Zilles, Karl; Eickhoff, Simon B; Schleicher, Axel; Mohlberg, Hartmut; Amunts, Katrin

    2013-03-01

    The human extrastriate visual cortex comprises numerous functionally defined areas, which are not identified in the widely used cytoarchitectonical map of Brodmann. The ventral part of the extrastriate cortex is particularly devoted to the identification of visual objects, faces and word forms. We analyzed the region immediately antero-lateral to hOc4v in serially sectioned (20 μm) and cell body-stained human brains using a quantitative observer-independent cytoarchitectonical approach to further identify the anatomical organization of the extrastriate cortex. Two novel cytoarchitectonical areas, FG1 and FG2, were identified on the posterior fusiform gyrus. The results of ten postmortem brains were then registered to their MRI volumes (acquired before histological processing), 3D reconstructed, and spatially normalized to the Montreal Neurological Institute reference brain. Finally, probabilistic maps were generated for each cytoarchitectonical area by superimposing the areas of the individual brains in the reference space. Comparison with recent functional imaging studies yielded that both areas are located within the object-related visual cortex. FG1 fills the gap between the retinotopically mapped area VO-1 and a posterior fusiform face patch. FG2 is probably the correlate of this face patch.

  15. Probabilistic Flood Maps to support decision-making: Mapping the Value of Information

    Science.gov (United States)

    Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.

    2016-02-01

    Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.

  16. Up-to-date Probabilistic Earthquake Hazard Maps for Egypt

    Science.gov (United States)

    Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed

    2018-04-01

    An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).

  17. Common Fixed Points of Mappings and Set-Valued Mappings in Symmetric Spaces with Application to Probabilistic Spaces

    OpenAIRE

    M. Aamri; A. Bassou; S. Bennani; D. El Moutawakil

    2007-01-01

    The main purpose of this paper is to give some common fixed point theorems of mappings and set-valued mappings of a symmetric space with some applications to probabilistic spaces. In order to get these results, we define the concept of E-weak compatibility between set-valued and single-valued mappings of a symmetric space.

  18. Anatomical parcellation of the brainstem and cerebellar white matter: a preliminary probabilistic tractography study at 3 T

    International Nuclear Information System (INIS)

    Habas, Christophe; Cabanis, Emmanuel A.

    2007-01-01

    The aims of this study were: (1) to test whether higher spatial resolution diffusion tensor images and a higher field strength (3 T) enable a more accurate delineation of the anatomical tract within the brainstem, and, in particular, (2) to try to distinguish the different components of the corticopontocerebellar paths in terms of their cortical origins. The main tracts of the brainstem of four volunteers were studied at 3 T using a probabilistic diffusion tensor imaging (DTI) axonal tracking. The resulting tractograms enabled anatomical well-delineated structures to be identified on the diffusion tensor coloured images. We tracked corticopontine, corticospinal, central tegmental, inferior and superior cerebellopeduncular, transverse, medial lemniscal and, possibly, longitudinal medial fibres. Moreover, DTI tracking allowed a broad delineation of the corticopontocerebellar paths. Diffusion tensor coloured images allow a rapid and reliable access to the white matter broad parcellation of the brainstem and of the cerebellum, which can be completed by fibre tracking. However, a more accurate and exhaustive depiction of the anatomical connectivity within the brainstem requires the application of more sophisticated techniques and tractography algorithms, such as diffusion spectrum imaging. (orig.)

  19. Anatomical parcellation of the brainstem and cerebellar white matter: a preliminary probabilistic tractography study at 3 T

    Energy Technology Data Exchange (ETDEWEB)

    Habas, Christophe; Cabanis, Emmanuel A. [UPMC Paris 6, Service de NeuroImagerie, Hopital des Quinze-Vingts, Paris (France)

    2007-10-15

    The aims of this study were: (1) to test whether higher spatial resolution diffusion tensor images and a higher field strength (3 T) enable a more accurate delineation of the anatomical tract within the brainstem, and, in particular, (2) to try to distinguish the different components of the corticopontocerebellar paths in terms of their cortical origins. The main tracts of the brainstem of four volunteers were studied at 3 T using a probabilistic diffusion tensor imaging (DTI) axonal tracking. The resulting tractograms enabled anatomical well-delineated structures to be identified on the diffusion tensor coloured images. We tracked corticopontine, corticospinal, central tegmental, inferior and superior cerebellopeduncular, transverse, medial lemniscal and, possibly, longitudinal medial fibres. Moreover, DTI tracking allowed a broad delineation of the corticopontocerebellar paths. Diffusion tensor coloured images allow a rapid and reliable access to the white matter broad parcellation of the brainstem and of the cerebellum, which can be completed by fibre tracking. However, a more accurate and exhaustive depiction of the anatomical connectivity within the brainstem requires the application of more sophisticated techniques and tractography algorithms, such as diffusion spectrum imaging. (orig.)

  20. Anatomic mapping of molecular subtypes in diffuse glioma.

    Science.gov (United States)

    Tang, Qisheng; Lian, Yuxi; Yu, Jinhua; Wang, Yuanyuan; Shi, Zhifeng; Chen, Liang

    2017-09-15

    Tumor location served as an important prognostic factor in glioma patients was considered to postulate molecular features according to cell origin theory. However, anatomic distribution of unique molecular subtypes was not widely investigated. The relationship between molecular phenotype and histological subgroup were also vague based on tumor location. Our group focuses on the study of glioma anatomic location of distinctive molecular subgroups and histology subtypes, and explores the possibility of their consistency based on clinical background. We retrospectively reviewed 143 cases with both molecular information (IDH1/TERT/1p19q) and MRI images diagnosed as cerebral diffuse gliomas. The anatomic distribution was analyzed between distinctive molecular subgroups and its relationship with histological subtypes. The influence of tumor location, molecular stratification and histology diagnosis on survival outcome was investigated as well. Anatomic locations of cerebral diffuse glioma indicate varied clinical outcome. Based on that, it can be stratified into five principal molecular subgroups according to IDH1/TERT/1p19q status. Triple-positive (IDH1 and TERT mutation with 1p19q codeletion) glioma tended to be oligodendroglioma present with much better clinical outcome compared to TERT mutation only group who is glioblastoma inclined (median overall survival 39 months VS 18 months). Five molecular subgroups were demonstrated with distinctive locational distribution. This kind of anatomic feature is consistent with its corresponding histological subtypes. Each molecular subgroup in glioma has unique anatomic location which indicates distinctive clinical outcome. Molecular diagnosis can be served as perfect complementary tool for the precise diagnosis. Integration of histomolecular diagnosis will be much more helpful in routine clinical practice in the future.

  1. Meditation effects within the hippocampal complex revealed by voxel-based morphometry and cytoarchitectonic probabilistic mapping

    Directory of Open Access Journals (Sweden)

    Eileen eLuders

    2013-07-01

    Full Text Available Scientific studies addressing anatomical variations in meditators’ brains have emerged rapidly over the last few years, where significant links are most frequently reported with respect to gray matter (GM. To advance prior work, this study examined GM characteristics in a large sample of 100 subjects (50 meditators, 50 controls, where meditators have been practicing close to twenty years, on average. A standard, whole-brain voxel-based morphometry approach was applied and revealed significant meditation effects in the vicinity of the hippocampus, showing more GM in meditators than in controls as well as positive correlations with the number of years practiced. However, the hippocampal complex is regionally segregated by architecture, connectivity, and functional relevance. Thus, to establish differential effects within the hippocampal formation (cornu ammonis, fascia dentate, entorhinal cortex, subiculum as well as the hippocampal-amygdaloid transition area, we utilized refined cytoarchitectonic probabilistic maps of (peri- hippocampal subsections. Significant meditation effects were observed within the subiculum specifically. Since the subiculum is known to play a key role in stress regulation and meditation is an established form of stress reduction, these GM findings may reflect neuronal preservation in long-term meditators – perhaps due to an attenuated release of stress hormones and decreased neurotoxicity.

  2. Meditation effects within the hippocampal complex revealed by voxel-based morphometry and cytoarchitectonic probabilistic mapping

    Science.gov (United States)

    Luders, Eileen; Kurth, Florian; Toga, Arthur W.; Narr, Katherine L.; Gaser, Christian

    2013-01-01

    Scientific studies addressing anatomical variations in meditators' brains have emerged rapidly over the last few years, where significant links are most frequently reported with respect to gray matter (GM). To advance prior work, this study examined GM characteristics in a large sample of 100 subjects (50 meditators, 50 controls), where meditators have been practicing close to 20 years, on average. A standard, whole-brain voxel-based morphometry approach was applied and revealed significant meditation effects in the vicinity of the hippocampus, showing more GM in meditators than in controls as well as positive correlations with the number of years practiced. However, the hippocampal complex is regionally segregated by architecture, connectivity, and functional relevance. Thus, to establish differential effects within the hippocampal formation (cornu ammonis, fascia dentata, entorhinal cortex, subiculum) as well as the hippocampal-amygdaloid transition area, we utilized refined cytoarchitectonic probabilistic maps of (peri-) hippocampal subsections. Significant meditation effects were observed within the subiculum specifically. Since the subiculum is known to play a key role in stress regulation and meditation is an established form of stress reduction, these GM findings may reflect neuronal preservation in long-term meditators—perhaps due to an attenuated release of stress hormones and decreased neurotoxicity. PMID:23847572

  3. Linking retinotopic fMRI mapping and anatomical probability maps of human occipital areas V1 and V2.

    Science.gov (United States)

    Wohlschläger, A M; Specht, K; Lie, C; Mohlberg, H; Wohlschläger, A; Bente, K; Pietrzyk, U; Stöcker, T; Zilles, K; Amunts, K; Fink, G R

    2005-05-15

    Using functional MRI, we characterized field sign maps of the occipital cortex and created three-dimensional maps of these areas. By averaging the individual maps into group maps, probability maps of functionally defined V1 or V2 were determined and compared to anatomical probability maps of Brodmann areas BA17 and BA18 derived from cytoarchitectonic analysis (Amunts, K., Malikovic, A., Mohlberg, H., Schormann, T., Zilles, K., 2000. Brodmann's areas 17 and 18 brought into stereotaxic space-where and how variable? NeuroImage 11, 66-84). Comparison of areas BA17/V1 and BA18/V2 revealed good agreement of the anatomical and functional probability maps. Taking into account that our functional stimulation (due to constraints of the visual angle of stimulation achievable in the MR scanner) only identified parts of V1 and V2, for statistical evaluation of the spatial correlation of V1 and BA17, or V2 and BA18, respectively, the a priori measure kappa was calculated testing the hypothesis that a region can only be part of functionally defined V1 or V2 if it is also in anatomically defined BA17 or BA18, respectively. kappa = 1 means the hypothesis is fully true, kappa = 0 means functionally and anatomically defined visual areas are independent. When applying this measure to the probability maps, kappa was equal to 0.84 for both V1/BA17 and V2/BA18. The data thus show a good correspondence of functionally and anatomically derived segregations of early visual processing areas and serve as a basis for employing anatomical probability maps of V1 and V2 in group analyses to characterize functional activations of early visual processing areas.

  4. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    Science.gov (United States)

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  5. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    Directory of Open Access Journals (Sweden)

    Hyungjin Kim

    2015-08-01

    Full Text Available Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments

  6. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    Science.gov (United States)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  7. Cytoarchitectonic identification and probabilistic mapping of two distinct areas within the anterior ventral bank of the human intraparietal sulcus

    Science.gov (United States)

    Choi, Hi-Jae; Zilles, Karl; Mohlberg, Hartmut; Schleicher, Axel; Fink, Gereon R.; Armstrong, Este; Amunts, Katrin

    2008-01-01

    Anatomical studies in the macaque cortex and functional imaging studies in humans have demonstrated the existence of different cortical areas within the IntraParietal Sulcus (IPS). Such functional segregation, however, does not correlate with presently available architectonic maps of the human brain. This is particularly true for the classical Brodmann map, which is still widely used as an anatomical reference in functional imaging studies. The aim of this cytoarchitectonic mapping study was to use previously defined algorithms to determine whether consistent regions and borders can be found within the cortex of the anterior IPS in a population of ten postmortem human brains. Two areas, the human IntraParietal area 1 (hIP1) and the human IntraParietal area 2 (hIP2), were delineated in serial histological sections of the anterior, lateral bank of the human IPS. The region hIP1 is located posterior and medial to hIP2, and the former is always within the depths of the IPS. The latter, on the other hand, sometimes reaches the free surface of the superior parietal lobule. The delineations were registered to standard reference space, and probabilistic maps were calculated, thereby quantifying the intersubject variability in location and extent of both areas. In the future, they can be a tool in analyzing structure – function relationships and a basis for determining degrees of homology in the IPS among anthropoid primates. We conclude that the human intraparietal sulcus has a finer grained parcellation than shown in Brodmann’s map. PMID:16432904

  8. A Probabilistic Approach for Improved Sequence Mapping in Metatranscriptomic Studies

    Science.gov (United States)

    Mapping millions of short DNA sequences a reference genome is a necessary step in many experiments designed to investigate the expression of genes involved in disease resistance. This is a difficult task in which several challenges often arise resulting in a suboptimal mapping. This mapping process ...

  9. A common fixed point theorem for weakly compatible mappings in Menger probabilistic quasi metric space

    Directory of Open Access Journals (Sweden)

    Badridatt Pant

    2014-02-01

    Full Text Available In this paper, we prove a common fixed point theorem for finite number of self mappings in Menger probabilistic quasi metric space. Our result improves and extends the results of Rezaiyan et al. [A common fixed point theorem in Menger probabilistic quasi-metric spaces, Chaos, Solitons and Fractals 37 (2008 1153-1157.], Miheţ [A note on a fixed point theorem in Menger probabilistic quasi-metric spaces, Chaos, Solitons and Fractals 40 (2009 2349-2352], Pant and Chauhan [Fixed points theorems in Menger probabilistic quasi metric spaces using weak compatibility, Internat. Math. Forum 5 (6 (2010 283-290] and Sastry et al. [A fixed point theorem in Menger PQM-spaces using weak compatibility, Internat. Math. Forum 5 (52 (2010 2563-2568

  10. Seismic risk maps of Switzerland; description of the probabilistic method and discussion of some input parameters

    International Nuclear Information System (INIS)

    Mayer-Rosa, D.; Merz, H.A.

    1976-01-01

    The probabilistic model used in a seismic risk mapping project for Switzerland is presented. Some of its advantages and limitations are spelled out. In addition some earthquake parameters which should be carefully investigated before using them in a seismic risk analysis are discussed

  11. MRI anatomical mapping and direct stereotactic targeting in the subthalamic region: functional and anatomical correspondence in Parkinson's disease

    International Nuclear Information System (INIS)

    Lemaire, Jean-Jacques; Coste, Jerome; Ouchchane, Lemlih; Hemm, Simone; Derost, Philippe; Ulla, Miguel; Durif, Franck; Siadoux, Severine; Gabrillargues, Jean; Chazal, Jean

    2007-01-01

    Object Relationships between clinical effects, anatomy, and electrophysiology are not fully understood in DBS of the subthalamic region in Parkinson's disease. We proposed an anatomic study based on direct image-guided stereotactic surgery with a multiple source data analysis. Materials and Methods A manual anatomic mapping was realized on coronal 1.5-Tesla MRI of 15 patients. Biological data were collected under local anesthesia: the spontaneous neuron activities and the clinical efficiency and the appearance of adverse effects. They were related to relevant current values (mA), the benefit threshold (bt, minimal current leading an clear efficiency), the adverse effect threshold (at, minimal current leading an adverse effect) and the stimulation margin (sm = at - bt); they were matched with anatomy. Results We found consistent relationships between anatomy and biological data. The optimal stimulation parameters (low bt + high sm) were noted in the dorsolateral STN. The highest spontaneous neuron activity was found in the ventromedial STN. Dorsolateral (sensorimotor) STN seems the main DBS effector. The highest spontaneous neuron activity seems related to the anterior (rostral) ventromedial (limbic) STN. Conclusion 1.5 Tesla images provide sufficiently detailed subthalamic anatomy for image-guided stereotactic surgery and may aid in understanding DBS mechanisms. (orig.)

  12. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  13. A probabilistic approach using deformable organ models for automatic definition of normal anatomical structures for 3D treatment planning

    International Nuclear Information System (INIS)

    Fritsch, Daniel; Yu Liyun; Johnson, Valen; McAuliffe, Matthew; Pizer, Stephen; Chaney, Edward

    1996-01-01

    Purpose/Objective : Current clinical methods for defining normal anatomical structures on tomographic images are time consuming and subject to intra- and inter-user variability. With the widespread implementation of 3D RTP, conformal radiotherapy, and dose escalation the implications of imprecise object definition have assumed a much higher level of importance. Object definition and volume-weighted metrics for normal anatomy, such as DVHs and NTCPs, play critical roles in aiming, shaping, and weighting beams. Improvements in object definition, including computer automation, are essential to yield reliable volume-weighted metrics and gains in human efficiency. The purpose of this study was to investigate a probabilistic approach using deformable models to automatically recognize and extract normal anatomy in tomographic images. Materials and Methods: Object models were created from normal organs that were segmented by an interactive method which involved placing a cursor near the center of the object on a slice and clicking a mouse button to initiate computation of structures called cores. Cores describe the skeletal and boundary shape of image objects in a manner that, in 2D, associates a location on the skeleton with the width of the object at that location. A significant advantage of cores is stability against image disturbances such as noise and blur. The model was composed of a relatively small set of extracted points on the skeleton and boundary. The points were carefully chosen to summarize the shape information captured by the cores. Neighborhood relationships between points were represented mathematically by energy functions that penalize, due to warping of the model, the ''goodness'' of match between the model and the image data at any stage during the segmentation process. The model was matched against the image data using a probabilistic approach based on Bayes theorem, which provides a means for computing a posteriori (posterior) probability from 1) a

  14. Probabilistic Cosmological Mass Mapping from Weak Lensing Shear

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, M. D.; Dawson, W. A. [Lawrence Livermore National Laboratory, Livermore, CA 94551 (United States); Ng, K. Y. [University of California, Davis, Davis, CA 95616 (United States); Marshall, P. J. [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, Stanford, CA 94035 (United States); Meyers, J. E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Bard, D. J., E-mail: schneider42@llnl.gov, E-mail: dstn@cmu.edu, E-mail: boutigny@in2p3.fr, E-mail: djbard@slac.stanford.edu, E-mail: jmeyers314@stanford.edu [National Energy Research Scientific Computing Center, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720-8150 (United States)

    2017-04-10

    We infer gravitational lensing shear and convergence fields from galaxy ellipticity catalogs under a spatial process prior for the lensing potential. We demonstrate the performance of our algorithm with simulated Gaussian-distributed cosmological lensing shear maps and a reconstruction of the mass distribution of the merging galaxy cluster Abell 781 using galaxy ellipticities measured with the Deep Lens Survey. Given interim posterior samples of lensing shear or convergence fields on the sky, we describe an algorithm to infer cosmological parameters via lens field marginalization. In the most general formulation of our algorithm we make no assumptions about weak shear or Gaussian-distributed shape noise or shears. Because we require solutions and matrix determinants of a linear system of dimension that scales with the number of galaxies, we expect our algorithm to require parallel high-performance computing resources for application to ongoing wide field lensing surveys.

  15. Probabilistic Mapping of Human Visual Attention from Head Pose Estimation

    Directory of Open Access Journals (Sweden)

    Andrea Veronese

    2017-10-01

    Full Text Available Effective interaction between a human and a robot requires the bidirectional perception and interpretation of actions and behavior. While actions can be identified as a directly observable activity, this might not be sufficient to deduce actions in a scene. For example, orienting our face toward a book might suggest the action toward “reading.” For a human observer, this deduction requires the direction of gaze, the object identified as a book and the intersection between gaze and book. With this in mind, we aim to estimate and map human visual attention as directed to a scene, and assess how this relates to the detection of objects and their related actions. In particular, we consider human head pose as measurement to infer the attention of a human engaged in a task and study which prior knowledge should be included in such a detection system. In a user study, we show the successful detection of attention to objects in a typical office task scenario (i.e., reading, working with a computer, studying an object. Our system requires a single external RGB camera for head pose measurements and a pre-recorded 3D point cloud of the environment.

  16. Anatomically standardized statistical mapping of 123I-IMP SPECT in brain tumors

    International Nuclear Information System (INIS)

    Shibata, Yasushi; Akimoto, Manabu; Matsushita, Akira; Yamamoto, Tetsuya; Takano, Shingo; Matsumura, Akira

    2010-01-01

    123 I-iodoamphetamine Single Photon Emission Computed Tomography (IMP SPECT) is used to evaluate cerebral blood flow. However, application of IMP SPECT to patients with brain tumors has been rarely reported. Primary central nervous system lymphoma (PCNSL) is a rare tumor that shows delayed IMP uptake. The relatively low spatial resolution of SPECT is a clinical problem in diagnosing brain tumors. We examined anatomically standardized statistical mapping of IMP SPECT in patients with brain lesions. This study included 49 IMP SPECT images for 49 patients with brain lesions: 20 PCNSL, 1 Burkitt's lymphoma, 14 glioma, 4 other tumor, 7 inflammatory disease and 3 without any pathological diagnosis but a clinical diagnosis of PCNSL. After intravenous injection of 222 MBq of 123 I-IMP, early (15 minutes) and delayed (4 hours) images were acquired using a multi-detector SPECT machine. All SPECT data were transferred to a newly developed software program iNeurostat+ (Nihon Medi-physics). SPECT data were anatomically standardized on normal brain images. Regions of increased uptake of IMP were statistically mapped on the tomographic images of normal brain. Eighteen patients showed high uptake in the delayed IMP SPECT images (16 PCNSL, 2 unknown). Other tumor or diseases did not show high uptake of delayed IMP SPECT, so there were no false positives. Four patients with pathologically proven PCNSL showed no uptake in original IMP SPECT. These tumors were too small to detect in IMP SPECT. However, statistical mapping revealed IMP uptake in 18 of 20 pathologically verified PCNSL patients. A heterogeneous IMP uptake was seen in homogenous tumors in MRI. For patients with a hot IMP uptake, statistical mapping showed clearer uptake. IMP SPECT is a sensitive test to diagnose of PCNSL, although it produced false negative results for small posterior fossa tumor. Anatomically standardized statistical mapping is therefore considered to be a useful method for improving the diagnostic

  17. Resolution and Probabilistic Models of Components in CryoEM Maps of Mature P22 Bacteriophage

    Science.gov (United States)

    Pintilie, Grigore; Chen, Dong-Hua; Haase-Pettingell, Cameron A.; King, Jonathan A.; Chiu, Wah

    2016-01-01

    CryoEM continues to produce density maps of larger and more complex assemblies with multiple protein components of mixed symmetries. Resolution is not always uniform throughout a cryoEM map, and it can be useful to estimate the resolution in specific molecular components of a large assembly. In this study, we present procedures to 1) estimate the resolution in subcomponents by gold-standard Fourier shell correlation (FSC); 2) validate modeling procedures, particularly at medium resolutions, which can include loop modeling and flexible fitting; and 3) build probabilistic models that combine high-accuracy priors (such as crystallographic structures) with medium-resolution cryoEM densities. As an example, we apply these methods to new cryoEM maps of the mature bacteriophage P22, reconstructed without imposing icosahedral symmetry. Resolution estimates based on gold-standard FSC show the highest resolution in the coat region (7.6 Å), whereas other components are at slightly lower resolutions: portal (9.2 Å), hub (8.5 Å), tailspike (10.9 Å), and needle (10.5 Å). These differences are indicative of inherent structural heterogeneity and/or reconstruction accuracy in different subcomponents of the map. Probabilistic models for these subcomponents provide new insights, to our knowledge, and structural information when taking into account uncertainty given the limitations of the observed density. PMID:26743049

  18. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    Science.gov (United States)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  19. Generation of pseudo-random numbers from given probabilistic distribution with the use of chaotic maps

    Science.gov (United States)

    Lawnik, Marcin

    2018-01-01

    The scope of the paper is the presentation of a new method of generating numbers from a given distribution. The method uses the inverse cumulative distribution function and a method of flattening of probabilistic distributions. On the grounds of these methods, a new construction of chaotic maps was derived, which generates values from a given distribution. The analysis of the new method was conducted on the example of a newly constructed chaotic recurrences, based on the Box-Muller transformation and the quantile function of the exponential distribution. The obtained results certify that the proposed method may be successively applicable for the construction of generators of pseudo-random numbers.

  20. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    Science.gov (United States)

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  1. Statistical probabilistic mapping in the individual brain space: decreased metabolism in epilepsy with FDG PET

    International Nuclear Information System (INIS)

    Oh, Jung Su; Lee, Jae Sung; Kim, Yu Kyeong; Chung, June Key; Lee, Myung Chul; Lee, Dong Soo

    2005-01-01

    In the statistical probabilistic mapping, commonly, differences between two or more groups of subjects are statistically analyzed following spatial normalization. However, to our best knowledge, there is few study which performed the statistical mapping in the individual brain space rather than in the stereotaxic brain space, i.e., template space. Therefore, in the current study, a new method for mapping the statistical results in the template space onto individual brain space has been developed. Four young subjects with epilepsy and their age-matched thirty normal healthy subjects were recruited. Both FDG PET and T1 structural MRI was scanned in these groups. Statistical analysis on the decreased FDG metabolism in epilepsy was performed on the SPM with two sample t-test (p < 0.001, intensity threshold 100). To map the statistical results onto individual space, inverse deformation was performed as follows. With SPM deformation toolbox, DCT (discrete cosine transform) basis-encoded deformation fields between individual T1 images and T1 MNI template were obtained. Afterward, inverse of those fields, i.e., inverse deformation fields were obtained. Since both PET and T1 images have been already normalized in the same MNI space, inversely deformed results in PET is on the individual brain MRI space. By applying inverse deformation field on the statistical results of the PET, the statistical map of decreased metabolism in individual spaces were obtained. With statistical results in the template space, localization of decreased metabolism was in the inferior temporal lobe, which was slightly inferior to the hippocampus. The statistical results in the individual space were commonly located in the hippocampus, where the activation should be decreased according to a priori knowledge of neuroscience. With our newly developed statistical mapping on the individual spaces, the localization of the brain functional mapping became more appropriate in the sense of neuroscience

  2. Statistical probabilistic mapping in the individual brain space: decreased metabolism in epilepsy with FDG PET

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Jung Su; Lee, Jae Sung; Kim, Yu Kyeong; Chung, June Key; Lee, Myung Chul; Lee, Dong Soo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2005-07-01

    In the statistical probabilistic mapping, commonly, differences between two or more groups of subjects are statistically analyzed following spatial normalization. However, to our best knowledge, there is few study which performed the statistical mapping in the individual brain space rather than in the stereotaxic brain space, i.e., template space. Therefore, in the current study, a new method for mapping the statistical results in the template space onto individual brain space has been developed. Four young subjects with epilepsy and their age-matched thirty normal healthy subjects were recruited. Both FDG PET and T1 structural MRI was scanned in these groups. Statistical analysis on the decreased FDG metabolism in epilepsy was performed on the SPM with two sample t-test (p < 0.001, intensity threshold 100). To map the statistical results onto individual space, inverse deformation was performed as follows. With SPM deformation toolbox, DCT (discrete cosine transform) basis-encoded deformation fields between individual T1 images and T1 MNI template were obtained. Afterward, inverse of those fields, i.e., inverse deformation fields were obtained. Since both PET and T1 images have been already normalized in the same MNI space, inversely deformed results in PET is on the individual brain MRI space. By applying inverse deformation field on the statistical results of the PET, the statistical map of decreased metabolism in individual spaces were obtained. With statistical results in the template space, localization of decreased metabolism was in the inferior temporal lobe, which was slightly inferior to the hippocampus. The statistical results in the individual space were commonly located in the hippocampus, where the activation should be decreased according to a priori knowledge of neuroscience. With our newly developed statistical mapping on the individual spaces, the localization of the brain functional mapping became more appropriate in the sense of neuroscience.

  3. Towards optical spectroscopic anatomical mapping (OSAM) for lesion validation in cardiac tissue (Conference Presentation)

    Science.gov (United States)

    Singh-Moon, Rajinder P.; Zaryab, Mohammad; Hendon, Christine P.

    2017-02-01

    Electroanatomical mapping (EAM) is an invaluable tool for guiding cardiac radiofrequency ablation (RFA) therapy. The principle roles of EAM is the identification of candidate ablation sites by detecting regions of abnormal electrogram activity and lesion validation subsequent to RF energy delivery. However, incomplete lesions may present interim electrical inactivity similar to effective treatment in the acute setting, despite efforts to reveal them with pacing or drugs, such as adenosine. Studies report that the misidentification and recovery of such lesions is a leading cause of arrhythmia recurrence and repeat procedures. In previous work, we demonstrated spectroscopic characterization of cardiac tissues using a fiber optic-integrated RF ablation catheter. In this work, we introduce OSAM (optical spectroscopic anatomical mapping), the application of this spectroscopic technique to obtain 2-dimensional biodistribution maps. We demonstrate its diagnostic potential as an auxiliary method for lesion validation in treated swine preparations. Endocardial lesion sets were created on fresh swine cardiac samples using a commercial RFA system. An optically-integrated catheter console fabricated in-house was used for measurement of tissue optical spectra between 600-1000nm. Three dimensional, Spatio-spectral datasets were generated by raster scanning of the optical catheter across the treated sample surface in the presence of whole blood. Tissue optical parameters were recovered at each spatial position using an inverse Monte Carlo method. OSAM biodistribution maps showed stark correspondence with gross examination of tetrazolium chloride stained tissue specimens. Specifically, we demonstrate the ability of OSAM to readily distinguish between shallow and deeper lesions, a limitation faced by current EAM techniques. These results showcase the OSAMs potential for lesion validation strategies for the treatment of cardiac arrhythmias.

  4. Analysis of lesions in patients with unilateral tactile agnosia using cytoarchitectonic probabilistic maps.

    Science.gov (United States)

    Hömke, Lars; Amunts, Katrin; Bönig, Lutz; Fretz, Christian; Binkofski, Ferdinand; Zilles, Karl; Weder, Bruno

    2009-05-01

    We propose a novel methodical approach to lesion analyses involving high-resolution MR images in combination with probabilistic cytoarchitectonic maps. 3D-MR images of the whole brain and the manually segmented lesion mask are spatially normalized to the reference brain of a stereotaxic probabilistic cytoarchitectonic atlas using a multiscale registration algorithm based on an elastic model. The procedure is demonstrated in three patients suffering from aperceptive tactile agnosia of the right hand due to chronic infarction of the left parietal cortex. Patient 1 presents a lesion in areas of the postcentral sulcus, Patient 3 in areas of the superior parietal lobule and adjacent intraparietal sulcus, and Patient 2 lesions in both regions. On the basis of neurobehavioral data, we conjectured degradation of sequential elementary sensory information processing within the postcentral gyrus, impeding texture recognition in Patients 1 and 2, and disturbed kinaesthetic information processing in the posterior parietal lobe, causing degraded shape recognition in the patients 2 and 3. The involvement of Brodmann areas 4a, 4p, 3a, 3b, 1, 2, and areas IP1 and IP2 of the intraparietal sulcus was assessed in terms of the voxel overlap between the spatially transformed lesion masks and the 50%-isocontours of the cytoarchitectonic maps. The disruption of the critical cytoarchitectonic areas and the impaired subfunctions, texture and shape recognition, relate as conjectured above. We conclude that the proposed method represents a promising approach to hypothesis-driven lesion analyses, yielding lesion-function correlates based on a cytoarchitectonic model. Finally, the lesion-function correlates are validated by functional imaging reference data. (c) 2008 Wiley-Liss, Inc.

  5. Probabilistic mapping of flood-induced backscatter changes in SAR time series

    Science.gov (United States)

    Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick

    2017-04-01

    The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.

  6. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I. [Hanyang University, Seoul (Korea, Republic of); Lee, J. S.; Lee, D. S.; Kwon, J. S. [Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, J. J. [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2003-06-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease.

  7. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    International Nuclear Information System (INIS)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.

    2003-01-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease

  8. Resting state cortico-cerebellar functional connectivity networks: A comparison of anatomical and self-organizing map approaches

    Directory of Open Access Journals (Sweden)

    Jessica A Bernard

    2012-08-01

    Full Text Available The cerebellum plays a role in a wide variety of complex behaviors. In order to better understand the role of the cerebellum in human behavior, it is important to know how this structure interacts with cortical and other subcortical regions of the brain. To date, several studies have investigated the cerebellum using resting-state functional connectivity magnetic resonance imaging (fcMRI; Buckner et al., 2011; Krienen & Buckner, 2009; O’Reilly et al., 2009. However, none of this work has taken an anatomically-driven approach. Furthermore, though detailed maps of cerebral cortex and cerebellum networks have been proposed using different network solutions based on the cerebral cortex (Buckner et al., 2011, it remains unknown whether or not an anatomical lobular breakdown best encompasses the networks of the cerebellum. Here, we used fcMRI to create an anatomically-driven cerebellar connectivity atlas. Timecourses were extracted from the lobules of the right hemisphere and vermis. We found distinct networks for the individual lobules with a clear division into motor and non-motor regions. We also used a self-organizing map algorithm to parcellate the cerebellum. This allowed us to investigate redundancy and independence of the anatomically identified cerebellar networks. We found that while anatomical boundaries in the anterior cerebellum provide functional subdivisions of a larger motor grouping defined using our self-organizing map algorithm, in the posterior cerebellum, the lobules were made up of sub-regions associated with distinct functional networks. Together, our results indicate that the lobular boundaries of the human cerebellum are not indicative of functional boundaries, though anatomical divisions can be useful, as is the case of the anterior cerebellum. Additionally, driving the analyses from the cerebellum is key to determining the complete picture of functional connectivity within the structure.

  9. Satellite Based Probabilistic Snow Cover Extent Mapping (SCE) at Hydro-Québec

    Science.gov (United States)

    Teasdale, Mylène; De Sève, Danielle; Angers, Jean-François; Perreault, Luc

    2016-04-01

    Over 40% of Canada's water resources are in Quebec and Hydro-Quebec has developed potential to become one of the largest producers of hydroelectricity in the world, with a total installed capacity of 36,643 MW. The Hydro-Québec fleet park includes 27 large reservoirs with a combined storage capacity of 176 TWh, and 668 dams and 98 controls. Thus, over 98% of all electricity used to supply the domestic market comes from water resources and the excess output is sold on the wholesale markets. In this perspective the efficient management of water resources is needed and it is based primarily on a good river flow estimation including appropriate hydrological data. Snow on ground is one of the significant variables representing 30% to 40% of its annual energy reserve. More specifically, information on snow cover extent (SCE) and snow water equivalent (SWE) is crucial for hydrological forecasting, particularly in northern regions since the snowmelt provides the water that fills the reservoirs and is subsequently used for hydropower generation. For several years Hydro Quebec's research institute ( IREQ) developed several algorithms to map SCE and SWE. So far all the methods were deterministic. However, given the need to maximize the efficient use of all resources while ensuring reliability, the electrical systems must now be managed taking into account all risks. Since snow cover estimation is based on limited spatial information, it is important to quantify and handle its uncertainty in the hydrological forecasting system. This paper presents the first results of a probabilistic algorithm for mapping SCE by combining Bayesian mixture of probability distributions and multiple logistic regression models applied to passive microwave data. This approach allows assigning for each grid point, probabilities to the set of the mutually exclusive discrete outcomes: "snow" and "no snow". Its performance was evaluated using the Brier score since it is particularly appropriate to

  10. Assessment of dynamic probabilistic methods for mapping snow cover in Québec Canada

    Science.gov (United States)

    De Seve, D.; Perreault, L.; Vachon, F.; Guay, F.; choquette, Y.

    2012-04-01

    with an ensemble mapping approach. The ensemble was generated from a Monte Carlo method. The second one relies on a probabilistic clustering method based on Bayesian Gaussian mixture models. Mixtures of probability distributions become natural models to represent data sets where the observations may have arisen from several distinct statistical populations. Each method can provide a map of uncertainty for the ground and the snow classes, which is a huge benefit for forecasters. Initial results have shown the difficulty of mapping the border between the snow and the ground with traditional approaches. In addition, the application of the mixture models reveals the presence of a third class, which seems to characterize the transition zone between snow and soil.

  11. Mapping Cropland in Smallholder-Dominated Savannas: Integrating Remote Sensing Techniques and Probabilistic Modeling

    Directory of Open Access Journals (Sweden)

    Sean Sweeney

    2015-11-01

    Full Text Available Traditional smallholder farming systems dominate the savanna range countries of sub-Saharan Africa and provide the foundation for the region’s food security. Despite continued expansion of smallholder farming into the surrounding savanna landscapes, food insecurity in the region persists. Central to the monitoring of food security in these countries, and to understanding the processes behind it, are reliable, high-quality datasets of cultivated land. Remote sensing has been frequently used for this purpose but distinguishing crops under certain stages of growth from savanna woodlands has remained a major challenge. Yet, crop production in dryland ecosystems is most vulnerable to seasonal climate variability, amplifying the need for high quality products showing the distribution and extent of cropland. The key objective in this analysis is the development of a classification protocol for African savanna landscapes, emphasizing the delineation of cropland. We integrate remote sensing techniques with probabilistic modeling into an innovative workflow. We present summary results for this methodology applied to a land cover classification of Zambia’s Southern Province. Five primary land cover categories are classified for the study area, producing an overall map accuracy of 88.18%. Omission error within the cropland class is 12.11% and commission error 9.76%.

  12. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    Science.gov (United States)

    Yusof, Norbazlan M.; Pradhan, Biswajeet

    2014-06-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for data

  13. Landslide susceptibility mapping along PLUS expressways in Malaysia using probabilistic based model in GIS

    International Nuclear Information System (INIS)

    Yusof, Norbazlan M; Pradhan, Biswajeet

    2014-01-01

    PLUS Berhad holds the concession for a total of 987 km of toll expressways in Malaysia, the longest of which is the North-South Expressway or NSE. Acting as the backbone' of the west coast of the peninsula, the NSE stretches from the Malaysian-Thai border in the north to the border with neighbouring Singapore in the south, linking several major cities and towns along the way. North-South Expressway in Malaysia contributes to the country economic development through trade, social and tourism sector. Presently, the highway is good in terms of its condition and connection to every state but some locations need urgent attention. Stability of slopes at these locations is of most concern as any instability can cause danger to the motorist. In this paper, two study locations have been analysed; they are Gua Tempurung (soil slope) and Jelapang (rock slope) which are obviously having two different characteristics. These locations passed through undulating terrain with steep slopes where landslides are common and the probability of slope instability due to human activities in surrounding areas is high. A combination of twelve (12) landslide conditioning factors database on slope stability such as slope degree and slope aspect were extracted from IFSAR (interoferometric synthetic aperture radar) while landuse, lithology and structural geology were constructed from interpretation of high resolution satellite data from World View II, Quickbird and Ikonos. All this information was analysed in geographic information system (GIS) environment for landslide susceptibility mapping using probabilistic based frequency ratio model. Consequently, information on the slopes such as inventories, condition assessments and maintenance records were assessed through total expressway maintenance management system or better known as TEMAN. The above mentioned system is used by PLUS as an asset management and decision support tools for maintenance activities along the highways as well as for

  14. Isthmus sites identified by Ripple Mapping are usually anatomically stable: A novel method to guide atrial substrate ablation?

    Science.gov (United States)

    Luther, Vishal; Qureshi, Norman; Lim, Phang Boon; Koa-Wing, Michael; Jamil-Copley, Shahnaz; Ng, Fu Siong; Whinnett, Zachary; Davies, D Wyn; Peters, Nicholas S; Kanagaratnam, Prapa; Linton, Nick

    2018-03-01

    Postablation reentrant ATs depend upon conducting isthmuses bordered by scar. Bipolar voltage maps highlight scar as sites of low voltage, but the voltage amplitude of an electrogram depends upon the myocardial activation sequence. Furthermore, a voltage threshold that defines atrial scar is unknown. We used Ripple Mapping (RM) to test whether these isthmuses were anatomically fixed between different activation vectors and atrial rates. We studied post-AF ablation ATs where >1 rhythm was mapped. Multipolar catheters were used with CARTO Confidense for high-density mapping. RM visualized the pattern of activation, and the voltage threshold below which no activation was seen. Isthmuses were characterized at this threshold between maps for each patient. Ten patients were studied (Map 1 was AT1; Map 2: sinus 1/10, LA paced 2/10, AT2 with reverse CS activation 3/10; AT2 CL difference 50 ± 30 ms). Point density was similar between maps (Map 1: 2,589 ± 1,330; Map 2: 2,214 ± 1,384; P  =  0.31). RM activation threshold was 0.16 ± 0.08 mV. Thirty-one isthmuses were identified in Map 1 (median 3 per map; width 27 ± 15 mm; 7 anterior; 6 roof; 8 mitral; 9 septal; 1 posterior). Importantly, 7 of 31 (23%) isthmuses were unexpectedly identified within regions without prior ablation. AT1 was treated following ablation of 11/31 (35%) isthmuses. Of the remaining 20 isthmuses, 14 of 16 isthmuses (88%) were consistent between the two maps (four were inadequately mapped). Wavefront collision caused variation in low voltage distribution in 2 of 16 (12%). The distribution of isthmuses and nonconducting tissue within the ablated left atrium, as defined by RM, appear concordant between rhythms. This could guide a substrate ablative approach. © 2018 Wiley Periodicals, Inc.

  15. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    Science.gov (United States)

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  16. Probabilistic tephra hazard maps for the Neapolitan area: Quantitative volcanological study of Campi Flegrei eruptions

    Science.gov (United States)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2008-07-01

    Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.

  17. Engineering Applications Using Probabilistic Aftershock Hazard Analyses: Aftershock Hazard Map and Load Combination of Aftershocks and Tsunamis

    Directory of Open Access Journals (Sweden)

    Byunghyun Choi

    2017-12-01

    Full Text Available After the Tohoku earthquake in 2011, we observed that aftershocks tended to occur in a wide region after such a large earthquake. These aftershocks resulted in secondary damage or delayed rescue and recovery activities. In addition, it has been reported that there are regions where the intensity of the vibrations owing to the aftershocks was much stronger than those associated with the main shock. Therefore, it is necessary to consider the seismic risk associated with aftershocks. We used the data regarding aftershocks that was obtained from the Tohoku earthquake and various other historically large earthquakes. We investigated the spatial and temporal distribution of the aftershocks using the Gutenberg–Richter law and the modified Omori law. Subsequently, we previously proposed a probabilistic aftershock occurrence model that is expected to be useful to develop plans for recovery activities after future large earthquakes. In this study, the probabilistic aftershock hazard analysis is used to create aftershock hazard maps. We propose a hazard map focusing on the probability of aftershocks on the scale of the main shock for use with a recovery activity plan. Following the lessons learned from the 2011 Tohoku earthquake, we focus on the simultaneous occurrence of tsunamis and aftershocks just after a great subduction earthquake. The probabilistic aftershock hazard analysis is used to derive load combination equations of the load and resistance factor design. This design is intended to simultaneously consider tsunamis and aftershocks for tsunami-resistant designs of tsunami evacuation buildings.

  18. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  19. Improving anatomical mapping of complexly deformed anatomy for external beam radiotherapy and brachytherapy dose accumulation in cervical cancer

    International Nuclear Information System (INIS)

    Vásquez Osorio, Eliana M.; Kolkman-Deurloo, Inger-Karine K.; Schuring-Pereira, Monica; Zolnay, András; Heijmen, Ben J. M.; Hoogeman, Mischa S.

    2015-01-01

    Purpose: In the treatment of cervical cancer, large anatomical deformations, caused by, e.g., tumor shrinkage, bladder and rectum filling changes, organ sliding, and the presence of the brachytherapy (BT) applicator, prohibit the accumulation of external beam radiotherapy (EBRT) and BT dose distributions. This work proposes a structure-wise registration with vector field integration (SW+VF) to map the largely deformed anatomies between EBRT and BT, paving the way for 3D dose accumulation between EBRT and BT. Methods: T2w-MRIs acquired before EBRT and as a part of the MRI-guided BT procedure for 12 cervical cancer patients, along with the manual delineations of the bladder, cervix-uterus, and rectum-sigmoid, were used for this study. A rigid transformation was used to align the bony anatomy in the MRIs. The proposed SW+VF method starts by automatically segmenting features in the area surrounding the delineated organs. Then, each organ and feature pair is registered independently using a feature-based nonrigid registration algorithm developed in-house. Additionally, a background transformation is calculated to account for areas far from all organs and features. In order to obtain one transformation that can be used for dose accumulation, the organ-based, feature-based, and the background transformations are combined into one vector field using a weighted sum, where the contribution of each transformation can be directly controlled by its extent of influence (scope size). The optimal scope sizes for organ-based and feature-based transformations were found by an exhaustive analysis. The anatomical correctness of the mapping was independently validated by measuring the residual distances after transformation for delineated structures inside the cervix-uterus (inner anatomical correctness), and for anatomical landmarks outside the organs in the surrounding region (outer anatomical correctness). The results of the proposed method were compared with the results of the

  20. Improving anatomical mapping of complexly deformed anatomy for external beam radiotherapy and brachytherapy dose accumulation in cervical cancer

    Energy Technology Data Exchange (ETDEWEB)

    Vásquez Osorio, Eliana M., E-mail: e.vasquezosorio@erasmusmc.nl; Kolkman-Deurloo, Inger-Karine K.; Schuring-Pereira, Monica; Zolnay, András; Heijmen, Ben J. M.; Hoogeman, Mischa S. [Department of Radiation Oncology, Erasmus MC Cancer Institute, Rotterdam 3075 (Netherlands)

    2015-01-15

    Purpose: In the treatment of cervical cancer, large anatomical deformations, caused by, e.g., tumor shrinkage, bladder and rectum filling changes, organ sliding, and the presence of the brachytherapy (BT) applicator, prohibit the accumulation of external beam radiotherapy (EBRT) and BT dose distributions. This work proposes a structure-wise registration with vector field integration (SW+VF) to map the largely deformed anatomies between EBRT and BT, paving the way for 3D dose accumulation between EBRT and BT. Methods: T2w-MRIs acquired before EBRT and as a part of the MRI-guided BT procedure for 12 cervical cancer patients, along with the manual delineations of the bladder, cervix-uterus, and rectum-sigmoid, were used for this study. A rigid transformation was used to align the bony anatomy in the MRIs. The proposed SW+VF method starts by automatically segmenting features in the area surrounding the delineated organs. Then, each organ and feature pair is registered independently using a feature-based nonrigid registration algorithm developed in-house. Additionally, a background transformation is calculated to account for areas far from all organs and features. In order to obtain one transformation that can be used for dose accumulation, the organ-based, feature-based, and the background transformations are combined into one vector field using a weighted sum, where the contribution of each transformation can be directly controlled by its extent of influence (scope size). The optimal scope sizes for organ-based and feature-based transformations were found by an exhaustive analysis. The anatomical correctness of the mapping was independently validated by measuring the residual distances after transformation for delineated structures inside the cervix-uterus (inner anatomical correctness), and for anatomical landmarks outside the organs in the surrounding region (outer anatomical correctness). The results of the proposed method were compared with the results of the

  1. Probabilistic Mapping of Storm-induced Coastal Inundation for Climate Change Adaptation

    Science.gov (United States)

    Li, N.; Yamazaki, Y.; Roeber, V.; Cheung, K. F.; Chock, G.

    2016-02-01

    Global warming is posing an imminent threat to coastal communities worldwide. Under the IPCC RCP8.5 scenario, we utilize hurricane events downscaled from a CMIP5 global climate model using the stochastic-deterministic method of Emanuel (2013, Proc. Nat. Acad. Sci.) in a pilot study to develop an inundation map with projected sea-level rise for the urban Honolulu coast. The downscaling is performed for a 20-year period from 2081 to 2100 to capture the ENSO, which strongly influences the hurricane activity in the Pacific. A total of 50 simulations provide a quasi-stationary dataset of 1000 years for probabilistic analysis of the flood hazards toward the end of the century. We utilize the meta-model Hakou, which is based on precomputed hurricane scenarios using ADCIRC, SWAN, and a 1D Boussinesq model (Kennedy et al., 2012, Ocean Modelling), to estimate the annual maximum inundation along the project coastline at the present sea level. Screening of the preliminary results identifies the most severe three events for detailed inundation modeling using the package of Li et al. (2014, Ocean Modelling) at the projected sea level. For each event, the third generation spectral model WAVEWATCH III of Tolman (2008, Ocean Modelling) provides the hurricane waves and the circulation model NEOWAVE of Yamazaki et al. (2009, 2011, Int. J. Num. Meth. Fluids) computes the surge using a system of telescopic nested grids from the open ocean to the project coastline. The output defines the boundary conditions and initial still-water elevation for computation of phase-resolving surf-zone and inundation processes using the 2D Boussinesq model of Roeber and Cheung (2012, Coastal Engineering). Each computed inundation event corresponds to an annual maximum, and with 1000 years of data, has an occurrence probability of 0.1% in a given year. Barring the tail of the distribution, aggregation of the three computed events allow delineation of the inundation zone with annual exceedance probability

  2. First Volcanological-Probabilistic Pyroclastic Density Current and Fallout Hazard Map for Campi Flegrei and Somma Vesuvius Volcanoes.

    Science.gov (United States)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2005-05-01

    Integrated volcanological-probabilistic approaches has been used in order to simulate pyroclastic density currents and fallout and produce hazard maps for Campi Flegrei and Somma Vesuvius areas. On the basis of the analyses of all types of pyroclastic flows, surges, secondary pyroclastic density currents and fallout events occurred in the volcanological history of the two volcanic areas and the evaluation of probability for each type of events, matrixs of input parameters for a numerical simulation have been performed. The multi-dimensional input matrixs include the main controlling parameters of the pyroclasts transport and deposition dispersion, as well as the set of possible eruptive vents used in the simulation program. Probabilistic hazard maps provide of each points of campanian area, the yearly probability to be interested by a given event with a given intensity and resulting demage. Probability of a few events in one thousand years are typical of most areas around the volcanoes whitin a range of ca 10 km, including Neaples. Results provide constrains for the emergency plans in Neapolitan area.

  3. Anatomical characterization of Cre driver mice for neural circuit mapping and manipulation

    Science.gov (United States)

    Harris, Julie A.; Hirokawa, Karla E.; Sorensen, Staci A.; Gu, Hong; Mills, Maya; Ng, Lydia L.; Bohn, Phillip; Mortrud, Marty; Ouellette, Benjamin; Kidney, Jolene; Smith, Kimberly A.; Dang, Chinh; Sunkin, Susan; Bernard, Amy; Oh, Seung Wook; Madisen, Linda; Zeng, Hongkui

    2014-01-01

    Significant advances in circuit-level analyses of the brain require tools that allow for labeling, modulation of gene expression, and monitoring and manipulation of cellular activity in specific cell types and/or anatomical regions. Large-scale projects and individual laboratories have produced hundreds of gene-specific promoter-driven Cre mouse lines invaluable for enabling genetic access to subpopulations of cells in the brain. However, the potential utility of each line may not be fully realized without systematic whole brain characterization of transgene expression patterns. We established a high-throughput in situ hybridization (ISH), imaging and data processing pipeline to describe whole brain gene expression patterns in Cre driver mice. Currently, anatomical data from over 100 Cre driver lines are publicly available via the Allen Institute's Transgenic Characterization database, which can be used to assist researchers in choosing the appropriate Cre drivers for functional, molecular, or connectional studies of different regions and/or cell types in the brain. PMID:25071457

  4. Anatomical characterization of cre driver mice for neural circuit mapping and manipulation

    Directory of Open Access Journals (Sweden)

    Julie Ann Harris

    2014-07-01

    Full Text Available Significant advances in circuit-level analyses of the brain require tools that allow for labeling, modulation of gene expression, and monitoring and manipulation of cellular activity in specific cell types and/or anatomical regions. Large-scale projects and individual laboratories have produced hundreds of gene-specific promoter-driven Cre mouse lines invaluable for enabling genetic access to subpopulations of cells in the brain. However, the potential utility of each line may not be fully realized without systematic whole brain characterization of transgene expression patterns. We established a high-throughput in situ hybridization, imaging and data processing pipeline to describe whole brain gene expression patterns in Cre driver mice. Currently, anatomical data from over 100 Cre driver lines are publicly available via the Allen Institute’s Transgenic Characterization database, which can be used to assist researchers in choosing the appropriate Cre drivers for functional, molecular, or connectional studies of different regions and/or cell types in the brain.

  5. Anatomical specificity of vascular endothelial growth factor expression in glioblastomas: a voxel-based mapping analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Xing [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Wang, Yinyan [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Capital Medical University, Department of Neuropathology, Beijing Neurosurgical Institute, Beijing (China); Wang, Kai; Ma, Jun; Li, Shaowu [Capital Medical University, Department of Neuroradiology, Beijing Tiantan Hospital, Beijing (China); Liu, Shuai [Chinese Academy of Medical Sciences and Peking Union Medical College, Departments of Neurosurgery, Peking Union Medical College Hospital, Beijing (China); Liu, Yong [Chinese Academy of Sciences, Brainnetome Center, Institute of Automation, Beijing (China); Jiang, Tao [Capital Medical University, Department of Neurosurgery, Beijing Tiantan Hospital, Beijing (China); Beijing Academy of Critical Illness in Brain, Department of Clinical Oncology, Beijing (China)

    2016-01-15

    The expression of vascular endothelial growth factor (VEGF) is a common genetic alteration in malignant gliomas and contributes to the angiogenesis of tumors. This study aimed to investigate the anatomical specificity of VEGF expression levels in glioblastomas using voxel-based neuroimaging analysis. Clinical information, MR scans, and immunohistochemistry stains of 209 patients with glioblastomas were reviewed. All tumor lesions were segmented manually and subsequently registered to standard brain space. Voxel-based regression analysis was performed to correlate the brain regions of tumor involvement with the level of VEGF expression. Brain regions identified as significantly associated with high or low VEGF expression were preserved following permutation correction. High VEGF expression was detected in 123 (58.9 %) of the 209 patients. Voxel-based statistical analysis demonstrated that high VEGF expression was more likely in tumors located in the left frontal lobe and the right caudate and low VEGF expression was more likely in tumors that occurred in the posterior region of the right lateral ventricle. Voxel-based neuroimaging analysis revealed the anatomic specificity of VEGF expression in glioblastoma, which may further our understanding of genetic heterogeneity during tumor origination. This finding provides primary theoretical support for potential future application of customized antiangiogenic therapy. (orig.)

  6. Anatomical specificity of vascular endothelial growth factor expression in glioblastomas: a voxel-based mapping analysis

    International Nuclear Information System (INIS)

    Fan, Xing; Wang, Yinyan; Wang, Kai; Ma, Jun; Li, Shaowu; Liu, Shuai; Liu, Yong; Jiang, Tao

    2016-01-01

    The expression of vascular endothelial growth factor (VEGF) is a common genetic alteration in malignant gliomas and contributes to the angiogenesis of tumors. This study aimed to investigate the anatomical specificity of VEGF expression levels in glioblastomas using voxel-based neuroimaging analysis. Clinical information, MR scans, and immunohistochemistry stains of 209 patients with glioblastomas were reviewed. All tumor lesions were segmented manually and subsequently registered to standard brain space. Voxel-based regression analysis was performed to correlate the brain regions of tumor involvement with the level of VEGF expression. Brain regions identified as significantly associated with high or low VEGF expression were preserved following permutation correction. High VEGF expression was detected in 123 (58.9 %) of the 209 patients. Voxel-based statistical analysis demonstrated that high VEGF expression was more likely in tumors located in the left frontal lobe and the right caudate and low VEGF expression was more likely in tumors that occurred in the posterior region of the right lateral ventricle. Voxel-based neuroimaging analysis revealed the anatomic specificity of VEGF expression in glioblastoma, which may further our understanding of genetic heterogeneity during tumor origination. This finding provides primary theoretical support for potential future application of customized antiangiogenic therapy. (orig.)

  7. Measuring the Uncertainty of Probabilistic Maps Representing Human Motion for Indoor Navigation

    Directory of Open Access Journals (Sweden)

    Susanna Kaiser

    2016-01-01

    Full Text Available Indoor navigation and mapping have recently become an important field of interest for researchers because global navigation satellite systems (GNSS are very often unavailable inside buildings. FootSLAM, a SLAM (Simultaneous Localization and Mapping algorithm for pedestrians based on step measurements, addresses the indoor mapping and positioning problem and can provide accurate positioning in many structured indoor environments. In this paper, we investigate how to compare FootSLAM maps via two entropy metrics. Since collaborative FootSLAM requires the alignment and combination of several individual FootSLAM maps, we also investigate measures that help to align maps that partially overlap. We distinguish between the map entropy conditioned on the sequence of pedestrian’s poses, which is a measure of the uncertainty of the estimated map, and the entropy rate of the pedestrian’s steps conditioned on the history of poses and conditioned on the estimated map. Because FootSLAM maps are built on a hexagon grid, the entropy and relative entropy metrics are derived for the special case of hexagonal transition maps. The entropy gives us a new insight on the performance of FootSLAM’s map estimation process.

  8. A stochastic approach for automatic registration and fusion of left atrial electroanatomic maps with 3D CT anatomical images

    International Nuclear Information System (INIS)

    Cristoforetti, Alessandro; Mase, Michela; Faes, Luca; Centonze, Maurizio; Greco, Maurizio Del; Antolini, Renzo; Nollo, Giandomenico; Ravelli, Flavia

    2007-01-01

    The integration of electroanatomic maps with highly resolved computed tomography cardiac images plays an important role in the successful planning of the ablation procedure of arrhythmias. In this paper, we present and validate a fully-automated strategy for the registration and fusion of sparse, atrial endocardial electroanatomic maps (CARTO maps) with detailed left atrial (LA) anatomical reconstructions segmented from a pre-procedural MDCT scan. Registration is accomplished by a parameterized geometric transformation of the CARTO points and by a stochastic search of the best parameter set which minimizes the misalignment between transformed CARTO points and the LA surface. The subsequent fusion of electrophysiological information on the registered CT atrium is obtained through radial basis function interpolation. The algorithm is validated by simulation and by real data from 14 patients referred to CT imaging prior to the ablation procedure. Results are presented, which show the validity of the algorithmic scheme as well as the accuracy and reproducibility of the integration process. The obtained results encourage the application of the integration method in post-intervention ablation assessment and basic AF research and suggest the development for real-time applications in catheter guiding during ablation intervention

  9. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Madankan, R. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pouget, S. [Department of Geology, University at Buffalo (United States); Singla, P., E-mail: psingla@buffalo.edu [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Bursik, M. [Department of Geology, University at Buffalo (United States); Dehn, J. [Geophysical Institute, University of Alaska, Fairbanks (United States); Jones, M. [Center for Computational Research, University at Buffalo (United States); Patra, A. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pavolonis, M. [NOAA-NESDIS, Center for Satellite Applications and Research (United States); Pitman, E.B. [Department of Mathematics, University at Buffalo (United States); Singh, T. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Webley, P. [Geophysical Institute, University of Alaska, Fairbanks (United States)

    2014-08-15

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This paper presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.

  10. Mapping grey matter reductions in schizophrenia: an anatomical likelihood estimation analysis of voxel-based morphometry studies.

    Science.gov (United States)

    Fornito, A; Yücel, M; Patti, J; Wood, S J; Pantelis, C

    2009-03-01

    Voxel-based morphometry (VBM) is a popular tool for mapping neuroanatomical changes in schizophrenia patients. Several recent meta-analyses have identified the brain regions in which patients most consistently show grey matter reductions, although they have not examined whether such changes reflect differences in grey matter concentration (GMC) or grey matter volume (GMV). These measures assess different aspects of grey matter integrity, and may therefore reflect different pathological processes. In this study, we used the Anatomical Likelihood Estimation procedure to analyse significant differences reported in 37 VBM studies of schizophrenia patients, incorporating data from 1646 patients and 1690 controls, and compared the findings of studies using either GMC or GMV to index grey matter differences. Analysis of all studies combined indicated that grey matter reductions in a network of frontal, temporal, thalamic and striatal regions are among the most frequently reported in literature. GMC reductions were generally larger and more consistent than GMV reductions, and were more frequent in the insula, medial prefrontal, medial temporal and striatal regions. GMV reductions were more frequent in dorso-medial frontal cortex, and lateral and orbital frontal areas. These findings support the primacy of frontal, limbic, and subcortical dysfunction in the pathophysiology of schizophrenia, and suggest that the grey matter changes observed with MRI may not necessarily result from a unitary pathological process.

  11. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    Science.gov (United States)

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  12. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Kevin T. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts Institute of Technology, Division of Health Sciences and Technology, Cambridge, MA (United States); Izquierdo-Garcia, David; Catana, Ciprian [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Poynton, Clare B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts General Hospital, Department of Psychiatry, Boston, MA (United States); University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Chonde, Daniel B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Harvard University, Program in Biophysics, Cambridge, MA (United States)

    2017-03-15

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps (''μ-maps'') were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map (''PAC-map'') generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach. (orig.)

  13. Probabilistic storm surge inundation maps for Metro Manila based on Philippine public storm warning signals

    Science.gov (United States)

    Tablazon, J.; Caro, C. V.; Lagmay, A. M. F.; Briones, J. B. L.; Dasallas, L.; Lapidez, J. P.; Santiago, J.; Suarez, J. K.; Ladiero, C.; Gonzalo, L. A.; Mungcal, M. T. F.; Malano, V.

    2015-03-01

    A storm surge is the sudden rise of sea water over the astronomical tides, generated by an approaching storm. This event poses a major threat to the Philippine coastal areas, as manifested by Typhoon Haiyan on 8 November 2013. This hydro-meteorological hazard is one of the main reasons for the high number of casualties due to the typhoon, with 6300 deaths. It became evident that the need to develop a storm surge inundation map is of utmost importance. To develop these maps, the Nationwide Operational Assessment of Hazards under the Department of Science and Technology (DOST-Project NOAH) simulated historical tropical cyclones that entered the Philippine Area of Responsibility. The Japan Meteorological Agency storm surge model was used to simulate storm surge heights. The frequency distribution of the maximum storm surge heights was calculated using simulation results of tropical cyclones under a specific public storm warning signal (PSWS) that passed through a particular coastal area. This determines the storm surge height corresponding to a given probability of occurrence. The storm surge heights from the model were added to the maximum astronomical tide data from WXTide software. The team then created maps of inundation for a specific PSWS using the probability of exceedance derived from the frequency distribution. Buildings and other structures were assigned a probability of exceedance depending on their occupancy category, i.e., 1% probability of exceedance for critical facilities, 10% probability of exceedance for special occupancy structures, and 25% for standard occupancy and miscellaneous structures. The maps produced show the storm-surge-vulnerable areas in Metro Manila, illustrated by the flood depth of up to 4 m and extent of up to 6.5 km from the coastline. This information can help local government units in developing early warning systems, disaster preparedness and mitigation plans, vulnerability assessments, risk-sensitive land use plans, shoreline

  14. Probabilistic change mapping from airborne LiDAR for post-disaster damage assessment

    Science.gov (United States)

    Jalobeanu, A.; Runyon, S. C.; Kruse, F. A.

    2013-12-01

    When both pre- and post-event LiDAR point clouds are available, change detection can be performed to identify areas that were most affected by a disaster event, and to obtain a map of quantitative changes in terms of height differences. In the case of earthquakes in built-up areas for instance, first responders can use a LiDAR change map to help prioritize search and recovery efforts. The main challenge consists of producing reliable change maps, robust to collection conditions, free of processing artifacts (due for instance to triangulation or gridding), and taking into account the various sources of uncertainty. Indeed, datasets acquired within a few years interval are often of different point density (sometimes an order of magnitude higher for recent data), different acquisition geometries, and very likely suffer from georeferencing errors and geometric discrepancies. All these differences might not be important for producing maps from each dataset separately, but they are crucial when performing change detection. We have developed a novel technique for the estimation of uncertainty maps from the LiDAR point clouds, using Bayesian inference, treating all variables as random. The main principle is to grid all points on a common grid before attempting any comparison, as working directly with point clouds is cumbersome and time consuming. A non-parametric approach based on local linear regression was implemented, assuming a locally linear model for the surface. This enabled us to derive error bars on gridded elevations, and then elevation differences. In this way, a map of statistically significant changes could be computed - whereas a deterministic approach would not allow testing of the significance of differences between the two datasets. This approach allowed us to take into account not only the observation noise (due to ranging, position and attitude errors) but also the intrinsic roughness of the observed surfaces occurring when scanning vegetation. As only

  15. Creating probabilistic maps of the face network in the adolescent brain: A multi-centre functional MRI study

    International Nuclear Information System (INIS)

    Tahmasebi, Amir M.; Mareckova, Klara; Artiges, Eric; Martinot, Jean-Luc; Banaschewski, Tobias; Barker, Gareth J.; Loth, Eva; Schumann, Gunter; Bruehl, Ruediger; Ittermann, Bernd; Buchel, Christian; Conrod, Patricia J.; Flor, Herta; Strohle, Andreas; Garavan, Hugh; Gallinat, Jurgen; Heinz, Andreas; Poline, Jean-Baptiste; Rietschel, Marcella; Smolka, Michael N.; Paus, Tomas

    2012-01-01

    Large-scale magnetic resonance (MR) studies of the human brain offer unique opportunities for identifying genetic and environmental factors shaping the human brain. Here, we describe a dataset collected in the context of a multi-centre study of the adolescent brain, namely the IMAGEN Study. We focus on one of the functional paradigms included in the project to probe the brain network underlying processing of ambiguous and angry faces. Using functional MR (fMRI) data collected in 1,110 adolescents, we constructed probabilistic maps of the neural network engaged consistently while viewing the ambiguous or angry faces; 21 brain regions responding to faces with high probability were identified. We were also able to address several methodological issues, including the minimal sample size yielding a stable location of a test region, namely the fusiform face area (FFA), as well as the effect of acquisition site (eight sites) and scanner (four manufacturers) on the location and magnitude of the fMRI response to faces in the FFA. Finally, we provided a comparison between male and female adolescents in terms of the effect sizes of sex differences in brain response to the ambiguous and angry faces in the 21 regions of interest. Overall, we found a stronger neural response to the ambiguous faces in several cortical regions, including the fusiform face area, in female (vs. male) adolescents, and a slightly stronger response to the angry faces in the amygdala of male (vs. female) adolescents. (authors)

  16. Integration of climatic indices in an objective probabilistic model for establishing and mapping viticultural climatic zones in a region

    Science.gov (United States)

    Moral, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo; Honorio, Fulgencio

    2016-05-01

    Different climatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this study, we propose using the information obtained from ten climatic indices [heliothermal index (HI), cool night index (CI), dryness index (DI), growing season temperature (GST), the Winkler index (WI), September mean thermal amplitude (MTA), annual precipitation (AP), precipitation during flowering (PDF), precipitation before flowering (PBF), and summer precipitation (SP)] as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main climatic indices, which could influence on wine suitability from a climate viewpoint, and utilizing the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural climatic suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the climatic indices which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural climatic suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural climatic zones in a region. To illustrate the process, an application to Extremadura, southwestern Spain, is shown.

  17. The integration of bioclimatic indices in an objective probabilistic model for establishing and mapping viticulture suitability in a region

    Science.gov (United States)

    Moral García, Francisco J.; Rebollo, Francisco J.; Paniagua, Luis L.; García, Abelardo

    2014-05-01

    Different bioclimatic indices have been proposed to determine the wine suitability in a region. Some of them are related to the air temperature, but the hydric component of climate should also be considered which, in turn, is influenced by the precipitation during the different stages of the grapevine growing and ripening periods. In this work we propose using the information obtained from 10 bioclimatic indices and variables (heliothermal index, HI, cool night index, CI, dryness index, DI, growing season temperature, GST, the Winkler index, WI, September mean thermal amplitude, MTA, annual precipitation, AP, precipitation during flowering, PDF, precipitation before flowering, PBF, and summer precipitation, SP) as inputs in an objective and probabilistic model, the Rasch model, with the aim of integrating the individual effects of them, obtaining the climate data that summarize all main bioclimatic indices which could influence on wine suitability, and utilize the Rasch measures to generate homogeneous climatic zones. The use of the Rasch model to estimate viticultural suitability constitutes a new application of great practical importance, enabling to rationally determine locations in a region where high viticultural potential exists and establishing a ranking of the bioclimatic indices or variables which exerts an important influence on wine suitability in a region. Furthermore, from the measures of viticultural suitability at some locations, estimates can be computed using a geostatistical algorithm, and these estimates can be utilized to map viticultural suitability potential in a region. To illustrate the process, an application to Extremadura, southewestern Spain, is shown. Keywords: Rasch model, bioclimatic indices, GIS.

  18. POLARIS: A 30-meter probabilistic soil series map of the contiguous United States

    Science.gov (United States)

    Chaney, Nathaniel W; Wood, Eric F; McBratney, Alexander B; Hempel, Jonathan W; Nauman, Travis; Brungard, Colby W.; Odgers, Nathan P

    2016-01-01

    A new complete map of soil series probabilities has been produced for the contiguous United States at a 30 m spatial resolution. This innovative database, named POLARIS, is constructed using available high-resolution geospatial environmental data and a state-of-the-art machine learning algorithm (DSMART-HPC) to remap the Soil Survey Geographic (SSURGO) database. This 9 billion grid cell database is possible using available high performance computing resources. POLARIS provides a spatially continuous, internally consistent, quantitative prediction of soil series. It offers potential solutions to the primary weaknesses in SSURGO: 1) unmapped areas are gap-filled using survey data from the surrounding regions, 2) the artificial discontinuities at political boundaries are removed, and 3) the use of high resolution environmental covariate data leads to a spatial disaggregation of the coarse polygons. The geospatial environmental covariates that have the largest role in assembling POLARIS over the contiguous United States (CONUS) are fine-scale (30 m) elevation data and coarse-scale (~ 2 km) estimates of the geographic distribution of uranium, thorium, and potassium. A preliminary validation of POLARIS using the NRCS National Soil Information System (NASIS) database shows variable performance over CONUS. In general, the best performance is obtained at grid cells where DSMART-HPC is most able to reduce the chance of misclassification. The important role of environmental covariates in limiting prediction uncertainty suggests including additional covariates is pivotal to improving POLARIS' accuracy. This database has the potential to improve the modeling of biogeochemical, water, and energy cycles in environmental models; enhance availability of data for precision agriculture; and assist hydrologic monitoring and forecasting to ensure food and water security.

  19. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  20. PAVA: Physiological and Anatomical Visual Analytics for Mapping of Tissue-Specific Concentration and Time-Course Data

    Science.gov (United States)

    We describe the development and implementation of a Physiological and Anatomical Visual Analytics tool (PAVA), a web browser-based application, used to visualize experimental/simulated chemical time-course data (dosimetry), epidemiological data and Physiologically-Annotated Data ...

  1. Right Hemisphere Cognitive Functions: From Clinical and Anatomic Bases to Brain Mapping During Awake Craniotomy Part I: Clinical and Functional Anatomy.

    Science.gov (United States)

    Bernard, Florian; Lemée, Jean-Michel; Ter Minassian, Aram; Menei, Philippe

    2018-05-12

    The nondominant hemisphere (usually the right) is responsible for primary cognitive functions such as visuospatial and social cognition. Awake surgery using direct electric stimulation for right cerebral tumor removal remains challenging because of the complexity of the functional anatomy and difficulties in adapting standard bedside tasks to awake surgery conditions. An understanding of semiology and anatomic bases, along with an analysis of the available cognitive tasks for visuospatial and social cognition per operative mapping allow neurosurgeons to better appreciate the functional anatomy of the right hemisphere and its relevance to tumor surgery. In this article, the first of a 2-part review, we discuss the anatomic and functional basis of right hemisphere function. Whereas part II of the review focuses primarily on semiology and surgical management of right-sided tumors under awake conditions, this article provides a comprehensive review of knowledge underpinning awake surgery on the right hemisphere. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners.

    Science.gov (United States)

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.

  3. Atrial fibrillation driven by micro-anatomic intramural re-entry revealed by simultaneous sub-epicardial and sub-endocardial optical mapping in explanted human hearts.

    Science.gov (United States)

    Hansen, Brian J; Zhao, Jichao; Csepe, Thomas A; Moore, Brandon T; Li, Ning; Jayne, Laura A; Kalyanasundaram, Anuradha; Lim, Praise; Bratasz, Anna; Powell, Kimerly A; Simonetti, Orlando P; Higgins, Robert S D; Kilic, Ahmet; Mohler, Peter J; Janssen, Paul M L; Weiss, Raul; Hummel, John D; Fedorov, Vadim V

    2015-09-14

    The complex architecture of the human atria may create physical substrates for sustained re-entry to drive atrial fibrillation (AF). The existence of sustained, anatomically defined AF drivers in humans has been challenged partly due to the lack of simultaneous endocardial-epicardial (Endo-Epi) mapping coupled with high-resolution 3D structural imaging. Coronary-perfused human right atria from explanted diseased hearts (n = 8, 43-72 years old) were optically mapped simultaneously by three high-resolution CMOS cameras (two aligned Endo-Epi views (330 µm2 resolution) and one panoramic view). 3D gadolinium-enhanced magnetic resonance imaging (GE-MRI, 80 µm3 resolution) revealed the atrial wall structure varied in thickness (1.0 ± 0.7-6.8 ± 2.4 mm), transmural fiber angle differences, and interstitial fibrosis causing transmural activation delay from 23 ± 11 to 43 ± 22 ms at increased pacing rates. Sustained AF (>90 min) was induced by burst pacing during pinacidil (30-100 µM) perfusion. Dual-sided sub-Endo-sub-Epi optical mapping revealed that AF was driven by spatially and temporally stable intramural re-entry with 107 ± 50 ms cycle length and transmural activation delay of 67 ± 31 ms. Intramural re-entrant drivers were captured primarily by sub-Endo mapping, while sub-Epi mapping visualized re-entry or 'breakthrough' patterns. Re-entrant drivers were anchored on 3D micro-anatomic tracks (15.4 ± 2.2 × 6.0 ± 2.3 mm2, 2.9 ± 0.9 mm depth) formed by atrial musculature characterized by increased transmural fiber angle differences and interstitial fibrosis. Targeted radiofrequency ablation of the tracks verified these re-entries as drivers of AF. Integrated 3D structural-functional mapping of diseased human right atria ex vivo revealed that the complex atrial microstructure caused significant differences between Endo vs. Epi activation during pacing and sustained AF driven by intramural re-entry anchored to fibrosis-insulated atrial bundles. Published on

  4. Clinical applications of the superior epigastric artery perforator (SEAP) flap: anatomical studies and preoperative perforator mapping with multidetector CT.

    Science.gov (United States)

    Hamdi, Moustapha; Van Landuyt, Koenraad; Ulens, Sara; Van Hedent, Eddy; Roche, Nathalie; Monstrey, Stan

    2009-09-01

    Pedicled superior epigastric artery perforator (SEAP) flaps can be raised to cover challenging thoracic defects. We present an anatomical study based on multidetector computerized tomography (MDCT) scan findings of the SEA perforators in addition to the first reported clinical series of SEAP flaps in anterior chest wall reconstruction. (a) In the CT scan study, images of a group of 20 patients who underwent MDCT scan analysis were used to visualise bilaterally the location of musculocutaneous SEAP. X- and Y-axes were used as landmarks to localise the perforators. The X-axis is a horizontal line at the junction of sternum and xyphoid (JCX) and the Y-axis is at the midline. (b) In the clinical study, seven pedicled SEAP flaps were performed in another group of patients. MDCT images revealed totally 157 perforators with a mean of 7.85 perforators per patient. The dominant perforators (137 perforators) were mainly localised in an area between 1.5 and 6.5 cm from the X-axis on both sides and between 3 and 16 cm below the Y-axis. The calibre of these dominant perforators was judged as 'good' to 'very good' in 82.5% of the cases. The average dimension of the flap was 21.7x6.7 cm. All flaps were based on one perforator. Mean harvesting time was 110 min. There were no flap losses. Minor tip necrosis occurred in two flaps. One of them was treated with excision and primary closure. Our clinical experience indicates that the SEAP flap provides a novel and useful approach for reconstruction of anterior chest wall defects. CT-based imaging allows for anatomical assessment of the perforators of the superior epigastric artery (SEA).

  5. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  6. Neuroanatomical substrates of action perception and understanding: an anatomic likelihood estimation meta-analysis of lesion-symptom mapping studies in brain injured patients.

    Directory of Open Access Journals (Sweden)

    Cosimo eUrgesi

    2014-05-01

    Full Text Available Several neurophysiologic and neuroimaging studies suggested that motor and perceptual systems are tightly linked along a continuum rather than providing segregated mechanisms supporting different functions. Using correlational approaches, these studies demonstrated that action observation activates not only visual but also motor brain regions. On the other hand, brain stimulation and brain lesion evidence allows tackling the critical question of whether our action representations are necessary to perceive and understand others’ actions. In particular, recent neuropsychological studies have shown that patients with temporal, parietal and frontal lesions exhibit a number of possible deficits in the visual perception and the understanding of others’ actions. The specific anatomical substrates of such neuropsychological deficits however are still a matter of debate. Here we review the existing literature on this issue and perform an anatomic likelihood estimation meta-analysis of studies using lesion-symptom mapping methods on the causal relation between brain lesions and non-linguistic action perception and understanding deficits. The meta-analysis encompassed data from 361 patients tested in 11 studies and identified regions in the inferior frontal cortex, the inferior parietal cortex and the middle/superior temporal cortex, whose damage is consistently associated with poor performance in action perception and understanding tasks across studies. Interestingly, these areas correspond to the three nodes of the action observation network that are strongly activated in response to visual action perception in neuroimaging research and that have been targeted in previous brain stimulation studies. Thus, brain lesion mapping research provides converging causal evidence that premotor, parietal and temporal regions play a crucial role in action recognition and understanding.

  7. Assignment of functional activations to probabilistic cytoarchitectonic areas revisited.

    Science.gov (United States)

    Eickhoff, Simon B; Paus, Tomas; Caspers, Svenja; Grosbras, Marie-Helene; Evans, Alan C; Zilles, Karl; Amunts, Katrin

    2007-07-01

    Probabilistic cytoarchitectonic maps in standard reference space provide a powerful tool for the analysis of structure-function relationships in the human brain. While these microstructurally defined maps have already been successfully used in the analysis of somatosensory, motor or language functions, several conceptual issues in the analysis of structure-function relationships still demand further clarification. In this paper, we demonstrate the principle approaches for anatomical localisation of functional activations based on probabilistic cytoarchitectonic maps by exemplary analysis of an anterior parietal activation evoked by visual presentation of hand gestures. After consideration of the conceptual basis and implementation of volume or local maxima labelling, we comment on some potential interpretational difficulties, limitations and caveats that could be encountered. Extending and supplementing these methods, we then propose a supplementary approach for quantification of structure-function correspondences based on distribution analysis. This approach relates the cytoarchitectonic probabilities observed at a particular functionally defined location to the areal specific null distribution of probabilities across the whole brain (i.e., the full probability map). Importantly, this method avoids the need for a unique classification of voxels to a single cortical area and may increase the comparability between results obtained for different areas. Moreover, as distribution-based labelling quantifies the "central tendency" of an activation with respect to anatomical areas, it will, in combination with the established methods, allow an advanced characterisation of the anatomical substrates of functional activations. Finally, the advantages and disadvantages of the various methods are discussed, focussing on the question of which approach is most appropriate for a particular situation.

  8. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  9. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  10. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  11. Characterization of respiratory and cardiac motion from electro-anatomical mapping data for improved fusion of MRI to left ventricular electrograms.

    Directory of Open Access Journals (Sweden)

    Sébastien Roujol

    Full Text Available Accurate fusion of late gadolinium enhancement magnetic resonance imaging (MRI and electro-anatomical voltage mapping (EAM is required to evaluate the potential of MRI to identify the substrate of ventricular tachycardia. However, both datasets are not acquired at the same cardiac phase and EAM data is corrupted with respiratory motion limiting the accuracy of current rigid fusion techniques. Knowledge of cardiac and respiratory motion during EAM is thus required to enhance the fusion process. In this study, we propose a novel approach to characterize both cardiac and respiratory motion from EAM data using the temporal evolution of the 3D catheter location recorded from clinical EAM systems. Cardiac and respiratory motion components are extracted from the recorded catheter location using multi-band filters. Filters are calibrated for each EAM point using estimates of heart rate and respiratory rate. The method was first evaluated in numerical simulations using 3D models of cardiac and respiratory motions of the heart generated from real time MRI data acquired in 5 healthy subjects. An accuracy of 0.6-0.7 mm was found for both cardiac and respiratory motion estimates in numerical simulations. Cardiac and respiratory motions were then characterized in 27 patients who underwent LV mapping for treatment of ventricular tachycardia. Mean maximum amplitude of cardiac and respiratory motion was 10.2±2.7 mm (min = 5.5, max = 16.9 and 8.8±2.3 mm (min = 4.3, max = 14.8, respectively. 3D Cardiac and respiratory motions could be estimated from the recorded catheter location and the method does not rely on additional imaging modality such as X-ray fluoroscopy and can be used in conventional electrophysiology laboratory setting.

  12. Prospective randomized comparison of rotational angiography with three-dimensional reconstruction and computed tomography merged with electro-anatomical mapping: a two center atrial fibrillation ablation study.

    Science.gov (United States)

    Anand, Rishi; Gorev, Maxim V; Poghosyan, Hermine; Pothier, Lindsay; Matkins, John; Kotler, Gregory; Moroz, Sarah; Armstrong, James; Nemtsov, Sergei V; Orlov, Michael V

    2016-08-01

    To compare the efficacy and accuracy of rotational angiography with three-dimensional reconstruction (3DATG) image merged with electro-anatomical mapping (EAM) vs. CT-EAM. A prospective, randomized, parallel, two-center study conducted in 36 patients (25 men, age 65 ± 10 years) undergoing AF ablation (33 % paroxysmal, 67 % persistent) guided by 3DATG (group 1) vs. CT (group 2) image fusion with EAM. 3DATG was performed on the Philips Allura Xper FD 10 system. Procedural characteristics including time, radiation exposure, outcome, and navigation accuracy were compared between two groups. There was no significant difference between the groups in total procedure duration or time spent for various procedural steps. Minor differences in procedural characteristics were present between two centers. Segmentation and fusion time for 3DATG or CT-EAM was short and similar between both centers. Accuracy of navigation guided by either method was high and did not depend on left atrial size. Maintenance of sinus rhythm between the two groups was no different up to 24 months of follow-up. This study did not find superiority of 3DATG-EAM image merge to guide AF ablation when compared to CT-EAM fusion. Both merging techniques result in similar navigation accuracy.

  13. Probabilistic global maps of the CO2 column at daily and monthly scales from sparse satellite measurements

    Science.gov (United States)

    Chevallier, Frédéric; Broquet, Grégoire; Pierangelo, Clémence; Crisp, David

    2017-07-01

    The column-average dry air-mole fraction of carbon dioxide in the atmosphere (XCO2) is measured by scattered satellite measurements like those from the Orbiting Carbon Observatory (OCO-2). We show that global continuous maps of XCO2 (corresponding to level 3 of the satellite data) at daily or coarser temporal resolution can be inferred from these data with a Kalman filter built on a model of persistence. Our application of this approach on 2 years of OCO-2 retrievals indicates that the filter provides better information than a climatology of XCO2 at both daily and monthly scales. Provided that the assigned observation uncertainty statistics are tuned in each grid cell of the XCO2 maps from an objective method (based on consistency diagnostics), the errors predicted by the filter at daily and monthly scales represent the true error statistics reasonably well, except for a bias in the high latitudes of the winter hemisphere and a lack of resolution (i.e., a too small discrimination skill) of the predicted error standard deviations. Due to the sparse satellite sampling, the broad-scale patterns of XCO2 described by the filter seem to lag behind the real signals by a few weeks. Finally, the filter offers interesting insights into the quality of the retrievals, both in terms of random and systematic errors.

  14. Image integration into 3-dimensional-electro-anatomical mapping system facilitates safe ablation of ventricular arrhythmias originating from the aortic root and its vicinity.

    Science.gov (United States)

    Jularic, Mario; Akbulak, Ruken Özge; Schäffer, Benjamin; Moser, Julia; Nuehrich, Jana; Meyer, Christian; Eickholt, Christian; Willems, Stephan; Hoffmann, Boris A

    2018-03-01

    During ablation in the vicinity of the coronary arteries establishing a safe distance from the catheter tip to the relevant vessels is mandatory and usually assessed by fluoroscopy alone. The aim of the study was to investigate the feasibility of an image integration module (IIM) for continuous monitoring of the distance of the ablation catheter tip to the main coronary arteries during ablation of ventricular arrhythmias (VA) originating in the sinus of valsalva (SOV) and the left ventricular summit part of which can be reached via the great cardiac vein (GCV). Of 129 patients undergoing mapping for outflow tract arrhythmias from June 2014 till October 2015, a total of 39 patients (52.4 ± 18.1 years, 17 female) had a source of origin in the SOV or the left ventricular summit. Radiofrequency (RF) ablation was performed when a distance of at least 5 mm could be demonstrated with IIM. A safe distance in at least one angiographic plane could be demonstrated in all patients with a source of origin in the SOV, whereas this was not possible in 50% of patients with earliest activation in the summit area. However, using the IIM a safe position at an adjacent site within the GCV could be obtained in three of these cases and successful RF ablation performed safely without any complications. Ablation was successful in 100% of patients with an origin in the SOV, whereas VAs originating from the left ventricular summit could be abolished completely in only 60% of cases. Image integration combining electroanatomical mapping and fluoroscopy allows assessment of the safety of a potential ablation site by continuous real-time monitoring of the spatial relations of the catheter tip to the coronary vessels prior to RF application. It aids ablation in anatomically complex regions like the SOV or the ventricular summit providing biplane angiograms merged into the three-dimensional electroanatomical map. Published on behalf of the European Society of Cardiology. All rights reserved.

  15. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  16. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  17. Google Earth Engine, Open-Access Satellite Data, and Machine Learning in Support of Large-Area Probabilistic Wetland Mapping

    Directory of Open Access Journals (Sweden)

    Jennifer N. Hird

    2017-12-01

    Full Text Available Modern advances in cloud computing and machine-leaning algorithms are shifting the manner in which Earth-observation (EO data are used for environmental monitoring, particularly as we settle into the era of free, open-access satellite data streams. Wetland delineation represents a particularly worthy application of this emerging research trend, since wetlands are an ecologically important yet chronically under-represented component of contemporary mapping and monitoring programs, particularly at the regional and national levels. Exploiting Google Earth Engine and R Statistical software, we developed a workflow for predicting the probability of wetland occurrence using a boosted regression tree machine-learning framework applied to digital topographic and EO data. Working in a 13,700 km2 study area in northern Alberta, our best models produced excellent results, with AUC (area under the receiver-operator characteristic curve values of 0.898 and explained-deviance values of 0.708. Our results demonstrate the central role of high-quality topographic variables for modeling wetland distribution at regional scales. Including optical and/or radar variables into the workflow substantially improved model performance, though optical data performed slightly better. Converting our wetland probability-of-occurrence model into a binary Wet-Dry classification yielded an overall accuracy of 85%, which is virtually identical to that derived from the Alberta Merged Wetland Inventory (AMWI: the contemporary inventory used by the Government of Alberta. However, our workflow contains several key advantages over that used to produce the AMWI, and provides a scalable foundation for province-wide monitoring initiatives.

  18. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  19. Validation of in vitro probabilistic tractography

    DEFF Research Database (Denmark)

    Dyrby, Tim B.; Sogaard, L.V.; Parker, G.J.

    2007-01-01

    assessed the anatomical validity and reproducibility of in vitro multi-fiber probabilistic tractography against two invasive tracers: the histochemically detectable biotinylated dextran amine and manganese enhanced magnetic resonance imaging. Post mortern DWI was used to ensure that most of the sources...

  20. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    thousands of earthquake scenarios. We have carried out preliminary tsunami hazard calculations for different return periods for western North America and Hawaii based on thousands of earthquake scenarios around the Pacific rim and along the coast of North America. We will present tsunami hazard maps for several return periods and also discuss how to use these results for probabilistic inundation and runup mapping. Our knowledge of certain types of tsunami sources is very limited (e.g. submarine landslides), but a probabilistic framework for tsunami hazard evaluation can include even such sources and their uncertainties and present the overall hazard in a meaningful and consistent way.

  1. Probabilistic maps of the white matter tracts with known associated functions on the neonatal brain atlas: Application to evaluate longitudinal developmental trajectories in term-born and preterm-born infants.

    Science.gov (United States)

    Akazawa, Kentaro; Chang, Linda; Yamakawa, Robyn; Hayama, Sara; Buchthal, Steven; Alicata, Daniel; Andres, Tamara; Castillo, Deborrah; Oishi, Kumiko; Skranes, Jon; Ernst, Thomas; Oishi, Kenichi

    2016-03-01

    Diffusion tensor imaging (DTI) has been widely used to investigate the development of the neonatal and infant brain, and deviations related to various diseases or medical conditions like preterm birth. In this study, we created a probabilistic map of fiber pathways with known associated functions, on a published neonatal multimodal atlas. The pathways-of-interest include the superficial white matter (SWM) fibers just beneath the specific cytoarchitectonically defined cortical areas, which were difficult to evaluate with existing DTI analysis methods. The Jülich cytoarchitectonic atlas was applied to define cortical areas related to specific brain functions, and the Dynamic Programming (DP) method was applied to delineate the white matter pathways traversing through the SWM. Probabilistic maps were created for pathways related to motor, somatosensory, auditory, visual, and limbic functions, as well as major white matter tracts, such as the corpus callosum, the inferior fronto-occipital fasciculus, and the middle cerebellar peduncle, by delineating these structures in eleven healthy term-born neonates. In order to characterize maturation-related changes in diffusivity measures of these pathways, the probabilistic maps were then applied to DTIs of 49 healthy infants who were longitudinally scanned at three time-points, approximately five weeks apart. First, we investigated the normal developmental pattern based on 19 term-born infants. Next, we analyzed 30 preterm-born infants to identify developmental patterns related to preterm birth. Last, we investigated the difference in diffusion measures between these groups to evaluate the effects of preterm birth on the development of these functional pathways. Term-born and preterm-born infants both demonstrated a time-dependent decrease in diffusivity, indicating postnatal maturation in these pathways, with laterality seen in the corticospinal tract and the optic radiation. The comparison between term- and preterm

  2. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  3. Virtual brain mapping: Meta-analysis and visualization in functional neuroimaging

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup

    Results from functional neuroimaging such as positron emission tomography and functional magnetic resonance are often reported as sets of 3-dimensional coordinates in Talairach stereotactic space. By utilizing data collected in the BrainMap database and from our own small XML database we can...... data matrix. By conditioning on elements in the databases other than the coordinate data, e.g., anatomical labels associated with many coordinates we can make conditional novelty detection identifying outliers in the database that might be errorneous entries or seldom occuring patterns. In the Brain......Map database we found errors, e.g., stemming from confusion of centimeters and millimeters during entering and errors in the original article. Conditional probability density modeling also enables generation of probabilistic atlases and automatic probabilistic anatomical labeling of new coordinates...

  4. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  5. Assessment of Cerebral Hemodynamic Changes in Pediatric Patients with Moyamoya Disease Using Probabilistic Maps on Analysis of Basal/Acetazolamide Stress Brain Perfusion SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ho Young; Lee, Jae Sung; Kim, Seung Ki; Wang, Kyu Chang; Cho, Byung Kyu; Chung, June Key; Lee, Myung Chul; Lee, Dong Soo [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2008-06-15

    lobe of the hemispheres with EDAS and frontal EGS, the post-operative CVRI, and {delta}CVRI showed a significant difference between patients with a good and poor clinical outcome (p<0.05). In a multivariate logistic regression analysis, the {delta}CVRI and the post-operative CVRI of medial frontal gyrus on the hemispheres where EDAS with frontal EGS was performed were the significant predictive factors for the clinical outcome (p=0.002, p=0.015). With probabilistic map, we could objectively evaluate pre/post-operative hemodynamic changes of pediatric patients with moyamoya disease. Specifically the post-operative CVRI and the post-operative CVRI of medial frontal gyrus where EDAS with frontal EGS was done were the significant predictive factors for further clinical outcomes.

  6. Mapping visual cortex in monkeys and humans using surface-based atlases

    Science.gov (United States)

    Van Essen, D. C.; Lewis, J. W.; Drury, H. A.; Hadjikhani, N.; Tootell, R. B.; Bakircioglu, M.; Miller, M. I.

    2001-01-01

    We have used surface-based atlases of the cerebral cortex to analyze the functional organization of visual cortex in humans and macaque monkeys. The macaque atlas contains multiple partitioning schemes for visual cortex, including a probabilistic atlas of visual areas derived from a recent architectonic study, plus summary schemes that reflect a combination of physiological and anatomical evidence. The human atlas includes a probabilistic map of eight topographically organized visual areas recently mapped using functional MRI. To facilitate comparisons between species, we used surface-based warping to bring functional and geographic landmarks on the macaque map into register with corresponding landmarks on the human map. The results suggest that extrastriate visual cortex outside the known topographically organized areas is dramatically expanded in human compared to macaque cortex, particularly in the parietal lobe.

  7. Discrimination and anatomical mapping of PET-positive lesions: comparison of CT attenuation-corrected PET images with coregistered MR and CT images in the abdomen

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, Felix P.; Crook, David W.; Mader, Caecilia E.; Appenzeller, Philippe; Schulthess, G.K. von; Schmid, Daniel T. [University Hospital Zurich, Department of Medical Radiology, Zurich (Switzerland)

    2013-01-15

    PET/MR has the potential to become a powerful tool in clinical oncological imaging. The purpose of this prospective study was to evaluate the performance of a single T1-weighted (T1w) fat-suppressed unenhanced MR pulse sequence of the abdomen in comparison with unenhanced low-dose CT images to characterize PET-positive lesions. A total of 100 oncological patients underwent sequential whole-body {sup 18}F-FDG PET with CT-based attenuation correction (AC), 40 mAs low-dose CT and two-point Dixon-based T1w 3D MRI of the abdomen in a trimodality PET/CT-MR system. PET-positive lesions were assessed by CT and MRI with regard to their anatomical location, conspicuity and additional relevant information for characterization. From among 66 patients with at least one PET-positive lesion, 147 lesions were evaluated. No significant difference between MRI and CT was found regarding anatomical lesion localization. The MR pulse sequence used performed significantly better than CT regarding conspicuity of liver lesions (p < 0.001, Wilcoxon signed ranks test), whereas no difference was noted for extrahepatic lesions. For overall lesion characterization, MRI was considered superior to CT in 40 % of lesions, equal to CT in 49 %, and inferior to CT in 11 %. Fast Dixon-based T1w MRI outperformed low-dose CT in terms of conspicuity and characterization of PET-positive liver lesions and performed similarly in extrahepatic tumour manifestations. Hence, under the assumption that the technical issue of MR AC for whole-body PET examinations is solved, in abdominal PET/MR imaging the replacement of low-dose CT by a single Dixon-based MR pulse sequence for anatomical lesion correlation appears to be valid and robust. (orig.)

  8. Discrimination and anatomical mapping of PET-positive lesions: comparison of CT attenuation-corrected PET images with coregistered MR and CT images in the abdomen

    International Nuclear Information System (INIS)

    Kuhn, Felix P.; Crook, David W.; Mader, Caecilia E.; Appenzeller, Philippe; Schulthess, G.K. von; Schmid, Daniel T.

    2013-01-01

    PET/MR has the potential to become a powerful tool in clinical oncological imaging. The purpose of this prospective study was to evaluate the performance of a single T1-weighted (T1w) fat-suppressed unenhanced MR pulse sequence of the abdomen in comparison with unenhanced low-dose CT images to characterize PET-positive lesions. A total of 100 oncological patients underwent sequential whole-body 18 F-FDG PET with CT-based attenuation correction (AC), 40 mAs low-dose CT and two-point Dixon-based T1w 3D MRI of the abdomen in a trimodality PET/CT-MR system. PET-positive lesions were assessed by CT and MRI with regard to their anatomical location, conspicuity and additional relevant information for characterization. From among 66 patients with at least one PET-positive lesion, 147 lesions were evaluated. No significant difference between MRI and CT was found regarding anatomical lesion localization. The MR pulse sequence used performed significantly better than CT regarding conspicuity of liver lesions (p < 0.001, Wilcoxon signed ranks test), whereas no difference was noted for extrahepatic lesions. For overall lesion characterization, MRI was considered superior to CT in 40 % of lesions, equal to CT in 49 %, and inferior to CT in 11 %. Fast Dixon-based T1w MRI outperformed low-dose CT in terms of conspicuity and characterization of PET-positive liver lesions and performed similarly in extrahepatic tumour manifestations. Hence, under the assumption that the technical issue of MR AC for whole-body PET examinations is solved, in abdominal PET/MR imaging the replacement of low-dose CT by a single Dixon-based MR pulse sequence for anatomical lesion correlation appears to be valid and robust. (orig.)

  9. Probabilistic biological network alignment.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  10. Climatological attribution of wind power ramp events in East Japan and their probabilistic forecast based on multi-model ensembles downscaled by analog ensemble using self-organizing maps

    Science.gov (United States)

    Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji

    2016-04-01

    Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.

  11. Comparison and validation of shallow landslides susceptibility maps generated by bi-variate and multi-variate linear probabilistic GIS-based techniques. A case study from Ribeira Quente Valley (S. Miguel Island, Azores)

    Science.gov (United States)

    Marques, R.; Amaral, P.; Zêzere, J. L.; Queiroz, G.; Goulart, C.

    2009-04-01

    Slope instability research and susceptibility mapping is a fundamental component of hazard assessment and is of extreme importance for risk mitigation, land-use management and emergency planning. Landslide susceptibility zonation has been actively pursued during the last two decades and several methodologies are still being improved. Among all the methods presented in the literature, indirect quantitative probabilistic methods have been extensively used. In this work different linear probabilistic methods, both bi-variate and multi-variate (Informative Value, Fuzzy Logic, Weights of Evidence and Logistic Regression), were used for the computation of the spatial probability of landslide occurrence, using the pixel as mapping unit. The methods used are based on linear relationships between landslides and 9 considered conditioning factors (altimetry, slope angle, exposition, curvature, distance to streams, wetness index, contribution area, lithology and land-use). It was assumed that future landslides will be conditioned by the same factors as past landslides in the study area. The presented work was developed for Ribeira Quente Valley (S. Miguel Island, Azores), a study area of 9,5 km2, mainly composed of volcanic deposits (ash and pumice lapilli) produced by explosive eruptions in Furnas Volcano. This materials associated to the steepness of the slopes (38,9% of the area has slope angles higher than 35°, reaching a maximum of 87,5°), make the area very prone to landslide activity. A total of 1.495 shallow landslides were mapped (at 1:5.000 scale) and included in a GIS database. The total affected area is 401.744 m2 (4,5% of the study area). Most slope movements are translational slides frequently evolving into debris-flows. The landslides are elongated, with maximum length generally equivalent to the slope extent, and their width normally does not exceed 25 m. The failure depth rarely exceeds 1,5 m and the volume is usually smaller than 700 m3. For modelling

  12. Probabilistic mapping of descriptive health status responses onto health state utilities using Bayesian networks: an empirical analysis converting SF-12 into EQ-5D utility index in a national US sample.

    Science.gov (United States)

    Le, Quang A; Doctor, Jason N

    2011-05-01

    As quality-adjusted life years have become the standard metric in health economic evaluations, mapping health-profile or disease-specific measures onto preference-based measures to obtain quality-adjusted life years has become a solution when health utilities are not directly available. However, current mapping methods are limited due to their predictive validity, reliability, and/or other methodological issues. We employ probability theory together with a graphical model, called a Bayesian network, to convert health-profile measures into preference-based measures and to compare the results to those estimated with current mapping methods. A sample of 19,678 adults who completed both the 12-item Short Form Health Survey (SF-12v2) and EuroQoL 5D (EQ-5D) questionnaires from the 2003 Medical Expenditure Panel Survey was split into training and validation sets. Bayesian networks were constructed to explore the probabilistic relationships between each EQ-5D domain and 12 items of the SF-12v2. The EQ-5D utility scores were estimated on the basis of the predicted probability of each response level of the 5 EQ-5D domains obtained from the Bayesian inference process using the following methods: Monte Carlo simulation, expected utility, and most-likely probability. Results were then compared with current mapping methods including multinomial logistic regression, ordinary least squares, and censored least absolute deviations. The Bayesian networks consistently outperformed other mapping models in the overall sample (mean absolute error=0.077, mean square error=0.013, and R overall=0.802), in different age groups, number of chronic conditions, and ranges of the EQ-5D index. Bayesian networks provide a new robust and natural approach to map health status responses into health utility measures for health economic evaluations.

  13. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  14. Some fixed point theorems for weakly compatible mappings in Non-Archimedean Menger probabilistic metric spaces via common limit range property

    Directory of Open Access Journals (Sweden)

    Sunny Chauhan

    2013-11-01

    Full Text Available In this paper, we utilize the notion of common limit range property in Non-Archimedean Menger PM-spaces and prove some fixed point theorems for two pairs of weakly compatible mappings. Some illustrative examples are furnished to support our results. As an application to our main result, we present a common fixed point theorem for four finite families of self mappings. Our results improve and extend several known results existing in the literature.

  15. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  16. A Geometric Presentation of Probabilistic Satisfiability

    OpenAIRE

    Morales-Luna, Guillermo

    2010-01-01

    By considering probability distributions over the set of assignments the expected truth values assignment to propositional variables are extended through linear operators, and the expected truth values of the clauses at any given conjunctive form are also extended through linear maps. The probabilistic satisfiability problems are discussed in terms of the introduced linear extensions. The case of multiple truth values is also discussed.

  17. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  18. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  19. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  20. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  1. [Mapping mini-implant anatomic sites in the area of the maxillary first molar with the aid of the NewTom 3G® system].

    Science.gov (United States)

    Dumitrache, Marius; Grenard, Annabelle

    2010-12-01

    The goal of our study was to construct a map of the implant sites in the region of the attached gingiva around the maxillary first molars that would be appropriate locations for placement of miniscrews to serve as orthodontic anchorage. We conducted 58 radiographic examinations with the NewTom 3G(®) cone beam technique. For each interdental space, between upper second bicuspids and first molars (5/6) and between upper first and second molars (6/7), we studied the mesio-distal width and depth of bucco-lingual bone at two different levels, L1 and L2, that corresponded to the lower and upper limits of the attached gingiva in the general population. The widths of the interdental spaces varied very little between L1 and L2 and their variances were comparable. At the level of the 5/6 space, the interdental widths displayed a Gaussian distribution, which made it possible for us to determine the confidence intervals at the two borders of attached gingiva as a function of age: IC(99%) of L1 = [2.045 ; 3.462] from 12 to 17 years or [1.594 ; 2.519] from 18 to 24 or [1.613 ; 2.5] from 25 to 48 years and IC(99%) of L2 = [2.37 ; 3.69] from 12 to 17 years or [1.5 ; 2.613] from 18 to 24 or [1.546 ; 2.619] from 25 to 48 ans. The interdental depths increased in an apical direction and their variance diminished. Even if the adequacy of the Gaussian law is less reliable in the sagittal plane, we find a greater consistency in depths in the spaces around 5/6 that allows us to establish very precise confidence levels: IC(99%) of L1 = [9.213; 10.575] and IC(99%) of L2 = [10.295; 11.593]. The mesial areas of the first molars constitute safe zones for implantation of miniscrews with a maximum of 2-2.3 mm for 12 to 17 years old or 1.5-1.6 mm for 18 to 48 year olds and of a maximum of 9-10 mm in length whether the attached gingival level is strong or feeble. The distal areas of the first molars, because of their great variability, require an individualized radiographic study before any mini

  2. Anatomical Basis for the Cardiac Interventional Electrophysiologist

    Directory of Open Access Journals (Sweden)

    Damián Sánchez-Quintana

    2015-01-01

    Full Text Available The establishment of radiofrequency catheter ablation techniques as the mainstay in the treatment of tachycardia has renewed new interest in cardiac anatomy. The interventional arrhythmologist has drawn attention not only to the gross anatomic details of the heart but also to architectural and histological characteristics of various cardiac regions that are relevant to the development or recurrence of tachyarrhythmias and procedural related complications of catheter ablation. In this review, therefore, we discuss some anatomic landmarks commonly used in catheter ablations including the terminal crest, sinus node region, Koch’s triangle, cavotricuspid isthmus, Eustachian ridge and valve, pulmonary venous orifices, venoatrial junctions, and ventricular outflow tracts. We also discuss the anatomical features of important structures in the vicinity of the atria and pulmonary veins, such as the esophagus and phrenic nerves. This paper provides basic anatomic information to improve understanding of the mapping and ablative procedures for cardiac interventional electrophysiologists.

  3. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  4. Influence of heart rhythm, breathing and arm position during computed tomography scanning on the registration accuracy of electro anatomical map (EAM) images, left atrium three-dimensional computed tomography angiography images, and fluoroscopy time during ablation to treat atrial fibrillation

    International Nuclear Information System (INIS)

    Chono, Taiki; Shimoshige, Shinya; Yoshikawa, Kenta; Mizonobe, Kazuhusa; Ogura, Keishi

    2013-01-01

    In CARTOMERGE for treatment of atrial fibrillation (AF) by ablation, by integrating electro anatomical map (EAM) and left atrium three-dimensional computed tomography angiography (3D-CTA) images, identification of the ablation points is simplified and the procedure can be made carried out more rapidly. However, the influence that heart rhythm, breathing and arm position during CT scanning have on registration accuracy and fluoroscopy time is not clear. To clarify the influence on registration accuracy and fluoroscopy time of heart rhythm, breathing and arm position during CT scanning. The patients were CT-scanned during both sinus rhythm (SR) and AF in each study subject. We evaluated the registration accuracy of images reconstructed between the cardiac cycle and assessed the registration accuracy and fluoroscopy time of images obtained during inspiratory breath-hold, expiratory breath-hold and up and down position of the arm. Although the registration accuracy of the EAM image and left atrium 3D-CTA image showed a significant difference during SR, no significant difference was seen during AF. Expiratory breath-hold and down position of the arm resulted in the highest registration accuracy and the shortest fluoroscopy time. However, arm position had no significant effect on registration accuracy. Heart rhythm and breathing during CT scanning have a significant effect on the registration accuracy of EAM images, left atrium 3D-CTA images, and fluoroscopy time. (author)

  5. Standardized anatomic space for abdominal fat quantification

    Science.gov (United States)

    Tong, Yubing; Udupa, Jayaram K.; Torigian, Drew A.

    2014-03-01

    The ability to accurately measure subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT) from images is important for improved assessment and management of patients with various conditions such as obesity, diabetes mellitus, obstructive sleep apnea, cardiovascular disease, kidney disease, and degenerative disease. Although imaging and analysis methods to measure the volume of these tissue components have been developed [1, 2], in clinical practice, an estimate of the amount of fat is obtained from just one transverse abdominal CT slice typically acquired at the level of the L4-L5 vertebrae for various reasons including decreased radiation exposure and cost [3-5]. It is generally assumed that such an estimate reliably depicts the burden of fat in the body. This paper sets out to answer two questions related to this issue which have not been addressed in the literature. How does one ensure that the slices used for correlation calculation from different subjects are at the same anatomic location? At what anatomic location do the volumes of SAT and VAT correlate maximally with the corresponding single-slice area measures? To answer these questions, we propose two approaches for slice localization: linear mapping and non-linear mapping which is a novel learning based strategy for mapping slice locations to a standardized anatomic space so that same anatomic slice locations are identified in different subjects. We then study the volume-to-area correlations and determine where they become maximal. We demonstrate on 50 abdominal CT data sets that this mapping achieves significantly improved consistency of anatomic localization compared to current practice. Our results also indicate that maximum correlations are achieved at different anatomic locations for SAT and VAT which are both different from the L4-L5 junction commonly utilized.

  6. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  7. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  8. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  9. Discrimination of dementia with Lewy bodies from Alzheimer's disease using voxel-based morphometry of white matter by statistical parametric mapping 8 plus diffeomorphic anatomic registration through exponentiated Lie algebra.

    Science.gov (United States)

    Nakatsuka, Tomoya; Imabayashi, Etsuko; Matsuda, Hiroshi; Sakakibara, Ryuji; Inaoka, Tsutomu; Terada, Hitoshi

    2013-05-01

    The purpose of this study was to identify brain atrophy specific for dementia with Lewy bodies (DLB) and to evaluate the discriminatory performance of this specific atrophy between DLB and Alzheimer's disease (AD). We retrospectively reviewed 60 DLB and 30 AD patients who had undergone 3D T1-weighted MRI. We randomly divided the DLB patients into two equal groups (A and B). First, we obtained a target volume of interest (VOI) for DLB-specific atrophy using correlation analysis of the percentage rate of significant whole white matter (WM) atrophy calculated using the Voxel-based Specific Regional Analysis System for Alzheimer's Disease (VSRAD) based on statistical parametric mapping 8 (SPM8) plus diffeomorphic anatomic registration through exponentiated Lie algebra, with segmented WM images in group A. We then evaluated the usefulness of this target VOI for discriminating the remaining 30 DLB patients in group B from the 30 AD patients. Z score values in this target VOI obtained from VSRAD were used as the determinant in receiver operating characteristic (ROC) analysis. Specific target VOIs for DLB were determined in the right-side dominant dorsal midbrain, right-side dominant dorsal pons, and bilateral cerebellum. ROC analysis revealed that the target VOI limited to the midbrain exhibited the highest area under the ROC curves of 0.75. DLB patients showed specific atrophy in the midbrain, pons, and cerebellum. Midbrain atrophy demonstrated the highest power for discriminating DLB and AD. This approach may be useful for determining the contributions of DLB and AD pathologies to the dementia syndrome.

  10. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  11. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  12. Probabilistic flood extent estimates from social media flood observations

    NARCIS (Netherlands)

    Brouwer, Tom; Eilander, Dirk; Van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-01-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from

  13. Cytoarchitecture, probability maps and functions of the human frontal pole.

    Science.gov (United States)

    Bludau, S; Eickhoff, S B; Mohlberg, H; Caspers, S; Laird, A R; Fox, P T; Schleicher, A; Zilles, K; Amunts, K

    2014-06-01

    The frontal pole has more expanded than any other part in the human brain as compared to our ancestors. It plays an important role for specifically human behavior and cognitive abilities, e.g. action selection (Kovach et al., 2012). Evidence about divergent functions of its medial and lateral part has been provided, both in the healthy brain and in psychiatric disorders. The anatomical correlates of such functional segregation, however, are still unknown due to a lack of stereotaxic, microstructural maps obtained in a representative sample of brains. Here we show that the human frontopolar cortex consists of two cytoarchitectonically and functionally distinct areas: lateral frontopolar area 1 (Fp1) and medial frontopolar area 2 (Fp2). Based on observer-independent mapping in serial, cell-body stained sections of 10 brains, three-dimensional, probabilistic maps of areas Fp1 and Fp2 were created. They show, for each position of the reference space, the probability with which each area was found in a particular voxel. Applying these maps as seed regions for a meta-analysis revealed that Fp1 and Fp2 differentially contribute to functional networks: Fp1 was involved in cognition, working memory and perception, whereas Fp2 was part of brain networks underlying affective processing and social cognition. The present study thus disclosed cortical correlates of a functional segregation of the human frontopolar cortex. The probabilistic maps provide a sound anatomical basis for interpreting neuroimaging data in the living human brain, and open new perspectives for analyzing structure-function relationships in the prefrontal cortex. The new data will also serve as a starting point for further comparative studies between human and non-human primate brains. This allows finding similarities and differences in the organizational principles of the frontal lobe during evolution as neurobiological basis for our behavior and cognitive abilities. Copyright © 2013 Elsevier Inc. All

  14. Growing hierarchical probabilistic self-organizing graphs.

    Science.gov (United States)

    López-Rubio, Ezequiel; Palomo, Esteban José

    2011-07-01

    Since the introduction of the growing hierarchical self-organizing map, much work has been done on self-organizing neural models with a dynamic structure. These models allow adjusting the layers of the model to the features of the input dataset. Here we propose a new self-organizing model which is based on a probabilistic mixture of multivariate Gaussian components. The learning rule is derived from the stochastic approximation framework, and a probabilistic criterion is used to control the growth of the model. Moreover, the model is able to adapt to the topology of each layer, so that a hierarchy of dynamic graphs is built. This overcomes the limitations of the self-organizing maps with a fixed topology, and gives rise to a faithful visualization method for high-dimensional data.

  15. Probabilistic Extraction Of Vectors In PIV

    Science.gov (United States)

    Humphreys, William M., Jr.

    1994-01-01

    Probabilistic technique for extraction of velocity vectors in particle-image velocimetry (PIV) implemented with much less computation. Double-exposure photograph of particles in flow illuminated by sheet of light provides data on velocity field of flow. Photograph converted into video image then digitized and processed by computer into velocity-field data. Velocity vectors in interrogation region chosen from magnitude and angle histograms constructed from centroid map of region.

  16. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  17. Anatomical curve identification

    Science.gov (United States)

    Bowman, Adrian W.; Katina, Stanislav; Smith, Joanna; Brown, Denise

    2015-01-01

    Methods for capturing images in three dimensions are now widely available, with stereo-photogrammetry and laser scanning being two common approaches. In anatomical studies, a number of landmarks are usually identified manually from each of these images and these form the basis of subsequent statistical analysis. However, landmarks express only a very small proportion of the information available from the images. Anatomically defined curves have the advantage of providing a much richer expression of shape. This is explored in the context of identifying the boundary of breasts from an image of the female torso and the boundary of the lips from a facial image. The curves of interest are characterised by ridges or valleys. Key issues in estimation are the ability to navigate across the anatomical surface in three-dimensions, the ability to recognise the relevant boundary and the need to assess the evidence for the presence of the surface feature of interest. The first issue is addressed by the use of principal curves, as an extension of principal components, the second by suitable assessment of curvature and the third by change-point detection. P-spline smoothing is used as an integral part of the methods but adaptations are made to the specific anatomical features of interest. After estimation of the boundary curves, the intermediate surfaces of the anatomical feature of interest can be characterised by surface interpolation. This allows shape variation to be explored using standard methods such as principal components. These tools are applied to a collection of images of women where one breast has been reconstructed after mastectomy and where interest lies in shape differences between the reconstructed and unreconstructed breasts. They are also applied to a collection of lip images where possible differences in shape between males and females are of interest. PMID:26041943

  18. Construction and evaluation of quantitative small-animal PET probabilistic atlases for [¹⁸F]FDG and [¹⁸F]FECT functional mapping of the mouse brain.

    Directory of Open Access Journals (Sweden)

    Cindy Casteels

    Full Text Available UNLABELLED: Automated voxel-based or pre-defined volume-of-interest (VOI analysis of small-animal PET data in mice is necessary for optimal information usage as the number of available resolution elements is limited. We have mapped metabolic ([(18F]FDG and dopamine transporter ([(18F]FECT small-animal PET data onto a 3D Magnetic Resonance Microscopy (MRM mouse brain template and aligned them in space to the Paxinos co-ordinate system. In this way, ligand-specific templates for sensitive analysis and accurate anatomical localization were created. Next, using a pre-defined VOI approach, test-retest and intersubject variability of various quantification methods were evaluated. Also, the feasibility of mouse brain statistical parametric mapping (SPM was explored for [(18F]FDG and [(18F]FECT imaging of 6-hydroxydopamine-lesioned (6-OHDA mice. METHODS: Twenty-three adult C57BL6 mice were scanned with [(18F]FDG and [(18F]FECT. Registrations and affine spatial normalizations were performed using SPM8. [(18F]FDG data were quantified using (1 an image-derived-input function obtained from the liver (cMRglc, using (2 standardized uptake values (SUVglc corrected for blood glucose levels and by (3 normalizing counts to the whole-brain uptake. Parametric [(18F]FECT binding images were constructed by reference to the cerebellum. Registration accuracy was determined using random simulated misalignments and vectorial mismatch determination. RESULTS: Registration accuracy was between 0.21-1.11 mm. Regional intersubject variabilities of cMRglc ranged from 15.4% to 19.2%, while test-retest values were between 5.0% and 13.0%. For [(18F]FECT uptake in the caudate-putamen, these values were 13.0% and 10.3%, respectively. Regional values of cMRglc positively correlated to SUVglc measured within the 45-60 min time frame (spearman r = 0.71. Next, SPM analysis of 6-OHDA-lesioned mice showed hypometabolism in the bilateral caudate-putamen and cerebellum, and an

  19. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  20. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  1. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  2. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  3. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  4. Early fetal anatomical sonography.

    LENUS (Irish Health Repository)

    Donnelly, Jennifer C

    2012-10-01

    Over the past decade, prenatal screening and diagnosis has moved from the second into the first trimester, with aneuploidy screening becoming both feasible and effective. With vast improvements in ultrasound technology, sonologists can now image the fetus in greater detail at all gestational ages. In the hands of experienced sonographers, anatomic surveys between 11 and 14 weeks can be carried out with good visualisation rates of many structures. It is important to be familiar with the normal development of the embryo and fetus, and to be aware of the major anatomical landmarks whose absence or presence may be deemed normal or abnormal depending on the gestational age. Some structural abnormalities will nearly always be detected, some will never be and some are potentially detectable depending on a number of factors.

  5. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  6. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  7. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  8. Visualizing Probabilistic Proof

    OpenAIRE

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  9. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  10. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  11. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  12. Anatomical imaging for radiotherapy

    International Nuclear Information System (INIS)

    Evans, Philip M

    2008-01-01

    The goal of radiation therapy is to achieve maximal therapeutic benefit expressed in terms of a high probability of local control of disease with minimal side effects. Physically this often equates to the delivery of a high dose of radiation to the tumour or target region whilst maintaining an acceptably low dose to other tissues, particularly those adjacent to the target. Techniques such as intensity modulated radiotherapy (IMRT), stereotactic radiosurgery and computer planned brachytherapy provide the means to calculate the radiation dose delivery to achieve the desired dose distribution. Imaging is an essential tool in all state of the art planning and delivery techniques: (i) to enable planning of the desired treatment, (ii) to verify the treatment is delivered as planned and (iii) to follow-up treatment outcome to monitor that the treatment has had the desired effect. Clinical imaging techniques can be loosely classified into anatomic methods which measure the basic physical characteristics of tissue such as their density and biological imaging techniques which measure functional characteristics such as metabolism. In this review we consider anatomical imaging techniques. Biological imaging is considered in another article. Anatomical imaging is generally used for goals (i) and (ii) above. Computed tomography (CT) has been the mainstay of anatomical treatment planning for many years, enabling some delineation of soft tissue as well as radiation attenuation estimation for dose prediction. Magnetic resonance imaging is fast becoming widespread alongside CT, enabling superior soft-tissue visualization. Traditionally scanning for treatment planning has relied on the use of a single snapshot scan. Recent years have seen the development of techniques such as 4D CT and adaptive radiotherapy (ART). In 4D CT raw data are encoded with phase information and reconstructed to yield a set of scans detailing motion through the breathing, or cardiac, cycle. In ART a set of

  13. Benchmarking Academic Anatomic Pathologists

    Directory of Open Access Journals (Sweden)

    Barbara S. Ducatman MD

    2016-10-01

    Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative

  14. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  15. Recent advances in standards for collaborative Digital Anatomic Pathology

    Science.gov (United States)

    2011-01-01

    Context Collaborative Digital Anatomic Pathology refers to the use of information technology that supports the creation and sharing or exchange of information, including data and images, during the complex workflow performed in an Anatomic Pathology department from specimen reception to report transmission and exploitation. Collaborative Digital Anatomic Pathology can only be fully achieved using medical informatics standards. The goal of the international integrating the Healthcare Enterprise (IHE) initiative is precisely specifying how medical informatics standards should be implemented to meet specific health care needs and making systems integration more efficient and less expensive. Objective To define the best use of medical informatics standards in order to share and exchange machine-readable structured reports and their evidences (including whole slide images) within hospitals and across healthcare facilities. Methods Specific working groups dedicated to Anatomy Pathology within multiple standards organizations defined standard-based data structures for Anatomic Pathology reports and images as well as informatic transactions in order to integrate Anatomic Pathology information into the electronic healthcare enterprise. Results The DICOM supplements 122 and 145 provide flexible object information definitions dedicated respectively to specimen description and Whole Slide Image acquisition, storage and display. The content profile “Anatomic Pathology Structured Report” (APSR) provides standard templates for structured reports in which textual observations may be bound to digital images or regions of interest. Anatomic Pathology observations are encoded using an international controlled vocabulary defined by the IHE Anatomic Pathology domain that is currently being mapped to SNOMED CT concepts. Conclusion Recent advances in standards for Collaborative Digital Anatomic Pathology are a unique opportunity to share or exchange Anatomic Pathology structured

  16. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  17. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  18. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  19. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  20. Occipital neuralgia: anatomic considerations.

    Science.gov (United States)

    Cesmebasi, Alper; Muhleman, Mitchel A; Hulsberg, Paul; Gielecki, Jerzy; Matusz, Petru; Tubbs, R Shane; Loukas, Marios

    2015-01-01

    Occipital neuralgia is a debilitating disorder first described in 1821 as recurrent headaches localized in the occipital region. Other symptoms that have been associated with this condition include paroxysmal burning and aching pain in the distribution of the greater, lesser, or third occipital nerves. Several etiologies have been identified in the cause of occipital neuralgia and include, but are not limited to, trauma, fibrositis, myositis, fracture of the atlas, and compression of the C-2 nerve root, C1-2 arthrosis syndrome, atlantoaxial lateral mass osteoarthritis, hypertrophic cervical pachymeningitis, cervical cord tumor, Chiari malformation, and neurosyphilis. The management of occipital neuralgia can include conservative approaches and/or surgical interventions. Occipital neuralgia is a multifactorial problem where multiple anatomic areas/structures may be involved with this pathology. A review of these etiologies may provide guidance in better understanding occipital neuralgia. © 2014 Wiley Periodicals, Inc.

  1. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  2. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  3. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  4. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  5. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  6. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  7. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  8. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  9. Quantification of 18F-FDG PET images using probabilistic brain atlas: clinical application in temporal lobe epilepsy patients

    International Nuclear Information System (INIS)

    Kang, Keon Wook; Lee, Dong Soo; Cho, Jae Hoon; Lee, Jae Sung; Yeo, Jeong Seok; Lee, Sang Gun; Chung, June Key; Lee, Myung Chul

    2000-01-01

    A probabilistic atlas of the human brain (Statistical Probability Anatomical Maps: SPAM) was developed by the international consortium for brain mapping (ICBM). After calculating the counts in volume of interest (VOI) using the product of probability of SPAM images and counts in FDG images, asymmetric indexes(AI) were calculated and used for finding epileptogenic zones in temporal lobe epilepsy (TLE). FDG PET images from 28 surgically confirmed TLE patients and 12 age-matched controls were spatially normalized to the averaged brain MRI atlas of ICBM. The counts from normalized PET images were multiplied with the probability of 12 VOIs (superior temporal gyrus, middle temporal gyrus, inferior temporal gyrus, hippocampus, parahippocampal gyrus, and amygdala in each hemisphere) of SPAM images of Montreal Neurological Institute. Finally AI was calculated on each pair of VOI, and compared with visual assessment. If AI was deviated more than 2 standard deviation of normal controls, we considered epileptogenic zones were found successfully. The counts of VOIs in normal controls were symmetric (AI 0.05) except those of inferior temporal gyrus (p<0.01). AIs in 5 pairs of VOI excluding inferior temporal gyrus were deviated to one side in TLE (p<0.05). Lateralization was correct in 23/28 of patients by AI, but all of 28 were consistent with visual inspection. In 3 patients with normal AI was symmetric on visual inspection. In 2 patients falsely lateralized using AI, metabolism was also decreased visually on contra-lateral side. Asymmetric index obtained by the product of statistical probability anatomical map and FDG PET correlated well with visual assessment in TLE patients. SPAM is useful for quantification of VOIs in functional images

  10. Quantification of {sup 18}F-FDG PET images using probabilistic brain atlas: clinical application in temporal lobe epilepsy patients

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Keon Wook; Lee, Dong Soo; Cho, Jae Hoon; Lee, Jae Sung; Yeo, Jeong Seok; Lee, Sang Gun; Chung, June Key; Lee, Myung Chul [Seoul National Univ., Seoul (Korea, Republic of)

    2000-07-01

    A probabilistic atlas of the human brain (Statistical Probability Anatomical Maps: SPAM) was developed by the international consortium for brain mapping (ICBM). After calculating the counts in volume of interest (VOI) using the product of probability of SPAM images and counts in FDG images, asymmetric indexes(AI) were calculated and used for finding epileptogenic zones in temporal lobe epilepsy (TLE). FDG PET images from 28 surgically confirmed TLE patients and 12 age-matched controls were spatially normalized to the averaged brain MRI atlas of ICBM. The counts from normalized PET images were multiplied with the probability of 12 VOIs (superior temporal gyrus, middle temporal gyrus, inferior temporal gyrus, hippocampus, parahippocampal gyrus, and amygdala in each hemisphere) of SPAM images of Montreal Neurological Institute. Finally AI was calculated on each pair of VOI, and compared with visual assessment. If AI was deviated more than 2 standard deviation of normal controls, we considered epileptogenic zones were found successfully. The counts of VOIs in normal controls were symmetric (AI <6%, paired t-test p>0.05) except those of inferior temporal gyrus (p<0.01). AIs in 5 pairs of VOI excluding inferior temporal gyrus were deviated to one side in TLE (p<0.05). Lateralization was correct in 23/28 of patients by AI, but all of 28 were consistent with visual inspection. In 3 patients with normal AI was symmetric on visual inspection. In 2 patients falsely lateralized using AI, metabolism was also decreased visually on contra-lateral side. Asymmetric index obtained by the product of statistical probability anatomical map and FDG PET correlated well with visual assessment in TLE patients. SPAM is useful for quantification of VOIs in functional images.

  11. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  12. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p cranial nerves. Probabilistic tracking with a gradual

  13. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  14. Probabilistic pathway construction.

    Science.gov (United States)

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  16. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  17. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  18. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  19. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  20. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  1. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  2. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  3. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  4. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  5. Probabilistic fracture finite elements

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-05-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  6. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  7. Probabilistic sensory recoding.

    Science.gov (United States)

    Jazayeri, Mehrdad

    2008-08-01

    A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.

  8. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  9. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  10. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  11. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  12. Feature-based morphometry: discovering group-related anatomical patterns.

    Science.gov (United States)

    Toews, Matthew; Wells, William; Collins, D Louis; Arbel, Tal

    2010-02-01

    This paper presents feature-based morphometry (FBM), a new fully data-driven technique for discovering patterns of group-related anatomical structure in volumetric imagery. In contrast to most morphometry methods which assume one-to-one correspondence between subjects, FBM explicitly aims to identify distinctive anatomical patterns that may only be present in subsets of subjects, due to disease or anatomical variability. The image is modeled as a collage of generic, localized image features that need not be present in all subjects. Scale-space theory is applied to analyze image features at the characteristic scale of underlying anatomical structures, instead of at arbitrary scales such as global or voxel-level. A probabilistic model describes features in terms of their appearance, geometry, and relationship to subject groups, and is automatically learned from a set of subject images and group labels. Features resulting from learning correspond to group-related anatomical structures that can potentially be used as image biomarkers of disease or as a basis for computer-aided diagnosis. The relationship between features and groups is quantified by the likelihood of feature occurrence within a specific group vs. the rest of the population, and feature significance is quantified in terms of the false discovery rate. Experiments validate FBM clinically in the analysis of normal (NC) and Alzheimer's (AD) brain images using the freely available OASIS database. FBM automatically identifies known structural differences between NC and AD subjects in a fully data-driven fashion, and an equal error classification rate of 0.80 is achieved for subjects aged 60-80 years exhibiting mild AD (CDR=1). Copyright (c) 2009 Elsevier Inc. All rights reserved.

  13. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  14. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  15. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  16. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  17. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  18. Probabilistic simulation of fermion paths

    International Nuclear Information System (INIS)

    Zhirov, O.V.

    1989-01-01

    Permutation symmetry of fermion path integral allows (while spin degrees of freedom are ignored) to use in its simulation any probabilistic algorithm, like Metropolis one, heat bath, etc. 6 refs., 2 tabs

  19. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  20. An anatomical and functional topography of human auditory cortical areas

    Directory of Open Access Journals (Sweden)

    Michelle eMoerel

    2014-07-01

    Full Text Available While advances in magnetic resonance imaging (MRI throughout the last decades have enabled the detailed anatomical and functional inspection of the human brain non-invasively, to date there is no consensus regarding the precise subdivision and topography of the areas forming the human auditory cortex. Here, we propose a topography of the human auditory areas based on insights on the anatomical and functional properties of human auditory areas as revealed by studies of cyto- and myelo-architecture and fMRI investigations at ultra-high magnetic field (7 Tesla. Importantly, we illustrate that - whereas a group-based approach to analyze functional (tonotopic maps is appropriate to highlight the main tonotopic axis - the examination of tonotopic maps at single subject level is required to detail the topography of primary and non-primary areas that may be more variable across subjects. Furthermore, we show that considering multiple maps indicative of anatomical (i.e. myelination as well as of functional properties (e.g. broadness of frequency tuning is helpful in identifying auditory cortical areas in individual human brains. We propose and discuss a topography of areas that is consistent with old and recent anatomical post mortem characterizations of the human auditory cortex and that may serve as a working model for neuroscience studies of auditory functions.

  1. Probabilistically modeling lava flows with MOLASSES

    Science.gov (United States)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  2. A Methodology for Probabilistic Accident Management

    International Nuclear Information System (INIS)

    Munteanu, Ion; Aldemir, Tunc

    2003-01-01

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated

  3. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  4. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    This paper presents our ongoing effort on developing a principled methodology for automatic ontology mapping based on BayesOWL, a probabilistic framework we developed for modeling uncertainty in semantic web...

  5. Metrics for Probabilistic Geometries

    DEFF Research Database (Denmark)

    Tosi, Alessandra; Hauberg, Søren; Vellido, Alfredo

    2014-01-01

    the distribution over mappings is given by a Gaussian process. We treat the corresponding latent variable model as a Riemannian manifold and we use the expectation of the metric under the Gaussian process prior to define interpolating paths and measure distance between latent points. We show how distances...

  6. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  7. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  8. A probabilistic atlas of human brainstem pathways based on connectome imaging data.

    Science.gov (United States)

    Tang, Yuchun; Sun, Wei; Toga, Arthur W; Ringman, John M; Shi, Yonggang

    2018-04-01

    The brainstem is a critical structure that regulates vital autonomic functions, houses the cranial nerves and their nuclei, relays motor and sensory information between the brain and spinal cord, and modulates cognition, mood, and emotions. As a primary relay center, the fiber pathways of the brainstem include efferent and afferent connections among the cerebral cortex, spinal cord, and cerebellum. While diffusion MRI has been successfully applied to map various brain pathways, its application for the in vivo imaging of the brainstem pathways has been limited due to inadequate resolution and large susceptibility-induced distortion artifacts. With the release of high-resolution data from the Human Connectome Project (HCP), there is increasing interest in mapping human brainstem pathways. Previous works relying on HCP data to study brainstem pathways, however, did not consider the prevalence (>80%) of large distortions in the brainstem even after the application of correction procedures from the HCP-Pipeline. They were also limited in the lack of adequate consideration of subject variability in either fiber pathways or region of interests (ROIs) used for bundle reconstruction. To overcome these limitations, we develop in this work a probabilistic atlas of 23 major brainstem bundles using high-quality HCP data passing rigorous quality control. For the large-scale data from the 500-Subject release of HCP, we conducted extensive quality controls to exclude subjects with severe distortions in the brainstem area. After that, we developed a systematic protocol to manually delineate 1300 ROIs on 20 HCP subjects (10 males; 10 females) for the reconstruction of fiber bundles using tractography techniques. Finally, we leveraged our novel connectome modeling techniques including high order fiber orientation distribution (FOD) reconstruction from multi-shell diffusion imaging and topography-preserving tract filtering algorithms to successfully reconstruct the 23 fiber bundles

  9. Uniportal anatomic combined unusual segmentectomies.

    Science.gov (United States)

    González-Rivas, Diego; Lirio, Francisco; Sesma, Julio

    2017-01-01

    Nowadays, sublobar anatomic resections are gaining momentum as a valid alternative for early stage lung cancer. Despite being technically demanding, anatomic segmentectomies can be performed by uniportal video-assisted thoracic surgery (VATS) approach to combine the benefits of minimally invasiveness with the maximum lung sparing. This procedure can be even more complex if a combined resection of multiple segments from different lobes has to be done. Here we report five cases of combined and unusual segmentectomies done by the same experienced surgeon in high volume institutions to show uniportal VATS is a feasible approach for these complex resections and to share an excellent educational resource.

  10. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  11. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  12. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  13. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  14. The Fate of Anatomical Collections

    NARCIS (Netherlands)

    Knoeff, Rina; Zwijnenberg, Robert

    Almost every medical faculty possesses anatomical and/or pathological collections: human and animal preparations, wax- and other models, as well as drawings, photographs, documents and archives relating to them. In many institutions these collections are well-preserved, but in others they are poorly

  15. Mapping neuroplastic potential in brain-damaged patients.

    Science.gov (United States)

    Herbet, Guillaume; Maheu, Maxime; Costi, Emanuele; Lafargue, Gilles; Duffau, Hugues

    2016-03-01

    It is increasingly acknowledged that the brain is highly plastic. However, the anatomic factors governing the potential for neuroplasticity have hardly been investigated. To bridge this knowledge gap, we generated a probabilistic atlas of functional plasticity derived from both anatomic magnetic resonance imaging results and intraoperative mapping data on 231 patients having undergone surgery for diffuse, low-grade glioma. The atlas includes detailed level of confidence information and is supplemented with a series of comprehensive, connectivity-based cluster analyses. Our results show that cortical plasticity is generally high in the cortex (except in primary unimodal areas and in a small set of neural hubs) and rather low in connective tracts (especially associative and projection tracts). The atlas sheds new light on the topological organization of critical neural systems and may also be useful in predicting the likelihood of recovery (as a function of lesion topology) in various neuropathological conditions-a crucial factor in improving the care of brain-damaged patients. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. The current and ideal state of anatomic pathology patient safety.

    Science.gov (United States)

    Raab, Stephen Spencer

    2014-01-01

    An anatomic pathology diagnostic error may be secondary to a number of active and latent technical and/or cognitive components, which may occur anywhere along the total testing process in clinical and/or laboratory domains. For the pathologist interpretive steps of diagnosis, we examine Kahneman's framework of slow and fast thinking to explain different causes of error in precision (agreement) and in accuracy (truth). The pathologist cognitive diagnostic process involves image pattern recognition and a slow thinking error may be caused by the application of different rationally-constructed mental maps of image criteria/patterns by different pathologists. This type of error is partly related to a system failure in standardizing the application of these maps. A fast thinking error involves the flawed leap from image pattern to incorrect diagnosis. In the ideal state, anatomic pathology systems would target these cognitive error causes as well as the technical latent factors that lead to error.

  17. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  18. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  19. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  20. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  1. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  2. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  3. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  4. Making Probabilistic Relational Categories Learnable

    Science.gov (United States)

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  5. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  6. Probabilistic Approaches to Video Retrieval

    NARCIS (Netherlands)

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  7. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  8. Sound Probabilistic #SAT with Projection

    Directory of Open Access Journals (Sweden)

    Vladimir Klebanov

    2016-10-01

    Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.

  9. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  10. Unification of Sinonasal Anatomical Terminology

    Directory of Open Access Journals (Sweden)

    Voegels, Richard Louis

    2015-07-01

    Full Text Available The advent of endoscopy and computed tomography at the beginning of the 1980s brought to rhinology a revival of anatomy and physiology study. In 1994, the International Conference of Sinus Disease was conceived because the official “Terminologia Anatomica”[1] had little information on the detailed sinonasal anatomy. In addition, there was a lack of uniformity of terminology and definitions. After 20 years, a new conference has been held. The need to use the same terminology led to the publication by the European Society of Rhinology of the “European Position Paper on the Anatomical Terminology of the Internal Nose and Paranasal Sinuses,” that can be accessed freely at www.rhinologyjournal.com. Professor Valerie Lund et al[2] wrote this document reviewing the anatomical terms, comparing to the “Terminology Anatomica” official order to define the structures without eponyms, while respecting the embryological development and especially universalizing and simplifying the terms. A must-read! The text's purpose lies beyond the review of anatomical terminology to universalize the language used to refer to structures of the nasal and paranasal cavities. Information about the anatomy, based on extensive review of the current literature, is arranged in just over 50 pages, which are direct and to the point. The publication may be pleasant reading for learners and teachers of rhinology. This text can be a starting point and enables searching the universal terminology used in Brazil, seeking to converge with this new European proposal for a nomenclature to help us communicate with our peers in Brazil and the rest of the world. The original text of the European Society of Rhinology provides English terms that avoided the use of Latin, and thus fall beyond several national personal translations. It would be admirable if we created our own cross-cultural adaptation of this new suggested anatomical terminology.

  11. [Cellular subcutaneous tissue. Anatomic observations].

    Science.gov (United States)

    Marquart-Elbaz, C; Varnaison, E; Sick, H; Grosshans, E; Cribier, B

    2001-11-01

    We showed in a companion paper that the definition of the French "subcutaneous cellular tissue" considerably varied from the 18th to the end of the 20th centuries and has not yet reached a consensus. To address the anatomic reality of this "subcutaneous cellular tissue", we investigated the anatomic structures underlying the fat tissue in normal human skin. Sixty specimens were excised from the surface to the deep structures (bone, muscle, cartilage) on different body sites of 3 cadavers from the Institut d'Anatomie Normale de Strasbourg. Samples were paraffin-embedded, stained and analysed with a binocular microscope taking x 1 photographs. Specimens were also excised and fixed after subcutaneous injection of Indian ink, after mechanic tissue splitting and after performing artificial skin folds. The aspects of the deep parts of the skin greatly varied according to their anatomic localisation. Below the adipose tissue, we often found a lamellar fibrous layer which extended from the interlobular septa and contained horizontally distributed fat cells. No specific tissue below the hypodermis was observed. Artificial skin folds concerned either exclusively the dermis, when they were superficial or included the hypodermis, but no specific structure was apparent in the center of the fold. India ink diffused to the adipose tissue, mainly along the septa, but did not localise in a specific subcutaneous compartment. This study shows that the histologic aspects of the deep part of the skin depend mainly on the anatomic localisation. Skin is composed of epidermis, dermis and hypodermis and thus the hypodermis can not be considered as being "subcutaneous". A difficult to individualise, fibrous lamellar structure in continuity with the interlobular septa is often found under the fat lobules. This structure is a cleavage line, as is always the case with loose connective tissues, but belongs to the hypodermis (i.e. fat tissue). No specific tissue nor any virtual space was

  12. Inexpensive anatomical trainer for bronchoscopy.

    Science.gov (United States)

    Di Domenico, Stefano; Simonassi, Claudio; Chessa, Leonardo

    2007-08-01

    Flexible fiberoptic bronchoscopy is an indispensable tool for optimal management of intensive care unit patients. However, the acquisition of sufficient training in bronchoscopy is not straightforward during residency, because of technical and ethical problems. Moreover, the use of commercial simulators is limited by their high cost. In order to overcome these limitations, we realized a low-cost anatomical simulator to acquire and maintain the basic skill to perform bronchoscopy in ventilated patients. We used 1.5 mm diameter iron wire to construct the bronchial tree scaffold; glazier-putty was applied to create the anatomical model. The model was covered by several layers of newspaper strips previously immersed in water and vinilic glue. When the model completely dried up, it was detached from the scaffold by cutting it into six pieces, it was reassembled, painted and fitted with an endotracheal tube. We used very cheap material and the final cost was euro16. The trainer resulted in real-scale and anatomically accurate, with appropriate correspondence on endoscopic view between model and patients. All bronchial segments can be explored and easily identified by endoscopic and external vision. This cheap simulator is a valuable tool for practicing, particularly in a hospital with limited resources for medical training.

  13. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  14. Building a high-resolution T2-weighted MR-based probabilistic model of tumor occurrence in the prostate.

    Science.gov (United States)

    Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R

    2018-02-19

    We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.

  15. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  16. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  17. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  18. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1980-08-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the US Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed

  19. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  20. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1983-01-01

    Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko

  1. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1979-01-01

    Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an

  2. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1981-01-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the U.S. Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed. (author)

  3. Utilization management in anatomic pathology.

    Science.gov (United States)

    Lewandrowski, Kent; Black-Schaffer, Steven

    2014-01-01

    There is relatively little published literature concerning utilization management in anatomic pathology. Nonetheless there are many utilization management opportunities that currently exist and are well recognized. Some of these impact only the cost structure within the pathology department itself whereas others reduce charges for third party payers. Utilization management may result in medical legal liabilities for breaching the standard of care. For this reason it will be important for pathology professional societies to develop national utilization guidelines to assist individual practices in implementing a medically sound approach to utilization management. © 2013.

  4. Estimating anatomical wrist joint motion with a robotic exoskeleton.

    Science.gov (United States)

    Rose, Chad G; Kann, Claudia K; Deshpande, Ashish D; O'Malley, Marcia K

    2017-07-01

    Robotic exoskeletons can provide the high intensity, long duration targeted therapeutic interventions required for regaining motor function lost as a result of neurological injury. Quantitative measurements by exoskeletons have been proposed as measures of rehabilitative outcomes. Exoskeletons, in contrast to end effector designs, have the potential to provide a direct mapping between human and robot joints. This mapping rests on the assumption that anatomical axes and robot axes are aligned well, and that movement within the exoskeleton is negligible. These assumptions hold well for simple one degree-of-freedom joints, but may not be valid for multi-articular joints with unique musculoskeletal properties such as the wrist. This paper presents an experiment comparing robot joint kinematic measurements from an exoskeleton to anatomical joint angles measured with a motion capture system. Joint-space position measurements and task-space smoothness metrics were compared between the two measurement modalities. The experimental results quantify the error between joint-level position measurements, and show that exoskeleton kinematic measurements preserve smoothness characteristics found in anatomical measures of wrist movements.

  5. The Science and Politics of Naming: Reforming Anatomical Nomenclature, ca. 1886-1955.

    Science.gov (United States)

    Buklijas, Tatjana

    2017-04-01

    Anatomical nomenclature is medicine's official language. Early in their medical studies, students are expected to memorize not only the bodily geography but also the names for all the structures that, by consensus, constitute the anatomical body. The making and uses of visual maps of the body have received considerable historiographical attention, yet the history of production, communication, and reception of anatomical names-a history as long as the history of anatomy itself-has been studied far less. My essay examines the reforms of anatomical naming between the first modern nomenclature, the 1895 Basel Nomina Anatomica (BNA), and the 1955 Nomina Anatomica Parisiensia (NAP, also known as PNA), which is the basis for current anatomical terminology. I focus on the controversial and ultimately failed attempt to reform anatomical nomenclature, known as Jena Nomina Anatomica (INA), of 1935. Discussions around nomenclature reveal not only how anatomical names are made and communicated, but also the relationship of anatomy with the clinic; disciplinary controversies within anatomy; national traditions in science; and the interplay between international and scientific disciplinary politics. I show how the current anatomical nomenclature, a successor to the NAP, is an outcome of both political and disciplinary tensions that reached their peak before 1945. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Brain anatomical network and intelligence.

    Directory of Open Access Journals (Sweden)

    Yonghui Li

    2009-05-01

    Full Text Available Intuitively, higher intelligence might be assumed to correspond to more efficient information transfer in the brain, but no direct evidence has been reported from the perspective of brain networks. In this study, we performed extensive analyses to test the hypothesis that individual differences in intelligence are associated with brain structural organization, and in particular that higher scores on intelligence tests are related to greater global efficiency of the brain anatomical network. We constructed binary and weighted brain anatomical networks in each of 79 healthy young adults utilizing diffusion tensor tractography and calculated topological properties of the networks using a graph theoretical method. Based on their IQ test scores, all subjects were divided into general and high intelligence groups and significantly higher global efficiencies were found in the networks of the latter group. Moreover, we showed significant correlations between IQ scores and network properties across all subjects while controlling for age and gender. Specifically, higher intelligence scores corresponded to a shorter characteristic path length and a higher global efficiency of the networks, indicating a more efficient parallel information transfer in the brain. The results were consistently observed not only in the binary but also in the weighted networks, which together provide convergent evidence for our hypothesis. Our findings suggest that the efficiency of brain structural organization may be an important biological basis for intelligence.

  7. Anatomic partial nephrectomy: technique evolution.

    Science.gov (United States)

    Azhar, Raed A; Metcalfe, Charles; Gill, Inderbir S

    2015-03-01

    Partial nephrectomy provides equivalent long-term oncologic and superior functional outcomes as radical nephrectomy for T1a renal masses. Herein, we review the various vascular clamping techniques employed during minimally invasive partial nephrectomy, describe the evolution of our partial nephrectomy technique and provide an update on contemporary thinking about the impact of ischemia on renal function. Recently, partial nephrectomy surgical technique has shifted away from main artery clamping and towards minimizing/eliminating global renal ischemia during partial nephrectomy. Supported by high-fidelity three-dimensional imaging, novel anatomic-based partial nephrectomy techniques have recently been developed, wherein partial nephrectomy can now be performed with segmental, minimal or zero global ischemia to the renal remnant. Sequential innovations have included early unclamping, segmental clamping, super-selective clamping and now culminating in anatomic zero-ischemia surgery. By eliminating 'under-the-gun' time pressure of ischemia for the surgeon, these techniques allow an unhurried, tightly contoured tumour excision with point-specific sutured haemostasis. Recent data indicate that zero-ischemia partial nephrectomy may provide better functional outcomes by minimizing/eliminating global ischemia and preserving greater vascularized kidney volume. Contemporary partial nephrectomy includes a spectrum of surgical techniques ranging from conventional-clamped to novel zero-ischemia approaches. Technique selection should be tailored to each individual case on the basis of tumour characteristics, surgical feasibility, surgeon experience, patient demographics and baseline renal function.

  8. Anatomical variations of paranasal sinuses: what to inform the otolaryngologist?

    International Nuclear Information System (INIS)

    Villela, Caroline Laurita Batista Couto; Gomes, Natalia Delage; Gaiotti, Juliana Oggioni; Costa, Ana Maria Doffemond; Ribeiro, Marcelo Almeida; Motta, Emilia Guerra Pinto Coelho; Moreira, Wanderval; Ramos, Laura Filgueiras Mourao; Diniz, Renata Lopes Furletti Caldeira

    2012-01-01

    Anatomic variations of paranasal sinuses are common findings in daily practice. For a radiologist, to know these variations is necessary because of the pathological conditions related to them, and also because they are import for planning a functional endoscopic endonasal surgery, the procedure of choice for diagnosis, biopsy and treatment of various sinonasal diseases. To assure that this surgery is done safely, preventing iatrogenic injuries, it is essential that the surgeon has the mapping of these structures. Thus, a CT is indispensable for preoperative evaluation of paranasal sinuses. Since a general radiologist is expected to know these changes and their relationship to pathological conditions, a literature review and a iconographic essay were conducted with the aim of discussing the importance of major anatomic variations of paranasal sinuses. (author)

  9. Probabilistic atlas based labeling of the cerebral vessel tree

    Science.gov (United States)

    Van de Giessen, Martijn; Janssen, Jasper P.; Brouwer, Patrick A.; Reiber, Johan H. C.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2015-03-01

    Preoperative imaging of the cerebral vessel tree is essential for planning therapy on intracranial stenoses and aneurysms. Usually, a magnetic resonance angiography (MRA) or computed tomography angiography (CTA) is acquired from which the cerebral vessel tree is segmented. Accurate analysis is helped by the labeling of the cerebral vessels, but labeling is non-trivial due to anatomical topological variability and missing branches due to acquisition issues. In recent literature, labeling the cerebral vasculature around the Circle of Willis has mainly been approached as a graph-based problem. The most successful method, however, requires the definition of all possible permutations of missing vessels, which limits application to subsets of the tree and ignores spatial information about the vessel locations. This research aims to perform labeling using probabilistic atlases that model spatial vessel and label likelihoods. A cerebral vessel tree is aligned to a probabilistic atlas and subsequently each vessel is labeled by computing the maximum label likelihood per segment from label-specific atlases. The proposed method was validated on 25 segmented cerebral vessel trees. Labeling accuracies were close to 100% for large vessels, but dropped to 50-60% for small vessels that were only present in less than 50% of the set. With this work we showed that using solely spatial information of the vessel labels, vessel segments from stable vessels (>50% presence) were reliably classified. This spatial information will form the basis for a future labeling strategy with a very loose topological model.

  10. Specifying the brain anatomy underlying temporo-parietal junction activations for theory of mind: A review using probabilistic atlases from different imaging modalities

    NARCIS (Netherlands)

    Schurz, M.; Tholen, M.G.; Perner, J.; Mars, R.B.; Sallet, J.

    2017-01-01

    In this quantitative review, we specified the anatomical basis of brain activity reported in the Temporo-Parietal Junction (TPJ) in Theory of Mind (ToM) research. Using probabilistic brain atlases, we labeled TPJ peak coordinates reported in the literature. This was carried out for four different

  11. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  12. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  13. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  14. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  15. Anatomical landmarks of radical prostatecomy.

    Science.gov (United States)

    Stolzenburg, Jens-Uwe; Schwalenberg, Thilo; Horn, Lars-Christian; Neuhaus, Jochen; Constantinides, Costantinos; Liatsikos, Evangelos N

    2007-03-01

    In the present study, we review current literature and based on our experience, we present the anatomical landmarks of open and laparoscopic/endoscopic radical prostatectomy. A thorough literature search was performed with the Medline database on the anatomy and the nomenclature of the structures surrounding the prostate gland. The correct handling of puboprostatic ligaments, external urethral sphincter, prostatic fascias and neurovascular bundle is necessary for avoiding malfunction of the urogenital system after radical prostatectomy. When evaluating new prostatectomy techniques, we should always take into account both clinical and final oncological outcomes. The present review adds further knowledge to the existing "postprostatectomy anatomical hazard" debate. It emphasizes upon the role of the puboprostatic ligaments and the course of the external urethral sphincter for urinary continence. When performing an intrafascial nerve sparing prostatectomy most urologists tend to approach as close to the prostatic capsula as possible, even though there is no concurrence regarding the nomenclature of the surrounding fascias and the course of the actual neurovascular bundles. After completion of an intrafascial technique the specimen does not contain any periprostatic tissue and thus the detection of pT3a disease is not feasible. This especially becomes problematic if the tumour reaches the resection margin. Nerve sparing open and laparoscopic radical prostatectomy should aim in maintaining sexual function, recuperating early continence after surgery, without hindering the final oncological outcome to the procedure. Despite the different approaches for radical prostatectomy the key for better results is the understanding of the anatomy of the bladder neck and the urethra.

  16. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  17. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  18. Aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Kozuh, M.

    1995-01-01

    Aging is a phenomenon, which is influencing on unavailability of all components of the plant. The influence of aging on Probabilistic Safety Assessment calculations was estimated for Electrical Power Supply System. The average increase of system unavailability due to aging of system components was estimated and components were prioritized regarding their influence on change of system unavailability and relative increase of their unavailability due to aging. After the analysis of some numerical results, the recommendation for a detailed research of aging phenomena and its influence on system availability is given. (author)

  19. Probabilistic assessment of SGTR management

    International Nuclear Information System (INIS)

    Champ, M.; Cornille, Y.; Lanore, J.M.

    1989-04-01

    In case of steam generator tube rupture (SGTR) event, in France, the mitigation of accident relies on operator intervention, by applying a specific accidental procedure. A detailed probabilistic analysis has been conducted which required the assessment of the failure probability of the operator actions, and for that purpose it was necessary to estimate the time available for the operator to apply the adequate procedure for various sequences. The results indicate that by taking into account the delays and the existence of adequate accidental procedures, the risk is reduced to a reasonably low level

  20. Probabilistic accident sequence recovery analysis

    International Nuclear Information System (INIS)

    Stutzke, Martin A.; Cooper, Susan E.

    2004-01-01

    Recovery analysis is a method that considers alternative strategies for preventing accidents in nuclear power plants during probabilistic risk assessment (PRA). Consideration of possible recovery actions in PRAs has been controversial, and there seems to be a widely held belief among PRA practitioners, utility staff, plant operators, and regulators that the results of recovery analysis should be skeptically viewed. This paper provides a framework for discussing recovery strategies, thus lending credibility to the process and enhancing regulatory acceptance of PRA results and conclusions. (author)

  1. Probabilistic risk assessment: Number 219

    International Nuclear Information System (INIS)

    Bari, R.A.

    1985-01-01

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  2. Axiomatisation of fully probabilistic design

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kroupa, Tomáš

    2012-01-01

    Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf

  3. Probabilistic Analysis of Crack Width

    Directory of Open Access Journals (Sweden)

    J. Marková

    2000-01-01

    Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.

  4. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  5. Probabilistic risk assessment, Volume I

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    This book contains 158 papers presented at the International Topical Meeting on Probabilistic Risk Assessment held by the American Nuclear Society (ANS) and the European Nuclear Society (ENS) in Port Chester, New York in 1981. The meeting was second in a series of three. The main focus of the meeting was on the safety of light water reactors. The papers discuss safety goals and risk assessment. Quantitative safety goals, risk assessment in non-nuclear technologies, and operational experience and data base are also covered. Included is an address by Dr. Chauncey Starr

  6. Probabilistic safety analysis using microcomputer

    International Nuclear Information System (INIS)

    Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos

    1990-01-01

    The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)

  7. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  8. Living probabilistic safety assessment (LPSA)

    International Nuclear Information System (INIS)

    1999-08-01

    Over the past few years many nuclear power plant organizations have performed probabilistic safety assessments (PSAs) to identify and understand key plant vulnerabilities. As a result of the availability of these PSA studies, there is a desire to use them to enhance plant safety and to operate the nuclear stations in the most efficient manner. PSA is an effective tool for this purpose as it assists plant management to target resources where the largest benefit to plant safety can be obtained. However, any PSA which is to be used in this way must have a credible and defensible basis. Thus, it is very important to have a high quality 'living PSA' accepted by the plant and the regulator. With this background in mind, the IAEA has prepared this report on Living Probabilistic Safety Assessment (LPSA) which addresses the updating, documentation, quality assurance, and management and organizational requirements for LPSA. Deficiencies in the areas addressed in this report would seriously reduce the adequacy of the LPSA as a tool to support decision making at NPPs. This report was reviewed by a working group during a Technical Committee Meeting on PSA Applications to Improve NPP Safety held in Madrid, Spain, from 23 to 27 February 1998

  9. Software for Probabilistic Risk Reduction

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  10. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  11. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  12. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  13. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  14. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  15. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  16. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  17. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  18. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  19. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  20. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  1. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  2. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  3. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-01-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10 −4 ), 283 for the intensity approach (p = 2  ×  10 −6 ) and 282

  4. Anatomical and palynological characteristics of Salvia willeana ...

    African Journals Online (AJOL)

    In this study, anatomical and palynological features of the roots, stems, petiole and leaves of Salvia willeana (Holmboe) Hedge and Salvia veneris Hedge, Salvia species endemic to Cyprus, were investigated. In the anatomical characteristics of stem structures, it was found that the chlorenchyma composed of 6 or 7 rows of ...

  5. Structural and functional properties of a probabilistic model of neuronal connectivity in a simple locomotor network

    Science.gov (United States)

    Merrison-Hort, Robert; Soffe, Stephen R; Borisyuk, Roman

    2018-01-01

    Although, in most animals, brain connectivity varies between individuals, behaviour is often similar across a species. What fundamental structural properties are shared across individual networks that define this behaviour? We describe a probabilistic model of connectivity in the hatchling Xenopus tadpole spinal cord which, when combined with a spiking model, reliably produces rhythmic activity corresponding to swimming. The probabilistic model allows calculation of structural characteristics that reflect common network properties, independent of individual network realisations. We use the structural characteristics to study examples of neuronal dynamics, in the complete network and various sub-networks, and this allows us to explain the basis for key experimental findings, and make predictions for experiments. We also study how structural and functional features differ between detailed anatomical connectomes and those generated by our new, simpler, model (meta-model). PMID:29589828

  6. Probabilistic mapping of deep brain stimulation effects in essential tremor

    Directory of Open Access Journals (Sweden)

    Till A Dembek

    2017-01-01

    Discussion: Our results support the assumption, that the ZI might be a very effective target for tremor suppression. However stimulation inside the ZI and in its close vicinity was also related to the occurrence of stimulation-induced side-effects, so it remains unclear whether the VIM or the ZI is the overall better target. The study demonstrates the use of PSMs for target selection and evaluation. While their accuracy has to be carefully discussed, they can improve the understanding of DBS effects and can be of use for other DBS targets in the therapy of neurological or psychiatric disorders as well. Furthermore they provide a priori information about expected DBS effects in a certain region and might be helpful to clinicians in programming DBS devices in the future.

  7. Probabilistic landslide hazards and risk mapping on Penang Island ...

    Indian Academy of Sciences (India)

    This paper deals with landslide hazards and risk analysis of Penang Island, Malaysia using Geo- .... require a priori knowledge of the main causes of landslides .... Soil. Rengam-bukit. 289450. 10.03. 96. 20.73. 2.07 temiang association. Selangor-kangkong. 34197. 1.18. 0. 0.00. 0.00 association. Local alluvium-. 373655.

  8. Prefrontal-Thalamic Anatomical Connectivity and Executive Cognitive Function in Schizophrenia.

    Science.gov (United States)

    Giraldo-Chica, Monica; Rogers, Baxter P; Damon, Stephen M; Landman, Bennett A; Woodward, Neil D

    2018-03-15

    Executive cognitive functions, including working memory, cognitive flexibility, and inhibition, are impaired in schizophrenia. Executive functions rely on coordinated information processing between the prefrontal cortex (PFC) and thalamus, particularly the mediodorsal nucleus. This raises the possibility that anatomical connectivity between the PFC and mediodorsal thalamus may be 1) reduced in schizophrenia and 2) related to deficits in executive function. The current investigation tested these hypotheses. Forty-five healthy subjects and 62 patients with a schizophrenia spectrum disorder completed a battery of tests of executive function and underwent diffusion-weighted imaging. Probabilistic tractography was used to quantify anatomical connectivity between six cortical regions, including PFC, and the thalamus. Thalamocortical anatomical connectivity was compared between healthy subjects and patients with schizophrenia using region-of-interest and voxelwise approaches, and the association between PFC-thalamic anatomical connectivity and severity of executive function impairment was examined in patients. Anatomical connectivity between the thalamus and PFC was reduced in schizophrenia. Voxelwise analysis localized the reduction to areas of the mediodorsal thalamus connected to lateral PFC. Reduced PFC-thalamic connectivity in schizophrenia correlated with impaired working memory but not cognitive flexibility and inhibition. In contrast to reduced PFC-thalamic connectivity, thalamic connectivity with somatosensory and occipital cortices was increased in schizophrenia. The results are consistent with models implicating disrupted PFC-thalamic connectivity in the pathophysiology of schizophrenia and mechanisms of cognitive impairment. PFC-thalamic anatomical connectivity may be an important target for procognitive interventions. Further work is needed to determine the implications of increased thalamic connectivity with sensory cortex. Copyright © 2017 Society of

  9. Anatomic variables affecting interdental papilla

    Directory of Open Access Journals (Sweden)

    Swapna A. Mahale

    2013-01-01

    Full Text Available Aim: The aim of this study is to evaluate the anatomic variables affecting the interdental papilla. Materials and Methods: Thirty adult patients were evaluated. Papilla score (PS, tooth form/shape, gingival thickness, crest bone height and keratinized gingiva/attached gingiva were recorded for 150 inter proximal sites. Data were analyzed using SPSS software package (version 7.0 and the significance level was set at 95% confidence interval. Pearson′s correlation was applied to correlate the relationship between the factors and the appearance of the papilla. Results: Competent papillae (complete fill interdentally were associated with: (1 Crown width (CW: length ≥0.87; (2 bone crest-contact point ≤5 mm; and (3 inter proximal gingival tissue thickness ≥1.5 mm. Gingival thickness correlated negatively with PS (r = −0.37 to −0.54 and positively with tissue height (r = 0.23-0.43. Tooth form (i.e., CW to length ratio correlated negatively with PS (r = −0.37 to −0.61. Conclusion: Gingival papilla appearance was associated significantly with tooth form/shape, crestal bone height and interproximal gingival thickness.

  10. Efficient Sensor Placement Optimization Using Gradient Descent and Probabilistic Coverage

    Directory of Open Access Journals (Sweden)

    Vahab Akbarzadeh

    2014-08-01

    Full Text Available We are proposing an adaptation of the gradient descent method to optimize the position and orientation of sensors for the sensor placement problem. The novelty of the proposed method lies in the combination of gradient descent optimization with a realistic model, which considers both the topography of the environment and a set of sensors with directional probabilistic sensing. The performance of this approach is compared with two other black box optimization methods over area coverage and processing time. Results show that our proposed method produces competitive results on smaller maps and superior results on larger maps, while requiring much less computation than the other optimization methods to which it has been compared.

  11. SPA: a probabilistic algorithm for spliced alignment.

    Directory of Open Access Journals (Sweden)

    2006-04-01

    Full Text Available Recent large-scale cDNA sequencing efforts show that elaborate patterns of splice variation are responsible for much of the proteome diversity in higher eukaryotes. To obtain an accurate account of the repertoire of splice variants, and to gain insight into the mechanisms of alternative splicing, it is essential that cDNAs are very accurately mapped to their respective genomes. Currently available algorithms for cDNA-to-genome alignment do not reach the necessary level of accuracy because they use ad hoc scoring models that cannot correctly trade off the likelihoods of various sequencing errors against the probabilities of different gene structures. Here we develop a Bayesian probabilistic approach to cDNA-to-genome alignment. Gene structures are assigned prior probabilities based on the lengths of their introns and exons, and based on the sequences at their splice boundaries. A likelihood model for sequencing errors takes into account the rates at which misincorporation, as well as insertions and deletions of different lengths, occurs during sequencing. The parameters of both the prior and likelihood model can be automatically estimated from a set of cDNAs, thus enabling our method to adapt itself to different organisms and experimental procedures. We implemented our method in a fast cDNA-to-genome alignment program, SPA, and applied it to the FANTOM3 dataset of over 100,000 full-length mouse cDNAs and a dataset of over 20,000 full-length human cDNAs. Comparison with the results of four other mapping programs shows that SPA produces alignments of significantly higher quality. In particular, the quality of the SPA alignments near splice boundaries and SPA's mapping of the 5' and 3' ends of the cDNAs are highly improved, allowing for more accurate identification of transcript starts and ends, and accurate identification of subtle splice variations. Finally, our splice boundary analysis on the human dataset suggests the existence of a novel non

  12. Probabilistic Survivability Versus Time Modeling

    Science.gov (United States)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  13. Probabilistic cloning with supplementary information

    International Nuclear Information System (INIS)

    Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki

    2005-01-01

    We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is 2, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party

  14. Machine learning a probabilistic perspective

    CERN Document Server

    Murphy, Kevin P

    2012-01-01

    Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...

  15. Probabilistic analysis of modernization options

    International Nuclear Information System (INIS)

    Wunderlich, W.O.; Giles, J.E.

    1991-01-01

    This paper reports on benefit-cost analysis for hydropower operations, a standard procedure for reaching planning decisions. Cost overruns and benefit shortfalls are also common occurrences. One reason for the difficulty of predicting future benefits and costs is that they usually cannot be represented with sufficient reliability by accurate values, because of the many uncertainties that enter the analysis through assumptions on inputs and system parameters. Therefore, ranges of variables need to be analyzed instead of single values. As a consequence, the decision criteria, such as net benefit and benefit-cost ratio, also vary over some range. A probabilistic approach will be demonstrated as a tool for assessing the reliability of the results

  16. Probabilistic assessments of fuel performance

    International Nuclear Information System (INIS)

    Kelppe, S.; Ranta-Puska, K.

    1998-01-01

    The probabilistic Monte Carlo Method, coupled with quasi-random sampling, is applied for the fuel performance analyses. By using known distributions of fabrication parameters and real power histories with their randomly selected combinations, and by making a large number of ENIGMA code calculations, one expects to find out the state of the whole reactor fuel. Good statistics requires thousands of runs. A sample case representing VVER-440 reactor fuel indicates relatively low fuel temperatures and mainly athermal fission gas release if any. The rod internal pressure remains typically below 2.5 MPa, which leaves a large margin to the system pressure of 12 MPa Gap conductance, an essential parameter in the accident evaluations, shows no decrease from its start-of-life value. (orig.)

  17. Probabilistic Fatigue Damage Program (FATIG)

    Science.gov (United States)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  18. Probabilistic cloning of equidistant states

    International Nuclear Information System (INIS)

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  19. Probabilistic safety assessment - regulatory perspective

    International Nuclear Information System (INIS)

    Solanki, R.B.; Paul, U.K.; Hajra, P.; Agarwal, S.K.

    2002-01-01

    Full text: Nuclear power plants (NPPs) have been designed, constructed and operated mainly based on deterministic safety analysis philosophy. In this approach, a substantial amount of safety margin is incorporated in the design and operational requirements. Additional margin is incorporated by applying the highest quality engineering codes, standards and practices, and the concept of defence-in-depth in design and operating procedures, by including conservative assumptions and acceptance criteria in plant response analysis of postulated initiating events (PIEs). However, as the probabilistic approach has been improved and refined over the years, it is possible for the designer, operator and regulator to get a more detailed and realistic picture of the safety importance of plant design features, operating procedures and operational practices by using probabilistic safety assessment (PSA) along with the deterministic methodology. At present, many countries including USA, UK and France are using PSA insights in their decision making along with deterministic basis. India has also made substantial progress in the development of methods for carrying out PSA. However, consensus on the use of PSA in regulatory decision-making has not been achieved yet. This paper emphasises on the requirements (e.g.,level of details, key modelling assumptions, data, modelling aspects, success criteria, sensitivity and uncertainty analysis) for improving the quality and consistency in performance and use of PSA that can facilitate meaningful use of the PSA insights in the regulatory decision-making in India. This paper also provides relevant information on international scenario and various application areas of PSA along with progress made in India. The PSA perspective presented in this paper may help in achieving consensus on the use of PSA for regulatory / utility decision-making in design and operation of NPPs

  20. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  1. Fast algorithm for probabilistic bone edge detection (FAPBED)

    Science.gov (United States)

    Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.

    2005-04-01

    The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean

  2. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  3. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  5. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  6. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  7. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  8. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  9. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  10. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  11. Probabilistic Counterfactuals: Semantics, Computation, and Applications

    National Research Council Canada - National Science Library

    Balke, Alexander

    1997-01-01

    ... handled within the framework of standard probability theory. Starting with functional description of physical mechanisms, we were able to derive the standard probabilistic properties of Bayesian networks and to show: (1...

  12. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    user

    The probabilistic non-linear cost constraint is converted into equivalent deterministic .... Further, in a survey the costs for enumerating a character in various strata are not known exactly, rather these are being ...... Naval Research Logistics, Vol.

  13. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  14. Probabilistic Meteorological Characterization for Turbine Loads

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov

    2014-01-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface...

  15. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  16. Advanced Test Reactor probabilistic risk assessment

    International Nuclear Information System (INIS)

    Atkinson, S.A.; Eide, S.A.; Khericha, S.T.; Thatcher, T.A.

    1993-01-01

    This report discusses Level 1 probabilistic risk assessment (PRA) incorporating a full-scope external events analysis which has been completed for the Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory

  17. Probabilistic safety assessment for seismic events

    International Nuclear Information System (INIS)

    1993-10-01

    This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes

  18. Estimating software development project size, using probabilistic ...

    African Journals Online (AJOL)

    Estimating software development project size, using probabilistic techniques. ... of managing the size of software development projects by Purchasers (Clients) and Vendors (Development ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  20. Probabilistic methods in exotic option pricing

    NARCIS (Netherlands)

    Anderluh, J.H.M.

    2007-01-01

    The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.

  1. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  2. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  3. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  4. Anatomic location and somatotopic arrangement of the corticospinal tract at the cerebral peduncle in the human brain.

    Science.gov (United States)

    Kwon, H G; Hong, J H; Jang, S H

    2011-12-01

    Little is known about the detailed anatomic location and somatotopic arrangement at the CP. Using DTT with FSL tools, we conducted an investigation of the anatomic location and somatotopic arrangement of the CST at the CP in the human brain. We recruited 43 healthy volunteers for this study. DTI was obtained by using 1.5T, and CSTs for the hand and leg were obtained by using the FSL tool. The somatotopic location of the CST was evaluated as the highest probabilistic location at the upper and lower midbrain. The posterior boundary was determined as the line between the interpeduncular fossa and the lateral sulcus; we then drew a rectangle on the basis of the boundary of the CP. In the mediolateral direction, the highest probabilistic locations for the hand and leg were an average of 60.46% and 69.98% from the medial boundary at the upper midbrain level and 53.44% and 62.76% at the lower midbrain level, respectively. As for the anteroposterior direction, the highest probabilistic locations for the hand and leg were an average of 28.26% and 32.03% from the anterior boundary at the upper midbrain level and 30.19% and 33.59% at the lower midbrain level, respectively. We found that the hand somatotopy for the CST is located at the middle portion of the CP and the leg somatotopy is located lateral to the hand somatotopy.

  5. Risk assessment using probabilistic standards

    International Nuclear Information System (INIS)

    Avila, R.

    2004-01-01

    A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim

  6. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    Science.gov (United States)

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Learning probabilistic features for robotic navigation using laser sensors.

    Directory of Open Access Journals (Sweden)

    Fidel Aznar

    Full Text Available SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N to O(N(2, where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  8. Learning probabilistic features for robotic navigation using laser sensors.

    Science.gov (United States)

    Aznar, Fidel; Pujol, Francisco A; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N(2)), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  9. A quantitative comparison of the electrical and anatomical definition of the pulmonary vein ostium.

    Science.gov (United States)

    Spies, Florian; Kühne, Michael; Reichlin, Tobias; Osswald, Stefan; Sticherling, Christian; Knecht, Sven

    2017-11-01

    Anatomically guided pulmonary vein isolation (PVI) is the cornerstone of atrial fibrillation (AF) ablation. However, the position where to confirm electrical isolation is ill-defined. The aim of the current study was to quantify the relationship between the anatomical and electrical definition of the pulmonary vein ostium. We analyzed 20 patients with paroxysmal AF undergoing PVI using radiofrequency energy and an electroanatomical mapping system. The anatomical ostium was defined based on the geometry obtained from preprocedural magnetic resonance imaging and computed tomography. The electrical ostium was defined at the position with a far-field atrial signal preceding a sharp pulmonary vein (PV) signal without any isoelectric interval in between. The electrically defined ostia were 8.4 ± 4.7 mm more distal in the PV compared to the anatomically defined ostia. The distances varied considerably between the four PVs and were 10.5 ± 6.5 mm, 7.4 ± 4.3 mm, 5.3 ± 4.0 mm, and 8.3 ± 3.4 mm for the left superior, left inferior, right superior, and right inferior PVs, respectively (P  =  0.009). The position of the electrical and anatomical ostium differs markedly. The site of the electrical ostium is variable within the PV but always more distal in the PV compared to the site of the anatomical ostium. © 2017 Wiley Periodicals, Inc.

  10. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Science.gov (United States)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  11. New probabilistic interest measures for association rules

    OpenAIRE

    Hahsler, Michael; Hornik, Kurt

    2008-01-01

    Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a rea...

  12. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  13. Probabilistic cloning of three symmetric states

    International Nuclear Information System (INIS)

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  14. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  15. [Establishment of anatomical terminology in Japan].

    Science.gov (United States)

    Shimada, Kazuyuki

    2008-12-01

    The history of anatomical terminology in Japan began with the publication of Waran Naikei Ihan-teimŏ in 1805 and Chŏtei Kaitai Shinsho in 1826. Although the establishment of Japanese anatomical terminology became necessary during the Meiji era when many western anatomy books imported into Janan were translated, such terminology was not unified during this period and varied among translators. In 1871, Tsukumo Ono's Kaibŏgaku Gosen was published by the Ministry of Education. Although this book is considered to be the first anatomical glossary terms in Japan, its contents were incomplete. Overseas, the German Anatomical Society established a unified anatomical terminology in 1895 called the Basle Nomina Anatomica (B.N.A.). Based on this development, Kaibŏgaku Meishŭ which follows the BNA, by Buntarŏ Suzuki was published in 1905. With the subsequent establishment in 1935 of Jena Nomina Anatomica (J.N.A.), the unification of anatomical terminology was also accelerated in Japan, leading to the further development of terminology.

  16. Probabilistic causality and radiogenic cancers

    International Nuclear Information System (INIS)

    Groeer, P.G.

    1986-01-01

    A review and scrutiny of the literature on probability and probabilistic causality shows that it is possible under certain assumptions to estimate the probability that a certain type of cancer diagnosed in an individual exposed to radiation prior to diagnosis was caused by this exposure. Diagnosis of this causal relationship like diagnosis of any disease - malignant or not - requires always some subjective judgments by the diagnostician. It is, therefore, illusory to believe that tables based on actuarial data can provide objective estimates of the chance that a cancer diagnosed in an individual is radiogenic. It is argued that such tables can only provide a base from which the diagnostician(s) deviate in one direction or the other according to his (their) individual (consensual) judgment. Acceptance of a physician's diagnostic judgment by patients is commonplace. Similar widespread acceptance of expert judgment by claimants in radiation compensation cases does presently not exist. Judicious use of the present radioepidemiological tables prepared by the Working Group of the National Institutes of Health or of updated future versions of similar tables may improve the situation. 20 references

  17. Dynamical systems probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  18. Computing Distances between Probabilistic Automata

    Directory of Open Access Journals (Sweden)

    Mathieu Tracol

    2011-07-01

    Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.

  19. Probabilistic modeling of children's handwriting

    Science.gov (United States)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  20. Probabilistic description of traffic flow

    International Nuclear Information System (INIS)

    Mahnke, R.; Kaupuzs, J.; Lubashevsky, I.

    2005-01-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given

  1. Distribution functions of probabilistic automata

    Science.gov (United States)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  2. Probabilistic transport models for fusion

    International Nuclear Information System (INIS)

    Milligen, B.Ph. van; Carreras, B.A.; Lynch, V.E.; Sanchez, R.

    2005-01-01

    A generalization of diffusive (Fickian) transport is considered, in which particle motion is described by probability distributions. We design a simple model that includes a critical mechanism to switch between two transport channels, and show that it exhibits various interesting characteristics, suggesting that the ideas of probabilistic transport might provide a framework for the description of a range of unusual transport phenomena observed in fusion plasmas. The model produces power degradation and profile consistency, as well as a scaling of the confinement time with system size reminiscent of the gyro-Bohm/Bohm scalings observed in fusion plasmas, and rapid propagation of disturbances. In the present work we show how this model may also produce on-axis peaking of the profiles with off-axis fuelling. It is important to note that the fluid limit of a simple model like this, characterized by two transport channels, does not correspond to the usual (Fickian) transport models commonly used for modelling transport in fusion plasmas, and behaves in a fundamentally different way. (author)

  3. Prospects for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.

    1992-01-01

    This article provides some reflections on future developments of Probabilistic Safety Assessment (PSA) in view of the present state of the art and evaluates current trends in the use of PSA for safety management. The main emphasis is on Level 1 PSA, although Level 2 aspects are also highlighted to some extent. As a starting point, the role of PSA is outlined from a historical perspective, demonstrating the rapid expansion of the uses of PSA. In this context the wide spectrum of PSA applications and the associated benefits to the users are in focus. It should be kept in mind, however, that PSA, in spite of its merits, is not a self-standing safety tool. It complements deterministic analysis and thus improves understanding and facilitating prioritization of safety issues. Significant progress in handling PSA limitations - such as reliability data, common-cause failures, human interactions, external events, accident progression, containment performance, and source-term issues - is described. This forms a background for expected future developments of PSA. Among the most important issues on the agenda for the future are PSA scope extensions, methodological improvements and computer code advancements, and full exploitation of the potential benefits of applications to operational safety management. Many PSA uses, if properly exercised, lead to safety improvements as well as major burden reductions. The article provides, in addition, International Atomic Energy Agency (IAEA) perspective on the topics covered, as reflected in the current PSA programs of the agency. 74 refs., 6 figs., 1 tab

  4. Future trends in flood risk in Indonesia - A probabilistic approach

    Science.gov (United States)

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  5. Anatomical physiology of spatial extinction.

    Science.gov (United States)

    Ciçek, Metehan; Gitelman, Darren; Hurley, Robert S E; Nobre, Anna; Mesulam, Marsel

    2007-12-01

    Neurologically intact volunteers participated in a functional magnetic resonance imaging experiment that simulated the unilateral (focal) and bilateral (global) stimulations used to elicit extinction in patients with hemispatial neglect. In peristriate areas, attentional modulations were selectively sensitive to contralaterally directed attention. A higher level of mapping was observed in the intraparietal sulcus (IPS), inferior parietal lobule (IPL), and inferior frontal gyrus (IFG). In these areas, there was no distinction between contralateral and ipsilateral focal attention, and the need to distribute attention globally led to greater activity than either focal condition. These physiological characteristics were symmetrically distributed in the IPS and IFG, suggesting that the effects of unilateral lesions in these 2 areas can be compensated by the contralateral hemisphere. In the IPL, the greater activation by the bilateral attentional mode was seen only in the right hemisphere. Its contralateral counterpart displayed equivalent activations when attention was distributed to the right, to the left, or bilaterally. Within the context of this experiment, the IPL of the right hemisphere emerged as the one area where unilateral lesions can cause the most uncompensated and selective impairment of global attention (without interfering with unilateral attention to either side), giving rise to the phenomenon of extinction.

  6. Development of test algorithm for semiconductor package with defects by using probabilistic neural network

    International Nuclear Information System (INIS)

    Kim, Jae Yeol; Sim, Jae Gi; Ko, Myoung Soo; Kim, Chang Hyun; Kim, Hun Cho

    2001-01-01

    In this study, researchers developing the estimative algorithm for artificial defects in semiconductor packages and performing it by pattern recognition technology. For this purpose, the estimative algorithm was included that researchers made software with MATLAB. The software consists of some procedures including ultrasonic image acquisition, equalization filtering, Self-Organizing Map and Probabilistic Neural Network. Self-Organizing Map and Probabilistic Neural Network are belong to methods of Neural Networks. And the pattern recognition technology has applied to classify three kinds of detective patterns in semiconductor packages. This study presumes probability density function from a sample of learning and present which is automatically determine method. PNN can distinguish flaws very difficult distinction as well as. This can do parallel process to stand in a row we confirm that is very efficiently classifier if we applied many data real the process.

  7. A Probabilistic Analysis of Surface Water Flood Risk in London.

    Science.gov (United States)

    Jenkins, Katie; Hall, Jim; Glenis, Vassilis; Kilsby, Chris

    2017-10-30

    Flooding in urban areas during heavy rainfall, often characterized by short duration and high-intensity events, is known as "surface water flooding." Analyzing surface water flood risk is complex as it requires understanding of biophysical and human factors, such as the localized scale and nature of heavy precipitation events, characteristics of the urban area affected (including detailed topography and drainage networks), and the spatial distribution of economic and social vulnerability. Climate change is recognized as having the potential to enhance the intensity and frequency of heavy rainfall events. This study develops a methodology to link high spatial resolution probabilistic projections of hourly precipitation with detailed surface water flood depth maps and characterization of urban vulnerability to estimate surface water flood risk. It incorporates probabilistic information on the range of uncertainties in future precipitation in a changing climate. The method is applied to a case study of Greater London and highlights that both the frequency and spatial extent of surface water flood events are set to increase under future climate change. The expected annual damage from surface water flooding is estimated to be to be £171 million, £343 million, and £390 million/year under the baseline, 2030 high, and 2050 high climate change scenarios, respectively. © 2017 Society for Risk Analysis.

  8. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  9. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  10. A high-resolution probabilistic in vivo atlas of human subcortical brain nuclei.

    Science.gov (United States)

    Pauli, Wolfgang M; Nili, Amanda N; Tyszka, J Michael

    2018-04-17

    Recent advances in magnetic resonance imaging methods, including data acquisition, pre-processing and analysis, have benefited research on the contributions of subcortical brain nuclei to human cognition and behavior. At the same time, these developments have led to an increasing need for a high-resolution probabilistic in vivo anatomical atlas of subcortical nuclei. In order to address this need, we constructed high spatial resolution, three-dimensional templates, using high-accuracy diffeomorphic registration of T 1 - and T 2 - weighted structural images from 168 typical adults between 22 and 35 years old. In these templates, many tissue boundaries are clearly visible, which would otherwise be impossible to delineate in data from individual studies. The resulting delineations of subcortical nuclei complement current histology-based atlases. We further created a companion library of software tools for atlas development, to offer an open and evolving resource for the creation of a crowd-sourced in vivo probabilistic anatomical atlas of the human brain.

  11. Holographic Transformation, Belief Propagation and Loop Calculus for Generalized Probabilistic Theories

    OpenAIRE

    Mori, Ryuhei

    2015-01-01

    The holographic transformation, belief propagation and loop calculus are generalized to problems in generalized probabilistic theories including quantum mechanics. In this work, the partition function of classical factor graph is represented by an inner product of two high-dimensional vectors both of which can be decomposed to tensor products of low-dimensional vectors. On the representation, the holographic transformation is clearly understood by using adjoint linear maps. Furthermore, on th...

  12. The probabilistic innovation theoretical framework

    Directory of Open Access Journals (Sweden)

    Chris W. Callaghan

    2017-07-01

    Full Text Available Background: Despite technological advances that offer new opportunities for solving societal problems in real time, knowledge management theory development has largely not kept pace with these developments. This article seeks to offer useful insights into how more effective theory development in this area could be enabled. Aim: This article suggests different streams of literature for inclusion into a theoretical framework for an emerging stream of research, termed ‘probabilistic innovation’, which seeks to develop a system of real-time research capability. The objective of this research is therefore to provide a synthesis of a range of diverse literatures, and to provide useful insights into how research enabled by crowdsourced research and development can potentially be used to address serious knowledge problems in real time. Setting: This research suggests that knowledge management theory can provide an anchor for a new stream of research contributing to the development of real-time knowledge problem solving. Methods: This conceptual article seeks to re-conceptualise the problem of real-time research and locate this knowledge problem in relation to a host of rapidly developing streams of literature. In doing so, a novel perspective of societal problem-solving is enabled. Results: An analysis of theory and literature suggests that certain rapidly developing streams of literature might more effectively contribute to societally important real-time research problem solving if these steams are united under a theoretical framework with this goal as its explicit focus. Conclusion: Although the goal of real-time research is as yet not attainable, research that contributes to its attainment may ultimately make an important contribution to society.

  13. Anatomical eponyms - unloved names in medical terminology.

    Science.gov (United States)

    Burdan, F; Dworzański, W; Cendrowska-Pinkosz, M; Burdan, M; Dworzańska, A

    2016-01-01

    Uniform international terminology is a fundamental issue of medicine. Names of various organs or structures have developed since early human history. The first proper anatomical books were written by Hippocrates, Aristotle and Galen. For this reason the modern terms originated from Latin or Greek. In a modern time the terminology was improved in particular by Vasalius, Fabricius and Harvey. Presently each known structure has internationally approved term that is explained in anatomical or histological terminology. However, some elements received eponyms, terms that incorporate the surname of the people that usually describe them for the first time or studied them (e.g., circle of Willis, follicle of Graff, fossa of Sylvious, foramen of Monro, Adamkiewicz artery). Literature and historical hero also influenced medical vocabulary (e.g. Achilles tendon and Atlas). According to various scientists, all the eponyms bring colour to medicine, embed medical traditions and culture to our history but lack accuracy, lead of confusion, and hamper scientific discussion. The current article presents a wide list of the anatomical eponyms with their proper anatomical term or description according to international anatomical terminology. However, since different eponyms are used in various countries, the list could be expanded.

  14. Determining customer satisfaction in anatomic pathology.

    Science.gov (United States)

    Zarbo, Richard J

    2006-05-01

    Measurement of physicians' and patients' satisfaction with laboratory services has become a standard practice in the United States, prompted by national accreditation requirements. Unlike other surveys of hospital-, outpatient care-, or physician-related activities, no ongoing, comprehensive customer satisfaction survey of anatomic pathology services is available for subscription that would allow continual benchmarking against peer laboratories. Pathologists, therefore, must often design their own local assessment tools to determine physician satisfaction in anatomic pathology. To describe satisfaction survey design that would elicit specific information from physician customers about key elements of anatomic pathology services. The author shares his experience in biannually assessing customer satisfaction in anatomic pathology with survey tools designed at the Henry Ford Hospital, Detroit, Mich. Benchmarks for physician satisfaction, opportunities for improvement, and characteristics that correlated with a high level of physician satisfaction were identified nationally from a standardized survey tool used by 94 laboratories in the 2001 College of American Pathologists Q-Probes quality improvement program. In general, physicians are most satisfied with professional diagnostic services and least satisfied with pathology services related to poor communication. A well-designed and conducted customer satisfaction survey is an opportunity for pathologists to periodically educate physician customers about services offered, manage unrealistic expectations, and understand the evolving needs of the physician customer. Armed with current information from physician customers, the pathologist is better able to strategically plan for resources that facilitate performance improvements in anatomic pathology laboratory services that align with evolving clinical needs in health care delivery.

  15. Probabilistic modeling of caprock leakage from seismic reflection data

    DEFF Research Database (Denmark)

    Zunino, Andrea; Hansen, Thomas Mejer; Bergjofd-Kitterød, Ingjerd

    We illustrate a methodology which helps to perform a leakage risk analysis for a CO2 reservoir based on a consistent, probabilistic approach to geophysical and geostatistical inversion. Generally, risk assessments of storage complexes are based on geological models and simulations of CO2 movement...... within the storage complexes. The geological models are built on top of geophysical data such as seismic surveys, geological information and well logs from the reservoir or nearby regions. The risk assessment of CO2 storage requires a careful analysis which accounts for all sources of uncertainty....... However, at present, no well-defined and consistent method for mapping the true uncertainty related to the geophysical data and how that uncertainty affects the overall risk assessment for the potential storage site is available. To properly quantify the uncertainties and to avoid unrealistic...

  16. Posterolateral supporting structures of the knee: findings on anatomic dissection, anatomic slices and MR images

    Energy Technology Data Exchange (ETDEWEB)

    Maeseneer, M. de; Shahabpour, M.; Vanderdood, K.; Ridder, F. de; Osteaux, M. [Dept. of Radiology, Free Univ. Brussels (Belgium); Roy, F. van [Dept. of Experimental Anatomy, Free Univ. Brussels (Belgium)

    2001-11-01

    In this article we study the ligaments and tendons of the posterolateral corner of the knee by anatomic dissection, MR-anatomic correlation, and MR imaging. The posterolateral aspect of two fresh cadaveric knee specimens was dissected. The MR-anatomic correlation was performed in three other specimens. The MR images of 122 patients were reviewed and assessed for the visualization of different posterolateral structures. Anatomic dissection and MR-anatomic correlation demonstrated the lateral collateral, fabellofibular, and arcuate ligaments, as well as the biceps and popliteus tendons. On MR images of patients the lateral collateral ligament was depicted in all cases. The fabellofibular, arcuate, and popliteofibular ligaments were visualized in 33, 25, and 38% of patients, respectively. Magnetic resonance imaging allows a detailed appreciation of the posterolateral corner of the knee. (orig.)

  17. Probabilistic diffusion tractography reveals improvement of structural network in musicians.

    Directory of Open Access Journals (Sweden)

    Jianfu Li

    Full Text Available PURPOSE: Musicians experience a large amount of information transfer and integration of complex sensory, motor, and auditory processes when training and playing musical instruments. Therefore, musicians are a useful model in which to investigate neural adaptations in the brain. METHODS: Here, based on diffusion-weighted imaging, probabilistic tractography was used to determine the architecture of white matter anatomical networks in musicians and non-musicians. Furthermore, the features of the white matter networks were analyzed using graph theory. RESULTS: Small-world properties of the white matter network were observed in both groups. Compared with non-musicians, the musicians exhibited significantly increased connectivity strength in the left and right supplementary motor areas, the left calcarine fissure and surrounding cortex and the right caudate nucleus, as well as a significantly larger weighted clustering coefficient in the right olfactory cortex, the left medial superior frontal gyrus, the right gyrus rectus, the left lingual gyrus, the left supramarginal gyrus, and the right pallidum. Furthermore, there were differences in the node betweenness centrality in several regions. However, no significant differences in topological properties were observed at a global level. CONCLUSIONS: We illustrated preliminary findings to extend the network level understanding of white matter plasticity in musicians who have had long-term musical training. These structural, network-based findings may indicate that musicians have enhanced information transmission efficiencies in local white matter networks that are related to musical training.

  18. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  19. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  20. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  1. Lacrimal Gland Pathologies from an Anatomical Perspective

    Directory of Open Access Journals (Sweden)

    Mahmut Sinan Abit

    2015-06-01

    Full Text Available Most of the patients in our daily practice have one or more ocular surface disorders including conjucntivitis, keratitis, dry eye disease, meibomian gland dysfunction, contact lens related symptoms, refractive errors,computer vision syndrome. Lacrimal gland has an important role in all above mentioned pathologies due to its major secretory product. An anatomical and physiological knowledge about lacrimal gland is a must in understanding basic and common ophthalmological cases. İn this paper it is aimed to explain the lacrimal gland diseases from an anatomical perspective.

  2. COMICS: Cartoon Visualization of Omics Data in Spatial Context Using Anatomical Ontologies.

    Science.gov (United States)

    Travin, Dmitrii; Popov, Iaroslav; Guler, Arzu Tugce; Medvedev, Dmitry; van der Plas-Duivesteijn, Suzanne; Varela, Monica; Kolder, Iris C R M; Meijer, Annemarie H; Spaink, Herman P; Palmblad, Magnus

    2018-01-05

    COMICS is an interactive and open-access web platform for integration and visualization of molecular expression data in anatomograms of zebrafish, carp, and mouse model systems. Anatomical ontologies are used to map omics data across experiments and between an experiment and a particular visualization in a data-dependent manner. COMICS is built on top of several existing resources. Zebrafish and mouse anatomical ontologies with their controlled vocabulary (CV) and defined hierarchy are used with the ontoCAT R package to aggregate data for comparison and visualization. Libraries from the QGIS geographical information system are used with the R packages "maps" and "maptools" to visualize and interact with molecular expression data in anatomical drawings of the model systems. COMICS allows users to upload their own data from omics experiments, using any gene or protein nomenclature they wish, as long as CV terms are used to define anatomical regions or developmental stages. Common nomenclatures such as the ZFIN gene names and UniProt accessions are provided additional support. COMICS can be used to generate publication-quality visualizations of gene and protein expression across experiments. Unlike previous tools that have used anatomical ontologies to interpret imaging data in several animal models, including zebrafish, COMICS is designed to take spatially resolved data generated by dissection or fractionation and display this data in visually clear anatomical representations rather than large data tables. COMICS is optimized for ease-of-use, with a minimalistic web interface and automatic selection of the appropriate visual representation depending on the input data.

  3. Seismic hazard maps for Haiti

    Science.gov (United States)

    Frankel, Arthur; Harmsen, Stephen; Mueller, Charles; Calais, Eric; Haase, Jennifer

    2011-01-01

    We have produced probabilistic seismic hazard maps of Haiti for peak ground acceleration and response spectral accelerations that include the hazard from the major crustal faults, subduction zones, and background earthquakes. The hazard from the Enriquillo-Plantain Garden, Septentrional, and Matheux-Neiba fault zones was estimated using fault slip rates determined from GPS measurements. The hazard from the subduction zones along the northern and southeastern coasts of Hispaniola was calculated from slip rates derived from GPS data and the overall plate motion. Hazard maps were made for a firm-rock site condition and for a grid of shallow shear-wave velocities estimated from topographic slope. The maps show substantial hazard throughout Haiti, with the highest hazard in Haiti along the Enriquillo-Plantain Garden and Septentrional fault zones. The Matheux-Neiba Fault exhibits high hazard in the maps for 2% probability of exceedance in 50 years, although its slip rate is poorly constrained.

  4. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  5. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  6. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  7. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  8. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  9. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  10. A general XML schema and SPM toolbox for storage of neuro-imaging results and anatomical labels.

    Science.gov (United States)

    Keator, David Bryant; Gadde, Syam; Grethe, Jeffrey S; Taylor, Derek V; Potkin, Steven G

    2006-01-01

    With the increased frequency of multisite, large-scale collaborative neuro-imaging studies, the need for a general, self-documenting framework for the storage and retrieval of activation maps and anatomical labels becomes evident. To address this need, we have developed and extensible markup language (XML) schema and associated tools for the storage of neuro-imaging activation maps and anatomical labels. This schema, as part of the XML-based Clinical Experiment Data Exchange (XCEDE) schema, provides storage capabilities for analysis annotations, activation threshold parameters, and cluster and voxel-level statistics. Activation parameters contain information describing the threshold, degrees of freedom, FWHM smoothness, search volumes, voxel sizes, expected voxels per cluster, and expected number of clusters in the statistical map. Cluster and voxel statistics can be stored along with the coordinates, threshold, and anatomical label information. Multiple threshold types can be documented for a given cluster or voxel along with the uncorrected and corrected probability values. Multiple atlases can be used to generate anatomical labels and stored for each significant voxel or cluter. Additionally, a toolbox for Statistical Parametric Mapping software (http://www. fil. ion.ucl.ac.uk/spm/) was created to capture the results from activation maps using the XML schema that supports both SPM99 and SPM2 versions (http://nbirn.net/Resources/Users/ Applications/xcede/SPM_XMLTools.htm). Support for anatomical labeling is available via the Talairach Daemon (http://ric.uthscsa. edu/projects/talairachdaemon.html) and Automated Anatomical Labeling (http://www. cyceron.fr/freeware/).

  11. Probabilistic Damage Stability Calculations for Ships

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    1996-01-01

    The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...... compartments in the formulation of a computer-based general procedure for probabilistic damaged stability assessment is shown. Some comments are given on the current state of knowledge on the ship survivability in damaged conditions. Finally, problems regarding proper account of water ingress through openings...

  12. Quantum logic networks for probabilistic teleportation

    Institute of Scientific and Technical Information of China (English)

    刘金明; 张永生; 等

    2003-01-01

    By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.

  13. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  14. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1988-01-01

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  15. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  16. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  17. Probabilistic calculation of dose commitment from uranium mill tailings

    International Nuclear Information System (INIS)

    1983-10-01

    The report discusses in a general way considerations of uncertainty in relation to probabilistic modelling. An example of a probabilistic calculation applied to the behaviour of uranium mill tailings is given

  18. Probabilistic inversion in priority setting of emerging zoonoses.

    NARCIS (Netherlands)

    Kurowicka, D.; Bucura, C.; Cooke, R.; Havelaar, A.H.

    2010-01-01

    This article presents methodology of applying probabilistic inversion in combination with expert judgment in priority setting problem. Experts rank scenarios according to severity. A linear multi-criteria analysis model underlying the expert preferences is posited. Using probabilistic inversion, a

  19. Application of probabilistic seismic hazard models with special calculation for the waste storage sites in Egypt

    International Nuclear Information System (INIS)

    Othman, A.A.; El-Hemamy, S.T.

    2000-01-01

    Probabilistic strong motion maps of Egypt are derived by applying Gumbel models and likelihood method to 8 earthquake source zones in Egypt and adjacent regions. Peak horizontal acceleration is mapped. Seismic data are collected from Helwan Catalog (1900-1997), regional catalog of earthquakes from the International Seismological Center (ISC,1910-1993) and earthquake data reports of US Department of International Geological Survey (USCGS, 1900-1994). Iso-seismic maps are also available for some events, which occurred in Egypt. Some earthquake source zones are well defined on the basis of both tectonics and average seismicity rates, but a lack of understanding of the near field effects of the large earthquakes prohibits accurate estimates of ground motion in their vicinity. Some source zones have no large-scale crustal features or zones of weakness that can explain the seismicity and must, therefore, be defined simply as concentrations of seismic activity with no geological or geophysical controls on the boundaries. Other source zones lack information on low-magnitude seismicity that would be representative of longer periods of time. Comparisons of the new probabilistic ground motion estimates in Egypt with equivalent estimates made in 1990 have been done. The new ground motion estimates are used to produce a new peak ground acceleration map to replace the 1990 peak acceleration zoning maps in the Building code of Egypt. (author)

  20. Magnetic resonance angiography: infrequent anatomic variants

    International Nuclear Information System (INIS)

    Trejo, Mariano; Meli, Francisco; Lambre, Hector; Blessing, Ricardo; Gigy Traynor, Ignacio; Miguez, Victor

    2002-01-01

    We studied through RM angiography (3D TOF) with high magnetic field equipment (1.5 T) different infrequent intracerebral vascular anatomic variants. For their detection we emphasise the value of post-processed images obtained after conventional angiographic sequences. These post-processed images should be included in routine protocols for evaluation of the intracerebral vascular structures. (author)

  1. Report of a rare anatomic variant

    DEFF Research Database (Denmark)

    De Brucker, Y; Ilsen, B; Muylaert, C

    2015-01-01

    We report the CT findings in a case of partial anomalous pulmonary venous return (PAPVR) from the left upper lobe in an adult. PAPVR is an anatomic variant in which one to three pulmonary veins drain into the right atrium or its tributaries, rather than into the left atrium. This results in a left...

  2. HPV Vaccine Effective at Multiple Anatomic Sites

    Science.gov (United States)

    A new study from NCI researchers finds that the HPV vaccine protects young women from infection with high-risk HPV types at the three primary anatomic sites where persistent HPV infections can cause cancer. The multi-site protection also was observed at l

  3. TIBIAL LANDMARKS IN ACL ANATOMIC REPAIR

    Directory of Open Access Journals (Sweden)

    M. V. Demesсhenko

    2016-01-01

    Full Text Available Purpose: to identify anatomical landmarks on tibial articular surface to serve as reference in preparing tibial canal with respect to the center of ACL footprint during single bundle arthroscopic repair.Materials and methods. Twelve frozen knee joint specimens and 68 unpaired macerated human tibia were studied using anatomical, morphometric, statistical methods as well as graphic simulation.Results. Center of the tibial ACL footprint was located 13,1±1,7 mm anteriorly from posterior border of intercondylar eminence, at 1/3 of the distance along the line connecting apexes of internal and external tubercles and 6,1±0,5 mm anteriorly along the perpendicular raised to this point.Conclusion. Internal and external tubercles, as well as posterior border of intercondylar eminence can be considered as anatomical references to determine the center of the tibial ACL footprint and to prepare bone canals for anatomic ligament repair.

  4. Influences on anatomical knowledge: The complete arguments

    NARCIS (Netherlands)

    Bergman, E.M.; Verheijen, I.W.; Scherpbier, A.J.J.A.; Vleuten, C.P.M. van der; Bruin, A.B. De

    2014-01-01

    Eight factors are claimed to have a negative influence on anatomical knowledge of medical students: (1) teaching by nonmedically qualified teachers, (2) the absence of a core anatomy curriculum, (3) decreased use of dissection as a teaching tool, (4) lack of teaching anatomy in context, (5)

  5. Evolution of the Anatomical Theatre in Padova

    Science.gov (United States)

    Macchi, Veronica; Porzionato, Andrea; Stecco, Carla; Caro, Raffaele

    2014-01-01

    The anatomical theatre played a pivotal role in the evolution of medical education, allowing students to directly observe and participate in the process of dissection. Due to the increase of training programs in clinical anatomy, the Institute of Human Anatomy at the University of Padova has renovated its dissecting room. The main guidelines in…

  6. MR urography: Anatomical and quantitative information on ...

    African Journals Online (AJOL)

    Background and Aim: Magnetic resonance urography (MRU) is considered to be the next step in uroradiology. This technique combines superb anatomical images and functional information in a single test. In this article, we aim to present the topic of MRU in children and how it has been implemented in Northern Greece so ...

  7. Anatomically Plausible Surface Alignment and Reconstruction

    DEFF Research Database (Denmark)

    Paulsen, Rasmus R.; Larsen, Rasmus

    2010-01-01

    With the increasing clinical use of 3D surface scanners, there is a need for accurate and reliable algorithms that can produce anatomically plausible surfaces. In this paper, a combined method for surface alignment and reconstruction is proposed. It is based on an implicit surface representation...

  8. Handbook of anatomical models for radiation dosimetry

    CERN Document Server

    Eckerman, Keith F

    2010-01-01

    Covering the history of human model development, this title presents the major anatomical and physical models that have been developed for human body radiation protection, diagnostic imaging, and nuclear medicine therapy. It explores how these models have evolved and the role that modern technologies have played in this development.

  9. Anatomical characteristics of southern pine stemwood

    Science.gov (United States)

    Elaine T. Howard; Floyd G. Manwiller

    1968-01-01

    To obtain a definitive description of the wood and anatomy of all 10 species of southern pine, juvenile, intermediate, and mature wood was sampled at three heights in one tree of each species and examined under a light microscope. Photographs and three-dimensional drawings were made to illustrate the morphology. No significant anatomical differences were found...

  10. Prostatome: A combined anatomical and disease based MRI atlas of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Rusu, Mirabela; Madabhushi, Anant, E-mail: anant.madabhushi@case.edu [Case Western Reserve University, Cleveland, Ohio 44106 (United States); Bloch, B. Nicolas; Jaffe, Carl C. [Boston University School of Medicine, Boston, Massachusetts 02118 (United States); Genega, Elizabeth M. [Beth Israel Deaconess Medical Center, Boston, Massachusetts 02215 (United States); Lenkinski, Robert E.; Rofsky, Neil M. [UT Southwestern Medical Center, Dallas, Texas 75235 (United States); Feleppa, Ernest [Riverside Research Institute, New York, New York 10038 (United States)

    2014-07-15

    Purpose: In this work, the authors introduce a novel framework, the anatomically constrained registration (AnCoR) scheme and apply it to create a fused anatomic-disease atlas of the prostate which the authors refer to as the prostatome. The prostatome combines a MRI based anatomic and a histology based disease atlas. Statistical imaging atlases allow for the integration of information across multiple scales and imaging modalities into a single canonical representation, in turn enabling a fused anatomical-disease representation which may facilitate the characterization of disease appearance relative to anatomic structures. While statistical atlases have been extensively developed and studied for the brain, approaches that have attempted to combine pathology and imaging data for study of prostate pathology are not extant. This works seeks to address this gap. Methods: The AnCoR framework optimizes a scoring function composed of two surface (prostate and central gland) misalignment measures and one intensity-based similarity term. This ensures the correct mapping of anatomic regions into the atlas, even when regional MRI intensities are inconsistent or highly variable between subjects. The framework allows for creation of an anatomic imaging and a disease atlas, while enabling their fusion into the anatomic imaging-disease atlas. The atlas presented here was constructed using 83 subjects with biopsy confirmed cancer who had pre-operative MRI (collected at two institutions) followed by radical prostatectomy. The imaging atlas results from mapping thein vivo MRI into the canonical space, while the anatomic regions serve as domain constraints. Elastic co-registration MRI and corresponding ex vivo histology provides “ground truth” mapping of cancer extent on in vivo imaging for 23 subjects. Results: AnCoR was evaluated relative to alternative construction strategies that use either MRI intensities or the prostate surface alone for registration. The AnCoR framework

  11. Prostatome: A combined anatomical and disease based MRI atlas of the prostate

    International Nuclear Information System (INIS)

    Rusu, Mirabela; Madabhushi, Anant; Bloch, B. Nicolas; Jaffe, Carl C.; Genega, Elizabeth M.; Lenkinski, Robert E.; Rofsky, Neil M.; Feleppa, Ernest

    2014-01-01

    Purpose: In this work, the authors introduce a novel framework, the anatomically constrained registration (AnCoR) scheme and apply it to create a fused anatomic-disease atlas of the prostate which the authors refer to as the prostatome. The prostatome combines a MRI based anatomic and a histology based disease atlas. Statistical imaging atlases allow for the integration of information across multiple scales and imaging modalities into a single canonical representation, in turn enabling a fused anatomical-disease representation which may facilitate the characterization of disease appearance relative to anatomic structures. While statistical atlases have been extensively developed and studied for the brain, approaches that have attempted to combine pathology and imaging data for study of prostate pathology are not extant. This works seeks to address this gap. Methods: The AnCoR framework optimizes a scoring function composed of two surface (prostate and central gland) misalignment measures and one intensity-based similarity term. This ensures the correct mapping of anatomic regions into the atlas, even when regional MRI intensities are inconsistent or highly variable between subjects. The framework allows for creation of an anatomic imaging and a disease atlas, while enabling their fusion into the anatomic imaging-disease atlas. The atlas presented here was constructed using 83 subjects with biopsy confirmed cancer who had pre-operative MRI (collected at two institutions) followed by radical prostatectomy. The imaging atlas results from mapping thein vivo MRI into the canonical space, while the anatomic regions serve as domain constraints. Elastic co-registration MRI and corresponding ex vivo histology provides “ground truth” mapping of cancer extent on in vivo imaging for 23 subjects. Results: AnCoR was evaluated relative to alternative construction strategies that use either MRI intensities or the prostate surface alone for registration. The AnCoR framework

  12. Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.

    1989-11-01

    A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs

  13. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  14. A common fixed point for operators in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.

    2009-01-01

    Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.

  15. ANATOMIC STRUCTURE OF CAMPANULA ROTUNDIFOLIA L. GRASS

    Directory of Open Access Journals (Sweden)

    V. N. Bubenchikova

    2017-01-01

    Full Text Available The article present results of the study for a anatomic structure of Campanula rotundifolia grass from Campanulaceae family. Despite its dispersion and application in folk medicine, there are no data about its anatomic structure, therefore to estimate the indices of authenticity and quality of raw materials it is necessary to develop microdiagnostical features in the first place, which could help introducing of thisplant in a medical practice. The purpose of this work is to study anatomical structureof Campanula rotundifolia grass to determine its diagnostic features. Methods. Thestudy for anatomic structure was carried out in accordance with the requirements of State Pharmacopoeia, edition XIII. Micromed laboratory microscope with digital adjutage was used to create microphotoes, Photoshop CC was used for their processing. Result. We have established that stalk epidermis is prosenchymal, slightly winding with straight of splayed end cells. After study for the epidermis cells we established that upper epidermis cells had straight walls and are slightly winding. The cells of lower epidermishave more winding walls with prolong wrinkled cuticule. Presence of simple one-cell, thin wall, rough papillose hair on leaf and stalk epidermis. Cells of epidermis in fauces of corolla are prosenchymal, with winding walls, straight or winding walls in a cup. Papillary excrescences can be found along the cup edges. Stomatal apparatus is anomocytic. Conclusion. As the result of the study we have carried out the research for Campanula rotundifolia grass anatomic structure, and determined microdiagnostic features for determination of raw materials authenticity, which included presence of simple, one-cell, thin-walled, rough papillose hair on both epidermises of a leaf, along the veins, leaf edge, and stalk epidermis, as well as the presence of epidermis cells with papillary excrescences along the edges of leaves and cups. Intercellular canals are situatedalong the

  16. Statistical methods in physical mapping

    International Nuclear Information System (INIS)

    Nelson, D.O.

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work

  17. Statistical methods in physical mapping

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, David O. [Univ. of California, Berkeley, CA (United States)

    1995-05-01

    One of the great success stories of modern molecular genetics has been the ability of biologists to isolate and characterize the genes responsible for serious inherited diseases like fragile X syndrome, cystic fibrosis and myotonic muscular dystrophy. This dissertation concentrates on constructing high-resolution physical maps. It demonstrates how probabilistic modeling and statistical analysis can aid molecular geneticists in the tasks of planning, execution, and evaluation of physical maps of chromosomes and large chromosomal regions. The dissertation is divided into six chapters. Chapter 1 provides an introduction to the field of physical mapping, describing the role of physical mapping in gene isolation and ill past efforts at mapping chromosomal regions. The next two chapters review and extend known results on predicting progress in large mapping projects. Such predictions help project planners decide between various approaches and tactics for mapping large regions of the human genome. Chapter 2 shows how probability models have been used in the past to predict progress in mapping projects. Chapter 3 presents new results, based on stationary point process theory, for progress measures for mapping projects based on directed mapping strategies. Chapter 4 describes in detail the construction of all initial high-resolution physical map for human chromosome 19. This chapter introduces the probability and statistical models involved in map construction in the context of a large, ongoing physical mapping project. Chapter 5 concentrates on one such model, the trinomial model. This chapter contains new results on the large-sample behavior of this model, including distributional results, asymptotic moments, and detection error rates. In addition, it contains an optimality result concerning experimental procedures based on the trinomial model. The last chapter explores unsolved problems and describes future work.

  18. Decomposing the Hounsfield unit: probabilistic segmentation of brain tissue in computed tomography.

    Science.gov (United States)

    Kemmling, A; Wersching, H; Berger, K; Knecht, S; Groden, C; Nölte, I

    2012-03-01

    The aim of this study was to present and evaluate a standardized technique for brain segmentation of cranial computed tomography (CT) using probabilistic partial volume tissue maps based on a database of high resolution T1 magnetic resonance images (MRI). Probabilistic tissue maps of white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) were derived from 600 normal brain MRIs (3.0 Tesla, T1-3D-turbo-field-echo) of 2 large community-based population studies (BiDirect and SEARCH Health studies). After partial tissue segmentation (FAST 4.0), MR images were linearly registered to MNI-152 standard space (FLIRT 5.5) with non-linear refinement (FNIRT 1.0) to obtain non-binary probabilistic volume images for each tissue class which were subsequently used for CT segmentation. From 150 normal cerebral CT scans a customized reference image in standard space was constructed with iterative non-linear registration to MNI-152 space. The inverse warp of tissue-specific probability maps to CT space (MNI-152 to individual CT) was used to decompose a CT image into tissue specific components (GM, WM, CSF). Potential benefits and utility of this novel approach with regard to unsupervised quantification of CT images and possible visual enhancement are addressed. Illustrative examples of tissue segmentation in different pathological cases including perfusion CT are presented. Automated tissue segmentation of cranial CT images using highly refined tissue probability maps derived from high resolution MR images is feasible. Potential applications include automated quantification of WM in leukoaraiosis, CSF in hydrocephalic patients, GM in neurodegeneration and ischemia and perfusion maps with separate assessment of GM and WM.

  19. Probabilistic seismic hazard estimates incorporating site effects - An example from Indiana, U.S.A

    Science.gov (United States)

    Hasse, J.S.; Park, C.H.; Nowack, R.L.; Hill, J.R.

    2010-01-01

    The U.S. Geological Survey (USGS) has published probabilistic earthquake hazard maps for the United States based on current knowledge of past earthquake activity and geological constraints on earthquake potential. These maps for the central and eastern United States assume standard site conditions with Swave velocities of 760 m/s in the top 30 m. For urban and infrastructure planning and long-term budgeting, the public is interested in similar probabilistic seismic hazard maps that take into account near-surface geological materials. We have implemented a probabilistic method for incorporating site effects into the USGS seismic hazard analysis that takes into account the first-order effects of the surface geologic conditions. The thicknesses of sediments, which play a large role in amplification, were derived from a P-wave refraction database with over 13, 000 profiles, and a preliminary geology-based velocity model was constructed from available information on S-wave velocities. An interesting feature of the preliminary hazard maps incorporating site effects is the approximate factor of two increases in the 1-Hz spectral acceleration with 2 percent probability of exceedance in 50 years for parts of the greater Indianapolis metropolitan region and surrounding parts of central Indiana. This effect is primarily due to the relatively thick sequence of sediments infilling ancient bedrock topography that has been deposited since the Pleistocene Epoch. As expected, the Late Pleistocene and Holocene depositional systems of the Wabash and Ohio Rivers produce additional amplification in the southwestern part of Indiana. Ground motions decrease, as would be expected, toward the bedrock units in south-central Indiana, where motions are significantly lower than the values on the USGS maps.

  20. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  1. Probabilistic analysis of tokamak plasma disruptions

    International Nuclear Information System (INIS)

    Sanzo, D.L.; Apostolakis, G.E.

    1985-01-01

    An approximate analytical solution to the heat conduction equations used in modeling component melting and vaporization resulting from plasma disruptions is presented. This solution is then used to propagate uncertainties in the input data characterizing disruptions, namely, energy density and disruption time, to obtain a probabilistic description of the output variables of interest, material melted and vaporized. (orig.)

  2. Strong Ideal Convergence in Probabilistic Metric Spaces

    Indian Academy of Sciences (India)

    In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...

  3. Quantum Probabilistic Dyadic Second-Order Logic

    NARCIS (Netherlands)

    Baltag, A.; Bergfeld, J.M.; Kishida, K.; Sack, J.; Smets, S.J.L.; Zhong, S.; Libkin, L.; Kohlenbach, U.; de Queiroz, R.

    2013-01-01

    We propose an expressive but decidable logic for reasoning about quantum systems. The logic is endowed with tensor operators to capture properties of composite systems, and with probabilistic predication formulas P  ≥ r (s), saying that a quantum system in state s will yield the answer ‘yes’ (i.e.

  4. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  5. Probabilistic Programming : A True Verification Challenge

    NARCIS (Netherlands)

    Katoen, Joost P.; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun

    2015-01-01

    Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a

  6. Probabilistic calculation for angular dependence collision

    International Nuclear Information System (INIS)

    Villarino, E.A.

    1990-01-01

    This collision probabilistic method is broadly used in cylindrical geometry (in one- or two-dimensions). It constitutes a powerful tool for the heterogeneous Response Method where, the coupling current is of the cosine type, that is, without angular dependence at azimuthal angle θ and proportional to μ (cosine of the θ polar angle). (Author) [es

  7. Probabilistic safety assessment in radioactive waste disposal

    International Nuclear Information System (INIS)

    Robinson, P.C.

    1987-07-01

    Probabilistic safety assessment codes are now widely used in radioactive waste disposal assessments. This report gives an overview of the current state of the field. The relationship between the codes and the regulations covering radioactive waste disposal is discussed and the characteristics of current codes is described. The problems of verification and validation are considered. (author)

  8. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...

  9. Probabilistic fuzzy systems as additive fuzzy systems

    NARCIS (Netherlands)

    Almeida, R.J.; Verbeek, N.; Kaymak, U.; Costa Sousa, da J.M.; Laurent, A.; Strauss, O.; Bouchon-Meunier, B.; Yager, R.

    2014-01-01

    Probabilistic fuzzy systems combine a linguistic description of the system behaviour with statistical properties of data. It was originally derived based on Zadeh’s concept of probability of a fuzzy event. Two possible and equivalent additive reasoning schemes were proposed, that lead to the

  10. Probabilistic studies for a safety assurance program

    International Nuclear Information System (INIS)

    Iyer, S.S.; Davis, J.F.

    1985-01-01

    The adequate supply of energy is always a matter of concern for any country. Nuclear power has played, and will continue to play an important role in supplying this energy. However, safety in nuclear power production is a fundamental prerequisite in fulfilling this role. This paper outlines a program to ensure safe operation of a nuclear power plant utilizing the Probabilistic Safety Studies

  11. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  12. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  13. Ambient Surveillance by Probabilistic-Possibilistic Perception

    NARCIS (Netherlands)

    Bittermann, M.S.; Ciftcioglu, O.

    2013-01-01

    A method for quantifying ambient surveillance is presented, which is based on probabilistic-possibilistic perception. The human surveillance of a scene through observing camera sensed images on a monitor is modeled in three steps. First immersion of the observer is simulated by modeling perception

  14. HERMES probabilistic risk assessment. Pilot study

    International Nuclear Information System (INIS)

    Parisot, F.; Munoz, J.

    1993-01-01

    The study was performed in 1989 of the contribution of probabilistic analysis for the optimal construction of system safety status in aeronautical and European nuclear industries, shows the growing trends towards incorporation of quantitative safety assessment and lead to an agreement to undertake a prototype proof study on Hermes. The main steps of the study and results are presented in the paper

  15. Some probabilistic properties of fractional point processes

    KAUST Repository

    Garra, Roberto; Orsingher, Enzo; Scavino, Marco

    2017-01-01

    P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features

  16. Strong Statistical Convergence in Probabilistic Metric Spaces

    OpenAIRE

    Şençimen, Celaleddin; Pehlivan, Serpil

    2008-01-01

    In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.

  17. Effectiveness of Securities with Fuzzy Probabilistic Return

    Directory of Open Access Journals (Sweden)

    Krzysztof Piasecki

    2011-01-01

    Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract

  18. Dialectical Multivalued Logic and Probabilistic Theory

    Directory of Open Access Journals (Sweden)

    José Luis Usó Doménech

    2017-02-01

    Full Text Available There are two probabilistic algebras: one for classical probability and the other for quantum mechanics. Naturally, it is the relation to the object that decides, as in the case of logic, which algebra is to be used. From a paraconsistent multivalued logic therefore, one can derive a probability theory, adding the correspondence between truth value and fortuity.

  19. Revisiting the formal foundation of Probabilistic Databases

    NARCIS (Netherlands)

    Wanders, B.; van Keulen, Maurice

    2015-01-01

    One of the core problems in soft computing is dealing with uncertainty in data. In this paper, we revisit the formal foundation of a class of probabilistic databases with the purpose to (1) obtain data model independence, (2) separate metadata on uncertainty and probabilities from the raw data, (3)

  20. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...

  1. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...

  2. Probabilistic safety assessment goals in Canada

    International Nuclear Information System (INIS)

    Snell, V.G.

    1986-01-01

    CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting

  3. Overview of the probabilistic risk assessment approach

    International Nuclear Information System (INIS)

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  4. Probabilistic safety goals. Phase 3 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2009-07-01

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  5. Probabilistic Relational Structures and Their Applications

    Science.gov (United States)

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  6. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Andova, S.; Georgievska, S.; Trcka, N.

    2012-01-01

    A notion of branching bisimilarity for the alternating model of probabilistic systems, compatible with parallel composition, is defined. For a congruence result, an internal transition immediately followed by a non-trivial probability distribution is not considered inert. A weaker definition of

  7. On Probabilistic Automata in Continuous Time

    DEFF Research Database (Denmark)

    Eisentraut, Christian; Hermanns, Holger; Zhang, Lijun

    2010-01-01

    We develop a compositional behavioural model that integrates a variation of probabilistic automata into a conservative extension of interactive Markov chains. The model is rich enough to embody the semantics of generalised stochastic Petri nets. We define strong and weak bisimulations and discuss...

  8. Bisimulations Meet PCTL Equivalences for Probabilistic Automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2011-01-01

    Probabilistic automata (PA) [20] have been successfully applied in the formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on PCTL [11] and its extension PCTL∗ [4...

  9. Searching Algorithms Implemented on Probabilistic Systolic Arrays

    Czech Academy of Sciences Publication Activity Database

    Kramosil, Ivan

    1996-01-01

    Roč. 25, č. 1 (1996), s. 7-45 ISSN 0308-1079 R&D Projects: GA ČR GA201/93/0781 Keywords : searching algorithms * probabilistic algorithms * systolic arrays * parallel algorithms Impact factor: 0.214, year: 1996

  10. Financial Markets Analysis by Probabilistic Fuzzy Modelling

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)

    2003-01-01

    textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno

  11. Towards decision making via expressive probabilistic ontologies

    NARCIS (Netherlands)

    Acar, Erman; Thorne, Camilo; Stuckenschmidt, Heiner

    2015-01-01

    © Springer International Publishing Switzerland 2015. We propose a framework for automated multi-attribute deci- sion making, employing the probabilistic non-monotonic description log- ics proposed by Lukasiewicz in 2008. Using this framework, we can model artificial agents in decision-making

  12. The Probabilistic Nature of Preferential Choice

    Science.gov (United States)

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  13. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  14. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  15. Improved transformer protection using probabilistic neural network ...

    African Journals Online (AJOL)

    This article presents a novel technique to distinguish between magnetizing inrush current and internal fault current of power transformer. An algorithm has been developed around the theme of the conventional differential protection method in which parallel combination of Probabilistic Neural Network (PNN) and Power ...

  16. Financial markets analysis by probabilistic fuzzy modelling

    NARCIS (Netherlands)

    Berg, van den J.; Kaymak, U.; Bergh, van den W.M.

    2003-01-01

    For successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (TS)

  17. Probabilistic solution of the Dirac equation

    International Nuclear Information System (INIS)

    Blanchard, P.; Combe, P.

    1985-01-01

    Various probabilistic representations of the 2, 3 and 4 dimensional Dirac equation are given in terms of expectation with respect to stochastic jump processes and are used to derive the nonrelativistic limit even in the presence of an external electromagnetic field. (orig.)

  18. Mastering probabilistic graphical models using Python

    CERN Document Server

    Ankan, Ankur

    2015-01-01

    If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.

  19. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    Science.gov (United States)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015

  20. Probabilistic Seismic Hazard Assessment for Northeast India Region

    Science.gov (United States)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  1. A multi-subject evaluation of uncertainty in anatomical landmark location on shoulder kinematic description.

    Science.gov (United States)

    Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J

    2009-04-01

    An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.

  2. Probabilistic seismic hazard assessment of southern part of Ghana

    Science.gov (United States)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2018-05-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  3. Probabilistic seismic hazard assessment of southern part of Ghana

    Science.gov (United States)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2017-12-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  4. Multibeam 3D Underwater SLAM with Probabilistic Registration

    Directory of Open Access Journals (Sweden)

    Albert Palomer

    2016-04-01

    Full Text Available This paper describes a pose-based underwater 3D Simultaneous Localization and Mapping (SLAM using a multibeam echosounder to produce high consistency underwater maps. The proposed algorithm compounds swath profiles of the seafloor with dead reckoning localization to build surface patches (i.e., point clouds. An Iterative Closest Point (ICP with a probabilistic implementation is then used to register the point clouds, taking into account their uncertainties. The registration process is divided in two steps: (1 point-to-point association for coarse registration and (2 point-to-plane association for fine registration. The point clouds of the surfaces to be registered are sub-sampled in order to decrease both the computation time and also the potential of falling into local minima during the registration. In addition, a heuristic is used to decrease the complexity of the association step of the ICP from O ( n 2 to O ( n . The performance of the SLAM framework is tested using two real world datasets: First, a 2.5D bathymetric dataset obtained with the usual down-looking multibeam sonar configuration, and second, a full 3D underwater dataset acquired with a multibeam sonar mounted on a pan and tilt unit.

  5. MAPS of Cancer

    Science.gov (United States)

    Gray, Lincoln

    1998-01-01

    Our goal was to produce an interactive visualization from a mathematical model that successfully predicts metastases from head and neck cancer. We met this goal early in the project. The visualization is available for the public to view. Our work appears to fill a need for more information about this deadly disease. The idea of this project was to make an easily interpretable visualization based on what we call "functional maps" of disease. A functional map is a graphic summary of medical data, where distances between parts of the body are determined by the probability of disease, not by anatomical distances. Functional maps often beat little resemblance to anatomical maps, but they can be used to predict the spread of disease. The idea of modeling the spread of disease in an abstract multidimensional space is difficult for many people. Our goal was to make the important predictions easy to see. NASA must face this problem frequently: how to help laypersons and professionals see important trends in abstract, complex data. We took advantage of concepts perfected in NASA's graphics libraries. As an analogy, consider a functional map of early America. Suppose we choose travel times, rather than miles, as our measures of inter-city distances. For Abraham Lincoln, travel times would have been the more meaningful measure of separation between cities. In such a map New Orleans would be close to Memphis because of the Mississippi River. St. Louis would be close to Portland because of the Oregon Trail. Oklahoma City would be far from Little Rock because of the Cheyenne. Such a map would look puzzling to those of us who have always seen physical maps, but the functional map would be more useful in predicting the probabilities of inter-site transit. Continuing the analogy, we could predict the spread of social diseases such as gambling along the rivers and cattle rustling along the trails. We could simply print the functional map of America, but it would be more interesting

  6. Anatomically corrected transposition of great vessels

    International Nuclear Information System (INIS)

    Ivanitskij, A.V.; Sarkisova, T.N.

    1989-01-01

    The paper is concerned with the description of rare congenital heart disease: anatomically corrected malposition of major vessels in a 9-mos 24 day old girl. The diagnosis of this disease was shown on the results of angiocardiography, concomitant congenital heart diseases were descibed. This abnormality is characterized by common atrioventricular and ventriculovascular joints and inversion position of the major vessels, it is always attended by congenital heart diseases. Surgical intervention is aimed at the elimination of concomitant heart dieseases

  7. Multiaxial probabilistic elastic-plastic constitutive simulations of soils

    Science.gov (United States)

    Sadrinezhad, Arezoo

    Fokker-Planck-Kolmogorov (FPK) equation approach has recently been developed to simulate elastic-plastic constitutive behaviors of materials with uncertain material properties. The FPK equation approach transforms the stochastic constitutive rate equation, which is a stochastic, nonlinear, ordinary differential equation (ODE) in the stress-pseudo time space into a second-order accurate, deterministic, linear FPK partial differential equation (PDE) in the probability density of stress-pseudo time space. This approach does not suffer from the drawbacks of the traditional approaches such as the Monte Carlo approach and the perturbation approach for solving nonlinear ODEs with random coefficients. In this study, the existing one dimensional FPK framework for probabilistic constitutive modeling of soils is extended to multi--dimension. However, the multivariate FPK PDEs cannot be solved using the traditional mathematical techniques such as finite difference techniques due to their high computational cost. Therefore, computationally efficient algorithms based on the Fourier spectral approach are developed for solving a class of FPK PDEs that arises in probabilistic elasto-plasticity. This class includes linear FPK PDEs in (stress) space and (pseudo) time - having space-independent but time-dependent, and both space- and time-dependent coefficients - with impulse initial conditions and reflecting boundary conditions. The solution algorithms, rely on first mapping the stress space of the governing PDE between 0 and 2pi using the change of coordinates rule, followed by approximating the solution of the PDE in the 2pi-periodic domain by a finite Fourier series in the stress space and unknown time-dependent solution coefficients. Finally, the time-dependent solution coefficients are obtained from the initial condition. The accuracy and efficiency of the developed algorithms are tested. The developed algorithms are used to simulate uniaxial and multiaxial, monotonic and cyclic

  8. Exploring brain function from anatomical connectivity

    Directory of Open Access Journals (Sweden)

    Gorka eZamora-López

    2011-06-01

    Full Text Available The intrinsic relationship between the architecture of the brain and the range of sensory and behavioral phenomena it produces is a relevant question in neuroscience. Here, we review recent knowledge gained on the architecture of the anatomical connectivity by means of complex network analysis. It has been found that corticocortical networks display a few prominent characteristics: (i modular organization, (ii abundant alternative processing paths and (iii the presence of highly connected hubs. Additionally, we present a novel classification of cortical areas of the cat according to the role they play in multisensory connectivity. All these properties represent an ideal anatomical substrate supporting rich dynamical behaviors, as-well-as facilitating the capacity of the brain to process sensory information of different modalities segregated and to integrate them towards a comprehensive perception of the real world. The result here exposed are mainly based in anatomical data of cats’ brain, but we show how further observations suggest that, from worms to humans, the nervous system of all animals might share fundamental principles of organization.

  9. Anatomic variation of cranial parasympathetic ganglia

    Directory of Open Access Journals (Sweden)

    Selma Siéssere

    2008-06-01

    Full Text Available Having broad knowledge of anatomy is essential for practicing dentistry. Certain anatomical structures call for detailed studies due to their anatomical and functional importance. Nevertheless, some structures are difficult to visualize and identify due to their small volume and complicated access. Such is the case of the parasympathetic ganglia located in the cranial part of the autonomic nervous system, which include: the ciliary ganglion (located deeply in the orbit, laterally to the optic nerve, the pterygopalatine ganglion (located in the pterygopalatine fossa, the submandibular ganglion (located laterally to the hyoglossus muscle, below the lingual nerve, and the otic ganglion (located medially to the mandibular nerve, right beneath the oval foramen. The aim of this study was to present these structures in dissected anatomic specimens and perform a comparative analysis regarding location and morphology. The proximity of the ganglia and associated nerves were also analyzed, as well as the number and volume of fibers connected to them. Human heads were dissected by planes, partially removing the adjacent structures to the point we could reach the parasympathetic ganglia. With this study, we concluded that there was no significant variation regarding the location of the studied ganglia. Morphologically, our observations concur with previous classical descriptions of the parasympathetic ganglia, but we observed variations regarding the proximity of the otic ganglion to the mandibular nerve. We also observed that there were variations regarding the number and volume of fiber bundles connected to the submandibular, otic, and pterygopalatine ganglia.

  10. Laryngeal spaces and lymphatics: current anatomic concepts

    International Nuclear Information System (INIS)

    Welsh, L.W.; Welsh, J.J.; Rizzo, T.A. Jr.

    1983-01-01

    This investigation evaluates the anatomic concepts of individual spaces or compartments within the larynx by isotope and dye diffusion. The authors identified continuity of spaces particularly within the submucosal planes and a relative isolation within the fixed structures resulting from the longitudinal pattern of fibroelastic tissues, muscle bands, and perichondrium. The historical data of anatomic resistance are refuted by the radioisotope patterns of dispersion and the histologic evidence of tissue permeability to the carbon particles. There is little clinical application of the compartment concept to the perimeter of growth and the configuration of extensive endolaryngeal cancers. The internal and extralaryngeal lymphatic network is presented and the regional associations are identified. The normal ipsilateral relationship is distorted by dispersion within the endolarynx supervening the anatomic midline. The effects of lymphatic obstruction caused by regional lymphadenectomy, tumor fixation, and irradiation-infection sequelae are illustrated; these result in widespread bilateral lymphatic nodal terminals. Finally, the evidence suggests that the internal network is modified by external interruption to accommodate an outflow system in continuity with the residual patent lymphatic channels

  11. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Directory of Open Access Journals (Sweden)

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  12. Introduction: Hazard mapping

    Science.gov (United States)

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  13. Value of a probabilistic atlas in medical image segmentation regarding non-rigid registration of abdominal CT scans

    Science.gov (United States)

    Park, Hyunjin; Meyer, Charles R.

    2012-10-01

    A probabilistic atlas provides important information to help segmentation and registration applications in medical image analysis. We construct a probabilistic atlas by picking a target geometry and mapping other training scans onto that target and then summing the results into one probabilistic atlas. By choosing an atlas space close to the desired target, we construct an atlas that represents the population well. Image registration used to map one image geometry onto another is a primary task in atlas building. One of the main parameters of registration is the choice of degrees of freedom (DOFs) of the geometric transform. Herein, we measure the effect of the registration's DOFs on the segmentation performance of the resulting probabilistic atlas. Twenty-three normal abdominal CT scans were used, and four organs (liver, spinal cord, left and right kidneys) were segmented for each scan. A well-known manifold learning method, ISOMAP, was used to find the best target space to build an atlas. In summary, segmentation performance was high for high DOF registrations regardless of the chosen target space, while segmentation performance was lowered for low DOF registrations if a target space was far from the best target space. At the 0.05 level of statistical significance, there were no significant differences at high DOF registrations while there were significant differences at low DOF registrations when choosing different targets.

  14. A methodology for reviewing probabilistic risk assessments

    International Nuclear Information System (INIS)

    Derby, S.L.

    1983-01-01

    The starting point for peer review of a Probabilistic Risk Assessment (PRA) is a clear understanding of how the risk estimate was prepared and of what contributions dominate the calculation. The problem facing the reviewers is how to cut through the complex details of a PRA to gain this understanding. This paper presents a structured, analytical procedure that solves this problem. The effectiveness of this solution is demonstrated by an application on the Zion Probabilistic Safety Study. The procedure found the three dominant initiating events and provided a simplified reconstruction of the calculation of the risk estimate. Significant assessments of uncertainty were also identified. If peer review disputes the accuracy of these judgments, then the revised risk estimate could significantly increase

  15. A probabilistic model of RNA conformational space

    DEFF Research Database (Denmark)

    Frellsen, Jes; Moltke, Ida; Thiim, Martin

    2009-01-01

    , the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows...... conformations for 9 out of 10 test structures, solely using coarse-grained base-pairing information. In conclusion, the method provides a theoretical and practical solution for a major bottleneck on the way to routine prediction and simulation of RNA structure and dynamics in atomic detail.......The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling...

  16. Generalized probabilistic scale space for image restoration.

    Science.gov (United States)

    Wong, Alexander; Mishra, Akshaya K

    2010-10-01

    A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.

  17. Probabilistic cloning and deleting of quantum states

    International Nuclear Information System (INIS)

    Feng Yuan; Zhang Shengyu; Ying Mingsheng

    2002-01-01

    We construct a probabilistic cloning and deleting machine which, taking several copies of an input quantum state, can output a linear superposition of multiple cloning and deleting states. Since the machine can perform cloning and deleting in a single unitary evolution, the probabilistic cloning and other cloning machines proposed in the previous literature can be thought of as special cases of our machine. A sufficient and necessary condition for successful cloning and deleting is presented, and it requires that the copies of an arbitrarily presumed number of the input states are linearly independent. This simply generalizes some results for cloning. We also derive an upper bound for the success probability of the cloning and deleting machine

  18. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  19. A probabilistic model of RNA conformational space

    DEFF Research Database (Denmark)

    Frellsen, Jes; Moltke, Ida; Thiim, Martin

    2009-01-01

    efficient sampling of RNA conformations in continuous space, and with associated probabilities. We show that the model captures several key features of RNA structure, such as its rotameric nature and the distribution of the helix lengths. Furthermore, the model readily generates native-like 3-D......, the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows......The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling...

  20. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  1. Probabilistic precursor analysis - an application of PSA

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Gopika, V.; Sanyasi Rao, V.V.S.; Vaze, K.K.

    2011-01-01

    Incidents are inevitably part of the operational life of any complex industrial facility, and it is hard to predict how various contributing factors combine to cause the outcome. However, it should be possible to detect the existence of latent conditions that, together with the triggering failure(s), result in abnormal events. These incidents are called precursors. Precursor study, by definition, focuses on how a particular event might have adversely developed. This paper focuses on the events which can be analyzed to assess their potential to develop into core damage situation and looks into extending Probabilistic Safety Assessment techniques to precursor studies and explains the benefits through a typical case study. A preliminary probabilistic precursor analysis has been carried out for a typical NPP. The major advantages of this approach are the strong potential for augmenting event analysis which is currently carried out purely on deterministic basis. (author)

  2. Probabilistic Analysis of Gas Turbine Field Performance

    Science.gov (United States)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  3. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  4. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  5. Reassessment of probabilistic seismic hazard in the Marmara region

    Science.gov (United States)

    Kalkan, Erol; Gulkan, Polat; Yilmaz, Nazan; Çelebi, Mehmet

    2009-01-01

    In 1999, the eastern coastline of the Marmara region (Turkey) witnessed increased seismic activity on the North Anatolian fault (NAF) system with two damaging earthquakes (M 7.4 Kocaeli and M 7.2 D??zce) that occurred almost three months apart. These events have reduced stress on the western segment of the NAF where it continues under the Marmara Sea. The undersea fault segments have been recently explored using bathymetric and reflection surveys. These recent findings helped scientists to understand the seismotectonic environment of the Marmara basin, which has remained a perplexing tectonic domain. On the basis of collected new data, seismic hazard of the Marmara region is reassessed using a probabilistic approach. Two different earthquake source models: (1) the smoothed-gridded seismicity model and (2) fault model and alternate magnitude-frequency relations, Gutenberg-Richter and characteristic, were used with local and imported ground-motion-prediction equations. Regional exposure is computed and quantified on a set of hazard maps that provide peak horizontal ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 sec on uniform firm-rock site condition (760 m=sec average shear wave velocity in the upper 30 m). These acceleration levels were computed for ground motions having 2% and 10% probabilities of exceedance in 50 yr, corresponding to return periods of about 2475 and 475 yr, respectively. The maximum PGA computed (at rock site) is 1.5g along the fault segments of the NAF zone extending into the Marmara Sea. The new maps generally show 10% to 15% increase for PGA, 0.2 and 1.0 sec spectral acceleration values across much of Marmara compared to previous regional hazard maps. Hazard curves and smooth design spectra for three site conditions: rock, soil, and soft-soil are provided for the Istanbul metropolitan area as possible tools in future risk estimates.

  6. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  7. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  8. Probabilistic Bandwidth Assignment in Wireless Sensor Networks

    OpenAIRE

    Khan , Dawood; Nefzi , Bilel; Santinelli , Luca; Song , Ye-Qiong

    2012-01-01

    International audience; With this paper we offer an insight in designing and analyzing wireless sensor networks in a versatile manner. Our framework applies probabilistic and component-based design principles for the wireless sensor network modeling and consequently analysis; while maintaining flexibility and accuracy. In particular, we address the problem of allocating and reconfiguring the available bandwidth. The framework has been successfully implemented in IEEE 802.15.4 using an Admissi...

  9. Probabilistic real-time contingency ranking method

    International Nuclear Information System (INIS)

    Mijuskovic, N.A.; Stojnic, D.

    2000-01-01

    This paper describes a real-time contingency method based on a probabilistic index-expected energy not supplied. This way it is possible to take into account the stochastic nature of the electric power system equipment outages. This approach enables more comprehensive ranking of contingencies and it is possible to form reliability cost values that can form the basis for hourly spot price calculations. The electric power system of Serbia is used as an example for the method proposed. (author)

  10. A probabilistic approach to crack instability

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  11. A probabilistic maintenance model for diesel engines

    Science.gov (United States)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  12. Probabilistic theism and the problem of evil

    Directory of Open Access Journals (Sweden)

    Dariusz Łukasiewicz

    2017-01-01

    Full Text Available I would like to present in the article an “omnipotence model of a theodicy of chance”, which is, as I believe, compatible with the view called probabilistic theism. I also would like to argue that this model satisfies the criteria of being a good theodicy. By a good theodicy I mean a reasonable and plausible theistic account of evil. A good theodicy should be: a comprehensive, b adequate, c authentic and d existentially relevant.

  13. Probabilistic studies for safety at optimum cost

    International Nuclear Information System (INIS)

    Pitner, P.

    1999-01-01

    By definition, the risk of failure of very reliable components is difficult to evaluate. How can the best strategies for in service inspection and maintenance be defined to limit this risk to an acceptable level at optimum cost? It is not sufficient to design structures with margins, it is also essential to understand how they age. The probabilistic approach has made it possible to develop well proven concepts. (author)

  14. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  15. Characterizing the topology of probabilistic biological networks.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Biological interactions are often uncertain events, that may or may not take place with some probability. This uncertainty leads to a massive number of alternative interaction topologies for each such network. The existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. In this paper, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. Using our mathematical representation, we develop a method that can accurately describe the degree distribution of such networks. We also take one more step and extend our method to accurately compute the joint-degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. Our method works quickly even for entire protein-protein interaction (PPI) networks. It also helps us find an adequate mathematical model using MLE. We perform a comparative study of node-degree and joint-degree distributions in two types of biological networks: the classical deterministic networks and the more flexible probabilistic networks. Our results confirm that power-law and log-normal models best describe degree distributions for both probabilistic and deterministic networks. Moreover, the inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected. We also show that probabilistic networks are more robust for node-degree distribution computation than the deterministic ones. all the data sets used, the software

  16. Failure probabilistic model of CNC lathes

    International Nuclear Information System (INIS)

    Wang Yiqiang; Jia Yazhou; Yu Junyi; Zheng Yuhua; Yi Shangfeng

    1999-01-01

    A field failure analysis of computerized numerical control (CNC) lathes is described. Field failure data was collected over a period of two years on approximately 80 CNC lathes. A coding system to code failure data was devised and a failure analysis data bank of CNC lathes was established. The failure position and subsystem, failure mode and cause were analyzed to indicate the weak subsystem of a CNC lathe. Also, failure probabilistic model of CNC lathes was analyzed by fuzzy multicriteria comprehensive evaluation

  17. Insights gained through probabilistic risk assessments

    International Nuclear Information System (INIS)

    Hitchler, M.J.; Burns, N.L.; Liparulo, N.J.; Mink, F.J.

    1987-01-01

    The insights gained through a comparison of seven probabilistic risk assessments (PRA) studies (Italian PUN, Sizewell B, Ringhals 2, Millstone 3, Zion 1 and 2, Oconee 3, and Seabrook) included insights regarding the adequacy of the PRA technology utilized in the studies and the potential areas for improvement and insights regarding the adequacy of plant designs and how PRA has been utilized to enhance the design and operation of nuclear power plants

  18. Towards probabilistic synchronisation of local controllers

    Czech Academy of Sciences Publication Activity Database

    Herzallah, R.; Kárný, Miroslav

    2017-01-01

    Roč. 48, č. 3 (2017), s. 604-615 ISSN 0020-7721 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : cooperative control * optimal control * complex system s * stochastic system s * fully probabilistic desing Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 2.285, year: 2016

  19. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  20. Probabilistic Models for Solar Particle Events

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  1. Probabilistic safety analysis and interpretation thereof

    International Nuclear Information System (INIS)

    Steininger, U.; Sacher, H.

    1999-01-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [de

  2. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  3. Applications of probabilistic techniques at NRC

    International Nuclear Information System (INIS)

    Thadani, A.; Rowsome, F.; Speis, T.

    1984-01-01

    The NRC is currently making extensive use of probabilistic safety assessment in the reactor regulation. Most of these applications have been introduced in the regulatory activities in the past few years. Plant Probabilistic Safety Studies are being utilized as a design tool for applications for standard designs and for assessment of plants located in regions of particularly high population density. There is considerable motivation for licenses to perform plant-specific probabilistic studies for many, if not all, of the existing operating nuclear power plants as a tool for prioritizing the implementation of the many outstanding licensing actions of these plants as well as recommending the elimination of a number of these issues which are judged to be insignificant in terms of their contribution to safety and risk. Risk assessment perspectives are being used in the priorization of generic safety issues, development of technical resolution of unresolved safety issues, assessing safety significance of proposed new regulatory requirements, assessment of safety significance of some of the occurrences at operating facilities and in environmental impact analyses of license applicants as required by the National Environmental Policy Act. (orig.)

  4. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  5. Contrasting Connectivity of the Vim and Vop Nuclei of the Motor Thalamus Demonstrated by Probabilistic Tractography

    DEFF Research Database (Denmark)

    Hyam, Jonathan A; Owen, Sarah L F; Kringelbach, Morten L.

    2011-01-01

    BACKGROUND:: Targeting of the motor thalamus for the treatment of tremor has traditionally been achieved by a combination of anatomical atlases and neuro-imaging, intra-operative clinical assessment, and physiological recordings. OBJECTIVE:: To evaluate whether thalamic nuclei targeted in tremor...... surgery could be identified by virtue of their differing connections using non-invasive neuro-imaging, thereby providing an extra factor to aid successful targeting. METHODS:: Diffusion tensor tractography was performed in seventeen healthy control subjects using diffusion data acquired at 1.5T magnetic...... resonance imaging (60 directions, b-value=1000 s/mm, 2x2x2 mm voxels). The ventralis intermedius (Vim) and ventralis oralis posterior (Vop) nuclei were identified by a stereotactic neurosurgeon and these sites were used as seeds for probabilistic tractography. The expected cortical connections...

  6. Exploring the human body space: A geographical information system based anatomical atlas

    Directory of Open Access Journals (Sweden)

    Antonio Barbeito

    2016-06-01

    Full Text Available Anatomical atlases allow mapping the anatomical structures of the human body. Early versions of these systems consisted of analogical representations with informative text and labeled images of the human body. With computer systems, digital versions emerged and the third and fourth dimensions were introduced. Consequently, these systems increased their efficiency, allowing more realistic visualizations with improved interactivity and functionality. The 4D atlases allow modeling changes over time on the structures represented. The anatomical atlases based on geographic information system (GIS environments allow the creation of platforms with a high degree of interactivity and new tools to explore and analyze the human body. In this study we expand the functions of a human body representation system by creating new vector data, topology, functions, and an improved user interface. The new prototype emulates a 3D GIS with a topological model of the human body, replicates the information provided by anatomical atlases, and provides a higher level of functionality and interactivity. At this stage, the developed system is intended to be used as an educational tool and integrates into the same interface the typical representations of surface and sectional atlases.

  7. Familial intracranial aneurysms: is anatomic vulnerability heritable?

    Science.gov (United States)

    Mackey, Jason; Brown, Robert D; Moomaw, Charles J; Hornung, Richard; Sauerbeck, Laura; Woo, Daniel; Foroud, Tatiana; Gandhi, Dheeraj; Kleindorfer, Dawn; Flaherty, Matthew L; Meissner, Irene; Anderson, Craig; Rouleau, Guy; Connolly, E Sander; Deka, Ranjan; Koller, Daniel L; Abruzzo, Todd; Huston, John; Broderick, Joseph P

    2013-01-01

    Previous studies have suggested that family members with intracranial aneurysms (IAs) often harbor IAs in similar anatomic locations. IA location is important because of its association with rupture. We tested the hypothesis that anatomic susceptibility to IA location exists using a family-based IA study. We identified all affected probands and first-degree relatives (FDRs) with a definite or probable phenotype in each family. We stratified each IA of the probands by major arterial territory and calculated each family's proband-FDR territory concordance and overall contribution to the concordance analysis. We then matched each family unit to an unrelated family unit selected randomly with replacement and performed 1001 simulations. The median concordance proportions, odds ratios (ORs), and P values from the 1001 logistic regression analyses were used to represent the final results of the analysis. There were 323 family units available for analysis, including 323 probands and 448 FDRs, with a total of 1176 IAs. IA territorial concordance was higher in the internal carotid artery (55.4% versus 45.6%; OR, 1.54 [1.04-2.27]; P=0.032), middle cerebral artery (45.8% versus 30.5%; OR, 1.99 [1.22-3.22]; P=0.006), and vertebrobasilar system (26.6% versus 11.3%; OR, 2.90 [1.05-8.24], P=0.04) distributions in the true family compared with the comparison family. Concordance was also higher when any location was considered (53.0% versus 40.7%; OR, 1.82 [1.34-2.46]; PIA development, we found that IA territorial concordance was higher when probands were compared with their own affected FDRs than with comparison FDRs, which suggests that anatomic vulnerability to IA formation exists. Future studies of IA genetics should consider stratifying cases by IA location.

  8. Chronic ankle instability: Arthroscopic anatomical repair.

    Science.gov (United States)

    Arroyo-Hernández, M; Mellado-Romero, M; Páramo-Díaz, P; García-Lamas, L; Vilà-Rico, J

    Ankle sprains are one of the most common injuries. Despite appropriate conservative treatment, approximately 20-40% of patients continue to have chronic ankle instability and pain. In 75-80% of cases there is an isolated rupture of the anterior talofibular ligament. A retrospective observational study was conducted on 21 patients surgically treated for chronic ankle instability by means of an arthroscopic anatomical repair, between May 2012 and January 2013. There were 15 men and 6 women, with a mean age of 30.43 years (range 18-48). The mean follow-up was 29 months (range 25-33). All patients were treated by arthroscopic anatomical repair of anterior talofibular ligament. Four (19%) patients were found to have varus hindfoot deformity. Associated injuries were present in 13 (62%) patients. There were 6 cases of osteochondral lesions, 3 cases of posterior ankle impingement syndrome, and 6 cases of peroneal pathology. All these injuries were surgically treated in the same surgical time. A clinical-functional study was performed using the American Orthopaedic Foot and Ankle Society (AOFAS) score. The mean score before surgery was 66.12 (range 60-71), and after surgery it increased up to a mean of 96.95 (range 90-100). All patients were able to return to their previous sport activity within a mean of 21.5 weeks (range 17-28). Complications were found in 3 (14%) patients. Arthroscopic anatomical ligament repair technique has excellent clinical-functional results with a low percentage of complications, and enables patients to return to their previous sport activity within a short period of time. Copyright © 2016 SECOT. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Optimization of abdominal fat quantification on CT imaging through use of standardized anatomic space: A novel approach

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Yubing; Udupa, Jayaram K., E-mail: jay@mail.med.upenn.edu [Department of Radiology, Medical Image Processing Group, University of Pennsylvania, Philadelphia, Pennsylvania 19104-6021 (United States); Torigian, Drew A. [Department of Radiology, University of Pennsylvania, Philadelphia, Pennsylvania 19104-6021 (United States)

    2014-06-15

    Purpose: The quantification of body fat plays an important role in the study of numerous diseases. It is common current practice to use the fat area at a single abdominal computed tomography (CT) slice as a marker of the body fat content in studying various disease processes. This paper sets out to answer three questions related to this issue which have not been addressed in the literature. At what single anatomic slice location do the areas of subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT) estimated from the slice correlate maximally with the corresponding fat volume measures? How does one ensure that the slices used for correlation calculation from different subjects are at the same anatomic location? Are there combinations of multiple slices (not necessarily contiguous) whose area sum correlates better with volume than does single slice area with volume? Methods: The authors propose a novel strategy for mapping slice locations to a standardized anatomic space so that same anatomic slice locations are identified in different subjects. The authors then study the volume-to-area correlations and determine where they become maximal. To address the third issue, the authors carry out similar correlation studies by utilizing two and three slices for calculating area sum. Results: Based on 50 abdominal CT data sets, the proposed mapping achieves significantly improved consistency of anatomic localization compared to current practice. Maximum correlations are achieved at different anatomic locations for SAT and VAT which are both different from the L4-L5 junction commonly utilized currently for single slice area estimation as a marker. Conclusions: The maximum area-to-volume correlation achieved is quite high, suggesting that it may be reasonable to estimate body fat by measuring the area of fat from a single anatomic slice at the site of maximum correlation and use this as a marker. The site of maximum correlation is not at L4-L5 as commonly assumed

  10. Talocalcaneal luxation: an anatomic and clinical study

    International Nuclear Information System (INIS)

    Gorse, M.J.; Purinton, P.T.; Penwick, R.C.; Aron, D.N.; Roberts, R.E.

    1990-01-01

    Talocalcaneal luxation in dogs was studied by anatomic dissection of the talocalcaneal joint in cadavers and review of five clinical cases. The integrity of the talocalcaneal joint was maintained by two strong ligaments traversing the tarsal sinus between the two bones. The joint was found to be a low motion joint. Luxation in clinical cases was not always apparent on standard radiographic views. Three dogs were treated surgically with a screw inserted in lag fashion from talus to calcaneus. One luxation was treated surgically with figure-of-eight orthopedic wires and one was treated with external coaptation. Four dogs returned to their previous levels of function without clinically detectable lameness

  11. Embryologic and anatomic basis of inguinal herniorrhaphy.

    Science.gov (United States)

    Skandalakis, J E; Colborn, G L; Androulakis, J A; Skandalakis, L J; Pemberton, L B

    1993-08-01

    The embryology and surgical anatomy of the inguinal area is presented with emphasis on embryologic and anatomic entities related to surgery. We have presented the factors, such as patent processus vaginalis and defective posterior wall of the inguinal canal, that may be responsible for the genesis of congenital inguinofemoral herniation. These, together with impaired collagen synthesis and trauma, are responsible for the formation of the acquired inguinofemoral hernia. Still, we do not have all the answers for an ideal repair. Despite the latest successes in repair, we, to paraphrase Ritsos, are awaiting the triumphant return of Theseus.

  12. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  13. [Anatomical study of men's nipple areola complex].

    Science.gov (United States)

    Vaucher, R; Dast, S; Assaf, N; Sinna, R

    2016-06-01

    The surgical approach of gynecomastia, sexual reassignment surgery in female-to-male transsexuals and the increase of number of obese wishing to turn to plastic surgery led us to deepen the anatomical knowledge of the nipple areola complex (NAC) in men, poorly retailed in the literature. By inspiring us of the methodology of a Japanese study, we studied 50 healthy volunteers male, from 18 to 55 years old, from July till August 2015. We measured various distances relative to the NAC to define its vertical and horizontal position, as well as the internipple distance according to the size, to the weight and to the body mass index (BMI). At the end of the analysis, we were able to underline a lower vertical thoracic position of the NAC in the tall category of person, a more side horizontal position to the subject presenting a high BMI and a linear relation between the BMI and the internipple (Em) defined by (Em)=8.96×BMI. The surgeon's judgment and the desires of the patient are essentials basis of therapeutics decisions that could be lean on this anatomical study, which allowed to establish an idea of the cartography of the NAC in man. It will be interesting and necessary to confront it with other studies with larger scale. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  14. On probabilistic forecasting of wind power time-series

    DEFF Research Database (Denmark)

    Pinson, Pierre

    power dynamics. In both cases, the model parameters are adaptively and recursively estimated, time-adaptativity being the result of exponential forgetting of past observations. The probabilistic forecasting methodology is applied at the Horns Rev wind farm in Denmark, for 10-minute ahead probabilistic...... forecasting of wind power generation. Probabilistic forecasts generated from the proposed methodology clearly have higher skill than those obtained from a classical Gaussian assumption about wind power predictive densities. Corresponding point forecasts also exhibit significantly lower error criteria....

  15. Volume 2. Probabilistic analysis of HTGR application studies. Supporting data

    International Nuclear Information System (INIS)

    1980-09-01

    Volume II, Probabilistic Analysis of HTGR Application Studies - Supporting Data, gives the detail data, both deterministic and probabilistic, employed in the calculation presented in Volume I. The HTGR plants and the fossil plants considered in the study are listed. GCRA provided the technical experts from which the data were obtained by MAC personnel. The names of the technical experts (interviewee) and the analysts (interviewer) are given for the probabilistic data

  16. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  17. Probabilistic safety assessment as a standpoint for decision making

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper focuses on the role of probabilistic safety assessment in decision-making. The prerequisites for use of the results of probabilistic safety assessment and the criteria for the decision-making based on probabilistic safety assessment are discussed. The decision-making process is described. It provides a risk evaluation of impact of the issue under investigation. Selected examples are discussed, which highlight the described process. (authors)

  18. A framework for probabilistic pluvial flood nowcasting for urban areas

    Science.gov (United States)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick

    2016-04-01

    Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the

  19. Quantifying agreement between anatomical and functional interhemispheric correspondences in the resting brain.

    Directory of Open Access Journals (Sweden)

    Hang Joon Jo

    Full Text Available The human brain is composed of two broadly symmetric cerebral hemispheres, with an abundance of reciprocal anatomical connections between homotopic locations. However, to date, studies of hemispheric symmetries have not identified correspondency precisely due to variable cortical folding patterns. Here we present a method to establish accurate correspondency using position on the unfolded cortical surface relative to gyral and sulcal landmarks. The landmark method is shown to outperform the method of reversing standard volume coordinates, and it is used to quantify the functional symmetry in resting fMRI data throughout the cortex. Resting brain activity was found to be maximally correlated with locations less than 1 cm away on the cortical surface from the corresponding anatomical location in nearly half of the cortex. While select locations exhibited asymmetric patterns, precise symmetric relationships were found to be the norm, with fine-grained symmetric functional maps demonstrated in motor, occipital, and inferior frontal cortex.

  20. Probabilistic logic networks a comprehensive framework for uncertain inference

    CERN Document Server

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.