WorldWideScience

Sample records for subject-specific probabilistic atlas

  1. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  2. Validating atlas-guided DOT: a comparison of diffuse optical tomography informed by atlas and subject-specific anatomies.

    Science.gov (United States)

    Cooper, Robert J; Caffini, Matteo; Dubb, Jay; Fang, Qianqian; Custo, Anna; Tsuzuki, Daisuke; Fischl, Bruce; Wells, William; Dan, Ippeita; Boas, David A

    2012-09-01

    We describe the validation of an anatomical brain atlas approach to the analysis of diffuse optical tomography (DOT). Using MRI data from 32 subjects, we compare the diffuse optical images of simulated cortical activation reconstructed using a registered atlas with those obtained using a subject's true anatomy. The error in localization of the simulated cortical activations when using a registered atlas is due to a combination of imperfect registration, anatomical differences between atlas and subject anatomies and the localization error associated with diffuse optical image reconstruction. When using a subject-specific MRI, any localization error is due to diffuse optical image reconstruction only. In this study we determine that using a registered anatomical brain atlas results in an average localization error of approximately 18 mm in Euclidean space. The corresponding error when the subject's own MRI is employed is 9.1 mm. In general, the cost of using atlas-guided DOT in place of subject-specific MRI-guided DOT is a doubling of the localization error. Our results show that despite this increase in error, reasonable anatomical localization is achievable even in cases where the subject-specific anatomy is unavailable. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Pancreas segmentation from 3D abdominal CT images using patient-specific weighted subspatial probabilistic atlases

    Science.gov (United States)

    Karasawa, Kenichi; Oda, Masahiro; Hayashi, Yuichiro; Nimura, Yukitaka; Kitasaka, Takayuki; Misawa, Kazunari; Fujiwara, Michitaka; Rueckert, Daniel; Mori, Kensaku

    2015-03-01

    Abdominal organ segmentations from CT volumes are now widely used in the computer-aided diagnosis and surgery assistance systems. Among abdominal organs, the pancreas is especially difficult to segment because of its large individual differences of the shape and position. In this paper, we propose a new pancreas segmentation method from 3D abdominal CT volumes using patient-specific weighted-subspatial probabilistic atlases. First of all, we perform normalization of organ shapes in training volumes and an input volume. We extract the Volume Of Interest (VOI) of the pancreas from the training volumes and an input volume. We divide each training VOI and input VOI into some cubic regions. We use a nonrigid registration method to register these cubic regions of the training VOI to corresponding regions of the input VOI. Based on the registration results, we calculate similarities between each cubic region of the training VOI and corresponding region of the input VOI. We select cubic regions of training volumes having the top N similarities in each cubic region. We subspatially construct probabilistic atlases weighted by the similarities in each cubic region. After integrating these probabilistic atlases in cubic regions into one, we perform a rough-to-precise segmentation of the pancreas using the atlas. The results of the experiments showed that utilization of the training volumes having the top N similarities in each cubic region led good results of the pancreas segmentation. The Jaccard Index and the average surface distance of the result were 58.9% and 2.04mm on average, respectively.

  4. Atlas-Based Automatic Generation of Subject-Specific Finite Element Tongue Meshes.

    Science.gov (United States)

    Bijar, Ahmad; Rohan, Pierre-Yves; Perrier, Pascal; Payan, Yohan

    2016-01-01

    Generation of subject-specific 3D finite element (FE) models requires the processing of numerous medical images in order to precisely extract geometrical information about subject-specific anatomy. This processing remains extremely challenging. To overcome this difficulty, we present an automatic atlas-based method that generates subject-specific FE meshes via a 3D registration guided by Magnetic Resonance images. The method extracts a 3D transformation by registering the atlas' volume image to the subject's one, and establishes a one-to-one correspondence between the two volumes. The 3D transformation field deforms the atlas' mesh to generate the subject-specific FE mesh. To preserve the quality of the subject-specific mesh, a diffeomorphic non-rigid registration based on B-spline free-form deformations is used, which guarantees a non-folding and one-to-one transformation. Two evaluations of the method are provided. First, a publicly available CT-database is used to assess the capability to accurately capture the complexity of each subject-specific Lung's geometry. Second, FE tongue meshes are generated for two healthy volunteers and two patients suffering from tongue cancer using MR images. It is shown that the method generates an appropriate representation of the subject-specific geometry while preserving the quality of the FE meshes for subsequent FE analysis. To demonstrate the importance of our method in a clinical context, a subject-specific mesh is used to simulate tongue's biomechanical response to the activation of an important tongue muscle, before and after cancer surgery.

  5. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    Science.gov (United States)

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  6. Probabilistic atlas based labeling of the cerebral vessel tree

    Science.gov (United States)

    Van de Giessen, Martijn; Janssen, Jasper P.; Brouwer, Patrick A.; Reiber, Johan H. C.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2015-03-01

    Preoperative imaging of the cerebral vessel tree is essential for planning therapy on intracranial stenoses and aneurysms. Usually, a magnetic resonance angiography (MRA) or computed tomography angiography (CTA) is acquired from which the cerebral vessel tree is segmented. Accurate analysis is helped by the labeling of the cerebral vessels, but labeling is non-trivial due to anatomical topological variability and missing branches due to acquisition issues. In recent literature, labeling the cerebral vasculature around the Circle of Willis has mainly been approached as a graph-based problem. The most successful method, however, requires the definition of all possible permutations of missing vessels, which limits application to subsets of the tree and ignores spatial information about the vessel locations. This research aims to perform labeling using probabilistic atlases that model spatial vessel and label likelihoods. A cerebral vessel tree is aligned to a probabilistic atlas and subsequently each vessel is labeled by computing the maximum label likelihood per segment from label-specific atlases. The proposed method was validated on 25 segmented cerebral vessel trees. Labeling accuracies were close to 100% for large vessels, but dropped to 50-60% for small vessels that were only present in less than 50% of the set. With this work we showed that using solely spatial information of the vessel labels, vessel segments from stable vessels (>50% presence) were reliably classified. This spatial information will form the basis for a future labeling strategy with a very loose topological model.

  7. A probabilistic atlas of human brainstem pathways based on connectome imaging data.

    Science.gov (United States)

    Tang, Yuchun; Sun, Wei; Toga, Arthur W; Ringman, John M; Shi, Yonggang

    2018-04-01

    The brainstem is a critical structure that regulates vital autonomic functions, houses the cranial nerves and their nuclei, relays motor and sensory information between the brain and spinal cord, and modulates cognition, mood, and emotions. As a primary relay center, the fiber pathways of the brainstem include efferent and afferent connections among the cerebral cortex, spinal cord, and cerebellum. While diffusion MRI has been successfully applied to map various brain pathways, its application for the in vivo imaging of the brainstem pathways has been limited due to inadequate resolution and large susceptibility-induced distortion artifacts. With the release of high-resolution data from the Human Connectome Project (HCP), there is increasing interest in mapping human brainstem pathways. Previous works relying on HCP data to study brainstem pathways, however, did not consider the prevalence (>80%) of large distortions in the brainstem even after the application of correction procedures from the HCP-Pipeline. They were also limited in the lack of adequate consideration of subject variability in either fiber pathways or region of interests (ROIs) used for bundle reconstruction. To overcome these limitations, we develop in this work a probabilistic atlas of 23 major brainstem bundles using high-quality HCP data passing rigorous quality control. For the large-scale data from the 500-Subject release of HCP, we conducted extensive quality controls to exclude subjects with severe distortions in the brainstem area. After that, we developed a systematic protocol to manually delineate 1300 ROIs on 20 HCP subjects (10 males; 10 females) for the reconstruction of fiber bundles using tractography techniques. Finally, we leveraged our novel connectome modeling techniques including high order fiber orientation distribution (FOD) reconstruction from multi-shell diffusion imaging and topography-preserving tract filtering algorithms to successfully reconstruct the 23 fiber bundles

  8. Value of a probabilistic atlas in medical image segmentation regarding non-rigid registration of abdominal CT scans

    Science.gov (United States)

    Park, Hyunjin; Meyer, Charles R.

    2012-10-01

    A probabilistic atlas provides important information to help segmentation and registration applications in medical image analysis. We construct a probabilistic atlas by picking a target geometry and mapping other training scans onto that target and then summing the results into one probabilistic atlas. By choosing an atlas space close to the desired target, we construct an atlas that represents the population well. Image registration used to map one image geometry onto another is a primary task in atlas building. One of the main parameters of registration is the choice of degrees of freedom (DOFs) of the geometric transform. Herein, we measure the effect of the registration's DOFs on the segmentation performance of the resulting probabilistic atlas. Twenty-three normal abdominal CT scans were used, and four organs (liver, spinal cord, left and right kidneys) were segmented for each scan. A well-known manifold learning method, ISOMAP, was used to find the best target space to build an atlas. In summary, segmentation performance was high for high DOF registrations regardless of the chosen target space, while segmentation performance was lowered for low DOF registrations if a target space was far from the best target space. At the 0.05 level of statistical significance, there were no significant differences at high DOF registrations while there were significant differences at low DOF registrations when choosing different targets.

  9. A high-resolution probabilistic in vivo atlas of human subcortical brain nuclei.

    Science.gov (United States)

    Pauli, Wolfgang M; Nili, Amanda N; Tyszka, J Michael

    2018-04-17

    Recent advances in magnetic resonance imaging methods, including data acquisition, pre-processing and analysis, have benefited research on the contributions of subcortical brain nuclei to human cognition and behavior. At the same time, these developments have led to an increasing need for a high-resolution probabilistic in vivo anatomical atlas of subcortical nuclei. In order to address this need, we constructed high spatial resolution, three-dimensional templates, using high-accuracy diffeomorphic registration of T 1 - and T 2 - weighted structural images from 168 typical adults between 22 and 35 years old. In these templates, many tissue boundaries are clearly visible, which would otherwise be impossible to delineate in data from individual studies. The resulting delineations of subcortical nuclei complement current histology-based atlases. We further created a companion library of software tools for atlas development, to offer an open and evolving resource for the creation of a crowd-sourced in vivo probabilistic anatomical atlas of the human brain.

  10. Probabilistic atlas-based segmentation of combined T1-weighted and DUTE MRI for calculation of head attenuation maps in integrated PET/MRI scanners.

    Science.gov (United States)

    Poynton, Clare B; Chen, Kevin T; Chonde, Daniel B; Izquierdo-Garcia, David; Gollub, Randy L; Gerstner, Elizabeth R; Batchelor, Tracy T; Catana, Ciprian

    2014-01-01

    We present a new MRI-based attenuation correction (AC) approach for integrated PET/MRI systems that combines both segmentation- and atlas-based methods by incorporating dual-echo ultra-short echo-time (DUTE) and T1-weighted (T1w) MRI data and a probabilistic atlas. Segmented atlases were constructed from CT training data using a leave-one-out framework and combined with T1w, DUTE, and CT data to train a classifier that computes the probability of air/soft tissue/bone at each voxel. This classifier was applied to segment the MRI of the subject of interest and attenuation maps (μ-maps) were generated by assigning specific linear attenuation coefficients (LACs) to each tissue class. The μ-maps generated with this "Atlas-T1w-DUTE" approach were compared to those obtained from DUTE data using a previously proposed method. For validation of the segmentation results, segmented CT μ-maps were considered to the "silver standard"; the segmentation accuracy was assessed qualitatively and quantitatively through calculation of the Dice similarity coefficient (DSC). Relative change (RC) maps between the CT and MRI-based attenuation corrected PET volumes were also calculated for a global voxel-wise assessment of the reconstruction results. The μ-maps obtained using the Atlas-T1w-DUTE classifier agreed well with those derived from CT; the mean DSCs for the Atlas-T1w-DUTE-based μ-maps across all subjects were higher than those for DUTE-based μ-maps; the atlas-based μ-maps also showed a lower percentage of misclassified voxels across all subjects. RC maps from the atlas-based technique also demonstrated improvement in the PET data compared to the DUTE method, both globally as well as regionally.

  11. Gyri of the human parietal lobe: Volumes, spatial extents, automatic labelling, and probabilistic atlases.

    Directory of Open Access Journals (Sweden)

    Heather M Wild

    Full Text Available Accurately describing the anatomy of individual brains enables interlaboratory communication of functional and developmental studies and is crucial for possible surgical interventions. The human parietal lobe participates in multimodal sensory integration including language processing and also contains the primary somatosensory area. We describe detailed protocols to subdivide the parietal lobe, analyze morphological and volumetric characteristics, and create probabilistic atlases in MNI152 stereotaxic space. The parietal lobe was manually delineated on 3D T1 MR images of 30 healthy subjects and divided into four regions: supramarginal gyrus (SMG, angular gyrus (AG, superior parietal lobe (supPL and postcentral gyrus (postCG. There was the expected correlation of male gender with larger brain and intracranial volume. We examined a wide range of anatomical features of the gyri and the sulci separating them. At least a rudimentary primary intermediate sulcus of Jensen (PISJ separating SMG and AG was identified in nearly all (59/60 hemispheres. Presence of additional gyri in SMG and AG was related to sulcal features and volumetric characteristics. The parietal lobe was slightly (2% larger on the left, driven by leftward asymmetries of the postCG and SMG. Intersubject variability was highest for SMG and AG, and lowest for postCG. Overall the morphological characteristics tended to be symmetrical, and volumes also tended to covary between hemispheres. This may reflect developmental as well as maturation factors. To assess the accuracy with which the labels can be used to segment newly acquired (unlabelled T1-weighted brain images, we applied multi-atlas label propagation software (MAPER in a leave-one-out experiment and compared the resulting automatic labels with the manually prepared ones. The results showed strong agreement (mean Jaccard index 0.69, corresponding to a mean Dice index of 0.82, average mean volume error of 0.6%. Stereotaxic

  12. Mindboggle: Automated brain labeling with multiple atlases

    International Nuclear Information System (INIS)

    Klein, Arno; Mensh, Brett; Ghosh, Satrajit; Tourville, Jason; Hirsch, Joy

    2005-01-01

    To make inferences about brain structures or activity across multiple individuals, one first needs to determine the structural correspondences across their image data. We have recently developed Mindboggle as a fully automated, feature-matching approach to assign anatomical labels to cortical structures and activity in human brain MRI data. Label assignment is based on structural correspondences between labeled atlases and unlabeled image data, where an atlas consists of a set of labels manually assigned to a single brain image. In the present work, we study the influence of using variable numbers of individual atlases to nonlinearly label human brain image data. Each brain image voxel of each of 20 human subjects is assigned a label by each of the remaining 19 atlases using Mindboggle. The most common label is selected and is given a confidence rating based on the number of atlases that assigned that label. The automatically assigned labels for each subject brain are compared with the manual labels for that subject (its atlas). Unlike recent approaches that transform subject data to a labeled, probabilistic atlas space (constructed from a database of atlases), Mindboggle labels a subject by each atlas in a database independently. When Mindboggle labels a human subject's brain image with at least four atlases, the resulting label agreement with coregistered manual labels is significantly higher than when only a single atlas is used. Different numbers of atlases provide significantly higher label agreements for individual brain regions. Increasing the number of reference brains used to automatically label a human subject brain improves labeling accuracy with respect to manually assigned labels. Mindboggle software can provide confidence measures for labels based on probabilistic assignment of labels and could be applied to large databases of brain images

  13. Bayesian longitudinal segmentation of hippocampal substructures in brain MRI using subject-specific atlases.

    Science.gov (United States)

    Iglesias, Juan Eugenio; Van Leemput, Koen; Augustinack, Jean; Insausti, Ricardo; Fischl, Bruce; Reuter, Martin

    2016-11-01

    The hippocampal formation is a complex, heterogeneous structure that consists of a number of distinct, interacting subregions. Atrophy of these subregions is implied in a variety of neurodegenerative diseases, most prominently in Alzheimer's disease (AD). Thanks to the increasing resolution of MR images and computational atlases, automatic segmentation of hippocampal subregions is becoming feasible in MRI scans. Here we introduce a generative model for dedicated longitudinal segmentation that relies on subject-specific atlases. The segmentations of the scans at the different time points are jointly computed using Bayesian inference. All time points are treated the same to avoid processing bias. We evaluate this approach using over 4700 scans from two publicly available datasets (ADNI and MIRIAD). In test-retest reliability experiments, the proposed method yielded significantly lower volume differences and significantly higher Dice overlaps than the cross-sectional approach for nearly every subregion (average across subregions: 4.5% vs. 6.5%, Dice overlap: 81.8% vs. 75.4%). The longitudinal algorithm also demonstrated increased sensitivity to group differences: in MIRIAD (69 subjects: 46 with AD and 23 controls), it found differences in atrophy rates between AD and controls that the cross sectional method could not detect in a number of subregions: right parasubiculum, left and right presubiculum, right subiculum, left dentate gyrus, left CA4, left HATA and right tail. In ADNI (836 subjects: 369 with AD, 215 with early cognitive impairment - eMCI - and 252 controls), all methods found significant differences between AD and controls, but the proposed longitudinal algorithm detected differences between controls and eMCI and differences between eMCI and AD that the cross sectional method could not find: left presubiculum, right subiculum, left and right parasubiculum, left and right HATA. Moreover, many of the differences that the cross-sectional method already found

  14. Encoding atlases by randomized classification forests for efficient multi-atlas label propagation.

    Science.gov (United States)

    Zikic, D; Glocker, B; Criminisi, A

    2014-12-01

    We propose a method for multi-atlas label propagation (MALP) based on encoding the individual atlases by randomized classification forests. Most current approaches perform a non-linear registration between all atlases and the target image, followed by a sophisticated fusion scheme. While these approaches can achieve high accuracy, in general they do so at high computational cost. This might negatively affect the scalability to large databases and experimentation. To tackle this issue, we propose to use a small and deep classification forest to encode each atlas individually in reference to an aligned probabilistic atlas, resulting in an Atlas Forest (AF). Our classifier-based encoding differs from current MALP approaches, which represent each point in the atlas either directly as a single image/label value pair, or by a set of corresponding patches. At test time, each AF produces one probabilistic label estimate, and their fusion is done by averaging. Our scheme performs only one registration per target image, achieves good results with a simple fusion scheme, and allows for efficient experimentation. In contrast to standard forest schemes, in which each tree would be trained on all atlases, our approach retains the advantages of the standard MALP framework. The target-specific selection of atlases remains possible, and incorporation of new scans is straightforward without retraining. The evaluation on four different databases shows accuracy within the range of the state of the art at a significantly lower running time. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Automated tissue classification of pediatric brains from magnetic resonance images using age-specific atlases

    Science.gov (United States)

    Metzger, Andrew; Benavides, Amanda; Nopoulos, Peg; Magnotta, Vincent

    2016-03-01

    The goal of this project was to develop two age appropriate atlases (neonatal and one year old) that account for the rapid growth and maturational changes that occur during early development. Tissue maps from this age group were initially created by manually correcting the resulting tissue maps after applying an expectation maximization (EM) algorithm and an adult atlas to pediatric subjects. The EM algorithm classified each voxel into one of ten possible tissue types including several subcortical structures. This was followed by a novel level set segmentation designed to improve differentiation between distal cortical gray matter and white matter. To minimize the req uired manual corrections, the adult atlas was registered to the pediatric scans using high -dimensional, symmetric image normalization (SyN) registration. The subject images were then mapped to an age specific atlas space, again using SyN registration, and the resulting transformation applied to the manually corrected tissue maps. The individual maps were averaged in the age specific atlas space and blurred to generate the age appropriate anatomical priors. The resulting anatomical priors were then used by the EM algorithm to re-segment the initial training set as well as an independent testing set. The results from the adult and age-specific anatomical priors were compared to the manually corrected results. The age appropriate atlas provided superior results as compared to the adult atlas. The image analysis pipeline used in this work was built using the open source software package BRAINSTools.

  16. Monitoring the injured brain: registered, patient specific atlas models to improve accuracy of recovered brain saturation values

    Science.gov (United States)

    Clancy, Michael; Belli, Antonio; Davies, David; Lucas, Samuel J. E.; Su, Zhangjie; Dehghani, Hamid

    2015-07-01

    The subject of superficial contamination and signal origins remains a widely debated topic in the field of Near Infrared Spectroscopy (NIRS), yet the concept of using the technology to monitor an injured brain, in a clinical setting, poses additional challenges concerning the quantitative accuracy of recovered parameters. Using high density diffuse optical tomography probes, quantitatively accurate parameters from different layers (skin, bone and brain) can be recovered from subject specific reconstruction models. This study assesses the use of registered atlas models for situations where subject specific models are not available. Data simulated from subject specific models were reconstructed using the 8 registered atlas models implementing a regional (layered) parameter recovery in NIRFAST. A 3-region recovery based on the atlas model yielded recovered brain saturation values which were accurate to within 4.6% (percentage error) of the simulated values, validating the technique. The recovered saturations in the superficial regions were not quantitatively accurate. These findings highlight differences in superficial (skin and bone) layer thickness between the subject and atlas models. This layer thickness mismatch was propagated through the reconstruction process decreasing the parameter accuracy.

  17. Specifying the brain anatomy underlying temporo-parietal junction activations for theory of mind: A review using probabilistic atlases from different imaging modalities.

    Science.gov (United States)

    Schurz, Matthias; Tholen, Matthias G; Perner, Josef; Mars, Rogier B; Sallet, Jerome

    2017-09-01

    In this quantitative review, we specified the anatomical basis of brain activity reported in the Temporo-Parietal Junction (TPJ) in Theory of Mind (ToM) research. Using probabilistic brain atlases, we labeled TPJ peak coordinates reported in the literature. This was carried out for four different atlas modalities: (i) gyral-parcellation, (ii) sulco-gyral parcellation, (iii) cytoarchitectonic parcellation and (iv) connectivity-based parcellation. In addition, our review distinguished between two ToM task types (false belief and social animations) and a nonsocial task (attention reorienting). We estimated the mean probabilities of activation for each atlas label, and found that for all three task types part of TPJ activations fell into the same areas: (i) Angular Gyrus (AG) and Lateral Occpital Cortex (LOC) in terms of a gyral atlas, (ii) AG and Superior Temporal Sulcus (STS) in terms of a sulco-gyral atlas, (iii) areas PGa and PGp in terms of cytoarchitecture and (iv) area TPJp in terms of a connectivity-based parcellation atlas. Beside these commonalities, we also found that individual task types showed preferential activation for particular labels. Main findings for the right hemisphere were preferential activation for false belief tasks in AG/PGa, and in Supramarginal Gyrus (SMG)/PFm for attention reorienting. Social animations showed strongest selective activation in the left hemisphere, specifically in left Middle Temporal Gyrus (MTG). We discuss how our results (i.e., identified atlas structures) can provide a new reference for describing future findings, with the aim to integrate different labels and terminologies used for studying brain activity around the TPJ. Hum Brain Mapp 38:4788-4805, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Specifying the brain anatomy underlying temporo-parietal junction activations for theory of mind: A review using probabilistic atlases from different imaging modalities

    NARCIS (Netherlands)

    Schurz, M.; Tholen, M.G.; Perner, J.; Mars, R.B.; Sallet, J.

    2017-01-01

    In this quantitative review, we specified the anatomical basis of brain activity reported in the Temporo-Parietal Junction (TPJ) in Theory of Mind (ToM) research. Using probabilistic brain atlases, we labeled TPJ peak coordinates reported in the literature. This was carried out for four different

  19. TU-CD-BRA-05: Atlas Selection for Multi-Atlas-Based Image Segmentation Using Surrogate Modeling

    International Nuclear Information System (INIS)

    Zhao, T; Ruan, D

    2015-01-01

    Purpose: The growing size and heterogeneity in training atlas necessitates sophisticated schemes to identify only the most relevant atlases for the specific multi-atlas-based image segmentation problem. This study aims to develop a model to infer the inaccessible oracle geometric relevance metric from surrogate image similarity metrics, and based on such model, provide guidance to atlas selection in multi-atlas-based image segmentation. Methods: We relate the oracle geometric relevance metric in label space to the surrogate metric in image space, by a monotonically non-decreasing function with additive random perturbations. Subsequently, a surrogate’s ability to prognosticate the oracle order for atlas subset selection is quantified probabilistically. Finally, important insights and guidance are provided for the design of fusion set size, balancing the competing demands to include the most relevant atlases and to exclude the most irrelevant ones. A systematic solution is derived based on an optimization framework. Model verification and performance assessment is performed based on clinical prostate MR images. Results: The proposed surrogate model was exemplified by a linear map with normally distributed perturbation, and verified with several commonly-used surrogates, including MSD, NCC and (N)MI. The derived behaviors of different surrogates in atlas selection and their corresponding performance in ultimate label estimate were validated. The performance of NCC and (N)MI was similarly superior to MSD, with a 10% higher atlas selection probability and a segmentation performance increase in DSC by 0.10 with the first and third quartiles of (0.83, 0.89), compared to (0.81, 0.89). The derived optimal fusion set size, valued at 7/8/8/7 for MSD/NCC/MI/NMI, agreed well with the appropriate range [4, 9] from empirical observation. Conclusion: This work has developed an efficacious probabilistic model to characterize the image-based surrogate metric on atlas selection

  20. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  1. ATLAS: Applications experiences and further developments

    International Nuclear Information System (INIS)

    Beraha, D.; Pointner, W.; Voggenberger, T.

    1999-01-01

    An overview of the plant analyzer ATLAS is given, describing its configuration, the process models and the supplementary modules which enhance the functionality of ATLAS for a range of applications in reactor safety analysis. These modules include the Reliability Advisory System, which supports the user by information from probabilistic safety analysis, the Procedure Analysis for development and test of emergency operating procedures, and a diagnostic system for steam-generator tube rupture. The development of plant specific analysers for various power plants is described, and the user experience related. Finally, the intended further development directions are discussed, centering on a tracking simulator, the migration of the visualisation system to Windows NT, and the construction of the Analysis Center as a multimedia environment for the operation of ATLAS. (author)

  2. Predicting BCI subject performance using probabilistic spatio-temporal filters.

    Directory of Open Access Journals (Sweden)

    Heung-Il Suk

    Full Text Available Recently, spatio-temporal filtering to enhance decoding for Brain-Computer-Interfacing (BCI has become increasingly popular. In this work, we discuss a novel, fully Bayesian-and thereby probabilistic-framework, called Bayesian Spatio-Spectral Filter Optimization (BSSFO and apply it to a large data set of 80 non-invasive EEG-based BCI experiments. Across the full frequency range, the BSSFO framework allows to analyze which spatio-spectral parameters are common and which ones differ across the subject population. As expected, large variability of brain rhythms is observed between subjects. We have clustered subjects according to similarities in their corresponding spectral characteristics from the BSSFO model, which is found to reflect their BCI performances well. In BCI, a considerable percentage of subjects is unable to use a BCI for communication, due to their missing ability to modulate their brain rhythms-a phenomenon sometimes denoted as BCI-illiteracy or inability. Predicting individual subjects' performance preceding the actual, time-consuming BCI-experiment enhances the usage of BCIs, e.g., by detecting users with BCI inability. This work additionally contributes by using the novel BSSFO method to predict the BCI-performance using only 2 minutes and 3 channels of resting-state EEG data recorded before the actual BCI-experiment. Specifically, by grouping the individual frequency characteristics we have nicely classified them into the subject 'prototypes' (like μ - or β -rhythm type subjects or users without ability to communicate with a BCI, and then by further building a linear regression model based on the grouping we could predict subjects' performance with the maximum correlation coefficient of 0.581 with the performance later seen in the actual BCI session.

  3. A probabilistic atlas of the basal ganglia using 7 T MRI

    NARCIS (Netherlands)

    Keuken, M.C.; Forstmann, B.U.

    2015-01-01

    A common localization procedure in functional imaging studies includes the overlay of statistical parametric functional magnetic resonance imaging (fMRI) maps or coordinates with neuroanatomical atlases in standard space, e.g., MNI-space. This procedure allows the identification of specific brain

  4. Spatially adapted augmentation of age-specific atlas-based segmentation using patch-based priors

    Science.gov (United States)

    Liu, Mengyuan; Seshamani, Sharmishtaa; Harrylock, Lisa; Kitsch, Averi; Miller, Steven; Chau, Van; Poskitt, Kenneth; Rousseau, Francois; Studholme, Colin

    2014-03-01

    One of the most common approaches to MRI brain tissue segmentation is to employ an atlas prior to initialize an Expectation- Maximization (EM) image labeling scheme using a statistical model of MRI intensities. This prior is commonly derived from a set of manually segmented training data from the population of interest. However, in cases where subject anatomy varies significantly from the prior anatomical average model (for example in the case where extreme developmental abnormalities or brain injuries occur), the prior tissue map does not provide adequate information about the observed MRI intensities to ensure the EM algorithm converges to an anatomically accurate labeling of the MRI. In this paper, we present a novel approach for automatic segmentation of such cases. This approach augments the atlas-based EM segmentation by exploring methods to build a hybrid tissue segmentation scheme that seeks to learn where an atlas prior fails (due to inadequate representation of anatomical variation in the statistical atlas) and utilize an alternative prior derived from a patch driven search of the atlas data. We describe a framework for incorporating this patch-based augmentation of EM (PBAEM) into a 4D age-specific atlas-based segmentation of developing brain anatomy. The proposed approach was evaluated on a set of MRI brain scans of premature neonates with ages ranging from 27.29 to 46.43 gestational weeks (GWs). Results indicated superior performance compared to the conventional atlas-based segmentation method, providing improved segmentation accuracy for gray matter, white matter, ventricles and sulcal CSF regions.

  5. Probabilistic anatomical labeling of brain structures using statistical probabilistic anatomical maps

    International Nuclear Information System (INIS)

    Kim, Jin Su; Lee, Dong Soo; Lee, Byung Il; Lee, Jae Sung; Shin, Hee Won; Chung, June Key; Lee, Myung Chul

    2002-01-01

    The use of statistical parametric mapping (SPM) program has increased for the analysis of brain PET and SPECT images. Montreal neurological institute (MNI) coordinate is used in SPM program as a standard anatomical framework. While the most researchers look up Talairach atlas to report the localization of the activations detected in SPM program, there is significant disparity between MNI templates and Talairach atlas. That disparity between Talairach and MNI coordinates makes the interpretation of SPM result time consuming, subjective and inaccurate. The purpose of this study was to develop a program to provide objective anatomical information of each x-y-z position in ICBM coordinate. Program was designed to provide the anatomical information for the given x-y-z position in MNI coordinate based on the statistical probabilistic anatomical map (SPAM) images of ICBM. When x-y-z position was given to the program, names of the anatomical structures with non-zero probability and the probabilities that the given position belongs to the structures were tabulated. The program was coded using IDL and JAVA language for the easy transplantation to any operating system or platform. Utility of this program was shown by comparing the results of this program to those of SPM program. Preliminary validation study was performed by applying this program to the analysis of PET brain activation study of human memory in which the anatomical information on the activated areas are previously known. Real time retrieval of probabilistic information with 1 mm spatial resolution was archived using the programs. Validation study showed the relevance of this program: probability that the activated area for memory belonged to hippocampal formation was more than 80%. These programs will be useful for the result interpretation of the image analysis performed on MNI coordinate, as done in SPM program

  6. Probabilistic maps of the white matter tracts with known associated functions on the neonatal brain atlas: Application to evaluate longitudinal developmental trajectories in term-born and preterm-born infants.

    Science.gov (United States)

    Akazawa, Kentaro; Chang, Linda; Yamakawa, Robyn; Hayama, Sara; Buchthal, Steven; Alicata, Daniel; Andres, Tamara; Castillo, Deborrah; Oishi, Kumiko; Skranes, Jon; Ernst, Thomas; Oishi, Kenichi

    2016-03-01

    Diffusion tensor imaging (DTI) has been widely used to investigate the development of the neonatal and infant brain, and deviations related to various diseases or medical conditions like preterm birth. In this study, we created a probabilistic map of fiber pathways with known associated functions, on a published neonatal multimodal atlas. The pathways-of-interest include the superficial white matter (SWM) fibers just beneath the specific cytoarchitectonically defined cortical areas, which were difficult to evaluate with existing DTI analysis methods. The Jülich cytoarchitectonic atlas was applied to define cortical areas related to specific brain functions, and the Dynamic Programming (DP) method was applied to delineate the white matter pathways traversing through the SWM. Probabilistic maps were created for pathways related to motor, somatosensory, auditory, visual, and limbic functions, as well as major white matter tracts, such as the corpus callosum, the inferior fronto-occipital fasciculus, and the middle cerebellar peduncle, by delineating these structures in eleven healthy term-born neonates. In order to characterize maturation-related changes in diffusivity measures of these pathways, the probabilistic maps were then applied to DTIs of 49 healthy infants who were longitudinally scanned at three time-points, approximately five weeks apart. First, we investigated the normal developmental pattern based on 19 term-born infants. Next, we analyzed 30 preterm-born infants to identify developmental patterns related to preterm birth. Last, we investigated the difference in diffusion measures between these groups to evaluate the effects of preterm birth on the development of these functional pathways. Term-born and preterm-born infants both demonstrated a time-dependent decrease in diffusivity, indicating postnatal maturation in these pathways, with laterality seen in the corticospinal tract and the optic radiation. The comparison between term- and preterm

  7. Mapping visual cortex in monkeys and humans using surface-based atlases

    Science.gov (United States)

    Van Essen, D. C.; Lewis, J. W.; Drury, H. A.; Hadjikhani, N.; Tootell, R. B.; Bakircioglu, M.; Miller, M. I.

    2001-01-01

    We have used surface-based atlases of the cerebral cortex to analyze the functional organization of visual cortex in humans and macaque monkeys. The macaque atlas contains multiple partitioning schemes for visual cortex, including a probabilistic atlas of visual areas derived from a recent architectonic study, plus summary schemes that reflect a combination of physiological and anatomical evidence. The human atlas includes a probabilistic map of eight topographically organized visual areas recently mapped using functional MRI. To facilitate comparisons between species, we used surface-based warping to bring functional and geographic landmarks on the macaque map into register with corresponding landmarks on the human map. The results suggest that extrastriate visual cortex outside the known topographically organized areas is dramatically expanded in human compared to macaque cortex, particularly in the parietal lobe.

  8. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    Science.gov (United States)

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  9. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Kevin T. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts Institute of Technology, Division of Health Sciences and Technology, Cambridge, MA (United States); Izquierdo-Garcia, David; Catana, Ciprian [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Poynton, Clare B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Massachusetts General Hospital, Department of Psychiatry, Boston, MA (United States); University of California, San Francisco, Department of Radiology and Biomedical Imaging, San Francisco, CA (United States); Chonde, Daniel B. [Massachusetts General Hospital and Harvard Medical School, Athinoula A. Martinos Center for Biomedical Imaging, Department of Radiology, Charlestown, MA (United States); Harvard University, Program in Biophysics, Cambridge, MA (United States)

    2017-03-15

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps (''μ-maps'') were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map (''PAC-map'') generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach. (orig.)

  10. Windows on the brain: the emerging role of atlases and databases in neuroscience

    Science.gov (United States)

    Van Essen, David C.; VanEssen, D. C. (Principal Investigator)

    2002-01-01

    Brain atlases and associated databases have great potential as gateways for navigating, accessing, and visualizing a wide range of neuroscientific data. Recent progress towards realizing this potential includes the establishment of probabilistic atlases, surface-based atlases and associated databases, combined with improvements in visualization capabilities and internet access.

  11. Optimisation of technical specifications using probabilistic methods

    International Nuclear Information System (INIS)

    Ericsson, G.; Knochenhauer, M.; Hultqvist, G.

    1986-01-01

    During the last few years the development of methods for modifying and optimising nuclear power plant Technical Specifications (TS) for plant operations has received increased attention. Probalistic methods in general, and the plant and system models of probabilistic safety assessment (PSA) in particular, seem to provide the most forceful tools for optimisation. This paper first gives some general comments on optimisation, identifying important parameters and then gives a description of recent Swedish experiences from the use of nuclear power plant PSA models and results for TS optimisation

  12. A novel approach of fMRI-guided tractography analysis within a group: construction of an fMRI-guided tractographic atlas.

    Science.gov (United States)

    Preti, Maria Giulia; Makris, Nikos; Laganà, Maria Marcella; Papadimitriou, George; Baglio, Francesca; Griffanti, Ludovica; Nemni, Raffaello; Cecconi, Pietro; Westin, Carl-Fredrik; Baselli, Giuseppe

    2012-01-01

    Diffusion Tensor Imaging (DTI) tractography and functional Magnetic Resonance Imaging (fMRI) investigate two complementary aspects of brain networks: white matter (WM) anatomical connectivity and gray matter (GM) function. However, integration standards have yet to be defined; namely, individual fMRI-driven tractography is usually applied and only few studies address group analysis. This work proposes an efficient method of fMRI-driven tractography at group level through the creation of a tractographic atlas starting from the GM areas activated by a verbal fluency task in 11 healthy subjects. The individual tracts were registered to the MNI space. Selection ROIs derived by GM masking and dilation of group activated areas were applied to obtain the fMRI-driven subsets within tracts. An atlas of the tracts recruited among the population was obtained by selecting for each subject the fMRI-guided tracts passing through the high probability voxels (the voxels recruited by the 90% of the subjects) and merging them together. The reliability of this approach was assessed by comparing it with the probabilistic atlas previously introduced in literature. The introduced method allowed to successfully reconstruct activated tracts, which comprehended corpus callosum, left cingulum and arcuate, a small portion of the right arcuate, both cortico-spinal tracts and inferior fronto-occipital fasciculi. Moreover, it proved to give results concordant with the previously introduced probabilistic approach, allowing in addition to reconstruct 3D trajectories of the activated fibers, which appear particularly helpful in the detection of WM connections.

  13. Atlas C++ Coding Standard Specification

    CERN Document Server

    Albrand, S; Barberis, D; Bosman, M; Jones, B; Stavrianakou, M; Arnault, C; Candlin, D; Candlin, R; Franck, E; Hansl-Kozanecka, Traudl; Malon, D; Qian, S; Quarrie, D; Schaffer, R D

    2001-01-01

    This document defines the ATLAS C++ coding standard, that should be adhered to when writing C++ code. It has been adapted from the original "PST Coding Standard" document (http://pst.cern.ch/HandBookWorkBook/Handbook/Programming/programming.html) CERN-UCO/1999/207. The "ATLAS standard" comprises modifications, further justification and examples for some of the rules in the original PST document. All changes were discussed in the ATLAS Offline Software Quality Control Group and feedback from the collaboration was taken into account in the "current" version.

  14. What Data to Co-register for Computing Atlases

    Science.gov (United States)

    Yeo, B.T. Thomas; Sabuncu, Mert; Mohlberg, Hartmut; Amunts, Katrin; Zilles, Karl; Golland, Polina; Fischl, Bruce

    2015-01-01

    We argue that registration should be thought of as a means to an end, and not as a goal by itself. In particular, we consider the problem of predicting the locations of hidden labels of a test image using observable features, given a training set with both the hidden labels and observable features. For example, the hidden labels could be segmentation labels or activation regions in fMRI, while the observable features could be sulcal geometry or MR intensity. We analyze a probabilistic framework for computing an optimal atlas, and the subsequent registration of a new subject using only the observable features to optimize the hidden label alignment to the training set. We compare two approaches for co-registering training images for the atlas construction: the traditional approach of only using observable features and a novel approach of only using hidden labels. We argue that the alternative approach is superior particularly when the relationship between the hidden labels and observable features is complex and unknown. As an application, we consider the task of registering cortical folds to optimize Brodmann area localization. We show that the alignment of the Brodmann areas improves by up to 25% when using the alternative atlas compared with the traditional atlas. To the best of our knowledge, these are the most accurate Brodmann area localization results (achieved via cortical fold registration) reported to date. PMID:26082678

  15. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  16. Are subject-specific musculoskeletal models robust to the uncertainties in parameter identification?

    Directory of Open Access Journals (Sweden)

    Giordano Valente

    Full Text Available Subject-specific musculoskeletal modeling can be applied to study musculoskeletal disorders, allowing inclusion of personalized anatomy and properties. Independent of the tools used for model creation, there are unavoidable uncertainties associated with parameter identification, whose effect on model predictions is still not fully understood. The aim of the present study was to analyze the sensitivity of subject-specific model predictions (i.e., joint angles, joint moments, muscle and joint contact forces during walking to the uncertainties in the identification of body landmark positions, maximum muscle tension and musculotendon geometry. To this aim, we created an MRI-based musculoskeletal model of the lower limbs, defined as a 7-segment, 10-degree-of-freedom articulated linkage, actuated by 84 musculotendon units. We then performed a Monte-Carlo probabilistic analysis perturbing model parameters according to their uncertainty, and solving a typical inverse dynamics and static optimization problem using 500 models that included the different sets of perturbed variable values. Model creation and gait simulations were performed by using freely available software that we developed to standardize the process of model creation, integrate with OpenSim and create probabilistic simulations of movement. The uncertainties in input variables had a moderate effect on model predictions, as muscle and joint contact forces showed maximum standard deviation of 0.3 times body-weight and maximum range of 2.1 times body-weight. In addition, the output variables significantly correlated with few input variables (up to 7 out of 312 across the gait cycle, including the geometry definition of larger muscles and the maximum muscle tension in limited gait portions. Although we found subject-specific models not markedly sensitive to parameter identification, researchers should be aware of the model precision in relation to the intended application. In fact, force

  17. Optimization of technical specifications by use of probabilistic methods

    International Nuclear Information System (INIS)

    Laakso, K.

    1990-01-01

    The Technical Specifications of a nuclear power plant specify the limits for plant operation from the safety point of view. These operational safety rules were originally defined on the basis of deterministic analyses and engineering judgement. As experience has accumulated, it has proved necessary to consider problems and make specific modifications in these rules. Developments in probabilistic safety assessment have provided a new tool to analyse, present and compare the risk effects of proposed rule modificatons. The main areas covered in the project are operational decisions in failure situations, preventive maintenance during power operation and surveillance tests of standby safety systems. (author)

  18. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  19. Probabilistic atlas-guided eigen-organ method for simultaneous bounding box estimation of multiple organs in volumetric CT images

    International Nuclear Information System (INIS)

    Yao, Cong; Wada, Takashige; Shimizu, Akinobu; Kobatake, Hidefumi; Nawano, Shigeru

    2006-01-01

    We propose an approach for the simultaneous bounding box estimation of multiple organs in volumetric CT images. Local eigen-organ spaces are constructed for different types of training organs, and a global eigen-space, which describes the spatial relationships between the organs, is also constructed. Each volume of interest in the abdominal CT image is projected into the local eigen-organ spaces, and several candidate locations are determined. The final selection of the organ locations is made by projecting the set of candidate locations into the global eigen-space. A probabilistic atlas of organs is used to eliminate locations with low probability and to guide the selection of candidate locations. Evaluation by the leave-one-out method using 10 volumetric abdominal CT images showed that the proposed method provided an average accuracy of 80.38% for 11 different organ types. (author)

  20. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    Science.gov (United States)

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  1. Development of Nuclear Plant Specific Analysis Simulators with ATLAS

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Draeger, P.; Horche, W.; Pointner, W.

    2006-01-01

    The simulation software ATLAS, based on the best-estimate code ATHLET, has been developed by the GRS for a range of applications in the field of nuclear plant safety analysis. Through application of versatile simulation tools and graphical interfaces the user should be able to analyse with ATLAS all essential accident scenarios. Detailed analysis simulators for several German and Russian NPPs are being constructed on the basis of ATLAS. An overview of the ATLAS is presented in the paper, describing its configuration, functions performed by main components and relationships among them. A significant part of any power plant simulator are the balance-of-plant (BOP) models, not only because all the plant transients and non-LOCA accidents can be initiated by operation of BOP systems, but also because the response of the plant to transients or accidents is strongly influenced by the automatic operation of BOP systems. Modelling aspects of BOP systems are shown in detail, also the interface between the process model and BOP systems. Special emphasis has been put on the BOP model builder based on the methodology developed in the GRS. The BOP modeler called GCSM-Generator is an object oriented tool which runs on the online expert system G2. It is equipped with utilities to edit the BOP models, to verification them and to generate a GCSM code, specific for the ATLAS. The communication system of ATLAS presents graphically the results of the simulation and allows interactively influencing the execution of the simulation process (malfunctions, manual control). Displays for communications with simulated processes and presentation of calculations results are also presented. In the framework of the verification of simulation models different tools are used e.g. the PC-codes MATHCAD for the calculation and documentation, ATLET-Input-Graphic for control of geometry data and the expert system G2 for development of BOP-Models. The validation procedure and selected analyses results

  2. Atlas-based head modeling and spatial normalization for high-density diffuse optical tomography: in vivo validation against fMRI.

    Science.gov (United States)

    Ferradal, Silvina L; Eggebrecht, Adam T; Hassanpour, Mahlega; Snyder, Abraham Z; Culver, Joseph P

    2014-01-15

    Diffuse optical imaging (DOI) is increasingly becoming a valuable neuroimaging tool when fMRI is precluded. Recent developments in high-density diffuse optical tomography (HD-DOT) overcome previous limitations of sparse DOI systems, providing improved image quality and brain specificity. These improvements in instrumentation prompt the need for advancements in both i) realistic forward light modeling for accurate HD-DOT image reconstruction, and ii) spatial normalization for voxel-wise comparisons across subjects. Individualized forward light models derived from subject-specific anatomical images provide the optimal inverse solutions, but such modeling may not be feasible in all situations. In the absence of subject-specific anatomical images, atlas-based head models registered to the subject's head using cranial fiducials provide an alternative solution. In addition, a standard atlas is attractive because it defines a common coordinate space in which to compare results across subjects. The question therefore arises as to whether atlas-based forward light modeling ensures adequate HD-DOT image quality at the individual and group level. Herein, we demonstrate the feasibility of using atlas-based forward light modeling and spatial normalization methods. Both techniques are validated using subject-matched HD-DOT and fMRI data sets for visual evoked responses measured in five healthy adult subjects. HD-DOT reconstructions obtained with the registered atlas anatomy (i.e. atlas DOT) had an average localization error of 2.7mm relative to reconstructions obtained with the subject-specific anatomical images (i.e. subject-MRI DOT), and 6.6mm relative to fMRI data. At the group level, the localization error of atlas DOT reconstruction was 4.2mm relative to subject-MRI DOT reconstruction, and 6.1mm relative to fMRI. These results show that atlas-based image reconstruction provides a viable approach to individual head modeling for HD-DOT when anatomical imaging is not available

  3. Test Specification of A1-1 Test for OECD-ATLAS Project

    International Nuclear Information System (INIS)

    Kang, Kyoung-Ho; Moon, Sang-Ki; Lee, Seung-Wook; Choi, Ki-Yong; Song, Chul-Hwa

    2014-01-01

    In the OECD-ATLAS project, design extension conditions (DECs) such as a station blackout (SBO) and a total loss of feed water (TLOFW) will be experimentally investigated to meet the international interests in the multiple high-risk DECs raised after the Fukushima accident. The proposed test matrix for the OECD-ATLAS project is summarized in Table 1.. In this study, detailed specification of the first test named as A1-1 in the OECD-ATLAS project was described. The target scenario of the A1-1 test is a prolonged SBO with delayed supply of turbine-driven auxiliary feedwater to only SG number 2 (SG-2). A SBO is one of the most important DECs in that without any proper operator actions, a total loss of heat sink leads to core uncover, to core damage, and ultimately a core melt-down scenario under high pressure. Due to this safety importance, a SBO is considered to be a base test item of the OECD-ATLAS project. A detailed specification of the first test named as A1-1 in the OECD-ATLAS project was described. The target scenario of the A1-1 test is a prolonged SBO with delayed supply of turbine-driven auxiliary feedwater to only SG-2 in order to consider an accident mitigation measure. The pre-test analysis using MARS code was performed with an aim of setting up the detailed test procedures for A1-1 test and also gaining the physical insights for a prolonged SBO transient. In the A1-1 test, a prolonged SBO transient will be simulated with two temporal phases: Phase (I) for conservative SBO transient without supply of turbine-driven auxiliary feedwater and Phase (II) for asymmetric cooling via single trained supply of turbine-driven auxiliary feedwater

  4. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  5. Characterizing the human hippocampus in aging and Alzheimer's disease using a computational atlas derived from ex vivo MRI and histology.

    Science.gov (United States)

    Adler, Daniel H; Wisse, Laura E M; Ittyerah, Ranjit; Pluta, John B; Ding, Song-Lin; Xie, Long; Wang, Jiancong; Kadivar, Salmon; Robinson, John L; Schuck, Theresa; Trojanowski, John Q; Grossman, Murray; Detre, John A; Elliott, Mark A; Toledo, Jon B; Liu, Weixia; Pickup, Stephen; Miller, Michael I; Das, Sandhitsu R; Wolk, David A; Yushkevich, Paul A

    2018-04-17

    Although the hippocampus is one of the most studied structures in the human brain, limited quantitative data exist on its 3D organization, anatomical variability, and effects of disease on its subregions. Histological studies provide restricted reference information due to their 2D nature. In this paper, high-resolution (∼200 × 200 × 200 μm 3 ) ex vivo MRI scans of 31 human hippocampal specimens are combined using a groupwise diffeomorphic registration approach into a 3D probabilistic atlas that captures average anatomy and anatomic variability of hippocampal subfields. Serial histological imaging in 9 of the 31 specimens was used to label hippocampal subfields in the atlas based on cytoarchitecture. Specimens were obtained from autopsies in patients with a clinical diagnosis of Alzheimer's disease (AD; 9 subjects, 13 hemispheres), of other dementia (nine subjects, nine hemispheres), and in subjects without dementia (seven subjects, nine hemispheres), and morphometric analysis was performed in atlas space to measure effects of age and AD on hippocampal subfields. Disproportional involvement of the cornu ammonis (CA) 1 subfield and stratum radiatum lacunosum moleculare was found in AD, with lesser involvement of the dentate gyrus and CA2/3 subfields. An association with age was found for the dentate gyrus and, to a lesser extent, for CA1. Three-dimensional patterns of variability and disease and aging effects discovered via the ex vivo hippocampus atlas provide information highly relevant to the active field of in vivo hippocampal subfield imaging.

  6. Site-specific probabilistic seismic hazard analyses for the Idaho National Engineering Laboratory. Volume 1: Final report

    International Nuclear Information System (INIS)

    1996-05-01

    This report describes and summarizes a probabilistic evaluation of ground motions for the Idaho National Engineering Laboratory (INEL). The purpose of this evaluation is to provide a basis for updating the seismic design criteria for the INEL. In this study, site-specific seismic hazard curves were developed for seven facility sites as prescribed by DOE Standards 1022-93 and 1023-96. These sites include the: Advanced Test Reactor (ATR); Argonne National Laboratory West (ANL); Idaho Chemical Processing Plant (ICPP or CPP); Power Burst Facility (PBF); Radioactive Waste Management Complex (RWMC); Naval Reactor Facility (NRF); and Test Area North (TAN). The results, probabilistic peak ground accelerations and uniform hazard spectra, contained in this report are not to be used for purposes of seismic design at INEL. A subsequent study will be performed to translate the results of this probabilistic seismic hazard analysis to site-specific seismic design values for the INEL as per the requirements of DOE Standard 1020-94. These site-specific seismic design values will be incorporated into the INEL Architectural and Engineering Standards

  7. Age-Specific Mortality and Fertility Rates for Probabilistic Population Projections

    OpenAIRE

    Ševčíková, Hana; Li, Nan; Kantorová, Vladimíra; Gerland, Patrick; Raftery, Adrian E.

    2015-01-01

    The United Nations released official probabilistic population projections (PPP) for all countries for the first time in July 2014. These were obtained by projecting the period total fertility rate (TFR) and life expectancy at birth ($e_0$) using Bayesian hierarchical models, yielding a large set of future trajectories of TFR and $e_0$ for all countries and future time periods to 2100, sampled from their joint predictive distribution. Each trajectory was then converted to age-specific mortalit...

  8. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  9. A Generative Probabilistic Model and Discriminative Extensions for Brain Lesion Segmentation - With Application to Tumor and Stroke

    DEFF Research Database (Denmark)

    Menze, Bjoern H.; Van Leemput, Koen; Lashkari, Danial

    2016-01-01

    jointly from the image data. It delineates lesion areas individually in each channel, allowing for differences in lesion appearance across modalities, an important feature of many brain tumor imaging sequences. We also propose discriminative model extensions to map the output of the generative model......), to estimate the label map for a new image. Our model augments the probabilistic atlas of the healthy tissues with a latent atlas of the lesion. We derive an estimation algorithm with closed-form EM update equations. The method extracts a latent atlas prior distribution and the lesion posterior distributions...

  10. Generating patient specific pseudo-CT of the head from MR using atlas-based regression

    International Nuclear Information System (INIS)

    Sjölund, J; Forsberg, D; Andersson, M; Knutsson, H

    2015-01-01

    Radiotherapy planning and attenuation correction of PET images require simulation of radiation transport. The necessary physical properties are typically derived from computed tomography (CT) images, but in some cases, including stereotactic neurosurgery and combined PET/MR imaging, only magnetic resonance (MR) images are available. With these applications in mind, we describe how a realistic, patient-specific, pseudo-CT of the head can be derived from anatomical MR images. We refer to the method as atlas-based regression, because of its similarity to atlas-based segmentation. Given a target MR and an atlas database comprising MR and CT pairs, atlas-based regression works by registering each atlas MR to the target MR, applying the resulting displacement fields to the corresponding atlas CTs and, finally, fusing the deformed atlas CTs into a single pseudo-CT. We use a deformable registration algorithm known as the Morphon and augment it with a certainty mask that allows a tailoring of the influence certain regions are allowed to have on the registration. Moreover, we propose a novel method of fusion, wherein the collection of deformed CTs is iteratively registered to their joint mean and find that the resulting mean CT becomes more similar to the target CT. However, the voxelwise median provided even better results; at least as good as earlier work that required special MR imaging techniques. This makes atlas-based regression a good candidate for clinical use. (paper)

  11. A bi-ventricular cardiac atlas built from 1000+ high resolution MR images of healthy subjects and an analysis of shape and motion.

    Science.gov (United States)

    Bai, Wenjia; Shi, Wenzhe; de Marvao, Antonio; Dawes, Timothy J W; O'Regan, Declan P; Cook, Stuart A; Rueckert, Daniel

    2015-12-01

    Atlases encode valuable anatomical and functional information from a population. In this work, a bi-ventricular cardiac atlas was built from a unique data set, which consists of high resolution cardiac MR images of 1000+ normal subjects. Based on the atlas, statistical methods were used to study the variation of cardiac shapes and the distribution of cardiac motion across the spatio-temporal domain. We have shown how statistical parametric mapping (SPM) can be combined with a general linear model to study the impact of gender and age on regional myocardial wall thickness. Finally, we have also investigated the influence of the population size on atlas construction and atlas-based analysis. The high resolution atlas, the statistical models and the SPM method will benefit more studies on cardiac anatomy and function analysis in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Optimization of structures subjected to dynamic load: deterministic and probabilistic methods

    Directory of Open Access Journals (Sweden)

    Élcio Cassimiro Alves

    Full Text Available Abstract This paper deals with the deterministic and probabilistic optimization of structures against bending when submitted to dynamic loads. The deterministic optimization problem considers the plate submitted to a time varying load while the probabilistic one takes into account a random loading defined by a power spectral density function. The correlation between the two problems is made by one Fourier Transformed. The finite element method is used to model the structures. The sensitivity analysis is performed through the analytical method and the optimization problem is dealt with by the method of interior points. A comparison between the deterministic optimisation and the probabilistic one with a power spectral density function compatible with the time varying load shows very good results.

  13. Quantification of 18F-FDG PET images using probabilistic brain atlas: clinical application in temporal lobe epilepsy patients

    International Nuclear Information System (INIS)

    Kang, Keon Wook; Lee, Dong Soo; Cho, Jae Hoon; Lee, Jae Sung; Yeo, Jeong Seok; Lee, Sang Gun; Chung, June Key; Lee, Myung Chul

    2000-01-01

    A probabilistic atlas of the human brain (Statistical Probability Anatomical Maps: SPAM) was developed by the international consortium for brain mapping (ICBM). After calculating the counts in volume of interest (VOI) using the product of probability of SPAM images and counts in FDG images, asymmetric indexes(AI) were calculated and used for finding epileptogenic zones in temporal lobe epilepsy (TLE). FDG PET images from 28 surgically confirmed TLE patients and 12 age-matched controls were spatially normalized to the averaged brain MRI atlas of ICBM. The counts from normalized PET images were multiplied with the probability of 12 VOIs (superior temporal gyrus, middle temporal gyrus, inferior temporal gyrus, hippocampus, parahippocampal gyrus, and amygdala in each hemisphere) of SPAM images of Montreal Neurological Institute. Finally AI was calculated on each pair of VOI, and compared with visual assessment. If AI was deviated more than 2 standard deviation of normal controls, we considered epileptogenic zones were found successfully. The counts of VOIs in normal controls were symmetric (AI 0.05) except those of inferior temporal gyrus (p<0.01). AIs in 5 pairs of VOI excluding inferior temporal gyrus were deviated to one side in TLE (p<0.05). Lateralization was correct in 23/28 of patients by AI, but all of 28 were consistent with visual inspection. In 3 patients with normal AI was symmetric on visual inspection. In 2 patients falsely lateralized using AI, metabolism was also decreased visually on contra-lateral side. Asymmetric index obtained by the product of statistical probability anatomical map and FDG PET correlated well with visual assessment in TLE patients. SPAM is useful for quantification of VOIs in functional images

  14. Probabilistic fatigue life of balsa cored sandwich composites subjected to transverse shear

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Berggreen, Christian

    2015-01-01

    A probabilistic fatigue life model for end-grain balsa cored sandwich composites subjectedto transverse shear is proposed. The model is calibrated to measured three-pointbending constant-amplitude fatigue test data using the maximum likelihood method. Some possible applications of the probabilistic...

  15. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  16. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    Science.gov (United States)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  17. Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra provides reduced effect of scanner for cortex volumetry with atlas-based method in healthy subjects.

    Science.gov (United States)

    Goto, Masami; Abe, Osamu; Aoki, Shigeki; Hayashi, Naoto; Miyati, Tosiaki; Takao, Hidemasa; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni

    2013-07-01

    This study aimed to investigate whether the effect of scanner for cortex volumetry with atlas-based method is reduced using Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra (DARTEL) normalization compared with standard normalization. Three-dimensional T1-weighted magnetic resonance images (3D-T1WIs) of 21 healthy subjects were obtained and evaluated for effect of scanner in cortex volumetry. 3D-T1WIs of the 21 subjects were obtained with five MRI systems. Imaging of each subject was performed on each of five different MRI scanners. We used the Voxel-Based Morphometry 8 tool implemented in Statistical Parametric Mapping 8 and WFU PickAtlas software (Talairach brain atlas theory). The following software default settings were used as bilateral region-of-interest labels: "Frontal Lobe," "Hippocampus," "Occipital Lobe," "Orbital Gyrus," "Parietal Lobe," "Putamen," and "Temporal Lobe." Effect of scanner for cortex volumetry using the atlas-based method was reduced with DARTEL normalization compared with standard normalization in Frontal Lobe, Occipital Lobe, Orbital Gyrus, Putamen, and Temporal Lobe; was the same in Hippocampus and Parietal Lobe; and showed no increase with DARTEL normalization for any region of interest (ROI). DARTEL normalization reduces the effect of scanner, which is a major problem in multicenter studies.

  18. Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra provides reduced effect of scanner for cortex volumetry with atlas-based method in healthy subjects

    International Nuclear Information System (INIS)

    Goto, Masami; Ino, Kenji; Yano, Keiichi; Abe, Osamu; Aoki, Shigeki; Hayashi, Naoto; Miyati, Tosiaki; Takao, Hidemasa; Mori, Harushi; Kunimatsu, Akira; Ohtomo, Kuni; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi

    2013-01-01

    This study aimed to investigate whether the effect of scanner for cortex volumetry with atlas-based method is reduced using Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra (DARTEL) normalization compared with standard normalization. Three-dimensional T1-weighted magnetic resonance images (3D-T1WIs) of 21 healthy subjects were obtained and evaluated for effect of scanner in cortex volumetry. 3D-T1WIs of the 21 subjects were obtained with five MRI systems. Imaging of each subject was performed on each of five different MRI scanners. We used the Voxel-Based Morphometry 8 tool implemented in Statistical Parametric Mapping 8 and WFU PickAtlas software (Talairach brain atlas theory). The following software default settings were used as bilateral region-of-interest labels: ''Frontal Lobe,'' ''Hippocampus,'' ''Occipital Lobe,'' ''Orbital Gyrus,'' ''Parietal Lobe,'' ''Putamen,'' and ''Temporal Lobe.'' Effect of scanner for cortex volumetry using the atlas-based method was reduced with DARTEL normalization compared with standard normalization in Frontal Lobe, Occipital Lobe, Orbital Gyrus, Putamen, and Temporal Lobe; was the same in Hippocampus and Parietal Lobe; and showed no increase with DARTEL normalization for any region of interest (ROI). DARTEL normalization reduces the effect of scanner, which is a major problem in multicenter studies. (orig.)

  19. Fusion set selection with surrogate metric in multi-atlas based image segmentation

    International Nuclear Information System (INIS)

    Zhao, Tingting; Ruan, Dan

    2016-01-01

    Multi-atlas based image segmentation sees unprecedented opportunities but also demanding challenges in the big data era. Relevant atlas selection before label fusion plays a crucial role in reducing potential performance loss from heterogeneous data quality and high computation cost from extensive data. This paper starts with investigating the image similarity metric (termed ‘surrogate’), an alternative to the inaccessible geometric agreement metric (termed ‘oracle’) in atlas relevance assessment, and probes into the problem of how to select the ‘most-relevant’ atlases and how many such atlases to incorporate. We propose an inference model to relate the surrogates and the oracle geometric agreement metrics. Based on this model, we quantify the behavior of the surrogates in mimicking oracle metrics for atlas relevance ordering. Finally, analytical insights on the choice of fusion set size are presented from a probabilistic perspective, with the integrated goal of including the most relevant atlases and excluding the irrelevant ones. Empirical evidence and performance assessment are provided based on prostate and corpus callosum segmentation. (paper)

  20. A Probabilistic Analysis of Data Popularity in ATLAS Data Caching

    CERN Document Server

    Titov, M; The ATLAS collaboration; Záruba, G; De, K

    2012-01-01

    Efficient distribution of physics data over ATLAS grid sites is one of the most important tasks for user data processing. ATLAS' initial static data distribution model over-replicated some unpopular data and under-replicated popular data, creating heavy disk space loads while underutilizing some processing resources due to low data availability. Thus, a new data distribution mechanism was implemented, PD2P (PanDA Dynamic Data Placement) within the production and distributed analysis system PanDA that dynamically reacts to user data needs, basing dataset distribution principally on user demand. Data deletion is also demand driven, reducing replica counts for unpopular data. This dynamic model has led to substantial improvements in efficient utilization of storage and processing resources. Based on this experience, in this work we seek to further improve data placement policy by investigating in detail how data popularity is calculated. For this it is necessary to precisely define what data popularity means, wh...

  1. Probabilistic analysis of 900 MWe PWR. Shutdown technical specifications

    International Nuclear Information System (INIS)

    Mattei, J.M.; Bars, G.

    1987-11-01

    During annual shutdown, preventive maintenance and modifications which are made on PWRs cause scheduled unavailabilities of equipment or systems which might harm the safety of the installation, in spite of the low level of decay heat during this period. The pumps in the auxiliary feedwater system, component cooling water system, service water system, the water injection arrays (LPIS, HPIS, CVCS), and the containment spray system may have scheduled unavailability, as well as the power supply of the electricity boards. The EDF utility is aware of the risks related to these situations for which accident procedures have been set up and hence has proposed limiting downtime for this equipment during the shutdown period, through technical specifications. The project defines the equipment required to ensure the functions important for safety during the various shutdown phases (criticality, water inventory, evacuation of decay heat, containment). In order to be able to judge the acceptability of these specifications, the IPSN, the technical support of the Service Central de Surete des Installations Nucleaires, has used probabilistic methodology to analyse the impact on the core melt probability of these specifications, for a French 900 MWe PWR

  2. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I. [Hanyang University, Seoul (Korea, Republic of); Lee, J. S.; Lee, D. S.; Kwon, J. S. [Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, J. J. [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2003-06-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease.

  3. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    International Nuclear Information System (INIS)

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.

    2003-01-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease

  4. Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra provides reduced effect of scanner for cortex volumetry with atlas-based method in healthy subjects

    Energy Technology Data Exchange (ETDEWEB)

    Goto, Masami; Ino, Kenji; Yano, Keiichi [University of Tokyo Hospital, Department of Radiological Technology, Bunkyo-ku, Tokyo (Japan); Abe, Osamu [Nihon University School of Medicine, Department of Radiology, Itabashi-ku, Tokyo (Japan); Aoki, Shigeki [Juntendo University, Department of Radiology, Bunkyo-ku, Tokyo (Japan); Hayashi, Naoto [University of Tokyo Hospital, Department of Computational Diagnostic Radiology and Preventive Medicine, Bunkyo-ku, Tokyo (Japan); Miyati, Tosiaki [Kanazawa University, Graduate School of Medical Science, Kanazawa (Japan); Takao, Hidemasa; Mori, Harushi; Kunimatsu, Akira; Ohtomo, Kuni [University of Tokyo Hospital, Department of Radiology and Department of Computational Diagnostic Radiology and Preventive Medicine, Bunkyo-ku, Tokyo (Japan); Iwatsubo, Takeshi [University of Tokyo, Department of Neuropathology, Bunkyo-ku, Tokyo (Japan); Yamashita, Fumio [Iwate Medical University, Department of Radiology, Yahaba, Iwate (Japan); Matsuda, Hiroshi [Integrative Brain Imaging Center National Center of Neurology and Psychiatry, Department of Nuclear Medicine, Kodaira, Tokyo (Japan); Collaboration: Japanese Alzheimer' s Disease Neuroimaging Initiative

    2013-07-15

    This study aimed to investigate whether the effect of scanner for cortex volumetry with atlas-based method is reduced using Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra (DARTEL) normalization compared with standard normalization. Three-dimensional T1-weighted magnetic resonance images (3D-T1WIs) of 21 healthy subjects were obtained and evaluated for effect of scanner in cortex volumetry. 3D-T1WIs of the 21 subjects were obtained with five MRI systems. Imaging of each subject was performed on each of five different MRI scanners. We used the Voxel-Based Morphometry 8 tool implemented in Statistical Parametric Mapping 8 and WFU PickAtlas software (Talairach brain atlas theory). The following software default settings were used as bilateral region-of-interest labels: ''Frontal Lobe,'' ''Hippocampus,'' ''Occipital Lobe,'' ''Orbital Gyrus,'' ''Parietal Lobe,'' ''Putamen,'' and ''Temporal Lobe.'' Effect of scanner for cortex volumetry using the atlas-based method was reduced with DARTEL normalization compared with standard normalization in Frontal Lobe, Occipital Lobe, Orbital Gyrus, Putamen, and Temporal Lobe; was the same in Hippocampus and Parietal Lobe; and showed no increase with DARTEL normalization for any region of interest (ROI). DARTEL normalization reduces the effect of scanner, which is a major problem in multicenter studies. (orig.)

  5. Quantification of {sup 18}F-FDG PET images using probabilistic brain atlas: clinical application in temporal lobe epilepsy patients

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Keon Wook; Lee, Dong Soo; Cho, Jae Hoon; Lee, Jae Sung; Yeo, Jeong Seok; Lee, Sang Gun; Chung, June Key; Lee, Myung Chul [Seoul National Univ., Seoul (Korea, Republic of)

    2000-07-01

    A probabilistic atlas of the human brain (Statistical Probability Anatomical Maps: SPAM) was developed by the international consortium for brain mapping (ICBM). After calculating the counts in volume of interest (VOI) using the product of probability of SPAM images and counts in FDG images, asymmetric indexes(AI) were calculated and used for finding epileptogenic zones in temporal lobe epilepsy (TLE). FDG PET images from 28 surgically confirmed TLE patients and 12 age-matched controls were spatially normalized to the averaged brain MRI atlas of ICBM. The counts from normalized PET images were multiplied with the probability of 12 VOIs (superior temporal gyrus, middle temporal gyrus, inferior temporal gyrus, hippocampus, parahippocampal gyrus, and amygdala in each hemisphere) of SPAM images of Montreal Neurological Institute. Finally AI was calculated on each pair of VOI, and compared with visual assessment. If AI was deviated more than 2 standard deviation of normal controls, we considered epileptogenic zones were found successfully. The counts of VOIs in normal controls were symmetric (AI <6%, paired t-test p>0.05) except those of inferior temporal gyrus (p<0.01). AIs in 5 pairs of VOI excluding inferior temporal gyrus were deviated to one side in TLE (p<0.05). Lateralization was correct in 23/28 of patients by AI, but all of 28 were consistent with visual inspection. In 3 patients with normal AI was symmetric on visual inspection. In 2 patients falsely lateralized using AI, metabolism was also decreased visually on contra-lateral side. Asymmetric index obtained by the product of statistical probability anatomical map and FDG PET correlated well with visual assessment in TLE patients. SPAM is useful for quantification of VOIs in functional images.

  6. Safety-specific benefit of the probabilistic evaluation of older nuclear power plants

    International Nuclear Information System (INIS)

    Hoertner, H.; Koeberlein, K.

    1991-01-01

    The report summarizes the experience of the GRS obtained within the framework of a probabilistic evaluation of older nuclear power plants and the German risk study. The applied methodology and the problems involved are explained first. After a brief summary of probabilistic analyses carried out for German nuclear power plants, reliability analyses for older systems are discussed in detail. The findings from the probabilistic safety analyses and the conclusions drawn are presented. (orig.) [de

  7. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    Science.gov (United States)

    2012-05-17

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...

  8. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  9. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  10. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  11. Contrasting Connectivity of the Vim and Vop Nuclei of the Motor Thalamus Demonstrated by Probabilistic Tractography

    DEFF Research Database (Denmark)

    Hyam, Jonathan A; Owen, Sarah L F; Kringelbach, Morten L.

    2011-01-01

    BACKGROUND:: Targeting of the motor thalamus for the treatment of tremor has traditionally been achieved by a combination of anatomical atlases and neuro-imaging, intra-operative clinical assessment, and physiological recordings. OBJECTIVE:: To evaluate whether thalamic nuclei targeted in tremor...... surgery could be identified by virtue of their differing connections using non-invasive neuro-imaging, thereby providing an extra factor to aid successful targeting. METHODS:: Diffusion tensor tractography was performed in seventeen healthy control subjects using diffusion data acquired at 1.5T magnetic...... resonance imaging (60 directions, b-value=1000 s/mm, 2x2x2 mm voxels). The ventralis intermedius (Vim) and ventralis oralis posterior (Vop) nuclei were identified by a stereotactic neurosurgeon and these sites were used as seeds for probabilistic tractography. The expected cortical connections...

  12. On the logical specification of probabilistic transition models

    CSIR Research Space (South Africa)

    Rens, G

    2013-05-01

    Full Text Available We investigate the requirements for specifying the behaviors of actions in a stochastic domain. That is, we propose how to write sentences in a logical language to capture a model of probabilistic transitions due to the execution of actions of some...

  13. Description of the Probabilistic Wind Atlas Methodology, Deliverable D3.1

    DEFF Research Database (Denmark)

    Hahmann, Andrea N.; Witha, Björn; Rife, Daran L.

    against data from 10 meteorological masts in South Africa, part of the Wind Atlas of South Africa (WASA) project, where a long-term set of high-quality observations exist. The results of the ensemble simulations are encouraging, but further analysis is needed to quantify their utility. A key disadvantage...

  14. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  15. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  16. Cortex Parcellation Associated Whole White Matter Parcellation in Individual Subjects

    Directory of Open Access Journals (Sweden)

    Patrick Schiffler

    2017-07-01

    Full Text Available The investigation of specific white matter areas is a growing field in neurological research and is typically achieved through the use of atlases. However, the definition of anatomically based regions remains challenging for the white matter and thus hinders region-specific analysis in individual subjects. In this article, we focus on creating a whole white matter parcellation method for individual subjects where these areas can be associated to cortex regions. This is done by combining cortex parcellation and fiber tracking data. By tracking fibers out of each cortex region and labeling the fibers according to their origin, we populate a candidate image. We then derive the white matter parcellation by classifying each white matter voxel according to the distribution of labels in the corresponding voxel from the candidate image. The parcellation of the white matter with the presented method is highly reliable and is not as dependent on registration as with white matter atlases. This method allows for the parcellation of the whole white matter into individual cortex region associated areas and, therefore, associates white matter alterations to cortex regions. In addition, we compare the results from the presented method to existing atlases. The areas generated by the presented method are not as sharply defined as the areas in most existing atlases; however, they are computed directly in the DWI space of the subject and, therefore, do not suffer from distortion caused by registration. The presented approach might be a promising tool for clinical and basic research to investigate modalities or system specific micro structural alterations of white matter areas in a quantitative manner.

  17. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  18. OCA-P, a deterministic and probabilistic fracture-mechanics code for application to pressure vessels

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    1984-05-01

    The OCA-P code is a probabilistic fracture-mechanics code that was prepared specifically for evaluating the integrity of pressurized-water reactor vessels when subjected to overcooling-accident loading conditions. The code has two-dimensional- and some three-dimensional-flaw capability; it is based on linear-elastic fracture mechanics; and it can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For the former analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and various histograms (probabilistic analysis)

  19. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  20. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  1. Report to users of ATLAS, January 1998

    International Nuclear Information System (INIS)

    Ahmad, I.; Hofman, D.

    1998-01-01

    This report is aimed at informing users about the operating schedule, user policies, and recent changes in research capabilities. It covers the following subjects: (1) status of the Argonne Tandem-Linac Accelerator System (ATLAS) accelerator; (2) the move of Gammasphere from LBNL to ANL; (3) commissioning of the CPT mass spectrometer at ATLAS; (4) highlights of recent research at ATLAS; (5) Program Advisory Committee; and (6) ATLAS User Group Executive Committee

  2. Development of specific data of plant for a safety probabilistic analysis

    International Nuclear Information System (INIS)

    Gonzalez C, M.; Nelson E, P.

    2004-01-01

    In this work the development of specific data of plant is described for the Safety Probabilistic Analysis (APS) of the Laguna Verde Central. The description of those used methods concentrate on the obtention of rates of failure of the equipment and frequencies of initiator events modeled in the APS, making mention to other types of data that also appeal to specific sources of the plant. The method to obtain the rates of failure of the equipment takes advantage the information of failures of components and unavailability of systems obtained entreaty in execution with the Maintenance Rule (1OCFR50.65). The method to develop the frequencies of initiators take in account the registered operational experience as reportable events. In both cases the own experience is combined with published generic data using Bayesian realized techniques. Details are provided about the gathering of information, the confirmations of consistency and adjustment necessities, presenting examples of the obtained results. (Author)

  3. Construction of patient specific atlases from locally most similar anatomical pieces

    Science.gov (United States)

    Ramus, Liliane; Commowick, Olivier; Malandain, Grégoire

    2010-01-01

    Radiotherapy planning requires accurate delineations of the critical structures. To avoid manual contouring, atlas-based segmentation can be used to get automatic delineations. However, the results strongly depend on the chosen atlas, especially for the head and neck region where the anatomical variability is high. To address this problem, atlases adapted to the patient’s anatomy may allow for a better registration, and already showed an improvement in segmentation accuracy. However, building such atlases requires the definition of a criterion to select among a database the images that are the most similar to the patient. Moreover, the inter-expert variability of manual contouring may be high, and therefore bias the segmentation if selecting only one image for each region. To tackle these issues, we present an original method to design a piecewise most similar atlas. Given a query image, we propose an efficient criterion to select for each anatomical region the K most similar images among a database by considering local volume variations possibly induced by the tumor. Then, we present a new approach to combine the K images selected for each region into a piecewise most similar template. Our results obtained with 105 CT images of the head and neck show that our method reduces the over-segmentation seen with an average atlas while being robust to inter-expert manual segmentation variability. PMID:20879395

  4. Multi-atlas labeling with population-specific template and non-local patch-based label fusion

    DEFF Research Database (Denmark)

    Fonov, Vladimir; Coupé, Pierrick; Eskildsen, Simon Fristed

    We propose a new method combining a population-specific nonlinear template atlas approach with non-local patch-based structure segmentation for whole brain segmentation into individual structures. This way, we benefit from the efficient intensity-driven segmentation of the non-local means framework...... and from the global shape constraints imposed by the nonlinear template matching....

  5. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  6. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  7. Automated probabilistic reconstruction of white-matter pathways in health and disease using an atlas of the underlying anatomy

    Directory of Open Access Journals (Sweden)

    Anastasia eYendiki

    2011-10-01

    Full Text Available We have developed a method for automated probabilistic reconstruction of a set of major white-matter pathways from diffusion-weighted MR images. Our method is called TRACULA (TRActs Constrained by UnderLying Anatomy and utilizes prior information on the anatomy of the pathways from a set of training subjects. By incorporating this prior knowledge in the reconstruction procedure, our method obviates the need for manual interaction with the tract solutions at a later stage and thus facilitates the application of tractography to large studies. In this paper we illustrate the application of the method on data from a schizophrenia study and investigate whether the inclusion of both patients and healthy subjects in the training set affects our ability to reconstruct the pathways reliably. We show that, since our method does not constrain the exact spatial location or shape of the pathways but only their trajectory relative to the surrounding anatomical structures, a set a of healthy training subjects can be used to reconstruct the pathways accurately in patients as well as in controls.

  8. Construction of an in vivo human spinal cord atlas based on high-resolution MR images at cervical and thoracic levels: preliminary results.

    Science.gov (United States)

    Taso, Manuel; Le Troter, Arnaud; Sdika, Michaël; Ranjeva, Jean-Philippe; Guye, Maxime; Bernard, Monique; Callot, Virginie

    2014-06-01

    Our goal was to build a probabilistic atlas and anatomical template of the human cervical and thoracic spinal cord (SC) that could be used for segmentation algorithm improvement, parametric group studies, and enrichment of biomechanical modelling. High-resolution axial T2*-weighted images were acquired at 3T on 15 healthy volunteers using a multi-echo-gradient-echo sequence (1 slice per vertebral level from C1 to L2). After manual segmentation, linear and affine co-registrations were performed providing either inter-individual morphometric variability maps, or substructure probabilistic maps [CSF, white and grey matter (WM/GM)] and anatomical SC template. The larger inter-individual morphometric variations were observed at the thoraco-lumbar levels and in the posterior GM. Mean SC diameters were in agreement with the literature and higher than post-mortem measurements. A representative SC MR template was generated and values up to 90 and 100% were observed on GM and WM-probability maps. This work provides a probabilistic SC atlas and a template that could offer great potentialities for parametrical MRI analysis (DTI/MTR/fMRI) and group studies, similar to what has already been performed using a brain atlas. It also offers great perspective for biomechanical models usually based on post-mortem or generic data. Further work will consider integration into an automated SC segmentation pipeline.

  9. Anatomical guidance for functional near-infrared spectroscopy: AtlasViewer tutorial.

    Science.gov (United States)

    Aasted, Christopher M; Yücel, Meryem A; Cooper, Robert J; Dubb, Jay; Tsuzuki, Daisuke; Becerra, Lino; Petkov, Mike P; Borsook, David; Dan, Ippeita; Boas, David A

    2015-04-01

    Functional near-infrared spectroscopy (fNIRS) is an optical imaging method that is used to noninvasively measure cerebral hemoglobin concentration changes induced by brain activation. Using structural guidance in fNIRS research enhances interpretation of results and facilitates making comparisons between studies. AtlasViewer is an open-source software package we have developed that incorporates multiple spatial registration tools to enable structural guidance in the interpretation of fNIRS studies. We introduce the reader to the layout of the AtlasViewer graphical user interface, the folder structure, and user files required in the creation of fNIRS probes containing sources and detectors registered to desired locations on the head, evaluating probe fabrication error and intersubject probe placement variability, and different procedures for estimating measurement sensitivity to different brain regions as well as image reconstruction performance. Further, we detail how AtlasViewer provides a generic head atlas for guiding interpretation of fNIRS results, but also permits users to provide subject-specific head anatomies to interpret their results. We anticipate that AtlasViewer will be a valuable tool in improving the anatomical interpretation of fNIRS studies.

  10. Development and selection of Asian-specific humeral implants based on statistical atlas: toward planning minimally invasive surgery.

    Science.gov (United States)

    Wu, K; Daruwalla, Z J; Wong, K L; Murphy, D; Ren, H

    2015-08-01

    The commercial humeral implants based on the Western population are currently not entirely compatible with Asian patients, due to differences in bone size, shape and structure. Surgeons may have to compromise or use different implants that are less conforming, which may cause complications of as well as inconvenience to the implant position. The construction of Asian humerus atlases of different clusters has therefore been proposed to eradicate this problem and to facilitate planning minimally invasive surgical procedures [6,31]. According to the features of the atlases, new implants could be designed specifically for different patients. Furthermore, an automatic implant selection algorithm has been proposed as well in order to reduce the complications caused by implant and bone mismatch. Prior to the design of the implant, data clustering and extraction of the relevant features were carried out on the datasets of each gender. The fuzzy C-means clustering method is explored in this paper. Besides, two new schemes of implant selection procedures, namely the Procrustes analysis-based scheme and the group average distance-based scheme, were proposed to better search for the matching implants for new coming patients from the database. Both these two algorithms have not been used in this area, while they turn out to have excellent performance in implant selection. Additionally, algorithms to calculate the matching scores between various implants and the patient data are proposed in this paper to assist the implant selection procedure. The results obtained have indicated the feasibility of the proposed development and selection scheme. The 16 sets of male data were divided into two clusters with 8 and 8 subjects, respectively, and the 11 female datasets were also divided into two clusters with 5 and 6 subjects, respectively. Based on the features of each cluster, the implants designed by the proposed algorithm fit very well on their reference humeri and the proposed

  11. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  12. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  13. Automated Multi-Atlas Segmentation of Hippocampal and Extrahippocampal Subregions in Alzheimer's Disease at 3T and 7T: What Atlas Composition Works Best?

    Science.gov (United States)

    Xie, Long; Shinohara, Russell T; Ittyerah, Ranjit; Kuijf, Hugo J; Pluta, John B; Blom, Kim; Kooistra, Minke; Reijmer, Yael D; Koek, Huiberdina L; Zwanenburg, Jaco J M; Wang, Hongzhi; Luijten, Peter R; Geerlings, Mirjam I; Das, Sandhitsu R; Biessels, Geert Jan; Wolk, David A; Yushkevich, Paul A; Wisse, Laura E M

    2018-01-01

    Multi-atlas segmentation, a popular technique implemented in the Automated Segmentation of Hippocampal Subfields (ASHS) software, utilizes multiple expert-labelled images ("atlases") to delineate medial temporal lobe substructures. This multi-atlas method is increasingly being employed in early Alzheimer's disease (AD) research, it is therefore becoming important to know how the construction of the atlas set in terms of proportions of controls and patients with mild cognitive impairment (MCI) and/or AD affects segmentation accuracy. To evaluate whether the proportion of controls in the training sets affects the segmentation accuracy of both controls and patients with MCI and/or early AD at 3T and 7T. We performed cross-validation experiments varying the proportion of control subjects in the training set, ranging from a patient-only to a control-only set. Segmentation accuracy of the test set was evaluated by the Dice similarity coeffiecient (DSC). A two-stage statistical analysis was applied to determine whether atlas composition is linked to segmentation accuracy in control subjects and patients, for 3T and 7T. The different atlas compositions did not significantly affect segmentation accuracy at 3T and for patients at 7T. For controls at 7T, including more control subjects in the training set significantly improves the segmentation accuracy, but only marginally, with the maximum of 0.0003 DSC improvement per percent increment of control subject in the training set. ASHS is robust in this study, and the results indicate that future studies investigating hippocampal subfields in early AD populations can be flexible in the selection of their atlas compositions.

  14. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  15. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  16. Computational and mathematical methods in brain atlasing.

    Science.gov (United States)

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  17. Multiatlas whole heart segmentation of CT data using conditional entropy for atlas ranking and selection

    Energy Technology Data Exchange (ETDEWEB)

    Zhuang, Xiahai, E-mail: zhuangxiahai@sjtu.edu.cn; Qian, Xiaohua [SJTU-CU International Cooperative Research Center, Department of Engineering Mechanics, School of Naval Architecture Ocean and Civil Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Bai, Wenjia; Shi, Wenzhe; Rueckert, Daniel [Biomedical Image Analysis Group, Department of Computing, Imperial College London, 180 Queens Gate, London SW7 2AZ (United Kingdom); Song, Jingjing; Zhan, Songhua [Shuguang Hospital Affiliated to Shanghai University of Traditional Chinese Medicine, Shanghai 201203 (China); Lian, Yanyun [Shanghai Advanced Research Institute, Chinese Academy of Sciences, Shanghai 201210 (China)

    2015-07-15

    Purpose: Cardiac computed tomography (CT) is widely used in clinical diagnosis of cardiovascular diseases. Whole heart segmentation (WHS) plays a vital role in developing new clinical applications of cardiac CT. However, the shape and appearance of the heart can vary greatly across different scans, making the automatic segmentation particularly challenging. The objective of this work is to develop and evaluate a multiatlas segmentation (MAS) scheme using a new atlas ranking and selection algorithm for automatic WHS of CT data. Research on different MAS strategies and their influence on WHS performance are limited. This work provides a detailed comparison study evaluating the impacts of label fusion, atlas ranking, and sizes of the atlas database on the segmentation performance. Methods: Atlases in a database were registered to the target image using a hierarchical registration scheme specifically designed for cardiac images. A subset of the atlases were selected for label fusion, according to the authors’ proposed atlas ranking criterion which evaluated the performance of each atlas by computing the conditional entropy of the target image given the propagated atlas labeling. Joint label fusion was used to combine multiple label estimates to obtain the final segmentation. The authors used 30 clinical cardiac CT angiography (CTA) images to evaluate the proposed MAS scheme and to investigate different segmentation strategies. Results: The mean WHS Dice score of the proposed MAS method was 0.918 ± 0.021, and the mean runtime for one case was 13.2 min on a workstation. This MAS scheme using joint label fusion generated significantly better Dice scores than the other label fusion strategies, including majority voting (0.901 ± 0.276, p < 0.01), locally weighted voting (0.905 ± 0.0247, p < 0.01), and probabilistic patch-based fusion (0.909 ± 0.0249, p < 0.01). In the atlas ranking study, the proposed criterion based on conditional entropy yielded a performance curve

  18. A Probabilistic Analysis of Data Popularity in ATLAS Data Caching

    International Nuclear Information System (INIS)

    Titov, M; Záruba, G; De, K; Klimentov, A

    2012-01-01

    One of the most important aspects in any computing distribution system is efficient data replication over storage or computing centers, that guarantees high data availability and low cost for resource utilization. In this paper we propose a data distribution scheme for the production and distributed analysis system PanDA at the ATLAS experiment. Our proposed scheme is based on the investigation of data usage. Thus, the paper is focused on the main concepts of data popularity in the PanDA system and their utilization. Data popularity is represented as the set of parameters that are used to predict the future data state in terms of popularity levels.

  19. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  20. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  1. A Probabilistic Analysis of the Sacco and Vanzetti Evidence

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    A Probabilistic Analysis of the Sacco and Vanzetti Evidence is a Bayesian analysis of the trial and post-trial evidence in the Sacco and Vanzetti case, based on subjectively determined probabilities and assumed relationships among evidential events. It applies the ideas of charting evidence and probabilistic assessment to this case, which is perhaps the ranking cause celebre in all of American legal history. Modern computation methods applied to inference networks are used to show how the inferential force of evidence in a complicated case can be graded. The authors employ probabilistic assess

  2. OCA-P, PWR Vessel Probabilistic Fracture Mechanics

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    2001-01-01

    1 - Description of program or function: OCA-P is a probabilistic fracture-mechanics code prepared specifically for evaluating the integrity of pressurized-water reactor vessels subjected to overcooling-accident loading conditions. Based on linear-elastic fracture mechanics, it has two- and limited three-dimensional flaw capability, and can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For deterministic analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and a variety of histograms (probabilistic analysis). 2 - Method of solution: OAC-P accepts as input the reactor primary- system pressure and the reactor pressure-vessel downcomer coolant temperature, as functions of time in the specified transient. Then, the wall temperatures and stresses are calculated as a function of time and radial position in the wall, and the fracture-mechanics analysis is performed to obtain the stress intensity factors as a function of crack depth and time in the transient. In a deterministic analysis, values of the static crack initiation toughness and the crack arrest toughness are also calculated for all crack depths and times in the transient. A comparison of these values permits an evaluation of flaw behavior. For a probabilistic analysis, OCA-P generates a large number of reactor pressure vessels, each with a different combination of the various values of the parameters involved in the analysis of flaw behavior. For each of these vessels, a deterministic fracture

  3. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  4. White matter atlas of the human spinal cord with estimation of partial volume effect.

    Science.gov (United States)

    Lévy, S; Benhamou, M; Naaman, C; Rainville, P; Callot, V; Cohen-Adad, J

    2015-10-01

    Template-based analysis has proven to be an efficient, objective and reproducible way of extracting relevant information from multi-parametric MRI data. Using common atlases, it is possible to quantify MRI metrics within specific regions without the need for manual segmentation. This method is therefore free from user-bias and amenable to group studies. While template-based analysis is common procedure for the brain, there is currently no atlas of the white matter (WM) spinal pathways. The goals of this study were: (i) to create an atlas of the white matter tracts compatible with the MNI-Poly-AMU template and (ii) to propose methods to quantify metrics within the atlas that account for partial volume effect. The WM atlas was generated by: (i) digitalizing an existing WM atlas from a well-known source (Gray's Anatomy), (ii) registering this atlas to the MNI-Poly-AMU template at the corresponding slice (C4 vertebral level), (iii) propagating the atlas throughout all slices of the template (C1 to T6) using regularized diffeomorphic transformations and (iv) computing partial volume values for each voxel and each tract. Several approaches were implemented and validated to quantify metrics within the atlas, including weighted-average and Gaussian mixture models. Proof-of-concept application was done in five subjects for quantifying magnetization transfer ratio (MTR) in each tract of the atlas. The resulting WM atlas showed consistent topological organization and smooth transitions along the rostro-caudal axis. The median MTR across tracts was 26.2. Significant differences were detected across tracts, vertebral levels and subjects, but not across laterality (right-left). Among the different tested approaches to extract metrics, the maximum a posteriori showed highest performance with respect to noise, inter-tract variability, tract size and partial volume effect. This new WM atlas of the human spinal cord overcomes the biases associated with manual delineation and partial

  5. Exploiting Tensor Rank-One Decomposition in Probabilistic Inference

    Czech Academy of Sciences Publication Activity Database

    Savický, Petr; Vomlel, Jiří

    2007-01-01

    Roč. 43, č. 5 (2007), s. 747-764 ISSN 0023-5954 R&D Projects: GA MŠk 1M0545; GA MŠk 1M0572; GA ČR GA201/04/0393 Institutional research plan: CEZ:AV0Z10300504; CEZ:AV0Z10750506 Keywords : graphical probabilistic models * probabilistic inference * tensor rank Subject RIV: BD - Theory of Information Impact factor: 0.552, year: 2007 http://dml.cz/handle/10338.dmlcz/135810

  6. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  7. Integrating Networking into ATLAS

    CERN Document Server

    Mc Kee, Shawn Patrick; The ATLAS collaboration

    2018-01-01

    Networking is foundational to the ATLAS distributed infrastructure and there are many ongoing activities related to networking both within and outside of ATLAS. We will report on the progress in a number of areas exploring ATLAS's use of networking and our ability to monitor the network, analyze metrics from the network, and tune and optimize application and end-host parameters to make the most effective use of the network. Specific topics will include work on Open vSwitch for production systems, network analytics, FTS testing and tuning, and network problem alerting and alarming.

  8. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  9. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  10. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  11. Are Autonomous and Controlled Motivations School-Subjects-Specific?

    Science.gov (United States)

    Chanal, Julien; Guay, Frédéric

    2015-01-01

    This research sought to test whether autonomous and controlled motivations are specific to school subjects or more general to the school context. In two cross-sectional studies, 252 elementary school children (43.7% male; mean age = 10.7 years, SD = 1.3 years) and 334 junior high school children (49.7% male, mean age = 14.07 years, SD = 1.01 years) were administered a questionnaire assessing their motivation for various school subjects. Results based on structural equation modeling using the correlated trait-correlated method minus one model (CTCM-1) showed that autonomous and controlled motivations assessed at the school subject level are not equally school-subject-specific. We found larger specificity effects for autonomous (intrinsic and identified) than for controlled (introjected and external) motivation. In both studies, results of factor loadings and the correlations with self-concept and achievement demonstrated that more evidence of specificity was obtained for autonomous regulations than for controlled ones. These findings suggest a new understanding of the hierarchical and multidimensional academic structure of autonomous and controlled motivations and of the mechanisms involved in the development of types of regulations for school subjects. PMID:26247788

  12. Adaptation of a 3D prostate cancer atlas for transrectal ultrasound guided target-specific biopsy

    International Nuclear Information System (INIS)

    Narayanan, R; Suri, J S; Werahera, P N; Barqawi, A; Crawford, E D; Shinohara, K; Simoneau, A R

    2008-01-01

    Due to lack of imaging modalities to identify prostate cancer in vivo, current TRUS guided prostate biopsies are taken randomly. Consequently, many important cancers are missed during initial biopsies. The purpose of this study was to determine the potential clinical utility of a high-speed registration algorithm for a 3D prostate cancer atlas. This 3D prostate cancer atlas provides voxel-level likelihood of cancer and optimized biopsy locations on a template space (Zhan et al 2007). The atlas was constructed from 158 expert annotated, 3D reconstructed radical prostatectomy specimens outlined for cancers (Shen et al 2004). For successful clinical implementation, the prostate atlas needs to be registered to each patient's TRUS image with high registration accuracy in a time-efficient manner. This is implemented in a two-step procedure, the segmentation of the prostate gland from a patient's TRUS image followed by the registration of the prostate atlas. We have developed a fast registration algorithm suitable for clinical applications of this prostate cancer atlas. The registration algorithm was implemented on a graphical processing unit (GPU) to meet the critical processing speed requirements for atlas guided biopsy. A color overlay of the atlas superposed on the TRUS image was presented to help pick statistically likely regions known to harbor cancer. We validated our fast registration algorithm using computer simulations of two optimized 7- and 12-core biopsy protocols to maximize the overall detection rate. Using a GPU, patient's TRUS image segmentation and atlas registration took less than 12 s. The prostate cancer atlas guided 7- and 12-core biopsy protocols had cancer detection rates of 84.81% and 89.87% respectively when validated on the same set of data. Whereas the sextant biopsy approach without the utility of 3D cancer atlas detected only 70.5% of the cancers using the same histology data. We estimate 10-20% increase in prostate cancer detection rates

  13. Creation of computerized 3D MRI-integrated atlases of the human basal ganglia and thalamus

    Directory of Open Access Journals (Sweden)

    Abbas F. Sadikot

    2011-09-01

    Full Text Available Functional brain imaging and neurosurgery in subcortical areas often requires visualization of brain nuclei beyond the resolution of current Magnetic Resonance Imaging (MRI methods. We present techniques used to create: 1 a lower resolution 3D atlas, based on the Schaltenbrand and Wahren print atlas, which was integrated into a stereotactic neurosurgery planning and visualization platform (VIPER; and 2 a higher resolution 3D atlas derived from a single set of manually segmented histological slices containing nuclei of the basal ganglia, thalamus, basal forebrain and medial temporal lobe. Both atlases were integrated to a canonical MRI (Colin27 from a young male participant by manually identifying homologous landmarks. The lower resolution atlas was then warped to fit the MRI based on the identified landmarks. A pseudo-MRI representation of the high-resolution atlas was created, and a nonlinear transformation was calculated in order to match the atlas to the template MRI. The atlas can then be warped to match the anatomy of Parkinson’s disease surgical candidates by using 3D automated nonlinear deformation methods. By way of functional validation of the atlas, the location of the sensory thalamus was correlated with stereotactic intraoperative physiological data. The position of subthalamic electrode positions in patients with Parkinson’s disease was also evaluated in the atlas-integrated MRI space. Finally, probabilistic maps of subthalamic stimulation electrodes were developed, in order to allow group analysis of the location of contacts associated with the best motor outcomes. We have therefore developed, and are continuing to validate, a high-resolution computerized MRI-integrated 3D histological atlas, which is useful in functional neurosurgery, and for functional and anatomical studies of the human basal ganglia, thalamus and basal forebrain.

  14. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  15. The evaluation of a population based diffusion tensor image atlas using a ground truth method

    Science.gov (United States)

    Van Hecke, Wim; Leemans, Alexander; D'Agostino, Emiliano; De Backer, Steve; Vandervliet, Evert; Parizel, Paul M.; Sijbers, Jan

    2008-03-01

    Purpose: Voxel based morphometry (VBM) is increasingly being used to detect diffusion tensor (DT) image abnormalities in patients for different pathologies. An important requisite for these VBM studies is the use of a high-dimensional, non-rigid coregistration technique, which is able to align both the spatial and the orientational information. Recent studies furthermore indicate that high-dimensional DT information should be included during coregistration for an optimal alignment. In this context, a population based DTI atlas is created that preserves the orientational DT information robustly and contains a minimal bias towards any specific individual data set. Methods: A ground truth evaluation method is developed using a single subject DT image that is deformed with 20 deformation fields. Thereafter, an atlas is constructed based on these 20 resulting images. Thereby, the non-rigid coregistration algorithm is based on a viscous fluid model and on mutual information. The fractional anisotropy (FA) maps as well as the DT elements are used as DT image information during the coregistration algorithm, in order to minimize the orientational alignment inaccuracies. Results: The population based DT atlas is compared with the ground truth image using accuracy and precision measures of spatial and orientational dependent metrics. Results indicate that the population based atlas preserves the orientational information in a robust way. Conclusion: A subject independent population based DT atlas is constructed and evaluated with a ground truth method. This atlas contains all available orientational information and can be used in future VBM studies as a reference system.

  16. Bayesian longitudinal segmentation of hippocampal substructures in brain MRI using subject-specific atlases

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Van Leemput, Koen; Augustinack, Jean

    2016-01-01

    and 23 controls), it found differences in atrophy rates between AD and controls that the cross sectional method could not detect in a number of subregions: right parasubiculum, left and right presubiculum, right subiculum, left dentate gyrus, left CA4, left HATA and right tail. In ADNI (836 subjects: 369...... not find: left presubiculum, right subiculum, left and right parasubiculum, left and right HATA. Moreover, many of the differences that the cross-sectional method already found were detected with higher significance. The presented algorithm will be made available as part of the open-source neuroimaging...... differences and significantly higher Dice overlaps than the cross-sectional approach for nearly every subregion (average across subregions: 4.5% vs. 6.5%, Dice overlap: 81.8% vs. 75.4%). The longitudinal algorithm also demonstrated increased sensitivity to group differences: in MIRIAD (69 subjects: 46 with AD...

  17. ATLAS Facility and Instrumentation Description Report

    International Nuclear Information System (INIS)

    Kang, Kyoung Ho; Moon, Sang Ki; Park, Hyun Sik

    2009-06-01

    A thermal-hydraulic integral effect test facility, ATLAS (Advanced Thermal-hydraulic Test Loop for Accident Simulation), has been constructed at KAERI (Korea Atomic Energy Research Institute). The ATLAS is a half-height and 1/288-volume scaled test facility with respect to the APR1400. The fluid system of the ATLAS consists of a primary system, a secondary system, a safety injection system, a break simulating system, a containment simulating system, and auxiliary systems. The primary system includes a reactor vessel, two hot legs, four cold legs, a pressurizer, four reactor coolant pumps, and two steam generators. The secondary system of the ATLAS is simplified to be of a circulating looptype. Most of the safety injection features of the APR1400 and the OPR1000 are incorporated into the safety injection system of the ATLAS. In the ATLAS test facility, about 1300 instrumentations are installed to precisely investigate the thermal-hydraulic behavior in simulation of the various test scenarios. This report describes the scaling methodology, the geometric data of the individual component, and the specification and the location of the instrumentations which are specific to the simulation of 50% DVI line break accident of the APR1400 for supporting the 50 th OECD/NEA International Standard Problem Exercise (ISP-50)

  18. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  19. Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis.

    Directory of Open Access Journals (Sweden)

    Charmaine Demanuele

    Full Text Available Discriminating spatiotemporal stages of information processing involved in complex cognitive processes remains a challenge for neuroscience. This is especially so in prefrontal cortex whose subregions, such as the dorsolateral prefrontal (DLPFC, anterior cingulate (ACC and orbitofrontal (OFC cortices are known to have differentiable roles in cognition. Yet it is much less clear how these subregions contribute to different cognitive processes required by a given task. To investigate this, we use functional MRI data recorded from a group of healthy adults during a "Jumping to Conclusions" probabilistic reasoning task.We used a novel approach combining multivariate test statistics with bootstrap-based procedures to discriminate between different task stages reflected in the fMRI blood oxygenation level dependent signal pattern and to unravel differences in task-related information encoded by these regions. Furthermore, we implemented a new feature extraction algorithm that selects voxels from any set of brain regions that are jointly maximally predictive about specific task stages.Using both the multivariate statistics approach and the algorithm that searches for maximally informative voxels we show that during the Jumping to Conclusions task, the DLPFC and ACC contribute more to the decision making phase comprising the accumulation of evidence and probabilistic reasoning, while the OFC is more involved in choice evaluation and uncertainty feedback. Moreover, we show that in presumably non-task-related regions (temporal cortices all information there was about task processing could be extracted from just one voxel (indicating the unspecific nature of that information, while for prefrontal areas a wider multivariate pattern of activity was maximally informative.We present a new approach to reveal the different roles of brain regions during the processing of one task from multivariate activity patterns measured by fMRI. This method can be a valuable

  20. Comparison of plant-specific probabilistic safety assessments and lessons learned

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Berg, H.P.; Steininger, U.

    2001-01-01

    Probabilistic safety assessments (PSA) have been performed for all German nuclear power plants in operation. These assessments are mainly based on the recent German PSA guide and an earlier draft, respectively. However, comparison of these PSA show differences in the results which are discussed in this paper. Lessons learned from this comparison and further development of the PSA methodology are described. (orig.) [de

  1. Mechanical construction and installation of the ATLAS tile calorimeter

    Czech Academy of Sciences Publication Activity Database

    Abdallah, J.; Adragna, P.; Alexa, C.; Lokajíček, Miloš; Němeček, Stanislav; Přibyl, Lukáš

    2013-01-01

    Roč. 8, Nov (2013), 1-26 ISSN 1748-0221 Institutional support: RVO:68378271 Keywords : calorimeter * ATLAS * iron * scintillation counter * central region * CERN Lab * rapidity * ATLAS * CERN LHC Coll Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.526, year: 2013

  2. Estimation of mouse organ locations through registration of a statistical mouse atlas with micro-CT images.

    Science.gov (United States)

    Wang, Hongkai; Stout, David B; Chatziioannou, Arion F

    2012-01-01

    Micro-CT is widely used in preclinical studies of small animals. Due to the low soft-tissue contrast in typical studies, segmentation of soft tissue organs from noncontrast enhanced micro-CT images is a challenging problem. Here, we propose an atlas-based approach for estimating the major organs in mouse micro-CT images. A statistical atlas of major trunk organs was constructed based on 45 training subjects. The statistical shape model technique was used to include inter-subject anatomical variations. The shape correlations between different organs were described using a conditional Gaussian model. For registration, first the high-contrast organs in micro-CT images were registered by fitting the statistical shape model, while the low-contrast organs were subsequently estimated from the high-contrast organs using the conditional Gaussian model. The registration accuracy was validated based on 23 noncontrast-enhanced and 45 contrast-enhanced micro-CT images. Three different accuracy metrics (Dice coefficient, organ volume recovery coefficient, and surface distance) were used for evaluation. The Dice coefficients vary from 0.45 ± 0.18 for the spleen to 0.90 ± 0.02 for the lungs, the volume recovery coefficients vary from 0.96 ± 0.10 for the liver to 1.30 ± 0.75 for the spleen, the surface distances vary from 0.18 ± 0.01 mm for the lungs to 0.72 ± 0.42 mm for the spleen. The registration accuracy of the statistical atlas was compared with two publicly available single-subject mouse atlases, i.e., the MOBY phantom and the DIGIMOUSE atlas, and the results proved that the statistical atlas is more accurate than the single atlases. To evaluate the influence of the training subject size, different numbers of training subjects were used for atlas construction and registration. The results showed an improvement of the registration accuracy when more training subjects were used for the atlas construction. The statistical atlas-based registration was also compared with

  3. PROBABILISTIC SEISMIC ASSESSMENT OF BASE-ISOLATED NPPS SUBJECTED TO STRONG GROUND MOTIONS OF TOHOKU EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    AHMER ALI

    2014-10-01

    Full Text Available The probabilistic seismic performance of a standard Korean nuclear power plant (NPP with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  4. Probabilistic seismic assessment of base-isolated NPPs subjected to strong ground motions of Tohoku earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Ahmer; Hayah, Nadin Abu; Kim, Doo Kie [Dept. of Civil and Environmental Engineering, Kunsan National University, Kunsan (Korea, Republic of); Cho, Sung Gook [R and D Center, JACE KOREA Company, Gyeonggido (Korea, Republic of)

    2014-10-15

    The probabilistic seismic performance of a standard Korean nuclear power plant (NPP) with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA) of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA) as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  5. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  6. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  7. Two-stage atlas subset selection in multi-atlas based image segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu [The Department of Radiation Oncology, University of California, Los Angeles, California 90095 (United States)

    2015-06-15

    Purpose: Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. Methods: An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. Results: The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. Conclusions: The authors

  8. Two-stage atlas subset selection in multi-atlas based image segmentation.

    Science.gov (United States)

    Zhao, Tingting; Ruan, Dan

    2015-06-01

    Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. The authors have developed a novel two-stage atlas

  9. Two-stage atlas subset selection in multi-atlas based image segmentation

    International Nuclear Information System (INIS)

    Zhao, Tingting; Ruan, Dan

    2015-01-01

    Purpose: Fast growing access to large databases and cloud stored data presents a unique opportunity for multi-atlas based image segmentation and also presents challenges in heterogeneous atlas quality and computation burden. This work aims to develop a novel two-stage method tailored to the special needs in the face of large atlas collection with varied quality, so that high-accuracy segmentation can be achieved with low computational cost. Methods: An atlas subset selection scheme is proposed to substitute a significant portion of the computationally expensive full-fledged registration in the conventional scheme with a low-cost alternative. More specifically, the authors introduce a two-stage atlas subset selection method. In the first stage, an augmented subset is obtained based on a low-cost registration configuration and a preliminary relevance metric; in the second stage, the subset is further narrowed down to a fusion set of desired size, based on full-fledged registration and a refined relevance metric. An inference model is developed to characterize the relationship between the preliminary and refined relevance metrics, and a proper augmented subset size is derived to ensure that the desired atlases survive the preliminary selection with high probability. Results: The performance of the proposed scheme has been assessed with cross validation based on two clinical datasets consisting of manually segmented prostate and brain magnetic resonance images, respectively. The proposed scheme demonstrates comparable end-to-end segmentation performance as the conventional single-stage selection method, but with significant computation reduction. Compared with the alternative computation reduction method, their scheme improves the mean and medium Dice similarity coefficient value from (0.74, 0.78) to (0.83, 0.85) and from (0.82, 0.84) to (0.95, 0.95) for prostate and corpus callosum segmentation, respectively, with statistical significance. Conclusions: The authors

  10. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  11. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  12. ATLAS Award for Difficult Task

    CERN Multimedia

    2004-01-01

    Two Russian companies were honoured with an ATLAS Award, for supply of the ATLAS Inner Detector barrel support structure elements, last week. On 23 March the Russian company ORPE Technologiya and its subcontractor, RSP Khrunitchev, were jointly presented with an ATLAS Supplier Award. Since 1998, ORPE Technologiya has been actively involved in the development of the carbon-fibre reinforced plastic elements of the ATLAS Inner Detector barrel support structure. After three years of joint research and development, CERN and ORPE Technologiya launched the manufacturing contract. It had a tight delivery schedule and very demanding specifications in terms of mechanical tolerance and stability. The contract was successfully completed with the arrival of the last element of the structure at CERN on 8 January 2004. The delivery of this key component of the Inner Detector deserves an ATLAS Award given the difficulty of manufacturing the end-frames, which very few companies in the world would have been able to do at an ...

  13. Automated Loads Analysis System (ATLAS)

    Science.gov (United States)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  14. Probabilistic Power Flow Simulation allowing Temporary Current Overloading

    NARCIS (Netherlands)

    W.S. Wadman (Wander); G. Bloemhof; D.T. Crommelin (Daan); J.E. Frank (Jason)

    2012-01-01

    htmlabstractThis paper presents a probabilistic power flow model subject to connection temperature constraints. Renewable power generation is included and modelled stochastically in order to reflect its intermittent nature. In contrast to conventional models that enforce connection current

  15. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  16. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  17. Multi-dimensional Analysis for SLB Transient in ATLAS Facility as Activity of DSP (Domestic Standard Problem)

    International Nuclear Information System (INIS)

    Bae, B. U.; Park, Y. S.; Kim, J. R.; Kang, K. H.; Choi, K. Y.; Sung, H. J.; Hwang, M. J.; Kang, D. H.; Lim, S. G.; Jun, S. S.

    2015-01-01

    Participants of DSP-03 were divided in three groups and each group has focused on the specific subject related to the enhancement of the code analysis. The group A tried to investigate scaling capability of ATLAS test data by comparing to the code analysis for a prototype, and the group C studied to investigate effect of various models in the one-dimensional codes. This paper briefly summarizes the code analysis result from the group B participants in the DSP-03 of the ATLAS test facility. The code analysis by Group B focuses highly on investigating the multi-dimensional thermal hydraulic phenomena in the ATLAS facility during the SLB transient. Even though the one-dimensional system analysis code cannot simulate the whole system of the ATLAS facility with a nodalization of the CFD (Computational Fluid Dynamics) scale, a reactor pressure vessel can be considered with multi-dimensional components to reflect the thermal mixing phenomena inside a downcomer and a core. Also, the CFD could give useful information for understanding complex phenomena in specific components such as the reactor pressure vessel. From the analysis activity of Group B in ATLAS DSP-03, participants adopted a multi-dimensional approach to the code analysis for the SLB transient in the ATLAS test facility. The main purpose of the analysis was to investigate prediction capability of multi-dimensional analysis tools for the SLB experiment result. In particular, the asymmetric cooling and thermal mixing phenomena in the reactor pressure vessel could be significantly focused for modeling the multi-dimensional components

  18. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  19. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  20. Handbook of statistical methods single subject design

    CERN Document Server

    Satake, Eiki; Maxwell, David L

    2008-01-01

    This book is a practical guide of the most commonly used approaches in analyzing and interpreting single-subject data. It arranges the methodologies used in a logical sequence using an array of research studies from the existing published literature to illustrate specific applications. The book provides a brief discussion of each approach such as visual, inferential, and probabilistic model, the applications for which it is intended, and a step-by-step illustration of the test as used in an actual research study.

  1. Affective and cognitive factors influencing sensitivity to probabilistic information.

    Science.gov (United States)

    Tyszka, Tadeusz; Sawicki, Przemyslaw

    2011-11-01

    In study 1 different groups of female students were randomly assigned to one of four probabilistic information formats. Five different levels of probability of a genetic disease in an unborn child were presented to participants (within-subject factor). After the presentation of the probability level, participants were requested to indicate the acceptable level of pain they would tolerate to avoid the disease (in their unborn child), their subjective evaluation of the disease risk, and their subjective evaluation of being worried by this risk. The results of study 1 confirmed the hypothesis that an experience-based probability format decreases the subjective sense of worry about the disease, thus, presumably, weakening the tendency to overrate the probability of rare events. Study 2 showed that for the emotionally laden stimuli, the experience-based probability format resulted in higher sensitivity to probability variations than other formats of probabilistic information. These advantages of the experience-based probability format are interpreted in terms of two systems of information processing: the rational deliberative versus the affective experiential and the principle of stimulus-response compatibility. © 2011 Society for Risk Analysis.

  2. Probabilistic safety assessment for seismic events

    International Nuclear Information System (INIS)

    1993-10-01

    This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes

  3. Poster - 32: Atlas Selection for Automated Segmentation of Pelvic CT for Prostate Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Mallawi, Abrar; Farrell, TomTom; Diamond, Kevin-Ross; Wierzbicki, Marcin [McMaster University / National Guard Health Affairs, Radiation Oncology Department, Riyadh, Saudi Arabia, McMaster University / Juravinski Cancer Centre, McMaster University / Juravinski Cancer Centre, McMaster University / Juravinski Cancer Centre (Saudi Arabia)

    2016-08-15

    Atlas based-segmentation has recently been evaluated for use in prostate radiotherapy. In a typical approach, the essential step is the selection of an atlas from a database that the best matches of the target image. This work proposes an atlas selection strategy and evaluate it impacts on final segmentation accuracy. Several anatomical parameters were measured to indicate the overall prostate and body shape, all of these measurements obtained on CT images. A brute force procedure was first performed for a training dataset of 20 patients using image registration to pair subject with similar contours; each subject was served as a target image to which all reaming 19 images were affinity registered. The overlap between the prostate and femoral heads was quantified for each pair using the Dice Similarity Coefficient (DSC). Finally, an atlas selection procedure was designed; relying on the computation of a similarity score defined as a weighted sum of differences between the target and atlas subject anatomical measurement. The algorithm ability to predict the most similar atlas was excellent, achieving mean DSCs of 0.78 ± 0.07 and 0.90 ± 0.02 for the CTV and either femoral head. The proposed atlas selection yielded 0.72 ± 0.11 and 0.87 ± 0.03 for CTV and either femoral head. The DSC obtained with the proposed selection method were slightly lower than the maximum established using brute force, but this does not include potential improvements expected with deformable registration. The proposed atlas selection method provides reasonable segmentation accuracy.

  4. Poster - 32: Atlas Selection for Automated Segmentation of Pelvic CT for Prostate Radiotherapy

    International Nuclear Information System (INIS)

    Mallawi, Abrar; Farrell, TomTom; Diamond, Kevin-Ross; Wierzbicki, Marcin

    2016-01-01

    Atlas based-segmentation has recently been evaluated for use in prostate radiotherapy. In a typical approach, the essential step is the selection of an atlas from a database that the best matches of the target image. This work proposes an atlas selection strategy and evaluate it impacts on final segmentation accuracy. Several anatomical parameters were measured to indicate the overall prostate and body shape, all of these measurements obtained on CT images. A brute force procedure was first performed for a training dataset of 20 patients using image registration to pair subject with similar contours; each subject was served as a target image to which all reaming 19 images were affinity registered. The overlap between the prostate and femoral heads was quantified for each pair using the Dice Similarity Coefficient (DSC). Finally, an atlas selection procedure was designed; relying on the computation of a similarity score defined as a weighted sum of differences between the target and atlas subject anatomical measurement. The algorithm ability to predict the most similar atlas was excellent, achieving mean DSCs of 0.78 ± 0.07 and 0.90 ± 0.02 for the CTV and either femoral head. The proposed atlas selection yielded 0.72 ± 0.11 and 0.87 ± 0.03 for CTV and either femoral head. The DSC obtained with the proposed selection method were slightly lower than the maximum established using brute force, but this does not include potential improvements expected with deformable registration. The proposed atlas selection method provides reasonable segmentation accuracy.

  5. Towards probabilistic synchronisation of local controllers

    Czech Academy of Sciences Publication Activity Database

    Herzallah, R.; Kárný, Miroslav

    2017-01-01

    Roč. 48, č. 3 (2017), s. 604-615 ISSN 0020-7721 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : cooperative control * optimal control * complex system s * stochastic system s * fully probabilistic desing Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 2.285, year: 2016

  6. ATLAS Facility Description Report

    International Nuclear Information System (INIS)

    Kang, Kyoung Ho; Moon, Sang Ki; Park, Hyun Sik; Cho, Seok; Choi, Ki Yong

    2009-04-01

    A thermal-hydraulic integral effect test facility, ATLAS (Advanced Thermal-hydraulic Test Loop for Accident Simulation), has been constructed at KAERI (Korea Atomic Energy Research Institute). The ATLAS has the same two-loop features as the APR1400 and is designed according to the well-known scaling method suggested by Ishii and Kataoka to simulate the various test scenarios as realistically as possible. It is a half-height and 1/288-volume scaled test facility with respect to the APR1400. The fluid system of the ATLAS consists of a primary system, a secondary system, a safety injection system, a break simulating system, a containment simulating system, and auxiliary systems. The primary system includes a reactor vessel, two hot legs, four cold legs, a pressurizer, four reactor coolant pumps, and two steam generators. The secondary system of the ATLAS is simplified to be of a circulating loop-type. Most of the safety injection features of the APR1400 and the OPR1000 are incorporated into the safety injection system of the ATLAS. In the ATLAS test facility, about 1300 instrumentations are installed to precisely investigate the thermal-hydraulic behavior in simulation of the various test scenarios. This report describes the scaling methodology, the geometric data of the individual component, and the specification and the location of the instrumentations in detail

  7. A digital atlas of the dog brain.

    Directory of Open Access Journals (Sweden)

    Ritobrato Datta

    Full Text Available There is a long history and a growing interest in the canine as a subject of study in neuroscience research and in translational neurology. In the last few years, anatomical and functional magnetic resonance imaging (MRI studies of awake and anesthetized dogs have been reported. Such efforts can be enhanced by a population atlas of canine brain anatomy to implement group analyses. Here we present a canine brain atlas derived as the diffeomorphic average of a population of fifteen mesaticephalic dogs. The atlas includes: 1 A brain template derived from in-vivo, T1-weighted imaging at 1 mm isotropic resolution at 3 Tesla (with and without the soft tissues of the head; 2 A co-registered, high-resolution (0.33 mm isotropic template created from imaging of ex-vivo brains at 7 Tesla; 3 A surface representation of the gray matter/white matter boundary of the high-resolution atlas (including labeling of gyral and sulcal features. The properties of the atlas are considered in relation to historical nomenclature and the evolutionary taxonomy of the Canini tribe. The atlas is available for download (https://cfn.upenn.edu/aguirre/wiki/public:data_plosone_2012_datta.

  8. An Example-Based Multi-Atlas Approach to Automatic Labeling of White Matter Tracts.

    Science.gov (United States)

    Yoo, Sang Wook; Guevara, Pamela; Jeong, Yong; Yoo, Kwangsun; Shin, Joseph S; Mangin, Jean-Francois; Seong, Joon-Kyung

    2015-01-01

    We present an example-based multi-atlas approach for classifying white matter (WM) tracts into anatomic bundles. Our approach exploits expert-provided example data to automatically classify the WM tracts of a subject. Multiple atlases are constructed to model the example data from multiple subjects in order to reflect the individual variability of bundle shapes and trajectories over subjects. For each example subject, an atlas is maintained to allow the example data of a subject to be added or deleted flexibly. A voting scheme is proposed to facilitate the multi-atlas exploitation of example data. For conceptual simplicity, we adopt the same metrics in both example data construction and WM tract labeling. Due to the huge number of WM tracts in a subject, it is time-consuming to label each WM tract individually. Thus, the WM tracts are grouped according to their shape similarity, and WM tracts within each group are labeled simultaneously. To further enhance the computational efficiency, we implemented our approach on the graphics processing unit (GPU). Through nested cross-validation we demonstrated that our approach yielded high classification performance. The average sensitivities for bundles in the left and right hemispheres were 89.5% and 91.0%, respectively, and their average false discovery rates were 14.9% and 14.2%, respectively.

  9. Measuring Single Event Upsets in the ATLAS Inner Tracker

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    When the HL-LHC starts collecting data, the electronics inside will be subject to massive amounts of radiation. As a result, single event upsets could pose a threat to the ATLAS readout chain. The ABC130, a prototype front-end ASIC for the ATLAS inner tracker, must be tested for its susceptibility to single event upsets.

  10. Probabilistic risk assessment in nuclear power plant regulation

    Energy Technology Data Exchange (ETDEWEB)

    Wall, J B

    1980-09-01

    A specific program is recommended to utilize more effectively probabilistic risk assessment in nuclear power plant regulation. It is based upon the engineering insights from the Reactor Safety Study (WASH-1400) and some follow-on risk assessment research by USNRC. The Three Mile Island accident is briefly discussed from a risk viewpoint to illustrate a weakness in current practice. The development of a probabilistic safety goal is recommended with some suggestions on underlying principles. Some ongoing work on risk perception and the draft probabilistic safety goal being reviewed on Canada is described. Some suggestions are offered on further risk assessment research. Finally, some recent U.S. Nuclear Regulatory Commission actions are described.

  11. ATLAS DataFlow Infrastructure: Recent results from ATLAS cosmic and first-beam data-taking

    Energy Technology Data Exchange (ETDEWEB)

    Vandelli, Wainer, E-mail: wainer.vandelli@cern.c

    2010-04-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented test-bed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its flexibility and contributed in understanding its limitations. Moreover, the integration with the detector and the interfacing with the off-line data processing and management have been able to take advantage of this extended data taking-period as well. In this paper we report on the usage of the DataFlow infrastructure during the ATLAS data-taking. These results, backed-up by complementary performance tests, validate the architecture of the ATLAS DataFlow and prove that the system is robust, flexible and scalable enough to cope with the final requirements of the ATLAS experiment.

  12. ATLAS DataFlow Infrastructure: Recent results from ATLAS cosmic and first-beam data-taking

    International Nuclear Information System (INIS)

    Vandelli, Wainer

    2010-01-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented test-bed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its flexibility and contributed in understanding its limitations. Moreover, the integration with the detector and the interfacing with the off-line data processing and management have been able to take advantage of this extended data taking-period as well. In this paper we report on the usage of the DataFlow infrastructure during the ATLAS data-taking. These results, backed-up by complementary performance tests, validate the architecture of the ATLAS DataFlow and prove that the system is robust, flexible and scalable enough to cope with the final requirements of the ATLAS experiment.

  13. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  14. Subject-specific bone attenuation correction for brain PET/MR: can ZTE-MRI substitute CT scan accurately?

    Science.gov (United States)

    Khalifé, Maya; Fernandez, Brice; Jaubert, Olivier; Soussan, Michael; Brulon, Vincent; Buvat, Irène; Comtat, Claude

    2017-10-01

    In brain PET/MR applications, accurate attenuation maps are required for accurate PET image quantification. An implemented attenuation correction (AC) method for brain imaging is the single-atlas approach that estimates an AC map from an averaged CT template. As an alternative, we propose to use a zero echo time (ZTE) pulse sequence to segment bone, air and soft tissue. A linear relationship between histogram normalized ZTE intensity and measured CT density in Hounsfield units (HU ) in bone has been established thanks to a CT-MR database of 16 patients. Continuous AC maps were computed based on the segmented ZTE by setting a fixed linear attenuation coefficient (LAC) to air and soft tissue and by using the linear relationship to generate continuous μ values for the bone. Additionally, for the purpose of comparison, four other AC maps were generated: a ZTE derived AC map with a fixed LAC for the bone, an AC map based on the single-atlas approach as provided by the PET/MR manufacturer, a soft-tissue only AC map and, finally, the CT derived attenuation map used as the gold standard (CTAC). All these AC maps were used with different levels of smoothing for PET image reconstruction with and without time-of-flight (TOF). The subject-specific AC map generated by combining ZTE-based segmentation and linear scaling of the normalized ZTE signal into HU was found to be a good substitute for the measured CTAC map in brain PET/MR when used with a Gaussian smoothing kernel of 4~mm corresponding to the PET scanner intrinsic resolution. As expected TOF reduces AC error regardless of the AC method. The continuous ZTE-AC performed better than the other alternative MR derived AC methods, reducing the quantification error between the MRAC corrected PET image and the reference CTAC corrected PET image.

  15. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  16. Multi-atlas attenuation correction supports full quantification of static and dynamic brain PET data in PET-MR

    Science.gov (United States)

    Mérida, Inés; Reilhac, Anthonin; Redouté, Jérôme; Heckemann, Rolf A.; Costes, Nicolas; Hammers, Alexander

    2017-04-01

    In simultaneous PET-MR, attenuation maps are not directly available. Essential for absolute radioactivity quantification, they need to be derived from MR or PET data to correct for gamma photon attenuation by the imaged object. We evaluate a multi-atlas attenuation correction method for brain imaging (MaxProb) on static [18F]FDG PET and, for the first time, on dynamic PET, using the serotoninergic tracer [18F]MPPF. A database of 40 MR/CT image pairs (atlases) was used. The MaxProb method synthesises subject-specific pseudo-CTs by registering each atlas to the target subject space. Atlas CT intensities are then fused via label propagation and majority voting. Here, we compared these pseudo-CTs with the real CTs in a leave-one-out design, contrasting the MaxProb approach with a simplified single-atlas method (SingleAtlas). We evaluated the impact of pseudo-CT accuracy on reconstructed PET images, compared to PET data reconstructed with real CT, at the regional and voxel levels for the following: radioactivity images; time-activity curves; and kinetic parameters (non-displaceable binding potential, BPND). On static [18F]FDG, the mean bias for MaxProb ranged between 0 and 1% for 73 out of 84 regions assessed, and exceptionally peaked at 2.5% for only one region. Statistical parametric map analysis of MaxProb-corrected PET data showed significant differences in less than 0.02% of the brain volume, whereas SingleAtlas-corrected data showed significant differences in 20% of the brain volume. On dynamic [18F]MPPF, most regional errors on BPND ranged from -1 to  +3% (maximum bias 5%) for the MaxProb method. With SingleAtlas, errors were larger and had higher variability in most regions. PET quantification bias increased over the duration of the dynamic scan for SingleAtlas, but not for MaxProb. We show that this effect is due to the interaction of the spatial tracer-distribution heterogeneity variation over time with the degree of accuracy of the attenuation maps. This

  17. Aluminum 7075-T6 fatigue data generation and probabilistic life prediction formulation

    OpenAIRE

    Kemna, John G.

    1998-01-01

    Approved for public release; distribution is unlimited. The life extension of aging fleet aircraft requires an assessment of the safe-life remaining after refurbishment. Risk can be estimated by conventional deterministic fatigue analysis coupled with a subjective factor of safety. Alternatively, risk can be quantitatively and objectively predicted by probabilistic analysis. In this investigation, a general probabilistic life formulation is specialized for constant amplitude, fully reverse...

  18. Human reliability in probabilistic safety assessments

    International Nuclear Information System (INIS)

    Nunez Mendez, J.

    1989-01-01

    Nowadays a growing interest in medioambiental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processess and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects. (This relevance has been demostrated in the accidents happenned). However in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a guide to carry out a Human Reliability Analysis and c) a selected overwiev of the techniques and methodologies currently applied in this area. (Author)

  19. ATLAS software stack on ARM64

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00529764; The ATLAS collaboration; Stewart, Graeme; Seuster, Rolf; Quadt, Arnulf

    2017-01-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  20. ATLAS software stack on ARM64

    Science.gov (United States)

    Smith, Joshua Wyatt; Stewart, Graeme A.; Seuster, Rolf; Quadt, Arnulf; ATLAS Collaboration

    2017-10-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  1. Radiologic atlas of rheumatic diseases

    International Nuclear Information System (INIS)

    Dihlmann, W.

    1986-01-01

    This book is an ''atlas of rheumatic joint disease'' selected from 20 years of personal experience by the author. The author sets a goal of demonstrating the value of soft-tissue imaging in the diagnosis of early joint disease. This goal is achieved with high quality reproductions, many of which are presented in duplicate to illustrate bone and soft-tissue changes. The contents include an introductory overview of the ''Mosaic of Arthritis'' followed by sections on adult rheumatoid arthritis, seronegative spondyloarthropathies, classic collagen disease, enthesiopathies, and lastly a section on gout and psuedogout. The subject index is specific and indexes figures with boldface type. Each section is introduced by a brief outline or overview of the radiographic spectrum of the joint disorder to be illustrated

  2. ATLAS DataFlow Infrastructure recent results from ATLAS cosmic and first-beam data-taking

    CERN Document Server

    Vandelli, W

    2010-01-01

    The ATLAS DataFlow infrastructure is responsible for the collection and conveyance of event data from the detector front-end electronics to the mass storage. Several optimized and multi-threaded applications fulfill this purpose operating over a multi-stage Gigabit Ethernet network which is the backbone of the ATLAS Trigger and Data Acquisition System. The system must be able to efficiently transport event-data with high reliability, while providing aggregated bandwidths larger than 5 GByte/s and coping with many thousands network connections. Nevertheless, routing and streaming capabilities and monitoring and data accounting functionalities are also fundamental requirements. During 2008, a few months of ATLAS cosmic data-taking and the first experience with the LHC beams provided an unprecedented testbed for the evaluation of the performance of the ATLAS DataFlow, in terms of functionality, robustness and stability. Besides, operating the system far from its design specifications helped in exercising its fle...

  3. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  4. SU-E-J-128: Two-Stage Atlas Selection in Multi-Atlas-Based Image Segmentation

    International Nuclear Information System (INIS)

    Zhao, T; Ruan, D

    2015-01-01

    Purpose: In the new era of big data, multi-atlas-based image segmentation is challenged by heterogeneous atlas quality and high computation burden from extensive atlas collection, demanding efficient identification of the most relevant atlases. This study aims to develop a two-stage atlas selection scheme to achieve computational economy with performance guarantee. Methods: We develop a low-cost fusion set selection scheme by introducing a preliminary selection to trim full atlas collection into an augmented subset, alleviating the need for extensive full-fledged registrations. More specifically, fusion set selection is performed in two successive steps: preliminary selection and refinement. An augmented subset is first roughly selected from the whole atlas collection with a simple registration scheme and the corresponding preliminary relevance metric; the augmented subset is further refined into the desired fusion set size, using full-fledged registration and the associated relevance metric. The main novelty of this work is the introduction of an inference model to relate the preliminary and refined relevance metrics, based on which the augmented subset size is rigorously derived to ensure the desired atlases survive the preliminary selection with high probability. Results: The performance and complexity of the proposed two-stage atlas selection method were assessed using a collection of 30 prostate MR images. It achieved comparable segmentation accuracy as the conventional one-stage method with full-fledged registration, but significantly reduced computation time to 1/3 (from 30.82 to 11.04 min per segmentation). Compared with alternative one-stage cost-saving approach, the proposed scheme yielded superior performance with mean and medium DSC of (0.83, 0.85) compared to (0.74, 0.78). Conclusion: This work has developed a model-guided two-stage atlas selection scheme to achieve significant cost reduction while guaranteeing high segmentation accuracy. The benefit

  5. Atlas warping for brain morphometry

    Science.gov (United States)

    Machado, Alexei M. C.; Gee, James C.

    1998-06-01

    In this work, we describe an automated approach to morphometry based on spatial normalizations of the data, and demonstrate its application to the analysis of gender differences in the human corpus callosum. The purpose is to describe a population by a reduced and representative set of variables, from which a prior model can be constructed. Our approach is rooted in the assumption that individual anatomies can be considered as quantitative variations on a common underlying qualitative plane. We can therefore imagine that a given individual's anatomy is a warped version of some referential anatomy, also known as an atlas. The spatial warps which transform a labeled atlas into anatomic alignment with a population yield immediate knowledge about organ size and shape in the group. Furthermore, variation within the set of spatial warps is directly related to the anatomic variation among the subjects. Specifically, the shape statistics--mean and variance of the mappings--for the population can be calculated in a special basis, and an eigendecomposition of the variance performed to identify the most significant modes of shape variation. The results obtained with the corpus callosum study confirm the existence of substantial anatomical differences between males and females, as reported in previous experimental work.

  6. Increasing Drought Sensitivity and Decline of Atlas Cedar (Cedrus atlantica in the Moroccan Middle Atlas Forests

    Directory of Open Access Journals (Sweden)

    Jesús Julio Camarero

    2011-09-01

    Full Text Available An understanding of the interactions between climate change and forest structure on tree growth are needed for decision making in forest conservation and management. In this paper, we investigated the relative contribution of tree features and stand structure on Atlas cedar (Cedrus atlantica radial growth in forests that have experienced heavy grazing and logging in the past. Dendrochronological methods were applied to quantify patterns in basal-area increment and drought sensitivity of Atlas cedar in the Middle Atlas, northern Morocco. We estimated the tree-to-tree competition intensity and quantified the structure in Atlas cedar stands with contrasting tree density, age, and decline symptoms. The relative contribution of tree age and size and stand structure to Atlas cedar growth decline was estimated by variance partitioning using partial-redundancy analyses. Recurrent drought events and temperature increases have been identified from local climate records since the 1970s. We detected consistent growth declines and increased drought sensitivity in Atlas cedar across all sites since the early 1980s. Specifically, we determined that previous growth rates and tree age were the strongest tree features, while Quercus rotundifolia basal area was the strongest stand structure measure related to Atlas cedar decline. As a result, we suggest that Atlas cedar forests that have experienced severe drought in combination with grazing and logging may be in the process of shifting dominance toward more drought-tolerant species such as Q. rotundifolia.

  7. Axiomatisation of fully probabilistic design

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kroupa, Tomáš

    2012-01-01

    Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf

  8. Finite element modeling of the human kidney for probabilistic occupant models: Statistical shape analysis and mesh morphing.

    Science.gov (United States)

    Yates, Keegan M; Untaroiu, Costin D

    2018-04-16

    Statistical shape analysis was conducted on 15 pairs (left and right) of human kidneys. It was shown that the left and right kidney were significantly different in size and shape. In addition, several common modes of kidney variation were identified using statistical shape analysis. Semi-automatic mesh morphing techniques have been developed to efficiently create subject specific meshes from a template mesh with a similar geometry. Subject specific meshes as well as probabilistic kidney meshes were created from a template mesh. Mesh quality remained about the same as the template mesh while only taking a fraction of the time to create the mesh from scratch or morph with manually identified landmarks. This technique can help enhance the quality of information gathered from experimental testing with subject specific meshes as well as help to more efficiently predict injury by creating models with the mean shape as well as models at the extremes for each principal component. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  10. Funding ATLAS 2012 key indicators for publicly funded research in Germany

    CERN Document Server

    Deutsche Forschungsgemeinschaft (DFG)

    2013-01-01

    The Funding ATLAS is a reporting system (previously referred to as the Funding Ranking) employed by the German Research Foundation (DFG) to provide information in the form of indicators of key developments in publicly funded research in Germany every three years. This English version of the Funding ATLAS 2012 presents selected findings from the more comprehensive German edition. At the core of the report are indicators that provide information on which subject areas have received funding at higher education and other research institutions in the period 2008-2010. This report also includes, as a supplement not found in the German edition, the decisions on the Excellence Initiative, which were taken shortly after the German edition of the Funding ATLAS 2012 was published. The report also addresses the subject of internationality by presenting selected indicators that show how attractive Germany's research institutions are for visiting scientists. In summary, the DFG Funding ATLAS furnishes reliable indicators o...

  11. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  12. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  13. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  14. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  15. Poster — Thur Eve — 59: Atlas Selection for Automated Segmentation of Pelvic CT for Prostate Radiotherapy

    International Nuclear Information System (INIS)

    Mallawi, A; Farrell, T; Diamond, K; Wierzbicki, M

    2014-01-01

    Automated atlas-based segmentation has recently been evaluated for use in planning prostate cancer radiotherapy. In the typical approach, the essential step is the selection of an atlas from a database that best matches the target image. This work proposes an atlas selection strategy and evaluates its impact on the final segmentation accuracy. Prostate length (PL), right femoral head diameter (RFHD), and left femoral head diameter (LFHD) were measured in CT images of 20 patients. Each subject was then taken as the target image to which all remaining 19 images were affinely registered. For each pair of registered images, the overlap between prostate and femoral head contours was quantified using the Dice Similarity Coefficient (DSC). Finally, we designed an atlas selection strategy that computed the ratio of PL (prostate segmentation), RFHD (right femur segmentation), and LFHD (left femur segmentation) between the target subject and each subject in the atlas database. Five atlas subjects yielding ratios nearest to one were then selected for further analysis. RFHD and LFHD were excellent parameters for atlas selection, achieving a mean femoral head DSC of 0.82 ± 0.06. PL had a moderate ability to select the most similar prostate, with a mean DSC of 0.63 ± 0.18. The DSC obtained with the proposed selection method were slightly lower than the maximums established using brute force, but this does not include potential improvements expected with deformable registration. Atlas selection based on PL for prostate and femoral diameter for femoral heads provides reasonable segmentation accuracy

  16. Poster — Thur Eve — 59: Atlas Selection for Automated Segmentation of Pelvic CT for Prostate Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Mallawi, A [McMaster University, Medical Physics and Applied Radiation Sciences Department, Hamilton, Ontario (Canada); Farrell, T; Diamond, K; Wierzbicki, M [McMaster University, Medical Physics and Applied Radiation Sciences Department, Hamilton, Ontario (Canada); Juravinski Cancer Centre, Medical Physics Department, Hamilton, Ontario (Canada)

    2014-08-15

    Automated atlas-based segmentation has recently been evaluated for use in planning prostate cancer radiotherapy. In the typical approach, the essential step is the selection of an atlas from a database that best matches the target image. This work proposes an atlas selection strategy and evaluates its impact on the final segmentation accuracy. Prostate length (PL), right femoral head diameter (RFHD), and left femoral head diameter (LFHD) were measured in CT images of 20 patients. Each subject was then taken as the target image to which all remaining 19 images were affinely registered. For each pair of registered images, the overlap between prostate and femoral head contours was quantified using the Dice Similarity Coefficient (DSC). Finally, we designed an atlas selection strategy that computed the ratio of PL (prostate segmentation), RFHD (right femur segmentation), and LFHD (left femur segmentation) between the target subject and each subject in the atlas database. Five atlas subjects yielding ratios nearest to one were then selected for further analysis. RFHD and LFHD were excellent parameters for atlas selection, achieving a mean femoral head DSC of 0.82 ± 0.06. PL had a moderate ability to select the most similar prostate, with a mean DSC of 0.63 ± 0.18. The DSC obtained with the proposed selection method were slightly lower than the maximums established using brute force, but this does not include potential improvements expected with deformable registration. Atlas selection based on PL for prostate and femoral diameter for femoral heads provides reasonable segmentation accuracy.

  17. ATLAS overview week highlights

    CERN Multimedia

    D. Froidevaux

    2005-01-01

    A warm and early October afternoon saw the beginning of the 2005 ATLAS overview week, which took place Rue de La Montagne Sainte-Geneviève in the heart of the Quartier Latin in Paris. All visitors had been warned many times by the ATLAS management and the organisers that the premises would be the subject of strict security clearance because of the "plan Vigipirate", which remains at some level of alert in all public buildings across France. The public building in question is now part of the Ministère de La Recherche, but used to host one of the so-called French "Grandes Ecoles", called l'Ecole Polytechnique (in France there is only one Ecole Polytechnique, whereas there are two in Switzerland) until the end of the seventies, a little while after it opened its doors also to women. In fact, the setting chosen for this ATLAS overview week by our hosts from LPNHE Paris has turned out to be ideal and the security was never an ordeal. For those seeing Paris for the first time, there we...

  18. Documentation of probabilistic fracture mechanics codes used for reactor pressure vessels subjected to pressurized thermal shock loading: Parts 1 and 2. Final report

    International Nuclear Information System (INIS)

    Balkey, K.; Witt, F.J.; Bishop, B.A.

    1995-06-01

    Significant attention has been focused on the issue of reactor vessel pressurized thermal shock (PTS) for many years. Pressurized thermal shock transient events are characterized by a rapid cooldown at potentially high pressure levels that could lead to a reactor vessel integrity concern for some pressurized water reactors. As a result of regulatory and industry efforts in the early 1980's, a probabilistic risk assessment methodology has been established to address this concern. Probabilistic fracture mechanics analyses are performed as part of this methodology to determine conditional probability of significant flaw extension for given pressurized thermal shock events. While recent industry efforts are underway to benchmark probabilistic fracture mechanics computer codes that are currently used by the nuclear industry, Part I of this report describes the comparison of two independent computer codes used at the time of the development of the original U.S. Nuclear Regulatory Commission (NRC) pressurized thermal shock rule. The work that was originally performed in 1982 and 1983 to compare the U.S. NRC - VISA and Westinghouse (W) - PFM computer codes has been documented and is provided in Part I of this report. Part II of this report describes the results of more recent industry efforts to benchmark PFM computer codes used by the nuclear industry. This study was conducted as part of the USNRC-EPRI Coordinated Research Program for reviewing the technical basis for pressurized thermal shock (PTS) analyses of the reactor pressure vessel. The work focused on the probabilistic fracture mechanics (PFM) analysis codes and methods used to perform the PTS calculations. An in-depth review of the methodologies was performed to verify the accuracy and adequacy of the various different codes. The review was structured around a series of benchmark sample problems to provide a specific context for discussion and examination of the fracture mechanics methodology

  19. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  20. Multiple brain atlas database and atlas-based neuroimaging system.

    Science.gov (United States)

    Nowinski, W L; Fang, A; Nguyen, B T; Raphel, J K; Jagannathan, L; Raghavan, R; Bryan, R N; Miller, G A

    1997-01-01

    For the purpose of developing multiple, complementary, fully labeled electronic brain atlases and an atlas-based neuroimaging system for analysis, quantification, and real-time manipulation of cerebral structures in two and three dimensions, we have digitized, enhanced, segmented, and labeled the following print brain atlases: Co-Planar Stereotaxic Atlas of the Human Brain by Talairach and Tournoux, Atlas for Stereotaxy of the Human Brain by Schaltenbrand and Wahren, Referentially Oriented Cerebral MRI Anatomy by Talairach and Tournoux, and Atlas of the Cerebral Sulci by Ono, Kubik, and Abernathey. Three-dimensional extensions of these atlases have been developed as well. All two- and three-dimensional atlases are mutually preregistered and may be interactively registered with an actual patient's data. An atlas-based neuroimaging system has been developed that provides support for reformatting, registration, visualization, navigation, image processing, and quantification of clinical data. The anatomical index contains about 1,000 structures and over 400 sulcal patterns. Several new applications of the brain atlas database also have been developed, supported by various technologies such as virtual reality, the Internet, and electronic publishing. Fusion of information from multiple atlases assists the user in comprehensively understanding brain structures and identifying and quantifying anatomical regions in clinical data. The multiple brain atlas database and atlas-based neuroimaging system have substantial potential impact in stereotactic neurosurgery and radiotherapy by assisting in visualization and real-time manipulation in three dimensions of anatomical structures, in quantitative neuroradiology by allowing interactive analysis of clinical data, in three-dimensional neuroeducation, and in brain function studies.

  1. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.

  2. A four-dimensional motion field atlas of the tongue from tagged and cine magnetic resonance imaging

    Science.gov (United States)

    Xing, Fangxu; Prince, Jerry L.; Stone, Maureen; Wedeen, Van J.; El Fakhri, Georges; Woo, Jonghye

    2017-02-01

    Representation of human tongue motion using three-dimensional vector fields over time can be used to better understand tongue function during speech, swallowing, and other lingual behaviors. To characterize the inter-subject variability of the tongue's shape and motion of a population carrying out one of these functions it is desirable to build a statistical model of the four-dimensional (4D) tongue. In this paper, we propose a method to construct a spatio-temporal atlas of tongue motion using magnetic resonance (MR) images acquired from fourteen healthy human subjects. First, cine MR images revealing the anatomical features of the tongue are used to construct a 4D intensity image atlas. Second, tagged MR images acquired to capture internal motion are used to compute a dense motion field at each time frame using a phase-based motion tracking method. Third, motion fields from each subject are pulled back to the cine atlas space using the deformation fields computed during the cine atlas construction. Finally, a spatio-temporal motion field atlas is created to show a sequence of mean motion fields and their inter-subject variation. The quality of the atlas was evaluated by deforming cine images in the atlas space. Comparison between deformed and original cine images showed high correspondence. The proposed method provides a quantitative representation to observe the commonality and variability of the tongue motion field for the first time, and shows potential in evaluation of common properties such as strains and other tensors based on motion fields.

  3. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  4. Human Reliability in Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Nunez Mendez, J.

    1989-01-01

    Nowadays a growing interest in environmental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processes and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects (This relevance has been demonstrated in the accidents happened) . However, in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a quid to carry out a Human Reliability Analysis and c) a selected overview of the techniques and methodologies currently applied in this area. (Author) 20 refs

  5. Cognitive Development Effects of Teaching Probabilistic Decision Making to Middle School Students

    Science.gov (United States)

    Mjelde, James W.; Litzenberg, Kerry K.; Lindner, James R.

    2011-01-01

    This study investigated the comprehension and effectiveness of teaching formal, probabilistic decision-making skills to middle school students. Two specific objectives were to determine (1) if middle school students can comprehend a probabilistic decision-making approach, and (2) if exposure to the modeling approaches improves middle school…

  6. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  7. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    Science.gov (United States)

    Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-10-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.

  8. Atlas of Skeletal SPECT/CT Clinical Images

    International Nuclear Information System (INIS)

    2016-01-01

    The atlas focuses specifically on single photon emission computed tomography/computed tomography (SPECT/CT) in musculoskeletal imaging, and thus illustrates the inherent advantages of the combination of the metabolic and anatomical component in a single procedure. In addition, the atlas provides information on the usefulness of several sets of specific indications. The publication, which serves more as a training tool rather than a textbook, will help to further integrate the SPECT and CT experience in clinical practice by presenting a series of typical cases with many different patterns of SPECT/CT seen in bone scintigraphy

  9. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  10. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  11. AGIS: Integration of new technologies used in ATLAS Distributed Computing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00291854; The ATLAS collaboration; Di Girolamo, Alessandro; Alandes Pradillo, Maria

    2017-01-01

    The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computin...

  12. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  13. Probabilistic and technology-specific modeling of emissions from municipal solid-waste incineration.

    Science.gov (United States)

    Koehler, Annette; Peyer, Fabio; Salzmann, Christoph; Saner, Dominik

    2011-04-15

    The European legislation increasingly directs waste streams which cannot be recycled toward thermal treatment. Models are therefore needed that help to quantify emissions of waste incineration and thus reveal potential risks and mitigation needs. This study presents a probabilistic model which computes emissions as a function of waste composition and technological layout of grate incineration plants and their pollution-control equipment. In contrast to previous waste-incineration models, this tool is based on a broader empirical database and allows uncertainties in emission loads to be quantified. Comparison to monitoring data of 83 actual European plants showed no significant difference between modeled emissions and measured data. An inventory of all European grate incineration plants including technical characteristics and plant capacities was established, and waste material mixtures were determined for different European countries, including generic elemental waste-material compositions. The model thus allows for calculation of country-specific and material-dependent emission factors and enables identification and tracking of emission sources. It thereby helps to develop strategies to decrease plant emissions by reducing or redirecting problematic waste fractions to other treatment options or adapting the technological equipment of waste incinerators.

  14. The ATLAS online High Level Trigger framework experience reusing offline software components in the ATLAS trigger

    CERN Document Server

    Wiedenmann, W

    2009-01-01

    Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking peri...

  15. TU-CD-BRA-04: Evaluation of An Atlas-Based Segmentation Method for Prostate and Peripheral Zone Regions On MRI

    International Nuclear Information System (INIS)

    Nelson, AS; Piper, J; Curry, K; Swallen, A; Padgett, K; Pollack, A; Stoyanova, RS

    2015-01-01

    Purpose: Prostate MRI plays an important role in diagnosis, biopsy guidance, and therapy planning for prostate cancer. Prostate MRI contours can be used to aid in image fusion for ultrasound biopsy guidance and delivery of radiation. Our goal in this study is to evaluate an automatic atlas-based segmentation method for generating prostate and peripheral zone (PZ) contours on MRI. Methods: T2-weighted MRIs were acquired on 3T-Discovery MR750 System (GE, Milwaukee). The Volumes of Interest (VOIs): prostate and PZ were outlined by an expert radiation oncologist and used to create an atlas library for atlas-based segmentation. The atlas-segmentation accuracy was evaluated using a leave-one-out analysis. The method involved automatically finding the atlas subject that best matched the test subject followed by a normalized intensity-based free-form deformable registration of the atlas subject to the test subject. The prostate and PZ contours were transformed to the test subject using the same deformation. For each test subject the three best matches were used and the final contour was combined using Majority Vote. The atlas-segmentation process was fully automatic. Dice similarity coefficients (DSC) and mean Hausdorff values were used for comparison. Results: VOIs contours were available for 28 subjects. For the prostate, the atlas-based segmentation method resulted in an average DSC of 0.88+/−0.08 and a mean Hausdorff distance of 1.1+/−0.9mm. The number of patients (#) in DSC ranges are as follows: 0.60–0.69(1), 0.70–0.79(2), 0.80–0.89(13), >0.89(11). For the PZ, the average DSC was 0.72+/−0.17 and average Hausdorff of 0.9+/−0.9mm. The number of patients (#) in DSC ranges are as follows: 0.89(1). Conclusion: The MRI atlas-based segmentation method achieved good results for both the whole prostate and PZ compared to expert defined VOIs. The technique is fast, fully automatic, and has the potential to provide significant time savings for prostate VOI

  16. Development of the ATLAS simulation framework

    International Nuclear Information System (INIS)

    DellAcqua, A.; Stavrianakou, M.; Amako, K.; Kanzaki, J.; Morita, Y.; Murakami, K.; Sasaki; Kurashige, H.; Rimoldi, A.; Saeki, T.; Ueda, I.; Tanaka, S.; Yoshida, H.

    2001-01-01

    Object-oriented (OO) approach is the key technology to develop a software system in the LHC/ATLAS experiment. The authors developed a OO simulation framework based on the Geant4 general-purpose simulation toolkit. Because of complexity of simulation in ATLAS, the authors paid most attention to the scalability in the design. Although the first target to apply this framework is to implement the ATLAS full detector simulation program, there is no experiment-specific code in it, therefore it can be utilized for the development of any simulation package, not only for HEP experiments but also for various different research domains. The authors discuss our approach of design and implementation of the framework

  17. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  18. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  19. Probabilistic vs. deterministic fiber tracking and the influence of different seed regions to delineate cerebellar-thalamic fibers in deep brain stimulation.

    Science.gov (United States)

    Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M

    2017-06-01

    This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  20. Creation of RTOG compliant patient CT-atlases for automated atlas based contouring of local regional breast and high-risk prostate cancers.

    Science.gov (United States)

    Velker, Vikram M; Rodrigues, George B; Dinniwell, Robert; Hwee, Jeremiah; Louie, Alexander V

    2013-07-25

    Increasing use of IMRT to treat breast and prostate cancers at high risk of regional nodal spread relies on accurate contouring of targets and organs at risk, which is subject to significant inter- and intra-observer variability. This study sought to evaluate the performance of an atlas based deformable registration algorithm to create multi-patient CT based atlases for automated contouring. Breast and prostate multi-patient CT atlases (n = 50 and 14 respectively) were constructed to be consistent with RTOG consensus contouring guidelines. A commercially available software algorithm was evaluated by comparison of atlas-predicted contours against manual contours using Dice Similarity coefficients. High levels of agreement were demonstrated for prediction of OAR contours of lungs, heart, femurs, and minor editing required for the CTV breast/chest wall. CTVs generated for axillary nodes, supraclavicular nodes, prostate, and pelvic nodes demonstrated modest agreement. Small and highly variable structures, such as internal mammary nodes, lumpectomy cavity, rectum, penile bulb, and seminal vesicles had poor agreement. A method to construct and validate performance of CT-based multi-patient atlases for automated atlas based auto-contouring has been demonstrated, and can be adopted for clinical use in planning of local regional breast and high-risk prostate radiotherapy.

  1. Creation of RTOG compliant patient CT-atlases for automated atlas based contouring of local regional breast and high-risk prostate cancers

    International Nuclear Information System (INIS)

    Velker, Vikram M; Rodrigues, George B; Dinniwell, Robert; Hwee, Jeremiah; Louie, Alexander V

    2013-01-01

    Increasing use of IMRT to treat breast and prostate cancers at high risk of regional nodal spread relies on accurate contouring of targets and organs at risk, which is subject to significant inter- and intra-observer variability. This study sought to evaluate the performance of an atlas based deformable registration algorithm to create multi-patient CT based atlases for automated contouring. Breast and prostate multi-patient CT atlases (n = 50 and 14 respectively) were constructed to be consistent with RTOG consensus contouring guidelines. A commercially available software algorithm was evaluated by comparison of atlas-predicted contours against manual contours using Dice Similarity coefficients. High levels of agreement were demonstrated for prediction of OAR contours of lungs, heart, femurs, and minor editing required for the CTV breast/chest wall. CTVs generated for axillary nodes, supraclavicular nodes, prostate, and pelvic nodes demonstrated modest agreement. Small and highly variable structures, such as internal mammary nodes, lumpectomy cavity, rectum, penile bulb, and seminal vesicles had poor agreement. A method to construct and validate performance of CT-based multi-patient atlases for automated atlas based auto-contouring has been demonstrated, and can be adopted for clinical use in planning of local regional breast and high-risk prostate radiotherapy

  2. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  3. A review of structural and functional brain networks: small world and atlas.

    Science.gov (United States)

    Yao, Zhijun; Hu, Bin; Xie, Yuanwei; Moore, Philip; Zheng, Jiaxiang

    2015-03-01

    Brain networks can be divided into two categories: structural and functional networks. Many studies of neuroscience have reported that the complex brain networks are characterized by small-world or scale-free properties. The identification of nodes is the key factor in studying the properties of networks on the macro-, micro- or mesoscale in both structural and functional networks. In the study of brain networks, nodes are always determined by atlases. Therefore, the selection of atlases is critical, and appropriate atlases are helpful to combine the analyses of structural and functional networks. Currently, some problems still exist in the establishment or usage of atlases, which are often caused by the segmentation or the parcellation of the brain. We suggest that quantification of brain networks might be affected by the selection of atlases to a large extent. In the process of building atlases, the influences of single subjects and groups should be balanced. In this article, we focused on the effects of atlases on the analysis of brain networks and the improved divisions based on the tractography or connectivity in the parcellation of atlases.

  4. The ATLAS simulation infrastructure

    Czech Academy of Sciences Publication Activity Database

    Aad, G.; Abbott, B.; Abdallah, J.; Bazalová, Magdalena; Böhm, Jan; Chudoba, Jiří; Gallus, Petr; Gunther, Jaroslav; Havránek, Miroslav; Hruška, I.; Jahoda, M.; Juránek, Vojtěch; Kepka, Oldřich; Kupčo, Alexander; Kůs, Vlastimil; Kvasnička, Jiří; Lipinský, L.; Lokajíček, Miloš; Marčišovský, Michal; Mikeštíková, Marcela; Myška, Miroslav; Němeček, Stanislav; Panušková, M.; Popule, Jiří; Růžička, Pavel; Schovancová, Jaroslava; Šícho, Petr; Staroba, Pavel; Šťastný, Jan; Taševský, Marek; Tic, Tomáš; Tomášek, Lukáš; Tomášek, Michal; Valenta, J.; Vrba, Václav

    2010-01-01

    Roč. 70, č. 3 (2010), s. 823-874 ISSN 1434-6044 R&D Projects: GA MŠk LC527; GA MŠk LA08015; GA MŠk LA08032 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * simulation Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 3.248, year: 2010 http://arxiv.org/pdf/1005.4568

  5. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  6. Probabilistic Compositional Models: solution of an equivalence problem

    Czech Academy of Sciences Publication Activity Database

    Kratochvíl, Václav

    2013-01-01

    Roč. 54, č. 5 (2013), s. 590-601 ISSN 0888-613X R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Probabilistic model * Compositional model * Independence * Equivalence Subject RIV: BA - General Mathematics Impact factor: 1.977, year: 2013 http://library.utia.cas.cz/separaty/2013/MTR/kratochvil-0391079.pdf

  7. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  8. Volume measurements of individual muscles in human quadriceps femoris using atlas-based segmentation approaches.

    Science.gov (United States)

    Le Troter, Arnaud; Fouré, Alexandre; Guye, Maxime; Confort-Gouny, Sylviane; Mattei, Jean-Pierre; Gondin, Julien; Salort-Campana, Emmanuelle; Bendahan, David

    2016-04-01

    Atlas-based segmentation is a powerful method for automatic structural segmentation of several sub-structures in many organs. However, such an approach has been very scarcely used in the context of muscle segmentation, and so far no study has assessed such a method for the automatic delineation of individual muscles of the quadriceps femoris (QF). In the present study, we have evaluated a fully automated multi-atlas method and a semi-automated single-atlas method for the segmentation and volume quantification of the four muscles of the QF and for the QF as a whole. The study was conducted in 32 young healthy males, using high-resolution magnetic resonance images (MRI) of the thigh. The multi-atlas-based segmentation method was conducted in 25 subjects. Different non-linear registration approaches based on free-form deformable (FFD) and symmetric diffeomorphic normalization algorithms (SyN) were assessed. Optimal parameters of two fusion methods, i.e., STAPLE and STEPS, were determined on the basis of the highest Dice similarity index (DSI) considering manual segmentation (MSeg) as the ground truth. Validation and reproducibility of this pipeline were determined using another MRI dataset recorded in seven healthy male subjects on the basis of additional metrics such as the muscle volume similarity values, intraclass coefficient, and coefficient of variation. Both non-linear registration methods (FFD and SyN) were also evaluated as part of a single-atlas strategy in order to assess longitudinal muscle volume measurements. The multi- and the single-atlas approaches were compared for the segmentation and the volume quantification of the four muscles of the QF and for the QF as a whole. Considering each muscle of the QF, the DSI of the multi-atlas-based approach was high 0.87 ± 0.11 and the best results were obtained with the combination of two deformation fields resulting from the SyN registration method and the STEPS fusion algorithm. The optimal variables for FFD

  9. Building a high-resolution T2-weighted MR-based probabilistic model of tumor occurrence in the prostate.

    Science.gov (United States)

    Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R

    2018-02-19

    We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.

  10. Spracovanie dát na experimente ATLAS

    Czech Academy of Sciences Publication Activity Database

    Marčišovský, Michal; Kubeš, T.; Chudoba, Jiří

    2008-01-01

    Roč. 58, č. 6 (2008), 354-359 ISSN 0009-0700 R&D Projects: GA MŠk LA08032 Institutional research plan: CEZ:AV0Z10100502 Keywords : Atlas * computing * DCS * Grid Subject RIV: BF - Elementary Particles and High Energy Physics

  11. The ATLAS semiconductor tracker end-cap module

    Czech Academy of Sciences Publication Activity Database

    Abdesselam, A.; Adkin, P. J.; Allport, P.; Böhm, Jan; Šťastný, Jan

    2007-01-01

    Roč. 575, - (2007), s. 353-389 ISSN 0168-9002 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * SCT * silicon * microstrip * module * LHC Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.114, year: 2007

  12. Applications of probabilistic techniques at NRC

    International Nuclear Information System (INIS)

    Thadani, A.; Rowsome, F.; Speis, T.

    1984-01-01

    The NRC is currently making extensive use of probabilistic safety assessment in the reactor regulation. Most of these applications have been introduced in the regulatory activities in the past few years. Plant Probabilistic Safety Studies are being utilized as a design tool for applications for standard designs and for assessment of plants located in regions of particularly high population density. There is considerable motivation for licenses to perform plant-specific probabilistic studies for many, if not all, of the existing operating nuclear power plants as a tool for prioritizing the implementation of the many outstanding licensing actions of these plants as well as recommending the elimination of a number of these issues which are judged to be insignificant in terms of their contribution to safety and risk. Risk assessment perspectives are being used in the priorization of generic safety issues, development of technical resolution of unresolved safety issues, assessing safety significance of proposed new regulatory requirements, assessment of safety significance of some of the occurrences at operating facilities and in environmental impact analyses of license applicants as required by the National Environmental Policy Act. (orig.)

  13. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  14. ATLAS & Google — "Data Ocean" R&D Project

    CERN Document Server

    The ATLAS collaboration

    2017-01-01

    ATLAS is facing several challenges with respect to their computing requirements for LHC Run-3 (2020-2023) and HL-LHC runs (2025-2034). The challenges are not specific for ATLAS or/and LHC, but common for HENP computing community. Most importantly, storage continues to be the driving cost factor and at the current growth rate cannot absorb the increased physics output of the experiment. Novel computing models with a more dynamic use of storage and computing resources need to be considered. This project aims to start an R&D project for evaluating and adopting novel IT technologies for HENP computing. ATLAS and Google plan to launch an R&D project to integrate Google cloud resources (Storage and Compute) to the ATLAS distributed computing environment. After a series of teleconferences, a face-to-face brainstorming meeting in Denver, CO at the Supercomputing 2017 conference resulted in this proposal for a first prototype of the "Data Ocean" project. The idea is threefold: (a) to allow ATLAS to explore the...

  15. TU-CD-BRA-04: Evaluation of An Atlas-Based Segmentation Method for Prostate and Peripheral Zone Regions On MRI

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, AS; Piper, J; Curry, K; Swallen, A [MIM Software Inc., Cleveland, OH (United States); Padgett, K; Pollack, A; Stoyanova, RS [University of Miami, Miami, FL (United States)

    2015-06-15

    Purpose: Prostate MRI plays an important role in diagnosis, biopsy guidance, and therapy planning for prostate cancer. Prostate MRI contours can be used to aid in image fusion for ultrasound biopsy guidance and delivery of radiation. Our goal in this study is to evaluate an automatic atlas-based segmentation method for generating prostate and peripheral zone (PZ) contours on MRI. Methods: T2-weighted MRIs were acquired on 3T-Discovery MR750 System (GE, Milwaukee). The Volumes of Interest (VOIs): prostate and PZ were outlined by an expert radiation oncologist and used to create an atlas library for atlas-based segmentation. The atlas-segmentation accuracy was evaluated using a leave-one-out analysis. The method involved automatically finding the atlas subject that best matched the test subject followed by a normalized intensity-based free-form deformable registration of the atlas subject to the test subject. The prostate and PZ contours were transformed to the test subject using the same deformation. For each test subject the three best matches were used and the final contour was combined using Majority Vote. The atlas-segmentation process was fully automatic. Dice similarity coefficients (DSC) and mean Hausdorff values were used for comparison. Results: VOIs contours were available for 28 subjects. For the prostate, the atlas-based segmentation method resulted in an average DSC of 0.88+/−0.08 and a mean Hausdorff distance of 1.1+/−0.9mm. The number of patients (#) in DSC ranges are as follows: 0.60–0.69(1), 0.70–0.79(2), 0.80–0.89(13), >0.89(11). For the PZ, the average DSC was 0.72+/−0.17 and average Hausdorff of 0.9+/−0.9mm. The number of patients (#) in DSC ranges are as follows: <0.60(4), 0.60–0.69(6), 0.70–0.79(7), 0.80–0.89(9), >0.89(1). Conclusion: The MRI atlas-based segmentation method achieved good results for both the whole prostate and PZ compared to expert defined VOIs. The technique is fast, fully automatic, and has the potential

  16. New experimental results in atlas-based brain morphometry

    Science.gov (United States)

    Gee, James C.; Fabella, Brian A.; Fernandes, Siddharth E.; Turetsky, Bruce I.; Gur, Ruben C.; Gur, Raquel E.

    1999-05-01

    In a previous meeting, we described a computational approach to MRI morphometry, in which a spatial warp mapping a reference or atlas image into anatomic alignment with the subject is first inferred. Shape differences with respect to the atlas are then studied by calculating the pointwise Jacobian determinant for the warp, which provides a measure of the change in differential volume about a point in the reference as it transforms to its corresponding position in the subject. In this paper, the method is used to analyze sex differences in the shape and size of the corpus callosum in an ongoing study of a large population of normal controls. The preliminary results of the current analysis support findings in the literature that have observed the splenium to be larger in females than in males.

  17. The Detector Safety System of the ATLAS experiment

    International Nuclear Information System (INIS)

    Beltramello, O; Burckhart, H J; Franz, S; Jaekel, M; Jeckel, M; Lueders, S; Morpurgo, G; Santos Pedrosa, F dos; Pommes, K; Sandaker, H

    2009-01-01

    The ATLAS detector at the Large Hadron Collider at CERN is one of the most advanced detectors for High Energy Physics experiments ever built. It consists of the order of ten functionally independent sub-detectors, which all have dedicated services like power, cooling, gas supply. A Detector Safety System has been built to detect possible operational problems and abnormal and potentially dangerous situations at an early stage and, if needed, to bring the relevant part of ATLAS automatically into a safe state. The procedures and the configuration specific to ATLAS are described in detail and first operational experience is given.

  18. Volume 1. Probabilistic analysis of HTGR application studies. Technical discussion

    International Nuclear Information System (INIS)

    May, J.; Perry, L.

    1980-01-01

    The HTGR Program encompasses a number of decisions facing both industry and government which are being evaluated under the HTGR application studies being conducted by the GCRA. This report is in support of these application studies, specifically by developing comparative probabilistic energy costs of the alternative HTGR plant types under study at this time and of competitive PWR and coal-fired plants. Management decision analytic methodology was used as the basis for the development of the comparative probabilistic data. This study covers the probabilistic comparison of various HTGR plant types at a commercial development stage with comparative PWR and coal-fired plants. Subsequent studies are needed to address the sequencing of HTGR plants from the lead plant to the commercial plants and to integrate the R and D program into the plant construction sequence. The probabilistic results cover the comparison of the 15-year levelized energy costs for commercial plants, all with 1995 startup dates. For comparison with the HTGR plants, PWR and fossil-fired plants have been included in the probabilistic analysis, both as steam electric plants and as combined steam electric and process heat plants

  19. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  20. Probabilistic Characterization of Adversary Behavior in Cyber Security

    Energy Technology Data Exchange (ETDEWEB)

    Meyers, C A; Powers, S S; Faissol, D M

    2009-10-08

    The objective of this SMS effort is to provide a probabilistic characterization of adversary behavior in cyber security. This includes both quantitative (data analysis) and qualitative (literature review) components. A set of real LLNL email data was obtained for this study, consisting of several years worth of unfiltered traffic sent to a selection of addresses at ciac.org. The email data was subjected to three interrelated analyses: a textual study of the header data and subject matter, an examination of threats present in message attachments, and a characterization of the maliciousness of embedded URLs.

  1. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  2. The "Digital Turn" of the European Historic Towns Atlas: Comparing Solutions for Digital Atlas Production and online Presentation

    Czech Academy of Sciences Publication Activity Database

    Chodějovská, Eva; Gearty, S.; Stracke, D.

    2015-01-01

    Roč. 10, č. 1 (2015), s. 89-121 ISSN 1828-6364 R&D Projects: GA ČR GA13-11425S Institutional support: RVO:67985963 Keywords : Historic Towns Atlas * International Commission for the History of Towns * urban history * cartography Subject RIV: AB - History

  3. ATLAS DDM integration in ARC

    International Nuclear Information System (INIS)

    Behrmann, G; Cameron, D; Ellert, M; Kleist, J; Taga, A

    2008-01-01

    The Nordic Data Grid Facility (NDGF) consists of Grid resources running ARC middleware in Denmark, Finland, Norway and Sweden. These resources serve many virtual organisations and contribute a large fraction of total worldwide resources for the ATLAS experiment, whose data is distributed and managed by the DQ2 software. Managing ATLAS data within NDGF and between NDGF and other Grids used by ATLAS (the Enabling Grids for E-sciencE Grid and the Open Science Grid) presents a unique challenge for several reasons. Firstly, the entry point for data, the Tier 1 centre, is physically distributed among heterogeneous resources in several countries and yet must present a single access point for all data stored within the centre. The middleware framework used in NDGF differs significantly from other Grids, specifically in the way that all data movement and registration is performed by services outside the worker node environment. Also, the service used for cataloging the location of data files is different from other Grids but must still be useable by DQ2 and ATLAS users to locate data within NDGF. This paper presents in detail how we solve these issues to allow seamless access worldwide to data within NDGF

  4. On the Probabilistic Characterization of Robustness and Resilience

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Qin, J.; Miraglia, Simona

    2017-01-01

    Over the last decade significant research efforts have been devoted to the probabilistic modeling and analysis of system characteristics. Especially performance characteristics of systems subjected to random disturbances, such as robustness and resilience have been in the focus of these efforts...... in the modeling of robustness and resilience in the research areas of natural disaster risk management, socio-ecological systems and social systems and we propose a generic decision analysis framework for the modeling and analysis of systems across application areas. The proposed framework extends the concept...... of direct and indirect consequences and associated risks in probabilistic systems modeling formulated by the Joint Committee on Structural Safety (JCSS) to facilitate the modeling and analysis of resilience in addition to robustness and vulnerability. Moreover, based on recent insights in the modeling...

  5. Construction of a consistent high-definition spatio-temporal atlas of the developing brain using adaptive kernel regression.

    Science.gov (United States)

    Serag, Ahmed; Aljabar, Paul; Ball, Gareth; Counsell, Serena J; Boardman, James P; Rutherford, Mary A; Edwards, A David; Hajnal, Joseph V; Rueckert, Daniel

    2012-02-01

    Medical imaging has shown that, during early development, the brain undergoes more changes in size, shape and appearance than at any other time in life. A better understanding of brain development requires a spatio-temporal atlas that characterizes the dynamic changes during this period. In this paper we present an approach for constructing a 4D atlas of the developing brain, between 28 and 44 weeks post-menstrual age at time of scan, using T1 and T2 weighted MR images from 204 premature neonates. The method used for the creation of the average 4D atlas utilizes non-rigid registration between all pairs of images to eliminate bias in the atlas toward any of the original images. In addition, kernel regression is used to produce age-dependent anatomical templates. A novelty in our approach is the use of a time-varying kernel width, to overcome the variations in the distribution of subjects at different ages. This leads to an atlas that retains a consistent level of detail at every time-point. Comparisons between the resulting atlas and atlases constructed using affine and non-rigid registration are presented. The resulting 4D atlas has greater anatomic definition than currently available 4D atlases created using various affine and non-rigid registration approaches, an important factor in improving registrations between the atlas and individual subjects. Also, the resulting 4D atlas can serve as a good representative of the population of interest as it reflects both global and local changes. The atlas is publicly available at www.brain-development.org. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  7. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  8. Next generation PanDA pilot for ATLAS and other experiments

    International Nuclear Information System (INIS)

    Nilsson, P; De, K; Megino, F Barreiro; Llamas, R Medrano; Bejar, J Caballero; Hover, J; Maeno, T; Wenaus, T; Love, P; Walker, R

    2014-01-01

    The Production and Distributed Analysis system (PanDA) has been in use in the ATLAS Experiment since 2005. It uses a sophisticated pilot system to execute submitted jobs on the worker nodes. While originally designed for ATLAS, the PanDA Pilot has recently been refactored to facilitate use outside of ATLAS. Experiments are now handled as plug-ins such that a new PanDA Pilot user only has to implement a set of prototyped methods in the plug-in classes, and provide a script that configures and runs the experiment-specific payload. We will give an overview of the Next Generation PanDA Pilot system and will present major features and recent improvements including live user payload debugging, data access via the Federated XRootD system, stage-out to alternative storage elements, support for the new ATLAS DDM system (Rucio), and an improved integration with glExec, as well as a description of the experiment-specific plug-in classes. The performance of the pilot system in processing LHC data on the OSG, LCG and Nordugrid infrastructures used by ATLAS will also be presented. We will describe plans for future development on the time scale of the next few years.

  9. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  10. Morphometric Atlas Selection for Automatic Brachial Plexus Segmentation

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Wouters, Johan; Vercauteren, Tom; De Gersem, Werner; Duprez, Fréderic; De Neve, Wilfried; Van Hoof, Tom

    2015-01-01

    Purpose: The purpose of this study was to determine the effects of atlas selection based on different morphometric parameters, on the accuracy of automatic brachial plexus (BP) segmentation for radiation therapy planning. The segmentation accuracy was measured by comparing all of the generated automatic segmentations with anatomically validated gold standard atlases developed using cadavers. Methods and Materials: Twelve cadaver computed tomography (CT) atlases (3 males, 9 females; mean age: 73 years) were included in the study. One atlas was selected to serve as a patient, and the other 11 atlases were registered separately onto this “patient” using deformable image registration. This procedure was repeated for every atlas as a patient. Next, the Dice and Jaccard similarity indices and inclusion index were calculated for every registered BP with the original gold standard BP. In parallel, differences in several morphometric parameters that may influence the BP segmentation accuracy were measured for the different atlases. Specific brachial plexus-related CT-visible bony points were used to define the morphometric parameters. Subsequently, correlations between the similarity indices and morphometric parameters were calculated. Results: A clear negative correlation between difference in protraction-retraction distance and the similarity indices was observed (mean Pearson correlation coefficient = −0.546). All of the other investigated Pearson correlation coefficients were weak. Conclusions: Differences in the shoulder protraction-retraction position between the atlas and the patient during planning CT influence the BP autosegmentation accuracy. A greater difference in the protraction-retraction distance between the atlas and the patient reduces the accuracy of the BP automatic segmentation result

  11. ATLAS insertable B-layer

    Czech Academy of Sciences Publication Activity Database

    Marčišovský, Michal

    2011-01-01

    Roč. 633, č. 1 (2011), "S224"-"S225" ISSN 0168-9002. [International workshop on radiation imaging detectors /11./. Praha, 26.06.2009-02.07.2009] R&D Projects: GA MŠk LA08015; GA MŠk LA08032 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * pixel detector * insertable B-layer Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.207, year: 2011

  12. Enhanced subject-specific resting-state network detection and extraction with fast fMRI.

    Science.gov (United States)

    Akin, Burak; Lee, Hsu-Lei; Hennig, Jürgen; LeVan, Pierre

    2017-02-01

    Resting-state networks have become an important tool for the study of brain function. An ultra-fast imaging technique that allows to measure brain function, called Magnetic Resonance Encephalography (MREG), achieves an order of magnitude higher temporal resolution than standard echo-planar imaging (EPI). This new sequence helps to correct physiological artifacts and improves the sensitivity of the fMRI analysis. In this study, EPI is compared with MREG in terms of capability to extract resting-state networks. Healthy controls underwent two consecutive resting-state scans, one with EPI and the other with MREG. Subject-level independent component analyses (ICA) were performed separately for each of the two datasets. Using Stanford FIND atlas parcels as network templates, the presence of ICA maps corresponding to each network was quantified in each subject. The number of detected individual networks was significantly higher in the MREG data set than for EPI. Moreover, using short time segments of MREG data, such as 50 seconds, one can still detect and track consistent networks. Fast fMRI thus results in an increased capability to extract distinct functional regions at the individual subject level for the same scan times, and also allow the extraction of consistent networks within shorter time intervals than when using EPI, which is notably relevant for the analysis of dynamic functional connectivity fluctuations. Hum Brain Mapp 38:817-830, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. The ATLAS Fast Tracker

    CERN Document Server

    Volpi, Guido; The ATLAS collaboration

    2015-01-01

    The use of tracking information at the trigger level in the LHC Run II period is crucial for the trigger an data acquisition (TDAQ) system. The tracking precision is in fact important to identify specific decay products of the Higgs boson or new phenomena, a well as to distinguish the contributions coming from many contemporary collisions that occur at every bunch crossing. However, the track reconstruction is among the most demanding tasks performed by the TDAQ computing farm; in fact, full reconstruction at full Level-1 trigger accept rate (100 KHz) is not possible. In order to overcome this limitation, the ATLAS experiment is planning the installation of a specific processor: the Fast Tracker (FTK), which is aimed at achieving this goal. The FTK is a pipeline of high performance electronic, based on custom and commercial devices, which is expected to reconstruct, with high resolution, the trajectories of charged tracks with a transverse momentum above 1 GeV, using the ATLAS inner tracker information. Patte...

  14. Advances in ATLAS@Home towards a major ATLAS computing resource

    CERN Document Server

    Cameron, David; The ATLAS collaboration

    2018-01-01

    The volunteer computing project ATLAS@Home has been providing a stable computing resource for the ATLAS experiment since 2013. It has recently undergone some significant developments and as a result has become one of the largest resources contributing to ATLAS computing, by expanding its scope beyond traditional volunteers and into exploitation of idle computing power in ATLAS data centres. Removing the need for virtualization on Linux and instead using container technology has made the entry barrier significantly lower data centre participation and in this paper, we describe the implementation and results of this change. We also present other recent changes and improvements in the project. In early 2017 the ATLAS@Home project was merged into a combined LHC@Home platform, providing a unified gateway to all CERN-related volunteer computing projects. The ATLAS Event Service shifts data processing from file-level to event-level and we describe how ATLAS@Home was incorporated into this new paradigm. The finishing...

  15. Modelling and subject-specific validation of the heart-arterial tree system.

    Science.gov (United States)

    Guala, Andrea; Camporeale, Carlo; Tosello, Francesco; Canuto, Claudio; Ridolfi, Luca

    2015-01-01

    A modeling approach integrated with a novel subject-specific characterization is here proposed for the assessment of hemodynamic values of the arterial tree. A 1D model is adopted to characterize large-to-medium arteries, while the left ventricle, aortic valve and distal micro-circulation sectors are described by lumped submodels. A new velocity profile and a new formulation of the non-linear viscoelastic constitutive relation suitable for the {Q, A} modeling are also proposed. The model is firstly verified semi-quantitatively against literature data. A simple but effective procedure for obtaining subject-specific model characterization from non-invasive measurements is then designed. A detailed subject-specific validation against in vivo measurements from a population of six healthy young men is also performed. Several key quantities of heart dynamics-mean ejected flow, ejection fraction, and left-ventricular end-diastolic, end-systolic and stroke volumes-and the pressure waveforms (at the central, radial, brachial, femoral, and posterior tibial sites) are compared with measured data. Mean errors around 5 and 8%, obtained for the heart and arterial quantities, respectively, testify the effectiveness of the model and its subject-specific characterization.

  16. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  17. Ethics and access to teaching materials in the medical library: the case of the Pernkopf atlas.

    Science.gov (United States)

    Atlas, M C

    2001-01-01

    Conflicts can occur between the principle of freedom of information treasured by librarians and ethical standards of scientific research involving the propriety of using data derived from immoral or dishonorable experimentation. A prime example of this conflict was brought to the attention of the medical and library communities in 1995 when articles claiming that the subjects of the illustrations in the classic anatomy atlas, Eduard Pernkopf's Topographische Anatomie des Menschen, were victims of the Nazi holocaust. While few have disputed the accuracy, artistic, or educational value of the Pernkopf atlas, some have argued that the use of such subjects violates standards of medical ethics involving inhuman and degrading treatment of subjects or disrespect of a human corpse. Efforts were made to remove the book from medical libraries. In this article, the history of the Pernkopf atlas and the controversy surrounding it are reviewed. The results of a survey of academic medical libraries concerning their treatment of the Pernkopf atlas are reported, and the ethical implications of these issues as they affect the responsibilities of librarians is discussed.

  18. European Wind Atlas and Wind Resource Research in Denmark

    DEFF Research Database (Denmark)

    Mortensen, Niels Gylling

    to estimate the actual wind climate at any specific site and height within this region. The Danish and European Wind Atlases are examples of how the wind atlas methodology can be employed to estimate the wind resource potential for a country or a sub-continent. Recently, the methodology has also been used...... - from wind measurements at prospective sites to wind tunnel simulations and advanced flow modelling. Among these approaches, the wind atlas methodology - developed at Ris0 National Laboratory over the last 25 years - has gained widespread recognition and is presently considered by many as the industry......-standard tool for wind resource assessment and siting of wind turbines. The PC-implementation of the methodology, the Wind Atlas Analysis and Application Program (WAsP), has been applied in more than 70 countries and territories world-wide. The wind atlas methodology is based on physical descriptions and models...

  19. ATLAS Silicon Microstrip Tracker

    CERN Document Server

    Haefner, Petra; The ATLAS collaboration

    2010-01-01

    The SemiConductor Tracker (SCT), made up from silicon micro-strip detectors is the key precision tracking device in ATLAS, one of the experiments at CERN LHC. The completed SCT is in very good shape: 99.3% of the SCT strips are operational, noise occupancy and hit efficiency exceed the design specifications. In the talk the current status of the SCT will be reviewed. We will report on the operation of the detector and observed problems, with stress on the sensor and electronics performance. TWEPP Summary In December 2009 the ATLAS experiment at the CERN Large Hadron Collider (LHC) recorded the first proton- proton collisions at a centre-of-mass energy of 900 GeV and this was followed by the unprecedented energy of 7 TeV in March 2010. The SemiConductor Tracker (SCT) is the key precision tracking device in ATLAS, made up from silicon micro-strip detectors processed in the planar p-in-n technology. The signal from the strips is processed in the front-end ASICS ABCD3TA, working in the binary readout mode. Data i...

  20. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  1. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  2. The German National Analysis Facility as a tool for ATLAS analyses

    International Nuclear Information System (INIS)

    Ehrenfeld, W; Leffhalm, K; Mehlhase, S

    2011-01-01

    In 2008 the German National Analysis Facility (NAF) at DESY was established. It is attached to and builds on top of DESY Grid infrastructure. The facility is designed to provide the best possible analysis infrastructure for high energy particle physics of the ATLAS, CMS, LHCb and ILC experiments. The Grid and local infrastructure of the NAF is reviewed with a focus on the ATLAS part. Both parts include large scale storage and a batch system. Emphasis is put on ATLAS specific customisation and utilisation of the NAF. This refers not only to the NAF components but also to the different components of the ATLAS analysis framework. Experience from operating and supporting ATLAS users on the NAF is presented in this paper. The ATLAS usage of the different components are shown including some typical use cases of user analysis. Finally, the question is addressed, if the design of the NAF meets the ATLAS expectations for efficient data analysis in the era of LHC data taking.

  3. The effect of morphometric atlas selection on multi-atlas-based automatic brachial plexus segmentation

    International Nuclear Information System (INIS)

    Van de Velde, Joris; Wouters, Johan; Vercauteren, Tom; De Gersem, Werner; Achten, Eric; De Neve, Wilfried; Van Hoof, Tom

    2015-01-01

    The present study aimed to measure the effect of a morphometric atlas selection strategy on the accuracy of multi-atlas-based BP autosegmentation using the commercially available software package ADMIRE® and to determine the optimal number of selected atlases to use. Autosegmentation accuracy was measured by comparing all generated automatic BP segmentations with anatomically validated gold standard segmentations that were developed using cadavers. Twelve cadaver computed tomography (CT) atlases were included in the study. One atlas was selected as a patient in ADMIRE®, and multi-atlas-based BP autosegmentation was first performed with a group of morphometrically preselected atlases. In this group, the atlases were selected on the basis of similarity in the shoulder protraction position with the patient. The number of selected atlases used started at two and increased up to eight. Subsequently, a group of randomly chosen, non-selected atlases were taken. In this second group, every possible combination of 2 to 8 random atlases was used for multi-atlas-based BP autosegmentation. For both groups, the average Dice similarity coefficient (DSC), Jaccard index (JI) and Inclusion index (INI) were calculated, measuring the similarity of the generated automatic BP segmentations and the gold standard segmentation. Similarity indices of both groups were compared using an independent sample t-test, and the optimal number of selected atlases was investigated using an equivalence trial. For each number of atlases, average similarity indices of the morphometrically selected atlas group were significantly higher than the random group (p < 0,05). In this study, the highest similarity indices were achieved using multi-atlas autosegmentation with 6 selected atlases (average DSC = 0,598; average JI = 0,434; average INI = 0,733). Morphometric atlas selection on the basis of the protraction position of the patient significantly improves multi-atlas-based BP autosegmentation accuracy

  4. Probabilistic Analysis of Failures Mechanisms of Large Dams

    NARCIS (Netherlands)

    Shams Ghahfarokhi, G.

    2014-01-01

    Risk and reliability analysis is presently being performed in almost all fields of engineering depending upon the specific field and its particular area. Probabilistic risk analysis (PRA), also called quantitative risk analysis (QRA) is a central feature of hydraulic engineering structural design.

  5. The ATLAS online High Level Trigger framework: Experience reusing offline software components in the ATLAS trigger

    International Nuclear Information System (INIS)

    Wiedenmann, Werner

    2010-01-01

    Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and ATLAS ATHENA frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of ATLAS, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the ATLAS computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies.

  6. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach.

    Directory of Open Access Journals (Sweden)

    Goker Erdogan

    2015-11-01

    Full Text Available People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models-that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model's percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects' ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.

  7. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  8. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  9. Prodeto, a computer code for probabilistic fatigue design

    Energy Technology Data Exchange (ETDEWEB)

    Braam, H [ECN-Solar and Wind Energy, Petten (Netherlands); Christensen, C J; Thoegersen, M L [Risoe National Lab., Roskilde (Denmark); Ronold, K O [Det Norske Veritas, Hoevik (Norway)

    1999-03-01

    A computer code for structural relibility analyses of wind turbine rotor blades subjected to fatigue loading is presented. With pre-processors that can transform measured and theoretically predicted load series to load range distributions by rain-flow counting and with a family of generic distribution models for parametric representation of these distribution this computer program is available for carying through probabilistic fatigue analyses of rotor blades. (au)

  10. Comparison of plant-specific probabilistic safety assessments and lessons learned

    Energy Technology Data Exchange (ETDEWEB)

    Balfanz, H.P. [TUeV Nord, Hamburg (Germany); Berg, H.P. [Bundesamt fuer Strahlenschutz, Salzgitter (Germany); Steininger, U. [TUeV Energie- und Systemtechnik GmbH, Unternehmensgruppe TUeV Sueddeutschland, Muenchen (Germany)

    2001-11-01

    Probabilistic safety assessments (PSA) have been performed for all German nuclear power plants in operation. These assessments are mainly based on the recent German PSA guide and an earlier draft, respectively. However, comparison of these PSA show differences in the results which are discussed in this paper. Lessons learned from this comparison and further development of the PSA methodology are described. (orig.) [German] Probabilistische Sicherheitsanalysen (PSA) sind fuer alle in Betrieb befindlichen deutschen Kernkraftwerke durchgefuehrt worden. Diese Analysen basierten in der Regel auf dem aktuellen deutschen PSA-Leitfaden bzw. einem frueheren Entwurf. Ein Vergleich dieser PSA zeigt Unterschiede in den Ergebnissen, die in diesem Beitrag diskutiert werden. Erfahrungen und Erkenntnisse, die aus diesem Vergleich abgeleitet werden koennen, und weitere Entwicklungen der PSA-Methoden werden beschrieben. (orig.)

  11. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  12. Information fusion in signal and image processing major probabilistic and non-probabilistic numerical approaches

    CERN Document Server

    Bloch, Isabelle

    2010-01-01

    The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).

  13. ATLAS Pixel Detector Operational Experience

    CERN Document Server

    Di Girolamo, B; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 96.9% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  14. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....

  15. Neuroinformatics of the Allen Mouse Brain Connectivity Atlas.

    Science.gov (United States)

    Kuan, Leonard; Li, Yang; Lau, Chris; Feng, David; Bernard, Amy; Sunkin, Susan M; Zeng, Hongkui; Dang, Chinh; Hawrylycz, Michael; Ng, Lydia

    2015-02-01

    The Allen Mouse Brain Connectivity Atlas is a mesoscale whole brain axonal projection atlas of the C57Bl/6J mouse brain. Anatomical trajectories throughout the brain were mapped into a common 3D space using a standardized platform to generate a comprehensive and quantitative database of inter-areal and cell-type-specific projections. This connectivity atlas has several desirable features, including brain-wide coverage, validated and versatile experimental techniques, a single standardized data format, a quantifiable and integrated neuroinformatics resource, and an open-access public online database (http://connectivity.brain-map.org/). Meaningful informatics data quantification and comparison is key to effective use and interpretation of connectome data. This relies on successful definition of a high fidelity atlas template and framework, mapping precision of raw data sets into the 3D reference framework, accurate signal detection and quantitative connection strength algorithms, and effective presentation in an integrated online application. Here we describe key informatics pipeline steps in the creation of the Allen Mouse Brain Connectivity Atlas and include basic application use cases. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Probabilistic safety analysis and interpretation thereof

    International Nuclear Information System (INIS)

    Steininger, U.; Sacher, H.

    1999-01-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [de

  17. Probabilistic Modelling of Fatigue Life of Composite Laminates Using Bayesian Inference

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der

    2014-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates subjected to constant-amplitude or variable-amplitude loading is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configuratio...

  18. Cerebellar tDCS does not improve performance in probabilistic classification learning

    NARCIS (Netherlands)

    N. Seyed Majidi; M.C. Verhage (Claire); O. Donchin (Opher); P.J. Holland (Peter); M.A. Frens (Maarten); J.N. van der Geest (Jos)

    2016-01-01

    textabstractIn this study, the role of the cerebellum in a cognitive learning task using transcranial direct current stimulation (tDCS) was investigated. Using a weather prediction task, subjects had to learn the probabilistic associations between a stimulus (a combination of cards) and an outcome

  19. ATLAS Outreach Highlights

    CERN Document Server

    Cheatham, Susan; The ATLAS collaboration

    2016-01-01

    The ATLAS outreach team is very active, promoting particle physics to a broad range of audiences including physicists, general public, policy makers, students and teachers, and media. A selection of current outreach activities and new projects will be presented. Recent highlights include the new ATLAS public website and ATLAS Open Data, the very recent public release of 1 fb-1 of ATLAS data.

  20. Planetary Data Systems (PDS) Imaging Node Atlas II

    Science.gov (United States)

    Stanboli, Alice; McAuley, James M.

    2013-01-01

    The Planetary Image Atlas (PIA) is a Rich Internet Application (RIA) that serves planetary imaging data to the science community and the general public. PIA also utilizes the USGS Unified Planetary Coordinate system (UPC) and the on-Mars map server. The Atlas was designed to provide the ability to search and filter through greater than 8 million planetary image files. This software is a three-tier Web application that contains a search engine backend (MySQL, JAVA), Web service interface (SOAP) between server and client, and a GWT Google Maps API client front end. This application allows for the search, retrieval, and download of planetary images and associated meta-data from the following missions: 2001 Mars Odyssey, Cassini, Galileo, LCROSS, Lunar Reconnaissance Orbiter, Mars Exploration Rover, Mars Express, Magellan, Mars Global Surveyor, Mars Pathfinder, Mars Reconnaissance Orbiter, MESSENGER, Phoe nix, Viking Lander, Viking Orbiter, and Voyager. The Atlas utilizes the UPC to translate mission-specific coordinate systems into a unified coordinate system, allowing the end user to query across missions of similar targets. If desired, the end user can also use a mission-specific view of the Atlas. The mission-specific views rely on the same code base. This application is a major improvement over the initial version of the Planetary Image Atlas. It is a multi-mission search engine. This tool includes both basic and advanced search capabilities, providing a product search tool to interrogate the collection of planetary images. This tool lets the end user query information about each image, and ignores the data that the user has no interest in. Users can reduce the number of images to look at by defining an area of interest with latitude and longitude ranges.

  1. Common accounting system for monitoring the ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Karavakis, E; Andreeva, J; Campana, S; Saiz, P; Gayazov, S; Jezequel, S; Sargsyan, L; Schovancova, J; Ueda, I

    2014-01-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  2. The barrel modules of the ATLAS semiconductor tracker

    Czech Academy of Sciences Publication Activity Database

    Abdesselam, A.; Akimoto, T.; Allport, P.; Böhm, Jan; Šťastný, Jan

    2006-01-01

    Roč. 568, - (2006), s. 642-671 ISSN 0168-9002 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * SCT * silicon * microstrip * module * LHC * barrel Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.185, year: 2006

  3. Improved detection of congestive heart failure via probabilistic symbolic pattern recognition and heart rate variability metrics.

    Science.gov (United States)

    Mahajan, Ruhi; Viangteeravat, Teeradache; Akbilgic, Oguz

    2017-12-01

    A timely diagnosis of congestive heart failure (CHF) is crucial to evade a life-threatening event. This paper presents a novel probabilistic symbol pattern recognition (PSPR) approach to detect CHF in subjects from their cardiac interbeat (R-R) intervals. PSPR discretizes each continuous R-R interval time series by mapping them onto an eight-symbol alphabet and then models the pattern transition behavior in the symbolic representation of the series. The PSPR-based analysis of the discretized series from 107 subjects (69 normal and 38 CHF subjects) yielded discernible features to distinguish normal subjects and subjects with CHF. In addition to PSPR features, we also extracted features using the time-domain heart rate variability measures such as average and standard deviation of R-R intervals. An ensemble of bagged decision trees was used to classify two groups resulting in a five-fold cross-validation accuracy, specificity, and sensitivity of 98.1%, 100%, and 94.7%, respectively. However, a 20% holdout validation yielded an accuracy, specificity, and sensitivity of 99.5%, 100%, and 98.57%, respectively. Results from this study suggest that features obtained with the combination of PSPR and long-term heart rate variability measures can be used in developing automated CHF diagnosis tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  5. A tiered approach for probabilistic ecological risk assessment of contaminated sites

    International Nuclear Information System (INIS)

    Zolezzi, M.; Nicolella, C.; Tarazona, J.V.

    2005-01-01

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it

  6. CERN Open Days 2013, Point 1 - ATLAS: ATLAS Experiment

    CERN Multimedia

    CERN Photolab

    2013-01-01

    Stand description: The ATLAS Experiment at CERN is one of the largest and most complex scientific endeavours ever assembled. The detector, located at collision point 1 of the LHC, is designed to explore the fundamental components of nature and to study the forces that shape our universe. The past year’s discovery of a Higgs boson is one of the most important scientific achievements of our time, yet this is only one of many key goals of ATLAS. During a brief break in their journey, some of the 3000-member ATLAS collaboration will be taking time to share the excitement of this exploration with you. On surface no restricted access  The exhibit at Point 1 will give visitors a chance to meet these modern-day explorers and to learn from them how answers to the most fundamental questions of mankind are being sought. Activities will include a visit to the ATLAS detector, located 80m below ground; watching the prize-winning ATLAS movie in the ATLAS cinema; seeing real particle tracks in a cloud chamber and discussi...

  7. Studies into tau reconstruction, missing transverse energy and photon induced processes with the ATLAS detector at the LHC

    International Nuclear Information System (INIS)

    Prabhu, Robindra P.

    2011-09-01

    The ATLAS experiment is currently recording data from proton-proton collisions delivered by CERN's Large Hadron Collider. As more data is amassed, studies of both Standard Model processes and searches for new physics beyond will intensify. This dissertation presents a three-part study providing new methods to help facilitate these efforts. The first part presents a novel τ-reconstruction algorithm for ATLAS inspired by the ideas of particle flow calorimetry. The algorithm is distinguished from traditional τ-reconstruction approaches in ATLAS, insofar that it seeks to recognize decay topologies consistent with a (hadronically) decaying τ-lepton using resolved energy flow objects in the calorimeters. This procedure allows for an early classification of τ-candidates according to their decay mode and the use of decay mode specific discrimination against fakes. A detailed discussion of the algorithm is provided along with early performance results derived from simulated data. The second part presents a Monte Carlo simulation tool which by way of a pseudorapidity-dependent parametrization of the jet energy resolution, provides a probabilistic estimate for the magnitude of instrumental contributions to missing transverse energy arising from jet fluctuations. The principles of the method are outlined and it is shown how the method can be used to populate tails of simulated missing transverse energy distributions suffering from low statistics. The third part explores the prospect of detecting photon-induced leptonic final states in early data. Such processes are distinguished from the more copious hadronic interactions at the LHC by cleaner final states void of hadronic debris, however the soft character of the final state leptons poses challenges to both trigger and offline selections. New trigger items enabling the online selection of such final states are presented, along with a study into the feasibility of detecting the two-photon exchange process pp(γγ →

  8. Body of evidence: integrating Eduard Pernkopf's Atlas into a librarian-led medical humanities seminar.

    Science.gov (United States)

    Mages, Keith C; Lohr, Linda A

    2017-04-01

    Anatomical subjects depicted in Eduard Pernkopf's richly illustrated Topographische Anatomie des Menschen may be victims of the Nazi regime. Special collections librarians in the history of medicine can use this primary resource to initiate dialogs about ethics with medical students. Reported here is the authors' use of Pernkopf's Atlas in an interactive medical humanities seminar designed for third-year medical students. Topical articles, illustrations, and interviews introduced students to Pernkopf, his Atlas , and the surrounding controversies. We aimed to illustrate how this controversial historical publication can successfully foster student discussion and ethical reflection. Pernkopf's Atlas and our mix of contextual resources facilitated thoughtful discussions about history and ethics amongst the group. Anonymous course evaluations showed student interest in the subject matter, relevance to their studies, and appreciation of our special collection's space and contents.

  9. Quantification of brain images using Korean standard templates and structural and cytoarchitectonic probabilistic maps

    International Nuclear Information System (INIS)

    Lee, Jae Sung; Lee, Dong Soo; Kim, Yu Kyeong

    2004-01-01

    Population based structural and functional maps of the brain provide effective tools for the analysis and interpretation of complex and individually variable brain data. Brain MRI and PET standard templates and statistical probabilistic maps based on image data of Korean normal volunteers have been developed and probabilistic maps based on cytoarchitectonic data have been introduced. A quantification method using these data was developed for the objective assessment of regional intensity in the brain images. Age, gender and ethnic specific anatomical and functional brain templates based on MR and PET images of Korean normal volunteers were developed. Korean structural probabilistic maps for 89 brain regions and cytoarchitectonic probabilistic maps for 13 Brodmann areas were transformed onto the standard templates. Brain FDG PET and SPGR MR images of normal volunteers were spatially normalized onto the template of each modality and gender. Regional uptake of radiotracers in PET and gray matter concentration in MR images were then quantified by averaging (or summing) regional intensities weighted using the probabilistic maps of brain regions. Regionally specific effects of aging on glucose metabolism in cingulate cortex were also examined. Quantification program could generate quantification results for single spatially normalized images per 20 seconds. Glucose metabolism change in cingulate gyrus was regionally specific: ratios of glucose metabolism in the rostral anterior cingulate vs. posterior cingulate and the caudal anterior cingulate vs. posterior cingulate were significantly decreased as the age increased. 'Rostral anterior' / 'posterior' was decreased by 3.1% per decade of age (p -11 , r=0.81) and 'caudal anterior' / 'posterior' was decreased by 1.7% (p -8 , r=0.72). Ethnic specific standard templates and probabilistic maps and quantification program developed in this study will be useful for the analysis of brain image of Korean people since the difference

  10. Quantification of brain images using Korean standard templates and structural and cytoarchitectonic probabilistic maps

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jae Sung; Lee, Dong Soo; Kim, Yu Kyeong [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)] [and others

    2004-06-01

    Population based structural and functional maps of the brain provide effective tools for the analysis and interpretation of complex and individually variable brain data. Brain MRI and PET standard templates and statistical probabilistic maps based on image data of Korean normal volunteers have been developed and probabilistic maps based on cytoarchitectonic data have been introduced. A quantification method using these data was developed for the objective assessment of regional intensity in the brain images. Age, gender and ethnic specific anatomical and functional brain templates based on MR and PET images of Korean normal volunteers were developed. Korean structural probabilistic maps for 89 brain regions and cytoarchitectonic probabilistic maps for 13 Brodmann areas were transformed onto the standard templates. Brain FDG PET and SPGR MR images of normal volunteers were spatially normalized onto the template of each modality and gender. Regional uptake of radiotracers in PET and gray matter concentration in MR images were then quantified by averaging (or summing) regional intensities weighted using the probabilistic maps of brain regions. Regionally specific effects of aging on glucose metabolism in cingulate cortex were also examined. Quantification program could generate quantification results for single spatially normalized images per 20 seconds. Glucose metabolism change in cingulate gyrus was regionally specific: ratios of glucose metabolism in the rostral anterior cingulate vs. posterior cingulate and the caudal anterior cingulate vs. posterior cingulate were significantly decreased as the age increased. 'Rostral anterior' / 'posterior' was decreased by 3.1% per decade of age (p<10{sup -11}, r=0.81) and 'caudal anterior' / 'posterior' was decreased by 1.7% (p<10{sup -8}, r=0.72). Ethnic specific standard templates and probabilistic maps and quantification program developed in this study will be useful for the analysis

  11. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  12. Parcellation of the Healthy Neonatal Brain into 107 Regions Using Atlas Propagation through Intermediate Time Points in Childhood.

    Science.gov (United States)

    Blesa, Manuel; Serag, Ahmed; Wilkinson, Alastair G; Anblagan, Devasuda; Telford, Emma J; Pataky, Rozalia; Sparrow, Sarah A; Macnaught, Gillian; Semple, Scott I; Bastin, Mark E; Boardman, James P

    2016-01-01

    Neuroimage analysis pipelines rely on parcellated atlases generated from healthy individuals to provide anatomic context to structural and diffusion MRI data. Atlases constructed using adult data introduce bias into studies of early brain development. We aimed to create a neonatal brain atlas of healthy subjects that can be applied to multi-modal MRI data. Structural and diffusion 3T MRI scans were acquired soon after birth from 33 typically developing neonates born at term (mean postmenstrual age at birth 39(+5) weeks, range 37(+2)-41(+6)). An adult brain atlas (SRI24/TZO) was propagated to the neonatal data using temporal registration via childhood templates with dense temporal samples (NIH Pediatric Database), with the final atlas (Edinburgh Neonatal Atlas, ENA33) constructed using the Symmetric Group Normalization (SyGN) method. After this step, the computed final transformations were applied to T2-weighted data, and fractional anisotropy, mean diffusivity, and tissue segmentations to provide a multi-modal atlas with 107 anatomical regions; a symmetric version was also created to facilitate studies of laterality. Volumes of each region of interest were measured to provide reference data from normal subjects. Because this atlas is generated from step-wise propagation of adult labels through intermediate time points in childhood, it may serve as a useful starting point for modeling brain growth during development.

  13. Parcellation of the healthy neonatal brain into 107 regions using atlas propagation through intermediate time points in childhood

    Directory of Open Access Journals (Sweden)

    Manuel eBlesa Cabez

    2016-05-01

    Full Text Available Neuroimage analysis pipelines rely on parcellated atlases generated from healthy individuals to provide anatomic context to structural and diffusion MRI data. Atlases constructed using adult data introduce bias into studies of early brain development. We aimed to create a neonatal brain atlas of healthy subjects that can be applied to multi-modal MRI data. Structural and diffusion 3T MRI scans were acquired soon after birth from 33 typically developing neonates born at term (mean postmenstrual age at birth 39+5 weeks, range 37+2-41+6. An adult brain atlas (SRI24/TZO was propagated to the neonatal data using temporal registration via childhood templates with dense temporal samples (NIH Pediatric Database, with the final atlas (Edinburgh Neonatal Atlas, ENA33 constructed using the Symmetric Group Normalization method. After this step, the computed final transformations were applied to T2-weighted data, and fractional anisotropy, mean diffusivity, and tissue segmentations to provide a multi-modal atlas with 107 anatomical regions; a symmetric version was also created to facilitate studies of laterality. Volumes of each region of interest were measured to provide reference data from normal subjects. Because this atlas is generated from step-wise propagation of adult labels through intermediate time points in childhood, it may serve as a useful starting point for modelling brain growth during development.

  14. Prostatome: A combined anatomical and disease based MRI atlas of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Rusu, Mirabela; Madabhushi, Anant, E-mail: anant.madabhushi@case.edu [Case Western Reserve University, Cleveland, Ohio 44106 (United States); Bloch, B. Nicolas; Jaffe, Carl C. [Boston University School of Medicine, Boston, Massachusetts 02118 (United States); Genega, Elizabeth M. [Beth Israel Deaconess Medical Center, Boston, Massachusetts 02215 (United States); Lenkinski, Robert E.; Rofsky, Neil M. [UT Southwestern Medical Center, Dallas, Texas 75235 (United States); Feleppa, Ernest [Riverside Research Institute, New York, New York 10038 (United States)

    2014-07-15

    Purpose: In this work, the authors introduce a novel framework, the anatomically constrained registration (AnCoR) scheme and apply it to create a fused anatomic-disease atlas of the prostate which the authors refer to as the prostatome. The prostatome combines a MRI based anatomic and a histology based disease atlas. Statistical imaging atlases allow for the integration of information across multiple scales and imaging modalities into a single canonical representation, in turn enabling a fused anatomical-disease representation which may facilitate the characterization of disease appearance relative to anatomic structures. While statistical atlases have been extensively developed and studied for the brain, approaches that have attempted to combine pathology and imaging data for study of prostate pathology are not extant. This works seeks to address this gap. Methods: The AnCoR framework optimizes a scoring function composed of two surface (prostate and central gland) misalignment measures and one intensity-based similarity term. This ensures the correct mapping of anatomic regions into the atlas, even when regional MRI intensities are inconsistent or highly variable between subjects. The framework allows for creation of an anatomic imaging and a disease atlas, while enabling their fusion into the anatomic imaging-disease atlas. The atlas presented here was constructed using 83 subjects with biopsy confirmed cancer who had pre-operative MRI (collected at two institutions) followed by radical prostatectomy. The imaging atlas results from mapping thein vivo MRI into the canonical space, while the anatomic regions serve as domain constraints. Elastic co-registration MRI and corresponding ex vivo histology provides “ground truth” mapping of cancer extent on in vivo imaging for 23 subjects. Results: AnCoR was evaluated relative to alternative construction strategies that use either MRI intensities or the prostate surface alone for registration. The AnCoR framework

  15. Prostatome: A combined anatomical and disease based MRI atlas of the prostate

    International Nuclear Information System (INIS)

    Rusu, Mirabela; Madabhushi, Anant; Bloch, B. Nicolas; Jaffe, Carl C.; Genega, Elizabeth M.; Lenkinski, Robert E.; Rofsky, Neil M.; Feleppa, Ernest

    2014-01-01

    Purpose: In this work, the authors introduce a novel framework, the anatomically constrained registration (AnCoR) scheme and apply it to create a fused anatomic-disease atlas of the prostate which the authors refer to as the prostatome. The prostatome combines a MRI based anatomic and a histology based disease atlas. Statistical imaging atlases allow for the integration of information across multiple scales and imaging modalities into a single canonical representation, in turn enabling a fused anatomical-disease representation which may facilitate the characterization of disease appearance relative to anatomic structures. While statistical atlases have been extensively developed and studied for the brain, approaches that have attempted to combine pathology and imaging data for study of prostate pathology are not extant. This works seeks to address this gap. Methods: The AnCoR framework optimizes a scoring function composed of two surface (prostate and central gland) misalignment measures and one intensity-based similarity term. This ensures the correct mapping of anatomic regions into the atlas, even when regional MRI intensities are inconsistent or highly variable between subjects. The framework allows for creation of an anatomic imaging and a disease atlas, while enabling their fusion into the anatomic imaging-disease atlas. The atlas presented here was constructed using 83 subjects with biopsy confirmed cancer who had pre-operative MRI (collected at two institutions) followed by radical prostatectomy. The imaging atlas results from mapping thein vivo MRI into the canonical space, while the anatomic regions serve as domain constraints. Elastic co-registration MRI and corresponding ex vivo histology provides “ground truth” mapping of cancer extent on in vivo imaging for 23 subjects. Results: AnCoR was evaluated relative to alternative construction strategies that use either MRI intensities or the prostate surface alone for registration. The AnCoR framework

  16. Performance of the Electronic Readout of the ATLAS Liquid Argon Calorimeters

    CERN Document Server

    Abreu, H; Aleksa, M; Aperio Bella, L; Archambault, JP; Arfaoui, S; Arnaez, O; Auge, E; Aurousseau, M; Bahinipati, S; Ban, J; Banfi, D; Barajas, A; Barillari, T; Bazan, A; Bellachia, F; Beloborodova, O; Benchekroun, D; Benslama, K; Berger, N; Berghaus, F; Bernat, P; Bernier, R; Besson, N; Binet, S; Blanchard, JB; Blondel, A; Bobrovnikov, V; Bohner, O; Boonekamp, M; Bordoni, S; Bouchel, M; Bourdarios, C; Bozzone, A; Braun, HM; Breton, D; Brettel, H; Brooijmans, G; Caputo, R; Carli, T; Carminati, L; Caughron, S; Cavalleri, P; Cavalli, D; Chareyre, E; Chase, RL; Chekulaev, SV; Chen, H; Cheplakov, A; Chiche, R; Citterio, M; Cojocaru, C; Colas, J; Collard, C; Collot, J; Consonni, M; Cooke, M; Copic, K; Costa, GC; Courneyea, L; Cuisy, D; Cwienk, WD; Damazio, D; Dannheim, D; De Cecco, S; De La Broise, X; De La Taille, C; de Vivie, JB; Debennerot, B; Delagnes, E; Delmastro, M; Derue, F; Dhaliwal, S; Di Ciaccio, L; Doan, O; Dudziak, F; Duflot, L; Dumont-Dayot, N; Dzahini, D; Elles, S; Ertel, E; Escalier, M; Etienvre, AI; Falleau, I; Fanti, M; Farooque, T; Favre, P; Fayard, Louis; Fent, J; Ferencei, J; Fischer, A; Fournier, D; Fournier, L; Fras, M; Froeschl, R; Gadfort, T; Gallin-Martel, ML; Gibson, A; Gillberg, D; Gingrich, DM; Göpfert, T; Goodson, J; Gouighri, M; Goy, C; Grassi, V; Gray, J; Guillemin, T; Guo, B; Habring, J; Handel, C; Heelan, L; Heintz, H; Helary, L; Henrot-Versille, S; Hervas, L; Hobbs, J; Hoffman, J; Hostachy, JY; Hoummada, A; Hrivnac, J; Hrynova, T; Hubaut, F; Huber, J; Iconomidou-Fayard, L; Iengo, P; Imbert, P; Ishmukhametov, R; Jantsch, A; Javadov, N; Jezequel, S; Jimenez Belenguer, M; Ju, XY; Kado, M; Kalinowski, A; Kar, D; Karev, A; Katsanos, I; Kazarinov, M; Kerschen, N; Kierstead, J; Kim, MS; Kiryunin, A; Kladiva, E; Knecht, N; Kobel, M; Koletsou, I; König, S; Krieger, P; Kukhtin, V; Kuna, M; Kurchaninov, L; Labbe, J; Lacour, D; Ladygin, E; Lafaye, R; Laforge, B; Lamarra, D; Lampl, W; Lanni, F; Laplace, S; Laskus, H; Le Coguie, A; Le Dortz, O; Le Maner, C; Lechowski, M; Lee, SC; Lefebvre, M; Leonhardt, K; Lethiec, L; Leveque, J; Liang, Z; Liu, C; Liu, T; Liu, Y; Loch, P; Lu, J; Ma, H; Mader, W; Majewski, S; Makovec, N; Makowiecki, D; Mandelli, L; Mangeard, PS; Mansoulie, B; Marchand, JF; Marchiori, G; Martin, D; Martin-Chassard, G; Martin dit Latour, B; Marzin, A; Maslennikov, A; Massol, N; Matricon, P; Maximov, D; Mazzanti, M; McCarthy, T; McPherson, R; Menke, S; Meyer, JP; Ming, Y; Monnier, E; Mooshofer, P; Neganov, A; Niedercorn, F; Nikolic-Audit, I; Nugent, IM; Oakham, G; Oberlack, H; Ocariz, J; Odier, J; Oram, CJ; Orlov, I; Orr, R; Parsons, JA; Peleganchuk, S; Penson, A; Perini, L; Perrodo, P; Perrot, G; Perus, A; Petit, E; Pisarev, I; Plamondon, M; Poffenberger, P; Poggioli, L; Pospelov, G; Pralavorio, P; Prast, J; Prudent, X; Przysiezniak, H; Puzo, P; Quentin, M; Radeka, V; Rajagopalan, S; Rauter, E; Reimann, O; Rescia, S; Resende, B; Richer, JP; Ridel, M; Rios, R; Roos, L; Rosenbaum, G; Rosenzweig, H; Rossetto, O; Roudil, W; Rousseau, D; Ruan, X; Rudert, A; Rusakovich, N; Rusquart, P; Rutherfoord, J; Sauvage, G; Savine, A; Schaarschmidt, J; Schacht, P; Schaffer, A; Schram, M; Schwemling, P; Seguin Moreau, N; Seifert, F; Serin, L; Seuster, R; Shalyugin, A; Shupe, M; Simion, S; Sinervo, P; Sippach, W; Skovpen, K; Sliwa, R; Soukharev, A; Spano, F; Stavina, P; Straessner, A; Strizenec, P; Stroynowski, R; Talyshev, A; Tapprogge, S; Tarrade, F; Tartarelli, GF; Teuscher, R; Tikhonov, Yu; Tocut, V; Tompkins, D; Thompson, P; Tisserant, S; Todorov, T; Tomasz, F; Trincaz-Duvoid, S; Trinh, Thi N; Trochet, S; Trocme, B; Tschann-Grimm, K; Tsionou, D; Ueno, R; Unal, G; Urbaniec, D; Usov, Y; Voss, K; Veillet, JJ; Vincter, M; Vogt, S; Weng, Z; Whalen, K; Wicek, F; Wilkens, H; Wingerter-Seez, I; Wulf, E; Yang, Z; Ye, J; Yuan, L; Yurkewicz, A; Zarzhitsky, P; Zerwas, D; Zhang, H; Zhang, L; Zhou, N; Zimmer, J; Zitoun, R; Zivkovic, L

    2010-01-01

    The ATLAS detector has been designed for operation at the Large Hadron Collider at CERN. ATLAS includes electromagnetic and hadronic liquid argon calorimeters, with almost 200,000 channels of data that must be sampled at the LHC bunch crossing frequency of 40 MHz. The calorimeter electronics calibration and readout are performed by custom electronics developed specifically for these purposes. This paper describes the system performance of the ATLAS liquid argon calibration and readout electronics, including noise, energy and time resolution, and long term stability, with data taken mainly from full-system calibration runs performed after installation of the system in the ATLAS detector hall at CERN.

  17. Next Generation PanDA Pilot for ATLAS and Other Experiments

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Caballero Bejar, J; De, K; Hover, J; Love, P; Maeno, T; Medrano Llamas, R; Walker, R; Wenaus, T

    2013-01-01

    The Production and Distributed Analysis system (PanDA) has been in use in the ATLAS Experiment since 2005. It uses a sophisticated pilot system to execute submitted jobs on the worker nodes. While originally designed for ATLAS, the PanDA Pilot has recently been refactored to facilitate use outside of ATLAS. Experiments are now handled as plug-ins such that a new PanDA Pilot user only has to implement a set of prototyped methods in the plug-in classes, and provide a script that configures and runs the experiment specific payload. We will give an overview of the Next Generation PanDA Pilot system and will present major features and recent improvements including live user payload debugging, data access via the Federated XRootD system, stage-out to alternative storage elements, support for the new ATLAS DDM system (Rucio), and an improved integration with glExec, as well as a description of the experiment specific plug-in classes. The performance of the pilot system in processing LHC data on the OSG, LCG and Nord...

  18. Next Generation PanDA Pilot for ATLAS and Other Experiments

    CERN Document Server

    Nilsson, P; The ATLAS collaboration; Caballero Bejar, J; De, K; Hover, J; Love, P; Maeno, T; Medrano Llamas, R; Walker, R; Wenaus, T

    2014-01-01

    The Production and Distributed Analysis system (PanDA) has been in use in the ATLAS Experiment since 2005. It uses a sophisticated pilot system to execute submitted jobs on the worker nodes. While originally designed for ATLAS, the PanDA Pilot has recently been refactored to facilitate use outside of ATLAS. Experiments are now handled as plug-ins such that a new PanDA Pilot user only has to implement a set of prototyped methods in the plug-in classes, and provide a script that configures and runs the experiment specific payload. We will give an overview of the Next Generation PanDA Pilot system and will present major features and recent improvements including live user payload debugging, data access via the Federated XRootD system, stage-out to alternative storage elements, support for the new ATLAS DDM system (Rucio), and an improved integration with glExec, as well as a description of the experiment specific plug-in classes. The performance of the pilot system in processing LHC data on the OSG, LCG and Nord...

  19. Iterative principles of recognition in probabilistic neural networks

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří; Hora, Jan

    2008-01-01

    Roč. 21, č. 6 (2008), s. 838-846 ISSN 0893-6080 R&D Projects: GA MŠk 1M0572; GA ČR GA102/07/1594 Grant - others:GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Probabilistic neural networks * Distribution mixtures * EM algorithm * Recognition of numerals * Recurrent reasoning Subject RIV: IN - Informatics, Computer Science Impact factor: 2.656, year: 2008

  20. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  1. Proceedings (Mathematical Sciences) SUBJECT INDEX

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    SUBJECT INDEX. Abel's summation formula. Analogues of Euler and Poisson summa- tion formulae. 213 ... theorems of Wiener and Lévy on absolutely convergent Fourier series. 179. Brownian motion. Probabilistic representations of solutions to the heat equation. 321. Cesáro matrix. Necessary and sufficient conditions for.

  2. Damage Atlas for Photographic materials

    Directory of Open Access Journals (Sweden)

    Kristel Van Camp

    2010-11-01

    Full Text Available La conservation des documents photographiques peut nécessiter des interventions préventives ou curatives. Ce choix est guidé par leur état de conservation. Une meilleure connaissance des détériorations est donc cruciale. Le répertoire présenté ici essaie de les classifier selon des caractéristiques spécifiques et leur niveau de gravité. Les différents types de dégradation sont illustrés et décrits avec une terminologie précise. L’auteur propose en regard de ceux-ci l’intervention qui semble la plus appropriée. Ce répertoire s’adresse à toutes les personnes concernées par la photographie, qu’ils soient dans le milieu de la conservation ou dans le domaine artistique, dans les musées ou dans les archives. In order to rescue a damaged photographic object, preventive or conservative actions are needed. Knowing the specific characteristics of different types of damage is crucial. A damage atlas can provide these characteristics. With this atlas the damage can be recognised and appropriate actions can be taken. This damage atlas offers a first attempt to such a characterisation in the field of photography. The damage atlas contains images and the necessary information about damage on photographic material. The atlas with special annotations about the terminology and the grade of the damage is meant for everybody who works with photographic material, as well in museums as in archives.

  3. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1980-08-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the US Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed

  4. Fault-tree analysis for probabilistic assessment of radioactive-waste segregation: an application to a plastic clay formation at a specific site

    International Nuclear Information System (INIS)

    D'Alessandro, M.; Bonne, A.

    1982-01-01

    This study concerns a probabilistic safety analysis of potential nuclear-waste repository which may be mined into a Tertiary clay formation underlying the Nuclear Research Centre at Mol (Belgium). The value of the geological barrier has been analyzed in probabilistic terms through the application of the Fault-Tree Analysis (FTA) which can answer two main questions: how can the barrier fail (query) and what is the failure probability (query). FTA has been applied to conceptual radioactive-waste disposal systems. In this paper this methodology has been applied to a specific clay formation, to test the applicability of the procedure to a potential site. With this aim, release probabilities to three different receptors (groundwater, land surface, and atmosphere) were estimated for four different time periods. Because of obvious uncertainties in geology predictive capabilities, a probability band has been obtained. Faulting phenomena are among the main mechanisms having the potential to cause release to groundwater, whereas direct releases to land surface may be linked to various glacial phenomena; on short term, different types of human actions may be important. The overall failure probabilities seem to be sufficiently low to offer a good safety margin. (author)

  5. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    Science.gov (United States)

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  6. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  7. ATLAS Thesis Award 2017

    CERN Multimedia

    Anthony, Katarina

    2018-01-01

    Winners of the ATLAS Thesis Award were presented with certificates and glass cubes during a ceremony on 22 February, 2018. They are pictured here with Karl Jakobs (ATLAS Spokesperson), Max Klein (ATLAS Collaboration Board Chair) and Katsuo Tokushuku (ATLAS Collaboration Board Deputy Chair).

  8. Operational intervention levels in a nuclear emergency, general concepts and a probabilistic approach

    International Nuclear Information System (INIS)

    Lauritzen, B.; Baeverstam, U.; Naadland Holo, E.; Sinkko, K.

    1997-12-01

    This report deals with Operational Intervention Levels (OILs) in a nuclear or radiation emergency. OILs are defined as the values of environmental measurements, in particular dose rate measurements, above which specific protective actions should be carried out in emergency exposure situations. The derivation and the application of OILs are discussed, and an overview of the presently adopted values is provided, with emphasis on the situation in the Nordic countries. A new, probabilistic approach to derive OILs is presented and the method is illustrated by calculating dose rate OILs in a simplified setting. Contrary to the standard approach, the probabilistic approach allows for optimization of OILs. It is argued, that optimized OILs may be much larger than the presently adopted or suggested values. It is recommended, that the probabilistic approach is further developed and employed in determining site specific OILs and in optimizing environmental measuring strategies. (au)

  9. The effect of teacher interpersonal behaviour on students' subject-specific motivation

    NARCIS (Netherlands)

    den Brok, P.; Levy, J.; Brekelmans, M.; Wubbels, Th.

    2006-01-01

    This study brings together insights from research on teaching and learning in specific subjects, learning environments research and effectiveness research by linking teacher interpersonal behaviour to students’ subject-related attitudes. Teaching was studied in terms of a model originating from

  10. ATLAS

    CERN Multimedia

    Akhnazarov, V; Canepa, A; Bremer, J; Burckhart, H; Cattai, A; Voss, R; Hervas, L; Kaplon, J; Nessi, M; Werner, P; Ten kate, H; Tyrvainen, H; Vandelli, W; Krasznahorkay, A; Gray, H; Alvarez gonzalez, B; Eifert, T F; Rolando, G; Oide, H; Barak, L; Glatzer, J; Backhaus, M; Schaefer, D M; Maciejewski, J P; Milic, A; Jin, S; Von torne, E; Limbach, C; Medinnis, M J; Gregor, I; Levonian, S; Schmitt, S; Waananen, A; Monnier, E; Muanza, S G; Pralavorio, P; Talby, M; Tiouchichine, E; Tocut, V M; Rybkin, G; Wang, S; Lacour, D; Laforge, B; Ocariz, J H; Bertoli, W; Malaescu, B; Sbarra, C; Yamamoto, A; Sasaki, O; Koriki, T; Hara, K; Da silva gomes, A; Carvalho maneira, J; Marcalo da palma, A; Chekulaev, S; Tikhomirov, V; Snesarev, A; Buzykaev, A; Maslennikov, A; Peleganchuk, S; Sukharev, A; Kaplan, B E; Swiatlowski, M J; Nef, P D; Schnoor, U; Oakham, G F; Ueno, R; Orr, R S; Abouzeid, O; Haug, S; Peng, H; Kus, V; Vitek, M; Temming, K K; Dang, N P; Meier, K; Schultz-coulon, H; Geisler, M P; Sander, H; Schaefer, U; Ellinghaus, F; Rieke, S; Nussbaumer, A; Liu, Y; Richter, R; Kortner, S; Fernandez-bosman, M; Ullan comes, M; Espinal curull, J; Chiriotti alvarez, S; Caubet serrabou, M; Valladolid gallego, E; Kaci, M; Carrasco vela, N; Lancon, E C; Besson, N E; Gautard, V; Bracinik, J; Bartsch, V C; Potter, C J; Lester, C G; Moeller, V A; Rosten, J; Crooks, D; Mathieson, K; Houston, S C; Wright, M; Jones, T W; Harris, O B; Byatt, T J; Dobson, E; Hodgson, P; Hodgkinson, M C; Dris, M; Karakostas, K; Ntekas, K; Oren, D; Duchovni, E; Etzion, E; Oren, Y; Ferrer, L M; Testa, M; Doria, A; Merola, L; Sekhniaidze, G; Giordano, R; Ricciardi, S; Milazzo, A; Falciano, S; De pedis, D; Dionisi, C; Veneziano, S; Cardarelli, R; Verzegnassi, C; Soualah, R; Ochi, A; Ohshima, T; Kishiki, S; Linde, F L; Vreeswijk, M; Werneke, P; Muijs, A; Vankov, P H; Jansweijer, P P M; Dale, O; Lund, E; Bruckman de renstrom, P; Dabrowski, W; Adamek, J D; Wolters, H; Micu, L; Pantea, D; Tudorache, V; Mjoernmark, J; Klimek, P J; Ferrari, A; Abdinov, O; Akhoundov, A; Hashimov, R; Shelkov, G; Khubua, J; Ladygin, E; Lazarev, A; Glagolev, V; Dedovich, D; Lykasov, G; Zhemchugov, A; Zolnikov, Y; Ryabenko, M; Sivoklokov, S; Vasilyev, I; Shalimov, A; Lobanov, M; Paramoshkina, E; Mosidze, M; Bingul, A; Nodulman, L J; Guarino, V J; Yoshida, R; Drake, G R; Calafiura, P; Haber, C; Quarrie, D R; Alonso, J R; Anderson, C; Evans, H; Lammers, S W; Baubock, M; Anderson, K; Petti, R; Suhr, C A; Linnemann, J T; Richards, R A; Tollefson, K A; Holzbauer, J L; Stoker, D P; Pier, S; Nelson, A J; Isakov, V; Martin, A J; Adelman, J A; Paganini, M; Gutierrez, P; Snow, J M; Pearson, B L; Cleland, W E; Savinov, V; Wong, W; Goodson, J J; Li, H; Lacey, R A; Gordeev, A; Gordon, H; Lanni, F; Nevski, P; Rescia, S; Kierstead, J A; Liu, Z; Yu, W W H; Bensinger, J; Hashemi, K S; Bogavac, D; Cindro, V; Hoeferkamp, M R; Coelli, S; Iodice, M; Piegaia, R N; Alonso, F; Wahlberg, H P; Barberio, E L; Limosani, A; Rodd, N L; Jennens, D T; Hill, E C; Pospisil, S; Smolek, K; Schaile, D A; Rauscher, F G; Adomeit, S; Mattig, P M; Wahlen, H; Volkmer, F; Calvente lopez, S; Sanchis peris, E J; Pallin, D; Podlyski, F; Says, L; Boumediene, D E; Scott, W; Phillips, P W; Greenall, A; Turner, P; Gwilliam, C B; Kluge, T; Wrona, B; Sellers, G J; Millward, G; Adragna, P; Hartin, A; Alpigiani, C; Piccaro, E; Bret cano, M; Hughes jones, R E; Mercer, D; Oh, A; Chavda, V S; Carminati, L; Cavasinni, V; Fedin, O; Patrichev, S; Ryabov, Y; Nesterov, S; Grebenyuk, O; Sasso, J; Mahmood, H; Polsdofer, E; Dai, T; Ferretti, C; Liu, H; Hegazy, K H; Benjamin, D P; Zobernig, G; Ban, J; Brooijmans, G H; Keener, P; Williams, H H; Le geyt, B C; Hines, E J; Fadeyev, V; Schumm, B A; Law, A T; Kuhl, A D; Neubauer, M S; Shang, R; Gagliardi, G; Calabro, D; Conta, C; Zinna, M; Jones, G; Li, J; Stradling, A R; Hadavand, H K; Mcguigan, P; Chiu, P; Baldelomar, E; Stroynowski, R A; Kehoe, R L; De groot, N; Timmermans, C; Lach-heb, F; Addy, T N; Nakano, I; Moreno lopez, D; Grosse-knetter, J; Tyson, B; Rude, G D; Tafirout, R; Benoit, P; Danielsson, H O; Elsing, M; Fassnacht, P; Froidevaux, D; Ganis, G; Gorini, B; Lasseur, C; Lehmann miotto, G; Kollar, D; Aleksa, M; Sfyrla, A; Duehrssen-debling, K; Fressard-batraneanu, S; Van der ster, D C; Bortolin, C; Schumacher, J; Mentink, M; Geich-gimbel, C; Yau wong, K H; Lafaye, R; Crepe-renaudin, S; Albrand, S; Hoffmann, D; Pangaud, P; Meessen, C; Hrivnac, J; Vernay, E; Perus, A; Henrot versille, S L; Le dortz, O; Derue, F; Piccinini, M; Polini, A; Terada, S; Arai, Y; Ikeno, M; Fujii, H; Nagano, K; Ukegawa, F; Aguilar saavedra, J A; Conde muino, P; Castro, N F; Eremin, V; Kopytine, M; Sulin, V; Tsukerman, I; Korol, A; Nemethy, P; Bartoldus, R; Glatte, A; Chelsky, S; Van nieuwkoop, J; Bellerive, A; Sinervo, J K; Battaglia, A; Barbier, G J; Pohl, M; Rosselet, L; Alexandre, G B; Prokoshin, F; Pezoa rivera, R A; Batkova, L; Kladiva, E; Stastny, J; Kubes, T; Vidlakova, Z; Esch, H; Homann, M; Herten, L G; Zimmermann, S U; Pfeifer, B; Stenzel, H; Andrei, G V; Wessels, M; Buescher, V; Kleinknecht, K; Fiedler, F M; Schroeder, C D; Fernandez, E; Mir martinez, L; Vorwerk, V; Bernabeu verdu, J; Salt, J; Civera navarrete, J V; Bernard, R; Berriaud, C P; Chevalier, L P; Hubbard, R; Schune, P; Nikolopoulos, K; Batley, J R; Brochu, F M; Phillips, A W; Teixeira-dias, P J; Rose, M B D; Buttar, C; Buckley, A G; Nurse, E L; Larner, A B; Boddy, C; Henderson, J; Costanzo, D; Tarem, S; Maccarrone, G; Laurelli, P F; Alviggi, M; Chiaramonte, R; Izzo, V; Palumbo, V; Fraternali, M; Crosetti, G; Marchese, F; Yamaguchi, Y; Hessey, N P; Mechnich, J M; Liebig, W; Kastanas, K A; Sjursen, T B; Zalieckas, J; Cameron, D G; Banka, P; Kowalewska, A B; Dwuznik, M; Mindur, B; Boldea, V; Hedberg, V; Smirnova, O; Sellden, B; Allahverdiyev, T; Gornushkin, Y; Koultchitski, I; Tokmenin, V; Chizhov, M; Gongadze, A; Khramov, E; Sadykov, R; Krasnoslobodtsev, I; Smirnova, L; Kramarenko, V; Minaenko, A; Zenin, O; Beddall, A J; Ozcan, E V; Hou, S; Wang, S; Moyse, E; Willocq, S; Chekanov, S; Le compte, T J; Love, J R; Ciocio, A; Hinchliffe, I; Tsulaia, V; Gomez, A; Luehring, F; Zieminska, D; Huth, J E; Gonski, J L; Oreglia, M; Tang, F; Shochet, M J; Costin, T; Mcleod, A; Uzunyan, S; Martin, S P; Pope, B G; Schwienhorst, R H; Brau, J E; Ptacek, E S; Milburn, R H; Sabancilar, E; Lauer, R; Saleem, M; Mohamed meera lebbai, M R; Lou, X; Reeves, K B; Rijssenbeek, M; Novakova, P N; Rahm, D; Steinberg, P A; Wenaus, T J; Paige, F; Ye, S; Kotcher, J R; Assamagan, K A; Oliveira damazio, D; Maeno, T; Henry, A; Dushkin, A; Costa, G; Meroni, C; Resconi, S; Lari, T; Biglietti, M; Lohse, T; Gonzalez silva, M L; Monticelli, F G; Saavedra, A F; Patel, N D; Ciodaro xavier, T; Asevedo nepomuceno, A; Lefebvre, M; Albert, J E; Kubik, P; Faltova, J; Turecek, D; Solc, J; Schaile, O; Ebke, J; Losel, P J; Zeitnitz, C; Sturm, P D; Barreiro alonso, F; Modesto alapont, P; Soret medel, J; Garzon alama, E J; Gee, C N; Mccubbin, N A; Sankey, D; Emeliyanov, D; Dewhurst, A L; Houlden, M A; Klein, M; Burdin, S; Lehan, A K; Eisenhandler, E; Lloyd, S; Traynor, D P; Ibbotson, M; Marshall, R; Pater, J; Freestone, J; Masik, J; Haughton, I; Manousakis katsikakis, A; Sampsonidis, D; Krepouri, A; Roda, C; Sarri, F; Fukunaga, C; Nadtochiy, A; Kara, S O; Timm, S; Alam, S M; Rashid, T; Goldfarb, S; Espahbodi, S; Marley, D E; Rau, A W; Dos anjos, A R; Haque, S; Grau, N C; Havener, L B; Thomson, E J; Newcomer, F M; Hansl-kozanecki, G; Deberg, H A; Takeshita, T; Goggi, V; Ennis, J S; Olness, F I; Kama, S; Ordonez sanz, G; Koetsveld, F; Elamri, M; Mansoor-ul-islam, S; Lemmer, B; Kawamura, G; Bindi, M; Schulte, S; Kugel, A; Kretz, M P; Kurchaninov, L; Blanchot, G; Chromek-burckhart, D; Di girolamo, B; Francis, D; Gianotti, F; Nordberg, M Y; Pernegger, H; Roe, S; Boyd, J; Wilkens, H G; Pauly, T; Fabre, C; Tricoli, A; Bertet, D; Ruiz martinez, M A; Arnaez, O L; Lenzi, B; Boveia, A J; Gillberg, D I; Davies, J M; Zimmermann, R; Uhlenbrock, M; Kraus, J K; Narayan, R T; John, A; Dam, M; Padilla aranda, C; Bellachia, F; Le flour chollet, F M; Jezequel, S; Dumont dayot, N; Fede, E; Mathieu, M; Gensolen, F D; Alio, L; Arnault, C; Bouchel, M; Ducorps, A; Kado, M M; Lounis, A; Zhang, Z P; De vivie de regie, J; Beau, T; Bruni, A; Bruni, G; Grafstrom, P; Romano, M; Lasagni manghi, F; Massa, L; Shaw, K; Ikegami, Y; Tsuno, S; Kawanishi, Y; Benincasa, G; Blagov, M; Fedorchuk, R; Shatalov, P; Romaniouk, A; Belotskiy, K; Timoshenko, S; Hooft van huysduynen, L; Lewis, G H; Wittgen, M M; Mader, W F; Rudolph, C J; Gumpert, C; Mamuzic, J; Rudolph, G; Schmid, P; Corriveau, F; Belanger-champagne, C; Yarkoni, S; Leroy, C; Koffas, T; Harack, B D; Weber, M S; Beck, H; Leger, A; Gonzalez sevilla, S; Zhu, Y; Gao, J; Zhang, X; Blazek, T; Rames, J; Sicho, P; Kouba, T; Sluka, T; Lysak, R; Ristic, B; Kompatscher, A E; Von radziewski, H; Groll, M; Meyer, C P; Oberlack, H; Stonjek, S M; Cortiana, G; Werthenbach, U; Ibragimov, I; Czirr, H S; Cavalli-sforza, M; Puigdengoles olive, C; Tallada crespi, P; Marti i garcia, S; Gonzalez de la hoz, S; Guyot, C; Meyer, J; Schoeffel, L O; Garvey, J; Hawkes, C; Hillier, S J; Staley, R J; Salvatore, P F; Santoyo castillo, I; Carter, J; Yusuff, I B; Barlow, N R; Berry, T S; Savage, G; Wraight, K G; Steele, G E; Hughes, G; Walder, J W; Love, P A; Crone, G J; Waugh, B M; Boeser, S; Sarkar, A M; Holmes, A; Massey, R; Pinder, A; Nicholson, R; Korolkova, E; Katsoufis, I; Maltezos, S; Tsipolitis, G; Leontsinis, S; Levinson, L J; Shoa, M; Abramowicz, H E; Bella, G; Gershon, A; Urkovsky, E; Taiblum, N; Gatti, C; Della pietra, M; Lanza, A; Negri, A; Flaminio, V; Lacava, F; Petrolo, E; Pontecorvo, L; Rosati, S; Zanello, L; Pasqualucci, E; Di ciaccio, A; Giordani, M; Yamazaki, Y; Jinno, T; Nomachi, M; De jong, P J; Ferrari, P; Homma, J; Van der graaf, H; Igonkina, O B; Stugu, B S; Buanes, T; Pedersen, M; Turala, M; Olszewski, A J; Koperny, S Z; Onofre, A; Castro nunes fiolhais, M; Alexa, C; Cuciuc, C M; Akesson, T P A; Hellman, S L; Milstead, D A; Bondyakov, A; Pushnova, V; Budagov, Y; Minashvili, I; Romanov, V; Sniatkov, V; Tskhadadze, E; Kalinovskaya, L; Shalyugin, A; Tavkhelidze, A; Rumyantsev, L; Karpov, S; Soloshenko, A; Vostrikov, A; Borissov, E; Solodkov, A; Vorob'ev, A; Sidorov, S; Malyaev, V; Lee, S; Grudzinski, J J; Virzi, J S; Vahsen, S E; Lys, J; Penwell, J W; Yan, Z; Bernard, C S; Barreiro guimaraes da costa, J P; Oliver, J N; Merritt, F S; Brubaker, E M; Kapliy, A; Kim, J; Zutshi, V V; Burghgrave, B O; Abolins, M A; Arabidze, G; Caughron, S A; Frey, R E; Radloff, P T; Schernau, M; Murillo garcia, R; Porter, R A; Mccormick, C A; Karn, P J; Sliwa, K J; Demers konezny, S M; Strauss, M G; Mueller, J A; Izen, J M; Klimentov, A; Lynn, D; Polychronakos, V; Radeka, V; Sondericker, J I I I; Bathe, S; Duffin, S; Chen, H; De castro faria salgado, P E; Kersevan, B P; Lacker, H M; Schulz, H; Kubota, T; Tan, K G; Yabsley, B D; Nunes de moura junior, N; Pinfold, J; Soluk, R A; Ouellette, E A; Leitner, R; Sykora, T; Solar, M; Sartisohn, G; Hirschbuehl, D; Huning, D; Fischer, J; Terron cuadrado, J; Glasman kuguel, C B; Lacasta llacer, C; Lopez-amengual, J; Calvet, D; Chevaleyre, J; Daudon, F; Montarou, G; Guicheney, C; Calvet, S P J; Tyndel, M; Dervan, P J; Maxfield, S J; Hayward, H S; Beck, G; Cox, B; Da via, C; Paschalias, P; Manolopoulou, M; Ragusa, F; Cimino, D; Ezzi, M; Fiuza de barros, N F; Yildiz, H; Ciftci, A K; Turkoz, S; Zain, S B; Tegenfeldt, F; Chapman, J W; Panikashvili, N; Bocci, A; Altheimer, A D; Martin, F F; Fratina, S; Jackson, B D; Grillo, A A; Seiden, A; Watts, G T; Mangiameli, S; Johns, K A; O'grady, F T; Errede, D R; Darbo, G; Ferretto parodi, A; Leahu, M C; Farbin, A; Ye, J; Liu, T; Wijnen, T A; Naito, D; Takashima, R; Sandoval usme, C E; Zinonos, Z; Moreno llacer, M; Agricola, J B; Mcgovern, S A; Sakurai, Y; Trigger, I M; Qing, D; De silva, A S; Butin, F; Dell'acqua, A; Hawkings, R J; Lamanna, M; Mapelli, L; Passardi, G; Rembser, C; Tremblet, L; Andreazza, W; Dobos, D A; Koblitz, B; Bianco, M; Dimitrov, G V; Schlenker, S; Armbruster, A J; Rammensee, M C; Romao rodrigues, L F; Peters, K; Pozo astigarraga, M E; Yi, Y; Desch, K K; Huegging, F G; Muller, K K; Stillings, J A; Schaetzel, S; Xella, S; Hansen, J D; Colas, J; Daguin, G; Wingerter, I; Ionescu, G D; Ledroit, F; Lucotte, A; Clement, B E; Stark, J; Clemens, J; Djama, F; Knoops, E; Coadou, Y; Vigeolas-choury, E; Feligioni, L; Iconomidou-fayard, L; Imbert, P; Schaffer, A C; Nikolic, I; Trincaz-duvoid, S; Warin, P; Camard, A F; Ridel, M; Pires, S; Giacobbe, B; Spighi, R; Villa, M; Negrini, M; Sato, K; Gavrilenko, I; Akimov, A; Khovanskiy, V; Talyshev, A; Voronkov, A; Hakobyan, H; Mallik, U; Shibata, A; Konoplich, R; Barklow, T L; Koi, T; Straessner, A; Stelzer, B; Robertson, S H; Vachon, B; Stoebe, M; Keyes, R A; Wang, K; Billoud, T R V; Strickland, V; Batygov, M; Krieger, P; Palacino caviedes, G D; Gay, C W; Jiang, Y; Han, L; Liu, M; Zenis, T; Lokajicek, M; Staroba, P; Tasevsky, M; Popule, J; Svatos, M; Seifert, F; Landgraf, U; Lai, S T; Schmitt, K H; Achenbach, R; Schuh, N; Kiesling, C; Macchiolo, A; Nisius, R; Schacht, P; Von der schmitt, J G; Kortner, O; Atlay, N B; Segura sole, E; Grinstein, S; Neissner, C; Bruckner, D M; Oliver garcia, E; Boonekamp, M; Perrin, P; Gaillot, F M; Wilson, J A; Thomas, J P; Thompson, P D; Palmer, J D; Falk, I E; Chavez barajas, C A; Sutton, M R; Robinson, D; Kaneti, S A; Wu, T; Robson, A; Shaw, C; Buzatu, A; Qin, G; Jones, R; Bouhova-thacker, E V; Viehhauser, G; Weidberg, A R; Gilbert, L; Johansson, P D C; Orphanides, M; Vlachos, S; Behar harpaz, S; Papish, O; Lellouch, D J H; Turgeman, D; Benary, O; La rotonda, L; Vena, R; Tarasio, A; Marzano, F; Gabrielli, A; Di stante, L; Liberti, B; Aielli, G; Oda, S; Nozaki, M; Takeda, H; Hayakawa, T; Miyazaki, K; Maeda, J; Sugimoto, T; Pettersson, N E; Bentvelsen, S; Groenstege, H L; Lipniacka, A; Vahabi, M; Ould-saada, F; Chwastowski, J J; Hajduk, Z; Kaczmarska, A; Olszowska, J B; Trzupek, A; Staszewski, R P; Palka, M; Constantinescu, S; Jarlskog, G; Lundberg, B L A; Pearce, M; Ellert, M F; Bannikov, A; Fechtchenko, A; Iambourenko, V; Kukhtin, V; Pozdniakov, V; Topilin, N; Vorozhtsov, S; Khassanov, A; Fliaguine, V; Kharchenko, D; Nikolaev, K; Kotenov, K; Kozhin, A; Zenin, A; Ivashin, A; Golubkov, D; Beddall, A; Su, D; Dallapiccola, C J; Cranshaw, J M; Price, L; Stanek, R W; Gieraltowski, G; Zhang, J; Gilchriese, M; Shapiro, M; Ahlen, S; Morii, M; Taylor, F E; Miller, R J; Phillips, F H; Torrence, E C; Wheeler, S J; Benedict, B H; Napier, A; Hamilton, S F; Petrescu, T A; Boyd, G R J; Jayasinghe, A L; Smith, J M; Mc carthy, R L; Adams, D L; Le vine, M J; Zhao, X; Patwa, A M; Baker, M; Kirsch, L; Krstic, J; Simic, L; Filipcic, A; Seidel, S C; Cantore-cavalli, D; Baroncelli, A; Kind, O M; Scarcella, M J; Maidantchik, C L L; Seixas, J; Balabram filho, L E; Vorobel, V; Spousta, M; Strachota, P; Vokac, P; Slavicek, T; Bergmann, B L; Biebel, O; Kersten, S; Srinivasan, M; Trefzger, T; Vazeille, F; Insa, C; Kirk, J; Middleton, R; Burke, S; Klein, U; Morris, J D; Ellis, K V; Millward, L R; Giokaris, N; Ioannou, P; Angelidakis, S; Bouzakis, K; Andreazza, A; Perini, L; Chtcheguelski, V; Spiridenkov, E; Yilmaz, M; Kaya, U; Ernst, J; Mahmood, A; Saland, J; Kutnink, T; Holler, J; Kagan, H P; Wang, C; Pan, Y; Xu, N; Ji, H; Willis, W J; Tuts, P M; Litke, A; Wilder, M; Rothberg, J; Twomey, M S; Rizatdinova, F; Loch, P; Rutherfoord, J P; Varnes, E W; Barberis, D; Osculati-becchi, B; Brandt, A G; Turvey, A J; Benchekroun, D; Nagasaka, Y; Thanakornworakij, T; Quadt, A; Nadal serrano, J; Magradze, E; Nackenhorst, O; Musheghyan, H; Kareem, M; Chytka, L; Perez codina, E; Stelzer-chilton, O; Brunel, B; Henriques correia, A M; Dittus, F; Hatch, M; Haug, F; Hauschild, M; Huhtinen, M; Lichard, P; Schuh-erhard, S; Spigo, G; Avolio, G; Tsarouchas, C; Ahmad, I; Backes, M P; Barisits, M; Gadatsch, S; Cerv, M; Sicoe, A D; Nattamai sekar, L P; Fazio, D; Shan, L; Sun, X; Gaycken, G F; Hemperek, T; Petersen, T C; Alonso diaz, A; Moynot, M; Werlen, M; Hryn'ova, T; Gallin-martel, M; Wu, M; Touchard, F; Menouni, M; Fougeron, D; Le guirriec, E; Chollet, J C; Veillet, J; Barrillon, P; Prat, S; Krasny, M W; Roos, L; Boudarham, G; Lefebvre, G; Boscherini, D; Valentinetti, S; Acharya, B S; Miglioranzi, S; Kanzaki, J; Unno, Y; Yasu, Y; Iwasaki, H; Tokushuku, K; Maio, A; Rodrigues fernandes, B J; Pinto figueiredo raimundo ribeiro, N M; Bot, A; Shmeleva, A; Zaidan, R; Djilkibaev, R; Mincer, A I; Salnikov, A; Aracena, I A; Schwartzman, A G; Silverstein, D J; Fulsom, B G; Anulli, F; Kuhn, D; White, M J; Vetterli, M J; Stockton, M C; Mantifel, R L; Azuelos, G; Shoaleh saadi, D; Savard, P; Clark, A; Ferrere, D; Gaumer, O P; Diaz gutierrez, M A; Liu, Y; Dubnickova, A; Sykora, I; Strizenec, P; Weichert, J; Zitek, K; Naumann, T; Goessling, C; Klingenberg, R; Jakobs, K; Rurikova, Z; Werner, M W; Arnold, H R; Buscher, D; Hanke, P; Stamen, R; Dietzsch, T A; Kiryunin, A; Salihagic, D; Buchholz, P; Pacheco pages, A; Sushkov, S; Porto fernandez, M D C; Cruz josa, R; Vos, M A; Schwindling, J; Ponsot, P; Charignon, C; Kivernyk, O; Goodrick, M J; Hill, J C; Green, B J; Quarman, C V; Bates, R L; Allwood-spiers, S E; Quilty, D; Chilingarov, A; Long, R E; Barton, A E; Konstantinidis, N; Simmons, B; Davison, A R; Christodoulou, V; Wastie, R L; Gallas, E J; Cox, J; Dehchar, M; Behr, J K; Pickering, M A; Filippas, A; Panagoulias, I; Tenenbaum katan, Y D; Roth, I; Pitt, M; Citron, Z H; Benhammou, Y; Amram, N Y N; Soffer, A; Gorodeisky, R; Antonelli, M; Chiarella, V; Curatolo, M; Esposito, B; Nicoletti, G; Martini, A; Sansoni, A; Carlino, G; Del prete, T; Bini, C; Vari, R; Kuna, M; Pinamonti, M; Itoh, Y; Colijn, A P; Klous, S; Garitaonandia elejabarrieta, H; Rosendahl, P L; Taga, A V; Malecki, P; Malecki, P; Wolter, M W; Kowalski, T; Korcyl, G M; Caprini, M; Caprini, I; Dita, P; Olariu, A; Tudorache, A; Lytken, E; Hidvegi, A; Aliyev, M; Alexeev, G; Bardin, D; Kakurin, S; Lebedev, A; Golubykh, S; Chepurnov, V; Gostkin, M; Kolesnikov, V; Karpova, Z; Davkov, K I; Yeletskikh, I; Grishkevich, Y; Rud, V; Myagkov, A; Nikolaenko, V; Starchenko, E; Zaytsev, A; Fakhrutdinov, R; Cheine, I; Istin, S; Sahin, S; Teng, P; Chu, M L; Trilling, G H; Heinemann, B; Richoz, N; Degeorge, C; Youssef, S; Pilcher, J; Cheng, Y; Purohit, M V; Kravchenko, A; Calkins, R E; Blazey, G; Hauser, R; Koll, J D; Reinsch, A; Brost, E C; Allen, B W; Lankford, A J; Ciobotaru, M D; Slagle, K J; Haffa, B; Mann, A; Loginov, A; Cummings, J T; Loyal, J D; Skubic, P L; Boudreau, J F; Lee, B E; Redlinger, G; Wlodek, T; Carcassi, G; Sexton, K A; Yu, D; Deng, W; Metcalfe, J E; Panitkin, S; Sijacki, D; Mikuz, M; Kramberger, G; Tartarelli, G F; Farilla, A; Stanescu, C; Herrberg, R; Alconada verzini, M J; Brennan, A J; Varvell, K; Marroquim, F; Gomes, A A; Do amaral coutinho, Y; Gingrich, D; Moore, R W; Dolejsi, J; Valkar, S; Broz, J; Jindra, T; Kohout, Z; Kral, V; Mann, A W; Calfayan, P P; Langer, T; Hamacher, K; Sanny, B; Wagner, W; Flick, T; Redelbach, A R; Ke, Y; Higon-rodriguez, E; Donini, J N; Lafarguette, P; Adye, T J; Baines, J; Barnett, B; Wickens, F J; Martin, V J; Jackson, J N; Prichard, P; Kretzschmar, J; Martin, A J; Walker, C J; Potter, K M; Kourkoumelis, C; Tzamarias, S; Houiris, A G; Iliadis, D; Fanti, M; Bertolucci, F; Maleev, V; Sultanov, S; Rosenberg, E I; Krumnack, N E; Bieganek, C; Diehl, E B; Mc kee, S P; Eppig, A P; Harper, D R; Liu, C; Schwarz, T A; Mazor, B; Looper, K A; Wiedenmann, W; Huang, P; Stahlman, J M; Battaglia, M; Nielsen, J A; Zhao, T; Khanov, A; Kaushik, V S; Vichou, E; Liss, A M; Gemme, C; Morettini, P; Parodi, F; Passaggio, S; Rossi, L; Kuzhir, P; Ignatenko, A; Ferrari, R; Spairani, M; Pianori, E; Sekula, S J; Firan, A I; Cao, T; Hetherly, J W; Gouighri, M; Vassilakopoulos, V; Long, M C; Shimojima, M; Sawyer, L H; Brummett, R E; Losada, M A; Schorlemmer, A L; Mantoani, M; Bawa, H S; Mornacchi, G; Nicquevert, B; Palestini, S; Stapnes, S; Veness, R; Kotamaki, M J; Sorde, C; Iengo, P; Campana, S; Goossens, L; Zajacova, Z; Pribyl, L; Poveda torres, J; Marzin, A; Conti, G; Carrillo montoya, G D; Kroseberg, J; Gonella, L; Velz, T; Schmitt, S; Lobodzinska, E M; Lovschall-jensen, A E; Galster, G; Perrot, G; Cailles, M; Berger, N; Barnovska, Z; Delsart, P; Lleres, A; Tisserant, S; Grivaz, J; Matricon, P; Bellagamba, L; Bertin, A; Bruschi, M; De castro, S; Semprini cesari, N; Fabbri, L; Rinaldi, L; Quayle, W B; Truong, T N L; Kondo, T; Haruyama, T; Ng, C; Do valle wemans, A; Almeida veloso, F M; Konovalov, S; Ziegler, J M; Su, D; Lukas, W; Prince, S; Ortega urrego, E J; Teuscher, R J; Knecht, N; Pretzl, K; Borer, C; Gadomski, S; Koch, B; Kuleshov, S; Brooks, W K; Antos, J; Kulkova, I; Chudoba, J; Chyla, J; Tomasek, L; Bazalova, M; Messmer, I; Tobias, J; Sundermann, J E; Kuehn, S S; Kluge, E; Scharf, V L; Barillari, T; Kluth, S; Menke, S; Weigell, P; Schwegler, P; Ziolkowski, M; Casado lechuga, P M; Garcia, C; Sanchez, J; Costa mezquita, M J; Valero biot, J A; Laporte, J; Nikolaidou, R; Virchaux, M; Nguyen, V T H; Charlton, D; Harrison, K; Slater, M W; Newman, P R; Parker, A M; Ward, P; Mcgarvie, S A; Kilvington, G J; D'auria, S; O'shea, V; Mcglone, H M; Fox, H; Henderson, R; Kartvelishvili, V; Davies, B; Sherwood, P; Fraser, J T; Lancaster, M A; Tseng, J C; Hays, C P; Apolle, R; Dixon, S D; Parker, K A; Gazis, E; Papadopoulou, T; Panagiotopoulou, E; Karastathis, N; Hershenhorn, A D; Milov, A; Groth-jensen, J; Bilokon, H; Miscetti, S; Canale, V; Rebuzzi, D M; Capua, M; Bagnaia, P; De salvo, A; Gentile, S; Safai tehrani, F; Solfaroli camillocci, E; Sasao, N; Tsunada, K; Massaro, G; Magrath, C A; Van kesteren, Z; Beker, M G; Van den wollenberg, W; Bugge, L; Buran, T; Read, A L; Gjelsten, B K; Banas, E A; Turnau, J; Derendarz, D K; Kisielewska, D; Chesneanu, D; Rotaru, M; Maurer, J B; Wong, M L; Lund-jensen, B; Asman, B; Jon-and, K B; Silverstein, S B; Johansen, M; Alexandrov, I; Iatsounenko, I; Krumshteyn, Z; Peshekhonov, V; Rybaltchenko, K; Samoylov, V; Cheplakov, A; Kekelidze, G; Lyablin, M; Teterine, V; Bednyakov, V; Kruchonak, U; Shiyakova, M M; Demichev, M; Denisov, S P; Fenyuk, A; Djobava, T; Salukvadze, G; Cetin, S A; Brau, B P; Pais, P R; Proudfoot, J; Van gemmeren, P; Zhang, Q; Beringer, J A; Ely, R; Leggett, C; Pengg, F X; Barnett, M R; Quick, R E; Williams, S; Gardner jr, R W; Huston, J; Brock, R; Wanotayaroj, C; Unel, G N; Taffard, A C; Frate, M; Baker, K O; Tipton, P L; Hutchison, A; Walsh, B J; Norberg, S R; Su, J; Tsybyshev, D; Caballero bejar, J; Ernst, M U; Wellenstein, H; Vudragovic, D; Vidic, I; Gorelov, I V; Toms, K; Alimonti, G; Petrucci, F; Kolanoski, H; Smith, J; Jeng, G; Watson, I J; Guimaraes ferreira, F; Miranda vieira xavier, F; Araujo pereira, R; Poffenberger, P; Sopko, V; Elmsheuser, J; Wittkowski, J; Glitza, K; Gorfine, G W; Ferrer soria, A; Fuster verdu, J A; Sanchis lozano, A; Reinmuth, G; Busato, E; Haywood, S J; Mcmahon, S J; Qian, W; Villani, E G; Laycock, P J; Poll, A J; Rizvi, E S; Foster, J M; Loebinger, F; Forti, A; Plano, W G; Brown, G J A; Kordas, K; Vegni, G; Ohsugi, T; Iwata, Y; Cherkaoui el moursli, R; Sahin, M; Akyazi, E; Carlsen, A; Kanwal, B; Cochran jr, J H; Aronnax, M V; Lockner, M J; Zhou, B; Levin, D S; Weaverdyck, C J; Grom, G F; Rudge, A; Ebenstein, W L; Jia, B; Yamaoka, J; Jared, R C; Wu, S L; Banerjee, S; Lu, Q; Hughes, E W; Alkire, S P; Degenhardt, J D; Lipeles, E D; Spencer, E N; Savine, A; Cheu, E C; Lampl, W; Veatch, J R; Roberts, K; Atkinson, M J; Odino, G A; Polesello, G; Martin, T; White, A P; Stephens, R; Grinbaum sarkisyan, E; Vartapetian, A; Yu, J; Sosebee, M; Thilagar, P A; Spurlock, B; Bonde, R; Filthaut, F; Klok, P; Hoummada, A; Ouchrif, M; Pellegrini, G; Rafi tatjer, J M; Navarro, G A; Blumenschein, U; Weingarten, J C; Mueller, D; Graber, L; Gao, Y; Bode, A; Capeans garrido, M D M; Carli, T; Wells, P; Beltramello, O; Vuillermet, R; Dudarev, A; Salzburger, A; Torchiani, C I; Serfon, C L G; Sloper, J E; Duperrier, G; Lilova, P T; Knecht, M O; Lassnig, M; Anders, G; Deviveiros, P; Young, C; Sforza, F; Shaochen, C; Lu, F; Wermes, N; Wienemann, P; Schwindt, T; Hansen, P H; Hansen, J B; Pingel, A M; Massol, N; Elles, S L; Hallewell, G D; Rozanov, A; Vacavant, L; Fournier, D A; Poggioli, L; Puzo, P M; Tanaka, R; Escalier, M A; Makovec, N; Rezynkina, K; De cecco, S; Cavalleri, P G; Massa, I; Zoccoli, A; Tanaka, S; Odaka, S; Mitsui, S; Tomasio pina, J A; Santos, H F; Satsounkevitch, I; Harkusha, S; Baranov, S; Nechaeva, P; Kayumov, F; Kazanin, V; Asai, M; Mount, R P; Nelson, T K; Smith, D; Kenney, C J; Malone, C M; Kobel, M; Friedrich, F; Grohs, J P; Jais, W J; O'neil, D C; Warburton, A T; Vincter, M; Mccarthy, T G; Groer, L S; Pham, Q T; Taylor, W J; La marra, D; Perrin, E; Wu, X; Bell, W H; Delitzsch, C M; Feng, C; Zhu, C; Tokar, S; Bruncko, D; Kupco, A; Marcisovsky, M; Jakoubek, T; Bruneliere, R; Aktas, A; Narrias villar, D I; Tapprogge, S; Mattmann, J; Kroha, H; Crespo, J; Korolkov, I; Cavallaro, E; Cabrera urban, S; Mitsou, V; Kozanecki, W; Mansoulie, B; Pabot, Y; Etienvre, A; Bauer, F; Chevallier, F; Bouty, A R; Watkins, P; Watson, A; Faulkner, P J W; Curtis, C J; Murillo quijada, J A; Grout, Z J; Chapman, J D; Cowan, G D; George, S; Boisvert, V; Mcmahon, T R; Doyle, A T; Thompson, S A; Britton, D; Smizanska, M; Campanelli, M; Butterworth, J M; Loken, J; Renton, P; Barr, A J; Issever, C; Short, D; Crispin ortuzar, M; Tovey, D R; French, R; Rozen, Y; Alexander, G; Kreisel, A; Conventi, F; Raulo, A; Schioppa, M; Susinno, G; Tassi, E; Giagu, S; Luci, C; Nisati, A; Cobal, M; Ishikawa, A; Jinnouchi, O; Bos, K; Verkerke, W; Vermeulen, J; Van vulpen, I B; Kieft, G; Mora, K D; Olsen, F; Rohne, O M; Pajchel, K; Nilsen, J K; Wosiek, B K; Wozniak, K W; Badescu, E; Jinaru, A; Bohm, C; Johansson, E K; Sjoelin, J B R; Clement, C; Buszello, C P; Huseynova, D; Boyko, I; Popov, B; Poukhov, O; Vinogradov, V; Tsiareshka, P; Skvorodnev, N; Soldatov, A; Chuguev, A; Gushchin, V; Yazici, E; Lutz, M S; Malon, D; Vanyashin, A; Lavrijsen, W; Spieler, H; Biesiada, J L; Bahr, M; Kong, J; Tatarkhanov, M; Ogren, H; Van kooten, R J; Cwetanski, P; Butler, J M; Shank, J T; Chakraborty, D; Ermoline, I; Sinev, N; Whiteson, D O; Corso radu, A; Huang, J; Werth, M P; Kastoryano, M; Meirose da silva costa, B; Namasivayam, H; Hobbs, J D; Schamberger jr, R D; Guo, F; Potekhin, M; Popovic, D; Gorisek, A; Sokhrannyi, G; Hofsajer, I W; Mandelli, L; Ceradini, F; Graziani, E; Giorgi, F; Zur nedden, M E G; Grancagnolo, S; Volpi, M; Nunes hanninger, G; Rados, P K; Milesi, M; Cuthbert, C J; Black, C W; Fink grael, F; Fincke-keeler, M; Keeler, R; Kowalewski, R V; Berghaus, F O; Qi, M; Davidek, T; Tas, P; Jakubek, J; Duckeck, G; Walker, R; Mitterer, C A; Harenberg, T; Sandvoss, S A; Del peso, J; Llorente merino, J; Gonzalez millan, V; Irles quiles, A; Crouau, M; Gris, P L Y; Liauzu, S; Romano saez, S M; Gallop, B J; Jones, T J; Austin, N C; Morris, J; Duerdoth, I; Thompson, R J; Kelly, M P; Leisos, A; Garas, A; Pizio, C; Venda pinto, B A; Kudin, L; Qian, J; Wilson, A W; Mietlicki, D; Long, J D; Sang, Z; Arms, K E; Rahimi, A M; Moss, J J; Oh, S H; Parker, S I; Parsons, J; Cunitz, H; Vanguri, R S; Sadrozinski, H; Lockman, W S; Martinez-mc kinney, G; Goussiou, A; Jones, A; Lie, K; Hasegawa, Y; Olcese, M; Gilewsky, V; Harrison, P F; Janus, M; Spangenberg, M; De, K; Ozturk, N; Pal, A K; Darmora, S; Bullock, D J; Oviawe, O; Derkaoui, J E; Rahal, G; Sircar, A; Frey, A S; Stolte, P; Rosien, N; Zoch, K; Li, L; Schouten, D W; Catinaccio, A; Ciapetti, M; Delruelle, N; Ellis, N; Farthouat, P; Hoecker, A; Klioutchnikova, T; Macina, D; Malyukov, S; Spiwoks, R D; Unal, G P; Vandoni, G; Petersen, B A; Pommes, K; Nairz, A M; Wengler, T; Mladenov, D; Solans sanchez, C A; Lantzsch, K; Schmieden, K; Jakobsen, S; Ritsch, E; Sciuccati, A; Alves dos santos, A M; Ouyang, Q; Zhou, M; Brock, I C; Janssen, J; Katzy, J; Anders, C F; Nilsson, B S; Bazan, A; Di ciaccio, L; Yildizkaya, T; Collot, J; Malek, F; Trocme, B S; Breugnon, P; Godiot, S; Adam bourdarios, C; Coulon, J; Duflot, L; Petroff, P G; Zerwas, D; Lieuvin, M; Calderini, G; Laporte, D; Ocariz, J; Gabrielli, A; Ohska, T K; Kurochkin, Y; Kantserov, V; Vasilyeva, L; Speransky, M; Smirnov, S; Antonov, A; Bulekov, O; Tikhonov, Y; Sargsyan, L; Vardanyan, G; Budick, B; Kocian, M L; Luitz, S; Young, C C; Grenier, P J; Kelsey, M; Black, J E; Kneringer, E; Jussel, P; Horton, A J; Beaudry, J; Chandra, A; Ereditato, A; Topfel, C M; Mathieu, R; Bucci, F; Muenstermann, D; White, R M; He, M; Urban, J; Straka, M; Vrba, V; Schumacher, M; Parzefall, U; Mahboubi, K; Sommer, P O; Koepke, L H; Bethke, S; Moser, H; Wiesmann, M; Walkowiak, W A; Fleck, I J; Martinez-perez, M; Sanchez sanchez, C A; Jorgensen roca, S; Accion garcia, E; Sainz ruiz, C A; Valls ferrer, J A; Amoros vicente, G; Vives torrescasana, R; Ouraou, A; Formica, A; Hassani, S; Watson, M F; Cottin buracchio, G F; Bussey, P J; Saxon, D; Ferrando, J E; Collins-tooth, C L; Hall, D C; Cuhadar donszelmann, T; Dawson, I; Duxfield, R; Argyropoulos, T; Brodet, E; Livneh, R; Shougaev, K; Reinherz, E I; Guttman, N; Beretta, M M; Vilucchi, E; Aloisio, A; Patricelli, S; Caprio, M; Cevenini, F; De vecchi, C; Livan, M; Rimoldi, A; Vercesi, V; Ayad, R; Mastroberardino, A; Ciapetti, G; Luminari, L; Rescigno, M; Santonico, R; Salamon, A; Del papa, C; Kurashige, H; Homma, Y; Tomoto, M; Horii, Y; Sugaya, Y; Hanagaki, K; Bobbink, G; Kluit, P M; Koffeman, E N; Van eijk, B; Lee, H; Eigen, G; Dorholt, O; Strandlie, A; Strzempek, P B; Dita, S; Stoicea, G; Chitan, A; Leven, S S; Moa, T; Brenner, R; Ekelof, T J C; Olshevskiy, A; Roumiantsev, V; Chlachidze, G; Zimine, N; Gusakov, Y; Grigalashvili, N; Mineev, M; Potrap, I; Barashkou, A; Shoukavy, D; Shaykhatdenov, B; Pikelner, A; Gladilin, L; Ammosov, V; Abramov, A; Arik, M; Sahinsoy, M; Uysal, Z; Azizi, K; Hotinli, S C; Zhou, S; Berger, E; Blair, R; Underwood, D G; Einsweiler, K; Garcia-sciveres, M A; Siegrist, J L; Kipnis, I; Dahl, O; Holland, S; Barbaro galtieri, A; Smith, P T; Parua, N; Franklin, M; Mercurio, K M; Tong, B; Pod, E; Cole, S G; Hopkins, W H; Guest, D H; Severini, H; Marsicano, J J; Abbott, B K; Wang, Q; Lissauer, D; Ma, H; Takai, H; Rajagopalan, S; Protopopescu, S D; Snyder, S S; Undrus, A; Popescu, R N; Begel, M A; Blocker, C A; Amelung, C; Mandic, I; Macek, B; Tucker, B H; Citterio, M; Troncon, C; Orestano, D; Taccini, C; Romeo, G L; Dova, M T; Taylor, G N; Gesualdi manhaes, A; Mcpherson, R A; Sobie, R; Taylor, R P; Dolezal, Z; Kodys, P; Slovak, R; Sopko, B; Vacek, V; Sanders, M P; Hertenberger, R; Meineck, C; Becks, K; Kind, P; Sandhoff, M; Cantero garcia, J; De la torre perez, H; Castillo gimenez, V; Ros, E; Hernandez jimenez, Y; Chadelas, R; Santoni, C; Washbrook, A J; O'brien, B J; Wynne, B M; Mehta, A; Vossebeld, J H; Landon, M; Teixeira dias castanheira, M; Cerrito, L; Keates, J R; Fassouliotis, D; Chardalas, M; Manousos, A; Grachev, V; Seliverstov, D; Sedykh, E; Cakir, O; Ciftci, R; Edson, W; Prell, S A; Rosati, M; Stroman, T; Jiang, H; Neal, H A; Li, X; Gan, K K; Smith, D S; Kruse, M C; Ko, B R; Leung fook cheong, A M; Cole, B; Angerami, A R; Greene, Z S; Kroll, J I; Van berg, R P; Forbush, D A; Lubatti, H; Raisher, J; Shupe, M A; Wolin, S; Oshita, H; Gaudio, G; Das, R; Konig, A C; Croft, V A; Harvey, A; Maaroufi, F; Melo, I; Greenwood jr, Z D; Shabalina, E; Mchedlidze, G; Drechsler, E; Rieger, J K; Blackston, M; Colombo, T

    2002-01-01

    % ATLAS \\\\ \\\\ ATLAS is a general-purpose experiment for recording proton-proton collisions at LHC. The ATLAS collaboration consists of 144 participating institutions (June 1998) with more than 1750~physicists and engineers (700 from non-Member States). The detector design has been optimized to cover the largest possible range of LHC physics: searches for Higgs bosons and alternative schemes for the spontaneous symmetry-breaking mechanism; searches for supersymmetric particles, new gauge bosons, leptoquarks, and quark and lepton compositeness indicating extensions to the Standard Model and new physics beyond it; studies of the origin of CP violation via high-precision measurements of CP-violating B-decays; high-precision measurements of the third quark family such as the top-quark mass and decay properties, rare decays of B-hadrons, spectroscopy of rare B-hadrons, and $ B ^0 _{s} $-mixing. \\\\ \\\\The ATLAS dectector, shown in the Figure includes an inner tracking detector inside a 2~T~solenoid providing an axial...

  11. Supporting ATLAS

    CERN Multimedia

    2003-01-01

    Eighteen feet made of stainless steel will support the barrel ATLAS detector in the cavern at Point 1. In total, the ATLAS feet system will carry approximately 6000 tons, and will give the same inclination to the detector as the LHC accelerator. The installation of the feet is scheduled to finish during January 2004 with an installation precision at the 1 mm level despite their height of 5.3 metres. The manufacture was carried out in Russia (Company Izhorskiye Zavody in St. Petersburg), as part of a Russian and JINR Dubna in-kind contribution to ATLAS. Involved in the installation is a team from IHEP-Protvino (Russia), the ATLAS technical co-ordination team at CERN, and the CERN survey team. In all, about 15 people are involved. After the feet are in place, the barrel toroid magnet and the barrel calorimeters will be installed. This will keep the ATLAS team busy for the entire year 2004.

  12. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  13. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  14. ATLAS SemiConductor Tracker Operation and Performance

    CERN Document Server

    Tojo, J; The ATLAS collaboration

    2011-01-01

    The SemiConductor Tracker (SCT), comprising of silicon micro-strip detectors is one of the key precision tracking devices in the ATLAS Inner Detector. ATLAS is one of the experiments at CERN LHC. The completed SCT is in very good shapes with 99.3% of the SCT’s 4088 modules (a total of 6.3 million strips) are operational. The noise occupancy and hit efficiency exceed the design specifications. In the talk the current status of the SCT will be reviewed. We will report on the operation of the detector, its performance and observed problems, with stress on the sensor and electronics performance. In December 2009 the ATLAS experiment at the CERN Large Hadron Collider (LHC) recorded the first proton-proton collisions at a centre-of-mass energy of 900 GeV and this was followed by the unprecedented energy of 7 TeV in March 2010. The Semi- Conductor Tracker (SCT) is the key precision tracking device in ATLAS, made from silicon micro-strip detectors processed in the planar p-in-n technology. The signals from the stri...

  15. ATLAS Silicon Microstrip Tracker Operation and Performance

    CERN Document Server

    Yamada, M; The ATLAS collaboration

    2011-01-01

    The SemiConductor Tracker (SCT), comprising of silicon micro-strip detectors is one of the key precision tracking devices in the ATLAS Inner Detector. ATLAS is one of the experiments at CERN LHC. The completed SCT is in very good shapes with 99.3% of the SCT’s 4088 modules (a total of 6.3 million strips) are operational. The noise occupancy and hit efficiency exceed the design specifications. In the talk the current status of the SCT will be reviewed. We will report on the operation of the detector, its performance and observed problems, with stress on the sensor and electronics performance. In December 2009 the ATLAS experiment at the CERN Large Hadron Collider (LHC) recorded the first proton-proton collisions at a centre-of-mass energy of 900 GeV and this was followed by the unprecedented energy of 7 TeV in March 2010. The Semi-Conductor Tracker (SCT) is the key precision tracking device in ATLAS, made from silicon micro-strip detectors processed in the planar p-in-n technology. The signals from the strip...

  16. The SysteMHC Atlas project.

    Science.gov (United States)

    Shao, Wenguang; Pedrioli, Patrick G A; Wolski, Witold; Scurtescu, Cristian; Schmid, Emanuel; Vizcaíno, Juan A; Courcelles, Mathieu; Schuster, Heiko; Kowalewski, Daniel; Marino, Fabio; Arlehamn, Cecilia S L; Vaughan, Kerrie; Peters, Bjoern; Sette, Alessandro; Ottenhoff, Tom H M; Meijgaarden, Krista E; Nieuwenhuizen, Natalie; Kaufmann, Stefan H E; Schlapbach, Ralph; Castle, John C; Nesvizhskii, Alexey I; Nielsen, Morten; Deutsch, Eric W; Campbell, David S; Moritz, Robert L; Zubarev, Roman A; Ytterberg, Anders Jimmy; Purcell, Anthony W; Marcilla, Miguel; Paradela, Alberto; Wang, Qi; Costello, Catherine E; Ternette, Nicola; van Veelen, Peter A; van Els, Cécile A C M; Heck, Albert J R; de Souza, Gustavo A; Sollid, Ludvig M; Admon, Arie; Stevanovic, Stefan; Rammensee, Hans-Georg; Thibault, Pierre; Perreault, Claude; Bassani-Sternberg, Michal; Aebersold, Ruedi; Caron, Etienne

    2018-01-04

    Mass spectrometry (MS)-based immunopeptidomics investigates the repertoire of peptides presented at the cell surface by major histocompatibility complex (MHC) molecules. The broad clinical relevance of MHC-associated peptides, e.g. in precision medicine, provides a strong rationale for the large-scale generation of immunopeptidomic datasets and recent developments in MS-based peptide analysis technologies now support the generation of the required data. Importantly, the availability of diverse immunopeptidomic datasets has resulted in an increasing need to standardize, store and exchange this type of data to enable better collaborations among researchers, to advance the field more efficiently and to establish quality measures required for the meaningful comparison of datasets. Here we present the SysteMHC Atlas (https://systemhcatlas.org), a public database that aims at collecting, organizing, sharing, visualizing and exploring immunopeptidomic data generated by MS. The Atlas includes raw mass spectrometer output files collected from several laboratories around the globe, a catalog of context-specific datasets of MHC class I and class II peptides, standardized MHC allele-specific peptide spectral libraries consisting of consensus spectra calculated from repeat measurements of the same peptide sequence, and links to other proteomics and immunology databases. The SysteMHC Atlas project was created and will be further expanded using a uniform and open computational pipeline that controls the quality of peptide identifications and peptide annotations. Thus, the SysteMHC Atlas disseminates quality controlled immunopeptidomic information to the public domain and serves as a community resource toward the generation of a high-quality comprehensive map of the human immunopeptidome and the support of consistent measurement of immunopeptidomic sample cohorts. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. 17 April 2008 - Head of Internal Audit Network meeting visiting the ATLAS experimental area with CERN ATLAS Team Leader P. Fassnacht, ATLAS Technical Coordinator M. Nessi and ATLAS Resources Manager M. Nordberg.

    CERN Multimedia

    Mona Schweizer

    2008-01-01

    17 April 2008 - Head of Internal Audit Network meeting visiting the ATLAS experimental area with CERN ATLAS Team Leader P. Fassnacht, ATLAS Technical Coordinator M. Nessi and ATLAS Resources Manager M. Nordberg.

  18. Remote control of ATLAS-MPX Network and Data Visualization

    International Nuclear Information System (INIS)

    Turecek, D.; Holy, T.; Pospisil, S.; Vykydal, Z.

    2011-01-01

    The ATLAS-MPX Network is a network of 15 Medipix2-based detector devices, installed in various positions in the ATLAS detector at CERN, Geneva. The aim of the network is to perform a real-time measurement of the spectral characteristics and the composition of radiation inside the ATLAS detector during its operation. The remote control system of ATLAS-MPX controls and configures all the devices from one place, via a web interface, accessible from different operating systems. The Data Visualization application, also with a web interface, has been developed in order to present measured data to the scientific community. It allows to browse through recorded frames from all devices and to search for specific frames by date and time. Charts containing the number of different types of tracks in each frame as a function of time may be rendered from the database.

  19. Probabilistic estimation of residential air exchange rates for ...

    Science.gov (United States)

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER measurements. An algorithm for probabilistically estimating AER was developed based on the Lawrence Berkley National Laboratory Infiltration model utilizing housing characteristics and meteorological data with adjustment for window opening behavior. The algorithm was evaluated by comparing modeled and measured AERs in four US cities (Los Angeles, CA; Detroit, MI; Elizabeth, NJ; and Houston, TX) inputting study-specific data. The impact on the modeled AER of using publically available housing data representative of the region for each city was also assessed. Finally, modeled AER based on region-specific inputs was compared with those estimated using literature-based distributions. While modeled AERs were similar in magnitude to the measured AER they were consistently lower for all cities except Houston. AERs estimated using region-specific inputs were lower than those using study-specific inputs due to differences in window opening probabilities. The algorithm produced more spatially and temporally variable AERs compared with literature-based distributions reflecting within- and between-city differences, helping reduce error in estimates of air pollutant exposure. Published in the Journal of

  20. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  1. Nonlocal atlas-guided multi-channel forest learning for human brain labeling.

    Science.gov (United States)

    Ma, Guangkai; Gao, Yaozong; Wu, Guorong; Wu, Ligang; Shen, Dinggang

    2016-02-01

    It is important for many quantitative brain studies to label meaningful anatomical regions in MR brain images. However, due to high complexity of brain structures and ambiguous boundaries between different anatomical regions, the anatomical labeling of MR brain images is still quite a challenging task. In many existing label fusion methods, appearance information is widely used. However, since local anatomy in the human brain is often complex, the appearance information alone is limited in characterizing each image point, especially for identifying the same anatomical structure across different subjects. Recent progress in computer vision suggests that the context features can be very useful in identifying an object from a complex scene. In light of this, the authors propose a novel learning-based label fusion method by using both low-level appearance features (computed from the target image) and high-level context features (computed from warped atlases or tentative labeling maps of the target image). In particular, the authors employ a multi-channel random forest to learn the nonlinear relationship between these hybrid features and target labels (i.e., corresponding to certain anatomical structures). Specifically, at each of the iterations, the random forest will output tentative labeling maps of the target image, from which the authors compute spatial label context features and then use in combination with original appearance features of the target image to refine the labeling. Moreover, to accommodate the high inter-subject variations, the authors further extend their learning-based label fusion to a multi-atlas scenario, i.e., they train a random forest for each atlas and then obtain the final labeling result according to the consensus of results from all atlases. The authors have comprehensively evaluated their method on both public LONI_LBPA40 and IXI datasets. To quantitatively evaluate the labeling accuracy, the authors use the dice similarity coefficient

  2. Daily dose monitoring with atlas-based auto-segmentation on diagnostic quality CT for prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wen; Vassil, Andrew; Xia, Ping [Department of Radiation Oncology, Cleveland Clinic Foundation, Cleveland, Ohio 44106 (United States); Zhong, Yahua [Department of Radiation Oncology, Zhongnan Hospital, Wuhan 430071 (China)

    2013-11-15

    Purpose: To evaluate the feasibility of daily dose monitoring using a patient specific atlas-based autosegmentation method on diagnostic quality verification images.Methods: Seven patients, who were treated for prostate cancer with intensity modulated radiotherapy under daily imaging guidance of a CT-on-rails system, were selected for this study. The prostate, rectum, and bladder were manually contoured on the first six and last seven sets of daily verification images. For each patient, three patient specific atlases were constructed using manual contours from planning CT alone (1-image atlas), planning CT plus first three verification CTs (4-image atlas), and planning CT plus first six verification CTs (7-image atlas). These atlases were subsequently applied to the last seven verification image sets of the same patient to generate the auto-contours. Daily dose was calculated by applying the original treatment plans to the daily beam isocenters. The autocontours and manual contours were compared geometrically using the dice similarity coefficient (DSC), and dosimetrically using the dose to 99% of the prostate CTV (D99) and the D5 of rectum and bladder.Results: The DSC of the autocontours obtained with the 4-image atlases were 87.0%± 3.3%, 84.7%± 8.6%, and 93.6%± 4.3% for the prostate, rectum, and bladder, respectively. These indices were higher than those from the 1-image atlases (p < 0.01) and comparable to those from the 7-image atlases (p > 0.05). Daily prostate D99 of the autocontours was comparable to those of the manual contours (p= 0.55). For the bladder and rectum, the daily D5 were 95.5%± 5.9% and 99.1%± 2.6% of the planned D5 for the autocontours compared to 95.3%± 6.7% (p= 0.58) and 99.8%± 2.3% (p < 0.01) for the manual contours.Conclusions: With patient specific 4-image atlases, atlas-based autosegmentation can adequately facilitate daily dose monitoring for prostate cancer.

  3. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  4. Atlas-guided prostate intensity modulated radiation therapy (IMRT) planning

    International Nuclear Information System (INIS)

    Sheng, Yang; Li, Taoran; Zhang, You; Lee, W Robert; Yin, Fang-Fang; Wu, Q Jackie; Ge, Yaorong

    2015-01-01

    An atlas-based IMRT planning technique for prostate cancer was developed and evaluated. A multi-dose atlas was built based on the anatomy patterns of the patients, more specifically, the percent distance to the prostate and the concaveness angle formed by the seminal vesicles relative to the anterior-posterior axis. A 70-case dataset was classified using a k-medoids clustering analysis to recognize anatomy pattern variations in the dataset. The best classification, defined by the number of classes or medoids, was determined by the largest value of the average silhouette width. Reference plans from each class formed a multi-dose atlas. The atlas-guided planning (AGP) technique started with matching the new case anatomy pattern to one of the reference cases in the atlas; then a deformable registration between the atlas and new case anatomies transferred the dose from the atlas to the new case to guide inverse planning with full automation. 20 additional clinical cases were re-planned to evaluate the AGP technique. Dosimetric properties between AGP and clinical plans were evaluated. The classification analysis determined that the 5-case atlas would best represent anatomy patterns for the patient cohort. AGP took approximately 1 min on average (corresponding to 70 iterations of optimization) for all cases. When dosimetric parameters were compared, the differences between AGP and clinical plans were less than 3.5%, albeit some statistical significances observed: homogeneity index (p  >  0.05), conformity index (p  <  0.01), bladder gEUD (p  <  0.01), and rectum gEUD (p  =  0.02). Atlas-guided treatment planning is feasible and efficient. Atlas predicted dose can effectively guide the optimizer to achieve plan quality comparable to that of clinical plans. (paper)

  5. Time Alignment as a Necessary Step in the Analysis of Sleep Probabilistic Curves

    Science.gov (United States)

    Rošt'áková, Zuzana; Rosipal, Roman

    2018-02-01

    Sleep can be characterised as a dynamic process that has a finite set of sleep stages during the night. The standard Rechtschaffen and Kales sleep model produces discrete representation of sleep and does not take into account its dynamic structure. In contrast, the continuous sleep representation provided by the probabilistic sleep model accounts for the dynamics of the sleep process. However, analysis of the sleep probabilistic curves is problematic when time misalignment is present. In this study, we highlight the necessity of curve synchronisation before further analysis. Original and in time aligned sleep probabilistic curves were transformed into a finite dimensional vector space, and their ability to predict subjects' age or daily measures is evaluated. We conclude that curve alignment significantly improves the prediction of the daily measures, especially in the case of the S2-related sleep states or slow wave sleep.

  6. The ATLAS Computing Agora: a resource web site for citizen science projects

    CERN Document Server

    Bourdarios, Claire; The ATLAS collaboration

    2016-01-01

    The ATLAS collaboration has recently setup a number of citizen science projects which have a strong IT component and could not have been envisaged without the growth of general public computing resources and network connectivity: event simulation through volunteer computing, algorithms improvement via Machine Learning challenges, event display analysis on citizen science platforms, use of open data, etc. Most of the interactions with volunteers are handled through message boards, but specific outreach material was also developed, giving an enhanced visibility to the ATLAS software and computing techniques, challenges and community. In this talk the Atlas Computing Agora (ACA) web platform will be presented as well as some of the specific material developed for some of the projects.

  7. Silicon microstrip detectors for the ATLAS SCT

    Czech Academy of Sciences Publication Activity Database

    Robinson, D.; Allport, P.; Andricek, L.; Böhm, Jan; Buttar, C.; Carter, J. R.; Chilingarov, A.; Clark, A. G.; Feriere, D.; Fuster, J.

    2002-01-01

    Roč. 485, 1-2 (2002), s. 84-88 ISSN 0168-9002 R&D Projects: GA MPO RP-4210/69 Institutional research plan: CEZ:AV0Z1010920 Keywords : ATLAS SCT * silicon microstrip detectors * irradiation * quality control Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.167, year: 2002

  8. Effects of instructions and cue subjectiveness on specificity of autobiographical memory recall

    Directory of Open Access Journals (Sweden)

    Jorge J. Ricarte-Trives

    2014-10-01

    Full Text Available The first aim of this study was to determine the power of instructions on the specificity of autobiographical memory as obtained with the Autobiographical Memory Test (AMT; Williams & Broadbent, 1986 and the efficacy of cue word criteria selection based on subjective parameters obtained with a standardized lexical program. Results showed a high power of specific instructions in its written version in contrast to non-directed memory recall to the same list of words three weeks later in a counterbalanced repeated measures within-subjects design. This effect was stronger when subjects previously were faced to the non-specific recovery task. Matched word lists using the "Buscapalabras" program (Davis & Perea, 2005 showed a very similar behaviour. These results point out that the same stimuli can be used repeatedly to obtain voluntary and involuntary retrieval with changes at instructional level. Additionally, standardized lexical programs can be employed to adapt cue-words of memory recall systems controlling for subjective differences related to language parameters (frequency, imageability and familiarity.

  9. A Sorghum bicolor expression atlas reveals dynamic genotype-specific expression profiles for vegetative tissues of grain, sweet and bioenergy sorghums

    Energy Technology Data Exchange (ETDEWEB)

    Shakoor, N; Nair, R; Crasta, O; Morris, G; Feltus, A; Kresovich, S

    2014-01-23

    Background: Effective improvement in sorghum crop development necessitates a genomics-based approach to identify functional genes and QTLs. Sequenced in 2009, a comprehensive annotation of the sorghum genome and the development of functional genomics resources is key to enable the discovery and deployment of regulatory and metabolic genes and gene networks for crop improvement. Results: This study utilizes the first commercially available whole-transcriptome sorghum microarray (Sorgh-WTa520972F) to identify tissue and genotype-specific expression patterns for all identified Sorghum bicolor exons and UTRs. The genechip contains 1,026,373 probes covering 149,182 exons (27,577 genes) across the Sorghum bicolor nuclear, chloroplast, and mitochondrial genomes. Specific probesets were also included for putative non-coding RNAs that may play a role in gene regulation (e. g., microRNAs), and confirmed functional small RNAs in related species (maize and sugarcane) were also included in our array design. We generated expression data for 78 samples with a combination of four different tissue types (shoot, root, leaf and stem), two dissected stem tissues (pith and rind) and six diverse genotypes, which included 6 public sorghum lines (R159, Atlas, Fremont, PI152611, AR2400 and PI455230) representing grain, sweet, forage, and high biomass ideotypes. Conclusions: Here we present a summary of the microarray dataset, including analysis of tissue-specific gene expression profiles and associated expression profiles of relevant metabolic pathways. With an aim to enable identification and functional characterization of genes in sorghum, this expression atlas presents a new and valuable resource to the research community.

  10. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1981-01-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the U.S. Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed. (author)

  11. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  12. Fully probabilistic control design in an adaptive critic framework

    Czech Academy of Sciences Publication Activity Database

    Herzallah, R.; Kárný, Miroslav

    2011-01-01

    Roč. 24, č. 10 (2011), s. 1128-1135 ISSN 0893-6080 R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Stochastic control design * Fully probabilistic design * Adaptive control * Adaptive critic Subject RIV: BC - Control Systems Theory Impact factor: 2.182, year: 2011 http://library.utia.cas.cz/separaty/2011/AS/karny-0364820.pdf

  13. Performance on a probabilistic inference task in healthy subjects receiving ketamine compared with patients with schizophrenia

    Science.gov (United States)

    Almahdi, Basil; Sultan, Pervez; Sohanpal, Imrat; Brandner, Brigitta; Collier, Tracey; Shergill, Sukhi S; Cregg, Roman; Averbeck, Bruno B

    2012-01-01

    Evidence suggests that some aspects of schizophrenia can be induced in healthy volunteers through acute administration of the non-competitive NMDA-receptor antagonist, ketamine. In probabilistic inference tasks, patients with schizophrenia have been shown to ‘jump to conclusions’ (JTC) when asked to make a decision. We aimed to test whether healthy participants receiving ketamine would adopt a JTC response pattern resembling that of patients. The paradigmatic task used to investigate JTC has been the ‘urn’ task, where participants are shown a sequence of beads drawn from one of two ‘urns’, each containing coloured beads in different proportions. Participants make a decision when they think they know the urn from which beads are being drawn. We compared performance on the urn task between controls receiving acute ketamine or placebo with that of patients with schizophrenia and another group of controls matched to the patient group. Patients were shown to exhibit a JTC response pattern relative to their matched controls, whereas JTC was not evident in controls receiving ketamine relative to placebo. Ketamine does not appear to promote JTC in healthy controls, suggesting that ketamine does not affect probabilistic inferences. PMID:22389244

  14. Multi-atlas Based Segmentation Editing with Interaction-Guided Constraints

    OpenAIRE

    Park, Sang Hyun; Gao, Yaozong; Shen, Dinggang

    2015-01-01

    We propose a novel multi-atlas based segmentation method to address the editing scenario, when given an incomplete segmentation along with a set of training label images. Unlike previous multi-atlas based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate training labels and derive their voting weights. Specifically, we divide user interactions, provided on erroneous parts, into multiple local interaction combinations, and th...

  15. Subject-specific knee joint geometry improves predictions of medial tibiofemoral contact forces

    Science.gov (United States)

    Gerus, Pauline; Sartori, Massimo; Besier, Thor F.; Fregly, Benjamin J.; Delp, Scott L.; Banks, Scott A.; Pandy, Marcus G.; D’Lima, Darryl D.; Lloyd, David G.

    2013-01-01

    Estimating tibiofemoral joint contact forces is important for understanding the initiation and progression of knee osteoarthritis. However, tibiofemoral contact force predictions are influenced by many factors including muscle forces and anatomical representations of the knee joint. This study aimed to investigate the influence of subject-specific geometry and knee joint kinematics on the prediction of tibiofemoral contact forces using a calibrated EMG-driven neuromusculoskeletal model of the knee. One participant fitted with an instrumented total knee replacement walked at a self-selected speed while medial and lateral tibiofemoral contact forces, ground reaction forces, whole-body kinematics, and lower-limb muscle activity were simultaneously measured. The combination of generic and subject-specific knee joint geometry and kinematics resulted in four different OpenSim models used to estimate muscle-tendon lengths and moment arms. The subject-specific geometric model was created from CT scans and the subject-specific knee joint kinematics representing the translation of the tibia relative to the femur was obtained from fluoroscopy. The EMG-driven model was calibrated using one walking trial, but with three different cost functions that tracked the knee flexion/extension moments with and without constraint over the estimated joint contact forces. The calibrated models then predicted the medial and lateral tibiofemoral contact forces for five other different walking trials. The use of subject-specific models with minimization of the peak tibiofemoral contact forces improved the accuracy of medial contact forces by 47% and lateral contact forces by 7%, respectively compared with the use of generic musculoskeletal model. PMID:24074941

  16. The silicon microstrip sensors of the ATLAS semiconductor tracker

    Czech Academy of Sciences Publication Activity Database

    Ahmad, A.; Albrechtskirchinger, Z.; Allport, P.; Böhm, Jan; Mikeštíková, Marcela; Šťastný, Jan

    2007-01-01

    Roč. 578, - (2007), s. 98-118 ISSN 0168-9002 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * SCT * silicon * microstrip * module * LHC Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.114, year: 2007

  17. Report to users of ATLAS

    International Nuclear Information System (INIS)

    Ahmad, I.; Glagola, B.

    1995-05-01

    This report contains discussing in the following areas: Status of the Atlas accelerator; highlights of recent research at Atlas; concept for an advanced exotic beam facility based on Atlas; program advisory committee; Atlas executive committee; and Atlas and ANL physics division on the world wide web

  18. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Science.gov (United States)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  19. Overview of the probabilistic risk assessment approach

    International Nuclear Information System (INIS)

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  20. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  1. Probabilistic stability and "tall" wind profiles: theory and method for use in wind resource assessment

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Troen, Ib

    2016-01-01

    A model has been derived for calculating the aggregate effects of stability and the finite height of the planetary boundary layer upon the long-term mean wind profile. A practical implementation of this probabilistic extended similarity-theory model is made, including its incorporation within...... to the methodology. Results of the modeling are shown for a number of sites, with discussion of the models’ efficacy and the relative improvement shown by the new model, for situations where a user lacks local heat flux information, as well as performance of the new model using measured flux statistics. Further...... the European Wind Atlas (EWA) methodology for site-to-site application. Theoretical and practical implications of the EWA methodology are also derived and described, including unprecedented documentation of the theoretical framework encompassing vertical extrapolation, as well as some improvement...

  2. Low-complexity atlas-based prostate segmentation by combining global, regional, and local metrics

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Qiuliang; Ruan, Dan, E-mail: druan@mednet.ucla.edu [The Department of Radiation Oncology, University of California Los Angeles, California 90095 (United States)

    2014-04-15

    Purpose: To improve the efficiency of atlas-based segmentation without compromising accuracy, and to demonstrate the validity of the proposed method on MRI-based prostate segmentation application. Methods: Accurate and efficient automatic structure segmentation is an important task in medical image processing. Atlas-based methods, as the state-of-the-art, provide good segmentation at the cost of a large number of computationally intensive nonrigid registrations, for anatomical sites/structures that are subject to deformation. In this study, the authors propose to utilize a combination of global, regional, and local metrics to improve the accuracy yet significantly reduce the number of required nonrigid registrations. The authors first perform an affine registration to minimize the global mean squared error (gMSE) to coarsely align each atlas image to the target. Subsequently, atarget-specific regional MSE (rMSE), demonstrated to be a good surrogate for dice similarity coefficient (DSC), is used to select a relevant subset from the training atlas. Only within this subset are nonrigid registrations performed between the training images and the target image, to minimize a weighted combination of gMSE and rMSE. Finally, structure labels are propagated from the selected training samples to the target via the estimated deformation fields, and label fusion is performed based on a weighted combination of rMSE and local MSE (lMSE) discrepancy, with proper total-variation-based spatial regularization. Results: The proposed method was applied to a public database of 30 prostate MR images with expert-segmented structures. The authors’ method, utilizing only eight nonrigid registrations, achieved a performance with a median/mean DSC of over 0.87/0.86, outperforming the state-of-the-art full-fledged atlas-based segmentation approach of which the median/mean DSC was 0.84/0.82 when applying to their data set. Conclusions: The proposed method requires a fixed number of nonrigid

  3. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  4. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  5. ATLAS-AWS

    International Nuclear Information System (INIS)

    Gehrcke, Jan-Philip; Stonjek, Stefan; Kluth, Stefan

    2010-01-01

    We show how the ATLAS offline software is ported on the Amazon Elastic Compute Cloud (EC2). We prepare an Amazon Machine Image (AMI) on the basis of the standard ATLAS platform Scientific Linux 4 (SL4). Then an instance of the SLC4 AMI is started on EC2 and we install and validate a recent release of the ATLAS offline software distribution kit. The installed software is archived as an image on the Amazon Simple Storage Service (S3) and can be quickly retrieved and connected to new SL4 AMI instances using the Amazon Elastic Block Store (EBS). ATLAS jobs can then configure against the release kit using the ATLAS configuration management tool (cmt) in the standard way. The output of jobs is exported to S3 before the SL4 AMI is terminated. Job status information is transferred to the Amazon SimpleDB service. The whole process of launching instances of our AMI, starting, monitoring and stopping jobs and retrieving job output from S3 is controlled from a client machine using python scripts implementing the Amazon EC2/S3 API via the boto library working together with small scripts embedded in the SL4 AMI. We report our experience with setting up and operating the system using standard ATLAS job transforms.

  6. A quality control atlas for scintillation camera systems

    International Nuclear Information System (INIS)

    Busemann Sokole, E.; Graham, L.S.; Todd-Pokropek, A.; Wegst, A.; Robilotta, C.C.

    2002-01-01

    Full text: The accurate interpretation of quality control and clinical nuclear medicine image data is coupled to an understanding of image patterns and quantitative results. Understanding is gained by learning from different examples, and knowledge of underlying principles of image production. An Atlas of examples has been created to assist with interpreting quality control tests and recognizing artifacts in clinical examples. The project was initiated and supported by the International Atomic Energy Agency (IAEA). The Atlas was developed and written by Busemann Sokole from image examples submitted from nuclear medicine users from around the world. The descriptive text was written in a consistent format to accompany each image or image set. Each example in the atlas finally consisted of the images; a brief description of the data acquisition, radionuclide/radiopharmaceutical, specific circumstances under which the image was produced; results describing the images and subsequent conclusions; comments, where appropriate, giving guidelines for follow-up strategies and trouble shooting; and occasional literature references. Hardcopy images required digitizing into JPEG format for inclusion into a digital document. Where possible, an example was contained on one page. The atlas was reviewed by an international group of experts. A total of about 250 examples were compiled into 6 sections: planar, SPECT, whole body, camera/computer interface, environment/radioactivity, and display/hardcopy. Subtle loss of image quality may be difficult to detect. SPECT examples, therefore, include simulations demonstrating effects of deterioration in camera performance (e.g. center-of-rotation offset, non-uniformity) or suboptimal clinical performance. The atlas includes normal results, results from poor adjustment of the camera system, poor results obtained at acceptance testing, artifacts due to system malfunction, and artifacts due to environmental situations. Some image patterns are

  7. MRI-based treatment planning with pseudo CT generated through atlas registration.

    Science.gov (United States)

    Uh, Jinsoo; Merchant, Thomas E; Li, Yimei; Li, Xingyu; Hua, Chiaho

    2014-05-01

    To evaluate the feasibility and accuracy of magnetic resonance imaging (MRI)-based treatment planning using pseudo CTs generated through atlas registration. A pseudo CT, providing electron density information for dose calculation, was generated by deforming atlas CT images previously acquired on other patients. The authors tested 4 schemes of synthesizing a pseudo CT from single or multiple deformed atlas images: use of a single arbitrarily selected atlas, arithmetic mean process using 6 atlases, and pattern recognition with Gaussian process (PRGP) using 6 or 12 atlases. The required deformation for atlas CT images was derived from a nonlinear registration of conjugated atlas MR images to that of the patient of interest. The contrasts of atlas MR images were adjusted by histogram matching to reduce the effect of different sets of acquisition parameters. For comparison, the authors also tested a simple scheme assigning the Hounsfield unit of water to the entire patient volume. All pseudo CT generating schemes were applied to 14 patients with common pediatric brain tumors. The image similarity of real patient-specific CT and pseudo CTs constructed by different schemes was compared. Differences in computation times were also calculated. The real CT in the treatment planning system was replaced with the pseudo CT, and the dose distribution was recalculated to determine the difference. The atlas approach generally performed better than assigning a bulk CT number to the entire patient volume. Comparing atlas-based schemes, those using multiple atlases outperformed the single atlas scheme. For multiple atlas schemes, the pseudo CTs were similar to the real CTs (correlation coefficient, 0.787-0.819). The calculated dose distribution was in close agreement with the original dose. Nearly the entire patient volume (98.3%-98.7%) satisfied the criteria of chi-evaluation (pediatric brain tumor patients. The doses calculated from pseudo CTs agreed well with those from real CTs

  8. EnviroAtlas

    Data.gov (United States)

    City and County of Durham, North Carolina — This EnviroAtlas web service supports research and online mapping activities related to EnviroAtlas (https://www.epa.gov/enviroatlas). The layers in this web...

  9. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    Science.gov (United States)

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p cranial nerves. Probabilistic tracking with a gradual

  10. Daily dose monitoring with atlas-based auto-segmentation on diagnostic quality CT for prostate cancer

    International Nuclear Information System (INIS)

    Li, Wen; Vassil, Andrew; Xia, Ping; Zhong, Yahua

    2013-01-01

    Purpose: To evaluate the feasibility of daily dose monitoring using a patient specific atlas-based autosegmentation method on diagnostic quality verification images.Methods: Seven patients, who were treated for prostate cancer with intensity modulated radiotherapy under daily imaging guidance of a CT-on-rails system, were selected for this study. The prostate, rectum, and bladder were manually contoured on the first six and last seven sets of daily verification images. For each patient, three patient specific atlases were constructed using manual contours from planning CT alone (1-image atlas), planning CT plus first three verification CTs (4-image atlas), and planning CT plus first six verification CTs (7-image atlas). These atlases were subsequently applied to the last seven verification image sets of the same patient to generate the auto-contours. Daily dose was calculated by applying the original treatment plans to the daily beam isocenters. The autocontours and manual contours were compared geometrically using the dice similarity coefficient (DSC), and dosimetrically using the dose to 99% of the prostate CTV (D99) and the D5 of rectum and bladder.Results: The DSC of the autocontours obtained with the 4-image atlases were 87.0%± 3.3%, 84.7%± 8.6%, and 93.6%± 4.3% for the prostate, rectum, and bladder, respectively. These indices were higher than those from the 1-image atlases (p 0.05). Daily prostate D99 of the autocontours was comparable to those of the manual contours (p= 0.55). For the bladder and rectum, the daily D5 were 95.5%± 5.9% and 99.1%± 2.6% of the planned D5 for the autocontours compared to 95.3%± 6.7% (p= 0.58) and 99.8%± 2.3% (p < 0.01) for the manual contours.Conclusions: With patient specific 4-image atlases, atlas-based autosegmentation can adequately facilitate daily dose monitoring for prostate cancer

  11. Dear ATLAS colleagues,

    CERN Multimedia

    PH Department

    2008-01-01

    We are collecting old pairs of glasses to take out to Mali, where they can be re-used by people there. The price for a pair of glasses can often exceed 3 months salary, so they are prohibitively expensive for many people. If you have any old spectacles you can donate, please put them in the special box in the ATLAS secretariat, bldg.40-4-D01 before the Christmas closure on 19 December so we can take them with us when we leave for Africa at the end of the month. (more details in ATLAS e-news edition of 29 September 2008: http://atlas-service-enews.web.cern.ch/atlas-service-enews/news/news_mali.php) many thanks! Katharine Leney co-driver of the ATLAS car on the Charity Run to Mali

  12. Effects of methamphetamine administration on information gathering during probabilistic reasoning in healthy humans.

    Science.gov (United States)

    Ermakova, Anna O; Ramachandra, Pranathi; Corlett, Philip R; Fletcher, Paul C; Murray, Graham K

    2014-01-01

    Jumping to conclusions (JTC) during probabilistic reasoning is a cognitive bias repeatedly demonstrated in people with schizophrenia and shown to be associated with delusions. Little is known about the neurochemical basis of probabilistic reasoning. We tested the hypothesis that catecholamines influence data gathering and probabilistic reasoning by administering intravenous methamphetamine, which is known to cause synaptic release of the catecholamines noradrenaline and dopamine, to healthy humans whilst they undertook a probabilistic inference task. Our study used a randomised, double-blind, cross-over design. Seventeen healthy volunteers on three visits were administered either placebo or methamphetamine or methamphetamine preceded by amisulpride. In all three conditions participants performed the "beads" task in which participants decide how much information to gather before making a probabilistic inference, and which measures the cognitive bias towards jumping to conclusions. Psychotic symptoms triggered by methamphetamine were assessed using Comprehensive Assessment of At-Risk Mental States (CAARMS). Methamphetamine induced mild psychotic symptoms, but there was no effect of drug administration on the number of draws to decision (DTD) on the beads task. DTD was a stable trait that was highly correlated within subjects across visits (intra-class correlation coefficients of 0.86 and 0.91 on two versions of the task). The less information was sampled in the placebo condition, the more psychotic-like symptoms the person had after the methamphetamine plus amisulpride condition (p = 0.028). Our results suggest that information gathering during probabilistic reasoning is a stable trait, not easily modified by dopaminergic or noradrenergic modulation.

  13. Development of probabilistic fast reactor fuel design method

    International Nuclear Information System (INIS)

    Ozawa, Takayuki

    1997-01-01

    Under the current method of evaluating fuel robustness in FBR fuel rod design, a variety of uncertain quantities including fuel production tolerance and power density are estimated conservatively. In the future, in order to proceed with improvements in the FBR core's performance and optimize the fuel's specifications, a rationalization of fuel design tolerance is required. Among the measures aimed at realizing this rationalization, the introduction of a probabilistic fast reactor fuel design method is currently under consideration. I have developed a probabilistic fast reactor fuel design code named BORNFREE, in order to make use of this method in FBR fuel design. At the same time, I have carried out a trial calculation of the cladding stress using this code and made a study and an evaluation of the possibility of employing tolerance rationalization in fuel rod design. In this paper, I provide an outline description of BORNFREE and report the results of the above study and evaluation. After performing cladding stress trial calculations using the probabilistic method, I was able to confirm that this method promises more rational design evaluation results than the conventional deterministic method. (author)

  14. Recent ATLAS Articles on WLAP

    CERN Multimedia

    Goldfarb, S

    2005-01-01

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project is a system for the archiving and publishing of multimedia presentations, using the Web as medium. We list here newly available WLAP items relating to ATLAS: Atlas Software Week Plenary 6-10 December 2004 North American ATLAS Physics Workshop (Tucson) 20-21 December 2004 (17 talks) Physics Analysis Tools Tutorial (Tucson) 19 December 2004 Full Chain Tutorial 21 September 2004 ATLAS Plenary Sessions, 17-18 February 2005 (17 talks) Coming soon: ATLAS Tutorial on Electroweak Physics, 14 Feb. 2005 Software Workshop, 21-22 February 2005 Click here to browse WLAP for all ATLAS lectures.

  15. Attention as Inference: Selection Is Probabilistic; Responses Are All-or-None Samples

    Science.gov (United States)

    Vul, Edward; Hanus, Deborah; Kanwisher, Nancy

    2009-01-01

    Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than…

  16. Multi-atlas pancreas segmentation: Atlas selection based on vessel structure.

    Science.gov (United States)

    Karasawa, Ken'ichi; Oda, Masahiro; Kitasaka, Takayuki; Misawa, Kazunari; Fujiwara, Michitaka; Chu, Chengwen; Zheng, Guoyan; Rueckert, Daniel; Mori, Kensaku

    2017-07-01

    Automated organ segmentation from medical images is an indispensable component for clinical applications such as computer-aided diagnosis (CAD) and computer-assisted surgery (CAS). We utilize a multi-atlas segmentation scheme, which has recently been used in different approaches in the literature to achieve more accurate and robust segmentation of anatomical structures in computed tomography (CT) volume data. Among abdominal organs, the pancreas has large inter-patient variability in its position, size and shape. Moreover, the CT intensity of the pancreas closely resembles adjacent tissues, rendering its segmentation a challenging task. Due to this, conventional intensity-based atlas selection for pancreas segmentation often fails to select atlases that are similar in pancreas position and shape to those of the unlabeled target volume. In this paper, we propose a new atlas selection strategy based on vessel structure around the pancreatic tissue and demonstrate its application to a multi-atlas pancreas segmentation. Our method utilizes vessel structure around the pancreas to select atlases with high pancreatic resemblance to the unlabeled volume. Also, we investigate two types of applications of the vessel structure information to the atlas selection. Our segmentations were evaluated on 150 abdominal contrast-enhanced CT volumes. The experimental results showed that our approach can segment the pancreas with an average Jaccard index of 66.3% and an average Dice overlap coefficient of 78.5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. ATLAS people can run!

    CERN Multimedia

    Claudia Marcelloni de Oliveira; Pauline Gagnon

    It must be all the training we are getting every day, running around trying to get everything ready for the start of the LHC next year. This year, the ATLAS runners were in fine form and came in force. Nine ATLAS teams signed up for the 37th Annual CERN Relay Race with six runners per team. Under a blasting sun on Wednesday 23rd May 2007, each team covered the distances of 1000m, 800m, 800m, 500m, 500m and 300m taking the runners around the whole Meyrin site, hills included. A small reception took place in the ATLAS secretariat a week later to award the ATLAS Cup to the best ATLAS team. For the details on this complex calculation which takes into account the age of each runner, their gender and the color of their shoes, see the July 2006 issue of ATLAS e-news. The ATLAS Running Athena Team, the only all-women team enrolled this year, won the much coveted ATLAS Cup for the second year in a row. In fact, they are so good that Peter Schmid and Patrick Fassnacht are wondering about reducing the women's bonus in...

  18. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    Science.gov (United States)

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  19. EnviroAtlas Connects Urban Ecosystem Services and Human ...

    Science.gov (United States)

    Ecosystem services in urban areas can improve public health and well-being by mitigating natural and anthropogenic pollution, and by promoting healthy lifestyles that include engagement with nature and enhanced opportunities for physical activity and social interaction. EPA’s EnviroAtlas online mapping tool identifies urban environmental features linked in the scientific and medical literature to specific aspects of public health and well-being. EnviroAtlas researchers have synthesized newly-generated one-meter resolution landcover data, downscaled census population data, and other existing datasets such as roads and parks. Resulting geospatial metrics represent health-related indicators of urban ecosystem services supply and demand by census block-group and finer scales. EnviroAtlas maps include percent of the population with limited window views of trees, tree cover along walkable roads, overall neighborhood green space, and proximity to parks. Demographic data can be overlaid to perform analyses of disproportionate distribution of urban ecosystem services across population groups. Together with the Eco-Health Relationship Browser, EnviroAtlas data can be linked to numerous aspects of public health and well-being including school performance, physical fitness, social capital, and longevity. EnviroAtlas maps have been developed using consistent methods to allow for comparisons between neighborhoods and across multiple U.S. communities. To feature eco-heal

  20. Recent ATLAS Articles on WLAP

    CERN Multimedia

    J. Herr

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project is a system for the archiving and publishing of multimedia presentations, using the Web as medium. We list here newly available WLAP items relating to ATLAS: Atlas Physics Workshop 6-11 June 2005 June 2005 ATLAS Week Plenary Session Click here to browse WLAP for all ATLAS lectures.

  1. Interactive microbial distribution analysis using BioAtlas

    DEFF Research Database (Denmark)

    Lund, Jesper; List, Markus; Baumbach, Jan

    2017-01-01

    body maps and (iii) user-defined maps. It further allows for (iv) uploading of own sample data, which can be placed on existing maps to (v) browse the distribution of the associated taxonomies. Finally, BioAtlas enables users to (vi) contribute custom maps (e.g. for plants or animals) and to map...... to analyze microbial distribution in a location-specific context. BioAtlas is an interactive web application that closes this gap between sequence databases, taxonomy profiling and geo/body-location information. It enables users to browse taxonomically annotated sequences across (i) the world map, (ii) human...

  2. ATLAS Silicon Microstrip Tracker Operation and Performance

    CERN Document Server

    Yamada, M; The ATLAS collaboration

    2011-01-01

    The SemiConductor Tracker (SCT), comprising of silicon micro-strip detectors is one of the key precision tracking devices in the ATLAS Inner Detector. ATLAS is one of the experiments at CERN LHC. The completed SCT is in very good shapes with 99.3% of the SCT’s 4088 modules (a total of 6.3 million strips) are operational. The noise occupancy and hit efficiency exceed the design specifications. In the talk the current status of the SCT will be reviewed. We will report on the operation of the detector, its performance and observed problems, with stress on the sensor and electronics performance.

  3. Hybrid Processing of Measurable and Subjective Data; TOPICAL

    International Nuclear Information System (INIS)

    COOPER, J. ARLIN; ROGINSKI, ROBERT J.

    2001-01-01

    Conventional systems surety analysis is basically restricted to measurable or physical-model-derived data. However, most analyses, including high-consequence system surety analysis, must also utilize subjective information. In order to address this need, there has been considerable effort on analytically incorporating engineering judgment. For example, Dempster-Shafer theory establishes a framework in which frequentist probability and Bayesian incorporation of new data are subsets. Although Bayesian and Dempster-Shafer methodology both allow judgment, neither derives results that can indicate the relative amounts of subjective judgment and measurable data in the results. The methodology described in this report addresses these problems through a hybrid-mathematics-based process that allows tracking of the degree of subjective information in the output, thereby providing more informative (as well as more appropriate) results. In addition, most high consequence systems offer difficult-to-analyze situations. For example, in the Sandia National Laboratories nuclear weapons program, the probability that a weapon responds safely when exposed to an abnormal environment (e.g., lightning, crush, metal-melting temperatures) must be assured to meet a specific requirement. There are also non-probabilistic DOE and DoD requirements (e.g., for determining the adequacy of positive measures). The type of processing required for these and similar situations transcends conventional probabilistic and human factors methodology. The results described herein address these situations by efficiently utilizing subjective and objective information in a hybrid mathematical structure in order to directly apply to the surety assessment of high consequence systems. The results can also improve the quality of the information currently provided to decision-makers. To this end, objective inputs are processed in a conventional manner; while subjective inputs are derived from the combined engineering

  4. The Historical Town’s Atlas of the Czech Republic – as a part of the European project of historical atlases for comparative history of towns

    Czech Academy of Sciences Publication Activity Database

    Semotanová, Eva

    2012-01-01

    Roč. 38, č. 1 (2012), s. 215-222 ISSN 0323-0988 R&D Projects: GA ČR(CZ) GBP410/12/G113 Institutional support: RVO:67985963 Keywords : towns * historical geography * atlas * history Subject RIV: AB - History

  5. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  6. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  7. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  8. New format for ATLAS e-news

    CERN Multimedia

    Pauline Gagnon

    ATLAS e-news got a new look! As of November 30, 2007, we have a new format for ATLAS e-news. Please go to: http://atlas-service-enews.web.cern.ch/atlas-service-enews/index.html . ATLAS e-news will now be published on a weekly basis. If you are not an ATLAS colaboration member but still want to know how the ATLAS experiment is doing, we will soon have a version of ATLAS e-news intended for the general public. Information will be sent out in due time.

  9. Measurements of the Higgs boson properties with the ATLAS detector

    CERN Document Server

    Tomoto, M; The ATLAS collaboration

    2013-01-01

    Slide draft for the Crimea 2013 workshop. The subject of the talk will be measurements of the Higgs boson properties, including the spin, mass, signal strength, and couplings of a new boson discovered in 2012 at the ATLAS experiment.

  10. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  11. Circular, Cryogenic Structures from the Hirnantian Deglaciation Sequence (Anti-Atlas, Morocco)

    Czech Academy of Sciences Publication Activity Database

    Nutz, A.; Ghienne, J.-F.; Štorch, Petr

    2013-01-01

    Roč. 83, č. 1 (2013), s. 115-131 ISSN 1527-1404 Institutional support: RVO:67985831 Keywords : Ordovician * Anti-Atlas (Morocco) * cryogenic structure Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.943, year: 2013

  12. Probabilistic Analysis of Gas Turbine Field Performance

    Science.gov (United States)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  13. Probabilistic assessment of SGTR management

    International Nuclear Information System (INIS)

    Champ, M.; Cornille, Y.; Lanore, J.M.

    1989-04-01

    In case of steam generator tube rupture (SGTR) event, in France, the mitigation of accident relies on operator intervention, by applying a specific accidental procedure. A detailed probabilistic analysis has been conducted which required the assessment of the failure probability of the operator actions, and for that purpose it was necessary to estimate the time available for the operator to apply the adequate procedure for various sequences. The results indicate that by taking into account the delays and the existence of adequate accidental procedures, the risk is reduced to a reasonably low level

  14. ATLAS Virtual Visits bringing the world into the ATLAS control room

    CERN Document Server

    AUTHOR|(CDS)2051192; The ATLAS collaboration; Yacoob, Sahal

    2016-01-01

    ATLAS Virtual Visits is a project initiated in 2011 for the Education & Outreach program of the ATLAS Experiment at CERN. Its goal is to promote public appreciation of the LHC physics program and particle physics, in general, through direct dialogue between ATLAS physicists and remote audiences. A Virtual Visit is an IP-based videoconference, coupled with a public webcast and video recording, between ATLAS physicists and remote locations around the world, that typically include high school or university classrooms, Masterclasses, science fairs, or other special events, usually hosted by collaboration members. Over the past two years, more than 10,000 people, from all of the world’s continents, have actively participated in ATLAS Virtual Visits, with many more enjoying the experience from the publicly available webcasts and recordings. We present an overview of our experience and discuss potential development for the future.

  15. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    International Nuclear Information System (INIS)

    Wilmot, R.D.

    2003-11-01

    The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment

  16. Guidance for the definition and application of probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2011-05-01

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  17. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  18. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  19. Standalone vertex finding in the ATLAS muon spectrometer

    Czech Academy of Sciences Publication Activity Database

    Aad, G.; Abajyan, T.; Abbott, B.; Böhm, Jan; Chudoba, Jiří; Hejbal, Jiří; Jakoubek, Tomáš; Kepka, Oldřich; Kupčo, Alexander; Kůs, Vlastimil; Lokajíček, Miloš; Lysák, Roman; Marčišovský, Michal; Mikeštíková, Marcela; Myška, Miroslav; Němeček, Stanislav; Dos Santos, D.R.; Růžička, P.; Šícho, Petr; Staroba, Pavel; Svatoš, Michal; Taševský, Marek; Tic, Tomáš; Vrba, Václav

    2014-01-01

    Roč. 9, Feb (2014), s. 1-22 ISSN 1748-0221 R&D Projects: GA MŠk(CZ) LG13009 Institutional support: RVO:68378271 Keywords : muon spectrometers * performance of high energy physics * detectors * ATLAS * CERN LHC Coll Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.399, year: 2014

  20. Charged track reconstruction and b-tagging performance in ATLAS

    CERN Document Server

    Favareto, A; The ATLAS collaboration

    2012-01-01

    The ATLAS Inner Detector is designed to provide precision tracking informa- tion at LHC luminosities with a hermetic detector covering 5 units in pseudo- rapidity. It features a large silicon tracker subdivided into a pixel and a strip system for precise tracking and primary/secondary vertex reconstruction and to provide excellent b-tagging capabilities. A Transition Radiation Tracker improves the momentum reconstruction and provides electron identification information. The subject of these proceedings is the performance of the ATLAS Inner Detector achieved after its first 2 years of operation. The excellent detector performance and more than a decade of simulation studies provided a good basis for the commissioning of the offline track and vertex reconstruction. Early studies with cosmic events and the ever increasing amount of high quality p-p collision data allowed for rapid progress in understanding of the detector. Today the ATLAS Inner Detector approaches its design values in most relevant performance c...

  1. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  2. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  3. Estimating the actual subject-specific genetic correlations in behavior genetics.

    Science.gov (United States)

    Molenaar, Peter C M

    2012-10-01

    Generalization of the standard behavior longitudinal genetic factor model for the analysis of interindividual phenotypic variation to a genetic state space model for the analysis of intraindividual variation enables the possibility to estimate subject-specific heritabilities.

  4. Supporting ATLAS

    CERN Multimedia

    maximilien brice

    2003-01-01

    Eighteen feet made of stainless steel will support the barrel ATLAS detector in the cavern at Point 1. In total, the ATLAS feet system will carry approximately 6000 tons, and will give the same inclination to the detector as the LHC accelerator.

  5. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    International Nuclear Information System (INIS)

    Zhao, T; Ruan, D

    2015-01-01

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject to random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be

  6. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  7. Optimal Multitrial Prediction Combination and Subject-Specific Adaptation for Minimal Training Brain Switch Designs

    NARCIS (Netherlands)

    Spyrou, L.; Blokland, Y.M.; Farquhar, J.D.R.; Bruhn, J.

    2016-01-01

    Brain-Computer Interface (BCI) systems are traditionally designed by taking into account user-specific data to enable practical use. More recently, subject independent (SI) classification algorithms have been developed which bypass the subject specific adaptation and enable rapid use of the system.

  8. Optimal multitrial prediction combination and subject-specific adaptation for minimal training brain switch designs

    NARCIS (Netherlands)

    Spyrou, L.; Blokland, Y.M.; Farquhar, J.D.R.; Bruhn, J.

    2016-01-01

    Brain-Computer Interface systems are traditionally designed by taking into account user-specific data to enable practical use. More recently, subject independent (SI) classification algorithms have been developed which bypass the subject specific adaptation and enable rapid use of the system. A

  9. The applicability of Greulich and Pyle atlas to assess skeletal age for four ethnic groups.

    Science.gov (United States)

    Mansourvar, Marjan; Ismail, Maizatul Akmar; Raj, Ram Gopal; Kareem, Sameem Abdul; Aik, Saw; Gunalan, Roshan; Antony, Chermaine Deepa

    2014-02-01

    Recently, determination of skeletal age, defined as the assessment of bone age, has rapidly become an important task between forensic experts and radiologists. The Greulich-Pyle (GP) atlas is one of the most frequently used methods for the assessment of skeletal age around the world. After presentation of the GP approach for the estimation of the bone age, much research has been conducted to examine the usability of this method in various geographic or ethnic categories. This study investigates on a small-scale and compares the reliability of the GP atlas for assessment of the bone age for four ethnic groups - Asian, African/American, Caucasian and Hispanic - for a different range of ages. Plain radiographs of 184 left hands and wrists for males from the healthy sample between 1 to 18 years of age for four ethnic groups were taken. The skeletal age (SA) was estimated by a radiologist using the GP atlas. The blind method was utilized. The mean (SA) results were compared with mean chronological ages (CA) for the separate ethnic groups. SPSS was used to conduct the analysis and the paired t-test was applied to show the difference between the mean CA and mean SA achieved from the GP atlas. The results from the GP atlas were compared to the CA of the samples. In Asian subjects the mean difference was 0.873 years. The GP atlas showed delayed bone age at 2-7 ages (from 0.2 to 2.3 year) and then advanced bone age for age 8. In the African/American subjects the difference between CA and SA was statistically significant (P-value = 0.048). The mean difference in the Caucasian and Hispanic subjects reflects no considerable distinction with a standard deviation (SD) of 0.3088 and 0.3766, respectively, (P-value >0.05 for both groups). According to the present study, it is concluded that although the GP atlas is reliable for Caucasian and Hispanic ethnic groups it is not applicable for other ethnic groups for different ranges of age, especially in the sample of the male African

  10. Introspection of subjective feelings is sensitive and specific.

    Science.gov (United States)

    Questienne, Laurence; van Dijck, Jean-Philippe; Gevers, Wim

    2018-02-01

    Conversely to behaviorist ideas, recent studies suggest that introspection can be accurate and reliable. However, an unresolved question is whether people are able to report specific aspects of their phenomenal experience, or whether they report more general nonspecific experiences. To address this question, we investigated the sensitivity and validity of our introspection for different types of conflict. Taking advantage of the congruency sequence effect, we dissociated response conflict while keeping visual conflict unchanged in a Stroop and in a priming task. Participants were subsequently asked to report on either their experience of urge to err or on their feeling of visual conflict. Depending on the focus of the introspection, subjective reports specifically followed either the response conflict or the visual conflict. These results demonstrate that our introspective reports can be sensitive and that we are able to dissociate specific aspects of our phenomenal experiences in a valid manner. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  12. ATLAS software stack on ARM64

    CERN Document Server

    Smith, Joshua Wyatt; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment explores new hardware and software platforms that, in the future, may be more suited to its data intensive workloads. One such alternative hardware platform is the ARM architecture, which is designed to be extremely power efficient and is found in most smartphones and tablets. CERN openlab recently installed a small cluster of ARM 64-bit evaluation prototype servers. Each server is based on a single-socket ARM 64-bit system on a chip, with 32 Cortex-A57 cores. In total, each server has 128 GB RAM connected with four fast memory channels. This paper reports on the port of the ATLAS software stack onto these new prototype ARM64 servers. This included building the "external" packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adj...

  13. Report to users of Atlas

    International Nuclear Information System (INIS)

    Ahmad, I.; Glagola, B.

    1996-06-01

    This report contains the following topics: Status of the ATLAS Accelerator; Highlights of Recent Research at ATLAS; Program Advisory Committee; ATLAS User Group Executive Committee; FMA Information Available On The World Wide Web; Conference on Nuclear Structure at the Limits; and Workshop on Experiments with Gammasphere at ATLAS

  14. Operational experience with the ATLAS Pixel Detector

    CERN Document Server

    Ince, T; The ATLAS collaboration

    2012-01-01

    The ATLAS Pixel Detector is the innermost element of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this paper, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 96.2% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  15. Operational experience of the ATLAS Pixel detector

    CERN Document Server

    Hirschbuehl, D; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  16. Operational experience of the ATLAS Pixel Detector

    CERN Document Server

    Marcisovsky, M; The ATLAS collaboration

    2011-01-01

    The ATLAS Pixel Detector is the innermost detector of the ATLAS experiment at the Large Hadron Collider at CERN, providing high-resolution measurements of charged particle tracks in the high radiation environment close to the collision region. This capability is vital for the identification and measurement of proper decay times of long-lived particles such as b-hadrons, and thus vital for the ATLAS physics program. The detector provides hermetic coverage with three cylindrical layers and three layers of forward and backward pixel detectors. It consists of approximately 80 million pixels that are individually read out via chips bump-bonded to 1744 n-in-n silicon substrates. In this talk, results from the successful operation of the Pixel Detector at the LHC will be presented, including monitoring, calibration procedures, timing optimization and detector performance. The detector performance is excellent: 97,5% of the pixels are operational, noise occupancy and hit efficiency exceed the design specification, an...

  17. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to overcome the dedicated resources available for ATLAS on the WLCG. Example of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at the Tier-2 and Tier-3 sites, opportunistic resources at the Open Science Grid, and ATLAS High Level Trigger farm between the data taking periods. Because of opportunistic resources specifics such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  18. Dynamical systems probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  19. Performance of b-jet identification in the ATLAS experiment

    Czech Academy of Sciences Publication Activity Database

    Aad, G.; Abbott, B.; Abdallah, J.; Chudoba, Jiří; Havránek, Miroslav; Hejbal, Jiří; Jakoubek, Tomáš; Kepka, Oldřich; Kupčo, Alexander; Kůs, Vlastimil; Lokajíček, Miloš; Lysák, Roman; Marčišovský, Michal; Mikeštíková, Marcela; Němeček, Stanislav; Šícho, Petr; Staroba, Pavel; Svatoš, Michal; Taševský, Marek; Vrba, Václav

    2016-01-01

    Roč. 11, Apr (2016), s. 1-127, č. článku P04008. ISSN 1748-0221 Institutional support: RVO:68378271 Keywords : efficiency * calibration * correlation * hadron * ATLAS * impact parameter * CERN LHC Coll Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.220, year: 2016

  20. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    International Nuclear Information System (INIS)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of the comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-11 perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel

  1. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    International Nuclear Information System (INIS)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel. 10 refs

  2. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  3. Atlas of prostate cancer heritability in European and African-American men pinpoints tissue-specific regulation

    DEFF Research Database (Denmark)

    Gusev, Alexander; Shi, Huwenbo; Kichaev, Gleb

    2016-01-01

    Although genome-wide association studies have identified over 100 risk loci that explain ∼33% of familial risk for prostate cancer (PrCa), their functional effects on risk remain largely unknown. Here we use genotype data from 59,089 men of European and African American ancestries combined...... with cell-type-specific epigenetic data to build a genomic atlas of single-nucleotide polymorphism (SNP) heritability in PrCa. We find significant differences in heritability between variants in prostate-relevant epigenetic marks defined in normal versus tumour tissue as well as between tissue and cell...... lines. The majority of SNP heritability lies in regions marked by H3k27 acetylation in prostate adenoc7arcinoma cell line (LNCaP) or by DNaseI hypersensitive sites in cancer cell lines. We find a high degree of similarity between European and African American ancestries suggesting a similar genetic...

  4. Nonlocal atlas-guided multi-channel forest learning for human brain labeling

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Guangkai [Space Control and Inertial Technology Research Center, Harbin Institute of Technology, Harbin 150001, China and Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599 (United States); Gao, Yaozong; Wu, Guorong [Department of Computer Science, Department of Radiology, and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599 (United States); Wu, Ligang [Space Control and Inertial Technology Research Center, Harbin Institute of Technology, Harbin 150001 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul 02841 (Korea, Republic of)

    2016-02-15

    Purpose: It is important for many quantitative brain studies to label meaningful anatomical regions in MR brain images. However, due to high complexity of brain structures and ambiguous boundaries between different anatomical regions, the anatomical labeling of MR brain images is still quite a challenging task. In many existing label fusion methods, appearance information is widely used. However, since local anatomy in the human brain is often complex, the appearance information alone is limited in characterizing each image point, especially for identifying the same anatomical structure across different subjects. Recent progress in computer vision suggests that the context features can be very useful in identifying an object from a complex scene. In light of this, the authors propose a novel learning-based label fusion method by using both low-level appearance features (computed from the target image) and high-level context features (computed from warped atlases or tentative labeling maps of the target image). Methods: In particular, the authors employ a multi-channel random forest to learn the nonlinear relationship between these hybrid features and target labels (i.e., corresponding to certain anatomical structures). Specifically, at each of the iterations, the random forest will output tentative labeling maps of the target image, from which the authors compute spatial label context features and then use in combination with original appearance features of the target image to refine the labeling. Moreover, to accommodate the high inter-subject variations, the authors further extend their learning-based label fusion to a multi-atlas scenario, i.e., they train a random forest for each atlas and then obtain the final labeling result according to the consensus of results from all atlases. Results: The authors have comprehensively evaluated their method on both public LONI-LBPA40 and IXI datasets. To quantitatively evaluate the labeling accuracy, the authors use the

  5. Nonlocal atlas-guided multi-channel forest learning for human brain labeling

    International Nuclear Information System (INIS)

    Ma, Guangkai; Gao, Yaozong; Wu, Guorong; Wu, Ligang; Shen, Dinggang

    2016-01-01

    Purpose: It is important for many quantitative brain studies to label meaningful anatomical regions in MR brain images. However, due to high complexity of brain structures and ambiguous boundaries between different anatomical regions, the anatomical labeling of MR brain images is still quite a challenging task. In many existing label fusion methods, appearance information is widely used. However, since local anatomy in the human brain is often complex, the appearance information alone is limited in characterizing each image point, especially for identifying the same anatomical structure across different subjects. Recent progress in computer vision suggests that the context features can be very useful in identifying an object from a complex scene. In light of this, the authors propose a novel learning-based label fusion method by using both low-level appearance features (computed from the target image) and high-level context features (computed from warped atlases or tentative labeling maps of the target image). Methods: In particular, the authors employ a multi-channel random forest to learn the nonlinear relationship between these hybrid features and target labels (i.e., corresponding to certain anatomical structures). Specifically, at each of the iterations, the random forest will output tentative labeling maps of the target image, from which the authors compute spatial label context features and then use in combination with original appearance features of the target image to refine the labeling. Moreover, to accommodate the high inter-subject variations, the authors further extend their learning-based label fusion to a multi-atlas scenario, i.e., they train a random forest for each atlas and then obtain the final labeling result according to the consensus of results from all atlases. Results: The authors have comprehensively evaluated their method on both public LONI-LBPA40 and IXI datasets. To quantitatively evaluate the labeling accuracy, the authors use the

  6. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  7. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  8. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  9. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    Directory of Open Access Journals (Sweden)

    Andrew Denovan

    2017-10-01

    Full Text Available The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy, the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT, the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual

  10. Unsteady Probabilistic Analysis of a Gas Turbine System

    Science.gov (United States)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  11. A normative probabilistic design of a fair governmental decision strategy

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kracík, Jan

    2004-01-01

    Roč. 12, 2-3 (2004), s. 1-15 ISSN 1057-9214 R&D Projects: GA AV ČR IBS1075351; GA ČR GA102/03/0049 Grant - others:ESF(FR) TED Institutional research plan: CEZ:AV0Z1075907 Keywords : fully probabilistic design * E-democracy * multiple participant decision making Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/historie/karny-0106249.pdf

  12. Fixed-point Characterization of Compositionality Properties of Probabilistic Processes Combinators

    Directory of Open Access Journals (Sweden)

    Daniel Gebler

    2014-08-01

    Full Text Available Bisimulation metric is a robust behavioural semantics for probabilistic processes. Given any SOS specification of probabilistic processes, we provide a method to compute for each operator of the language its respective metric compositionality property. The compositionality property of an operator is defined as its modulus of continuity which gives the relative increase of the distance between processes when they are combined by that operator. The compositionality property of an operator is computed by recursively counting how many times the combined processes are copied along their evolution. The compositionality properties allow to derive an upper bound on the distance between processes by purely inspecting the operators used to specify those processes.

  13. Optimal Portfolio Allocation under a Probabilistic Risk Constraint and the Incentives for Financial Innovation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); B.N. Jorgensen (Bjørn); C.G. de Vries (Casper); X. Yang (Xiaoguang)

    2001-01-01

    textabstractWe derive, in a complete markets environment, an investor's optimal portfolio allocation subject to both a budget constraint and a probabilistic risk constraint. We demonstrate that the set of feasible portfolios need not be connected or convex, while the number of local optima increases

  14. Studies into tau reconstruction, missing transverse energy and photon induced processes with the ATLAS detector at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Prabhu, Robindra P.

    2011-09-15

    The ATLAS experiment is currently recording data from proton-proton collisions delivered by CERN's Large Hadron Collider. As more data is amassed, studies of both Standard Model processes and searches for new physics beyond will intensify. This dissertation presents a three-part study providing new methods to help facilitate these efforts. The first part presents a novel {tau}-reconstruction algorithm for ATLAS inspired by the ideas of particle flow calorimetry. The algorithm is distinguished from traditional {tau}-reconstruction approaches in ATLAS, insofar that it seeks to recognize decay topologies consistent with a (hadronically) decaying {tau}-lepton using resolved energy flow objects in the calorimeters. This procedure allows for an early classification of {tau}-candidates according to their decay mode and the use of decay mode specific discrimination against fakes. A detailed discussion of the algorithm is provided along with early performance results derived from simulated data. The second part presents a Monte Carlo simulation tool which by way of a pseudorapidity-dependent parametrization of the jet energy resolution, provides a probabilistic estimate for the magnitude of instrumental contributions to missing transverse energy arising from jet fluctuations. The principles of the method are outlined and it is shown how the method can be used to populate tails of simulated missing transverse energy distributions suffering from low statistics. The third part explores the prospect of detecting photon-induced leptonic final states in early data. Such processes are distinguished from the more copious hadronic interactions at the LHC by cleaner final states void of hadronic debris, however the soft character of the final state leptons poses challenges to both trigger and offline selections. New trigger items enabling the online selection of such final states are presented, along with a study into the feasibility of detecting the two-photon exchange process

  15. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...... are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...

  16. Evolution of the ATLAS PanDA Workload Management System for Exascale Computational Science

    OpenAIRE

    Maeno, T; De, K; Klimentov, A; Nilsson, P; Oleynik, D; Panitkin, S; Petrosyan, A; Schovancova, J; Vaniachine, A; Wenaus, T; Yu, D

    2013-01-01

    An important foundation underlying the impressive success of data processing and analysis in the ATLAS experiment [1] at the LHC [2] is the Production and Distributed Analysis (PanDA) workload management system [3]. PanDA was designed specifically for ATLAS and proved to be highly successful in meeting all the distributed computing needs of the experiment. However, the core design of PanDA is not experiment specific. The PanDA workload management system is capable of meeting the needs of othe...

  17. Mapy energií v Atlase krajiny České republiky

    Czech Academy of Sciences Publication Activity Database

    Kolejka, Jaromír

    2008-01-01

    Roč. 42, č. 6 (2008), s. 292-297 ISSN 0044-4863 Grant - others:GA MŽP(CZ) SK/600/1/03 Institutional research plan: CEZ:AV0Z30860518 Keywords : map * energy * Landscape atlas of Czech Republic Subject RIV: DE - Earth Magnetism, Geodesy, Geography

  18. An overview-probabilistic safety analysis for research reactors

    International Nuclear Information System (INIS)

    Liu Jinlin; Peng Changhong

    2015-01-01

    For long-term application, Probabilistic Safety Analysis (PSA) has proved to be a valuable tool for improving the safety and reliability of power reactors. In China, 'Nuclear safety and radioactive pollution prevention 'Twelfth Five Year Plan' and the 2020 vision' raises clearly that: to develop probabilistic safety analysis and aging evaluation for research reactors. Comparing with the power reactors, it reveals some specific features in research reactors: lower operating power, lower coolant temperature and pressure, etc. However, the core configurations may be changed very often and human actions play an important safety role in research reactors due to its specific experimental requirement. As a result, there is a necessary to conduct the PSA analysis of research reactors. This paper discusses the special characteristics related to the structure and operation and the methods to develop the PSA of research reactors, including initiating event analysis, event tree analysis, fault tree analysis, dependent failure analysis, human reliability analysis and quantification as well as the experimental and external event evaluation through the investigation of various research reactors and their PSAs home and abroad, to provide the current situation and features of research reactors PSAs. (author)

  19. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  20. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  1. ATLAS Distributed Computing

    CERN Document Server

    Schovancova, J; The ATLAS collaboration

    2011-01-01

    The poster details the different aspects of the ATLAS Distributed Computing experience after the first year of LHC data taking. We describe the performance of the ATLAS distributed computing system and the lessons learned during the 2010 run, pointing out parts of the system which were in a good shape, and also spotting areas which required improvements. Improvements ranged from hardware upgrade on the ATLAS Tier-0 computing pools to improve data distribution rates, tuning of FTS channels between CERN and Tier-1s, and studying data access patterns for Grid analysis to improve the global processing rate. We show recent software development driven by operational needs with emphasis on data management and job execution in the ATLAS production system.

  2. On the Origins of Suboptimality in Human Probabilistic Inference

    Science.gov (United States)

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M.

    2014-01-01

    Humans have been shown to combine noisy sensory information with previous experience (priors), in qualitative and sometimes quantitative agreement with the statistically-optimal predictions of Bayesian integration. However, when the prior distribution becomes more complex than a simple Gaussian, such as skewed or bimodal, training takes much longer and performance appears suboptimal. It is unclear whether such suboptimality arises from an imprecise internal representation of the complex prior, or from additional constraints in performing probabilistic computations on complex distributions, even when accurately represented. Here we probe the sources of suboptimality in probabilistic inference using a novel estimation task in which subjects are exposed to an explicitly provided distribution, thereby removing the need to remember the prior. Subjects had to estimate the location of a target given a noisy cue and a visual representation of the prior probability density over locations, which changed on each trial. Different classes of priors were examined (Gaussian, unimodal, bimodal). Subjects' performance was in qualitative agreement with the predictions of Bayesian Decision Theory although generally suboptimal. The degree of suboptimality was modulated by statistical features of the priors but was largely independent of the class of the prior and level of noise in the cue, suggesting that suboptimality in dealing with complex statistical features, such as bimodality, may be due to a problem of acquiring the priors rather than computing with them. We performed a factorial model comparison across a large set of Bayesian observer models to identify additional sources of noise and suboptimality. Our analysis rejects several models of stochastic behavior, including probability matching and sample-averaging strategies. Instead we show that subjects' response variability was mainly driven by a combination of a noisy estimation of the parameters of the priors, and by

  3. On the origins of suboptimality in human probabilistic inference.

    Directory of Open Access Journals (Sweden)

    Luigi Acerbi

    2014-06-01

    Full Text Available Humans have been shown to combine noisy sensory information with previous experience (priors, in qualitative and sometimes quantitative agreement with the statistically-optimal predictions of Bayesian integration. However, when the prior distribution becomes more complex than a simple Gaussian, such as skewed or bimodal, training takes much longer and performance appears suboptimal. It is unclear whether such suboptimality arises from an imprecise internal representation of the complex prior, or from additional constraints in performing probabilistic computations on complex distributions, even when accurately represented. Here we probe the sources of suboptimality in probabilistic inference using a novel estimation task in which subjects are exposed to an explicitly provided distribution, thereby removing the need to remember the prior. Subjects had to estimate the location of a target given a noisy cue and a visual representation of the prior probability density over locations, which changed on each trial. Different classes of priors were examined (Gaussian, unimodal, bimodal. Subjects' performance was in qualitative agreement with the predictions of Bayesian Decision Theory although generally suboptimal. The degree of suboptimality was modulated by statistical features of the priors but was largely independent of the class of the prior and level of noise in the cue, suggesting that suboptimality in dealing with complex statistical features, such as bimodality, may be due to a problem of acquiring the priors rather than computing with them. We performed a factorial model comparison across a large set of Bayesian observer models to identify additional sources of noise and suboptimality. Our analysis rejects several models of stochastic behavior, including probability matching and sample-averaging strategies. Instead we show that subjects' response variability was mainly driven by a combination of a noisy estimation of the parameters of the

  4. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  5. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  6. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  7. Illustrative Example of Distributed Analysis in ATLAS Spanish Tier-2 and Tier-3 centers

    CERN Document Server

    Oliver, E; The ATLAS collaboration; González de la Hoz, S; Kaci, M; Lamas, A; Salt, J; Sánchez, J; Villaplana, M

    2011-01-01

    Data taking in ATLAS has been going on for more than one year. The necessity of a computing infrastructure for data storage, access for thousands of users and process of hundreds of million of events has been confirmed in this period. Fortunately, this task has been managed by the GRID infrastructure and the manpower that also has been developing specific GRID tools for the ATLAS community. An example of a physics analysis, searches for the decay of a heavy resonance into a ttbar pair, using this infrastructure is shown. Concretely using the ATLAS Spanish Tier-2 and the IFIC Tier-3. In this moment, the ATLAS Distributed Computing group is working to improve the connectivity among centers in order to be ready for the foreseen increase on the ATLAS activity in the next years.

  8. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  9. ATLAS Review Office

    CERN Multimedia

    Szeless, B

    The ATLAS internal reviews, be it the mandatory Production Readiness Reviews, the now newly installed Production Advancement Reviews, or the more and more requested different Design Reviews, have become a part of our ATLAS culture over the past years. The Activity Systems Status Overviews are, for the time being, a one in time event and should be held for each system as soon as possible to have some meaning. There seems to a consensus that the reviews have become a useful project tool for the ATLAS management but even more so for the sub-systems themselves making achievements as well as possible shortcomings visible. One other recognized byproduct is the increasing cross talk between the systems, a very important ingredient to make profit all the systems from the large collective knowledge we dispose of in ATLAS. In the last two months, the first two PARs were organized for the MDT End Caps and the TRT Barrel Modules, both part of the US contribution to the ATLAS Project. Furthermore several different design...

  10. ATLAS Forward Proton (AFP) time-of-flight (ToF) detector: construction & existing experiences

    CERN Document Server

    Sykora, Tomas; The ATLAS collaboration

    2018-01-01

    In 2017 the ATLAS collaboration successfully completed the installation of the ATLAS Forward Proton (AFP) detector to measure diffractive protons leaving under very small angles (hundreds of micro radians) the ATLAS proton-proton interaction point. The AFP tags and measures forward protons scattered in single diffraction or hard central diffraction, where two protons are emitted and a central system is created. In addition, the AFP has a potential to measure two-photon exchange processes, and to be sensitive to eventual anomalous quartic couplings of Vector Bosons: γγW+W−, γγZZ, and γγγγ. Such measurements at high luminosities will be possible only due the combination of high resolution tracking (semi-edgeless 3D Silicon pixel) detectors and ultra-high precision ToF (Quartz-Cherenkov) detectors at both sides of the ATLAS detector. The ToF detector construction and experiences with its operation represent the subject of the talk.

  11. Development of representative magnetic resonance imaging-based atlases of the canine brain and evaluation of three methods for atlas-based segmentation.

    Science.gov (United States)

    Milne, Marjorie E; Steward, Christopher; Firestone, Simon M; Long, Sam N; O'Brien, Terrence J; Moffat, Bradford A

    2016-04-01

    To develop representative MRI atlases of the canine brain and to evaluate 3 methods of atlas-based segmentation (ABS). 62 dogs without clinical signs of epilepsy and without MRI evidence of structural brain disease. The MRI scans from 44 dogs were used to develop 4 templates on the basis of brain shape (brachycephalic, mesaticephalic, dolichocephalic, and combined mesaticephalic and dolichocephalic). Atlas labels were generated by segmenting the brain, ventricular system, hippocampal formation, and caudate nuclei. The MRI scans from the remaining 18 dogs were used to evaluate 3 methods of ABS (manual brain extraction and application of a brain shape-specific template [A], automatic brain extraction and application of a brain shape-specific template [B], and manual brain extraction and application of a combined template [C]). The performance of each ABS method was compared by calculation of the Dice and Jaccard coefficients, with manual segmentation used as the gold standard. Method A had the highest mean Jaccard coefficient and was the most accurate ABS method assessed. Measures of overlap for ABS methods that used manual brain extraction (A and C) ranged from 0.75 to 0.95 and compared favorably with repeated measures of overlap for manual extraction, which ranged from 0.88 to 0.97. Atlas-based segmentation was an accurate and repeatable method for segmentation of canine brain structures. It could be performed more rapidly than manual segmentation, which should allow the application of computer-assisted volumetry to large data sets and clinical cases and facilitate neuroimaging research and disease diagnosis.

  12. Teaching atlas of mammography

    International Nuclear Information System (INIS)

    Tabar, L.; Dean, P.B.

    1985-01-01

    The illustrated case reports in this teaching atlas cover practically the entire range of possible pathological changes and are based on in-patient case material and 80,000 screening documents. The two basic approaches, - detection and analysis of changes -, are taught comprehensively and in great detail. A systematic procedure for analysing the mammographies, in order to detect even the very least changes, and its practical application is explained using mammographies showing unclear findings at first sight. A system of coordinates is presented which allows precise localisation of the changes. Exercises for practising the technique of identifying the pathological changes round up the methodolical chapters. Additional imaging technical enhancements and detail enlargements are of great help in interpreting the findings. The specific approach adopted for this teaching atlas is a 'reverse procedure', which leaves the beaten track and starts with analysing the mammographies and evaluating the radiographic findings, in order to finally derive the diagnosis. (orig./CB) [de

  13. The ATLAS semi-conductor tracker operation and performance

    International Nuclear Information System (INIS)

    Robinson, D.

    2013-01-01

    The Semi-Conductor Tracker (SCT) is a silicon strip detector and one of the key precision tracking devices in the Inner Detector of the ATLAS experiment at the CERN Large Hadron Collider (LHC). The SCT was installed and commissioned within ATLAS in 2007, and has been used to exploit fully the physics potential of the LHC since the first proton–proton collisions at 7 TeV were delivered in 2009. In this paper, its operational status throughout data taking up to the end of 2011 is presented, and its tracking performance is reviewed. -- Highlights: ► The operation and performance of the ATLAS Semi-Conductor Tracker (SCT) is reviewed. ► More than 99% of the SCT strips have remained operational in all data taking periods so far. ► Tracking performance indicators have met or exceeded design specifications. ► Radiation damage effects match closely expectations from delivered fluence.

  14. Berliner Philarmoniker ATLAS visit

    CERN Multimedia

    ATLAS Collaboration

    2017-01-01

    The Berliner Philarmoniker in on tour through Europe. They stopped on June 27th in Geneva, for a concert at the Victoria Hall. An ATLAS visit was organised the morning after, lead by the ATLAS spokesperson Karl Jakobs (welcome and overview talk) and two ATLAS guides (AVC visit and 3D movie).

  15. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Augustinack, Jean C.; Nguyen, Khoa

    2015-01-01

    level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise...... datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer......'s disease subjects and elderly controls with 88% accuracy in standard resolution (1 mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and classification based on whole hippocampal volume (82% accuracy)....

  16. Multi-threading in the ATLAS High-Level Trigger

    CERN Document Server

    Barton, Adam Edward; The ATLAS collaboration

    2017-01-01

    Over the next decade of LHC data-taking the instantaneous luminosity will reach up 7.5 times the design value with over 200 interactions per bunch-crossing and will pose unprecedented challenges for the ATLAS trigger system. We report on an HLT prototype in which the need for HLT­specific components has been reduced to a minimum while retaining the key aspects of trigger functionality including regional reconstruction and early event rejection. We report on the first experience of migrating trigger algorithms to this new framework and present the next steps towards a full implementation of the ATLAS trigger within AthenaMT.

  17. Operational experience of ATLAS SCT and Pixel Detector

    CERN Document Server

    Kocian, Martin; The ATLAS collaboration

    2017-01-01

    The ATLAS Inner Detector based on silicon sensors is consisting of a strip detector (SCT) and a pixel detector. It is the crucial component for vertexing and tracking in the ATLAS experiment. With the excellent performance of the LHC well beyond the original specification the silicon tracking detectors are facing substantial challenges in terms of data acquisition, radiation damage to the sensors, and SEUs in the readout ASICs. The approaches on how the detector systems cope with the demands of high luminosity operation while maintaining excellent performance through hardware upgrades, software and firmware algorithms, and operational settings, are presented.

  18. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  19. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  20. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  1. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00066086; The ATLAS collaboration; Caballero, Jose; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  2. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  3. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  4. ATLAS Open Data project

    CERN Document Server

    The ATLAS collaboration

    2018-01-01

    The current ATLAS model of Open Access to recorded and simulated data offers the opportunity to access datasets with a focus on education, training and outreach. This mandate supports the creation of platforms, projects, software, and educational products used all over the planet. We describe the overall status of ATLAS Open Data (http://opendata.atlas.cern) activities, from core ATLAS activities and releases to individual and group efforts, as well as educational programs, and final web or software-based (and hard-copy) products that have been produced or are under development. The relatively large number and heterogeneous use cases currently documented is driving an upcoming release of more data and resources for the ATLAS Community and anyone interested to explore the world of experimental particle physics and the computer sciences through data analysis.

  5. A method of 2D/3D registration of a statistical mouse atlas with a planar X-ray projection and an optical photo.

    Science.gov (United States)

    Wang, Hongkai; Stout, David B; Chatziioannou, Arion F

    2013-05-01

    The development of sophisticated and high throughput whole body small animal imaging technologies has created a need for improved image analysis and increased automation. The registration of a digital mouse atlas to individual images is a prerequisite for automated organ segmentation and uptake quantification. This paper presents a fully-automatic method for registering a statistical mouse atlas with individual subjects based on an anterior-posterior X-ray projection and a lateral optical photo of the mouse silhouette. The mouse atlas was trained as a statistical shape model based on 83 organ-segmented micro-CT images. For registration, a hierarchical approach is applied which first registers high contrast organs, and then estimates low contrast organs based on the registered high contrast organs. To register the high contrast organs, a 2D-registration-back-projection strategy is used that deforms the 3D atlas based on the 2D registrations of the atlas projections. For validation, this method was evaluated using 55 subjects of preclinical mouse studies. The results showed that this method can compensate for moderate variations of animal postures and organ anatomy. Two different metrics, the Dice coefficient and the average surface distance, were used to assess the registration accuracy of major organs. The Dice coefficients vary from 0.31 ± 0.16 for the spleen to 0.88 ± 0.03 for the whole body, and the average surface distance varies from 0.54 ± 0.06 mm for the lungs to 0.85 ± 0.10mm for the skin. The method was compared with a direct 3D deformation optimization (without 2D-registration-back-projection) and a single-subject atlas registration (instead of using the statistical atlas). The comparison revealed that the 2D-registration-back-projection strategy significantly improved the registration accuracy, and the use of the statistical mouse atlas led to more plausible organ shapes than the single-subject atlas. This method was also tested with shoulder

  6. AGIS: The ATLAS Grid Information System

    CERN Document Server

    Anisenkov, A; The ATLAS collaboration; Klimentov, A; Senchenko, A

    2012-01-01

    The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.

  7. Survey of probabilistic methods in safety and risk assessment for nuclear power plant licensing

    International Nuclear Information System (INIS)

    1984-04-01

    After an overview about the goals and general methods of probabilistic approaches in nuclear safety the main features of probabilistic safety or risk assessment (PRA) methods are discussed. Mostly in practical applications not a full-fledged PRA is applied but rather various levels of analysis leading from unavailability assessment of systems over the more complex analysis of the probable core damage stages up to the assessment of the overall health effects on the total population from a certain practice. The various types of application are discussed in relation to their limitation and benefits for different stages of design or operation of nuclear power plants. This gives guidance for licensing staff to judge the usefulness of the various methods for their licensing decisions. Examples of the application of probabilistic methods in several countries are given. Two appendices on reliability analysis and on containment and consequence analysis provide some more details on these subjects. (author)

  8. Approximative determination of failure probabilities in probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Riesch-Oppermann, H.; Brueckner, A.

    1987-01-01

    The possibility of using FORM in probabilistic fracture mechanics (PFM) is investigated. After a short review of the method and a description of some specific problems occurring in PFM applications, results obtained with FORM for the failure probabilities in a typical PFM problem (fatigue crack growth) are compared with those determined by a Monte Carlo simulation. (orig./HP)

  9. Cartea de Colorat a Experimentului ATLAS - ATLAS Experiment Colouring Book in Romanian

    CERN Multimedia

    Anthony, Katarina

    2018-01-01

    Language: Romanian - The ATLAS Experiment Colouring Book is a free-to-download educational book, ideal for kids aged 5-9. It aims to introduce children to the field of High-Energy Physics, as well as the work being carried out by the ATLAS Collaboration. Limba: Română - Cartea de Colorat a Experimentului ATLAS este o carte educativă gratuită, ideală pentru copiii cu vârsta cuprinsă între 5-9 ani. Scopul său este de a introduce copii în domeniul fizicii de înaltă energie, precum și activitatea desfășurată de colaborarea ATLAS.

  10. System Architecture Modeling for Technology Portfolio Management using ATLAS

    Science.gov (United States)

    Thompson, Robert W.; O'Neil, Daniel A.

    2006-01-01

    Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.

  11. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  12. Beam tests of ATLAS SCT silicon strip detector modules

    Czech Academy of Sciences Publication Activity Database

    Campabadal, F.; Fleta, C.; Key, M.; Böhm, Jan; Mikeštíková, Marcela; Šťastný, Jan

    2005-01-01

    Roč. 538, - (2005), s. 384-407 ISSN 0168-9002 R&D Projects: GA MŠk(CZ) 1P04LA212 Institutional research plan: CEZ:AV0Z10100502 Keywords : ATLAS * silicon * micro-strip * beam * test Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.224, year: 2005

  13. ATLAS ABCD Hybrid Fatal Charge Dosage Test

    CERN Document Server

    Kuhl, A; The ATLAS collaboration; Grillo, AA; Martinez-McKinney, F; Nielsen, J; Spencer, E; Wilder, M

    2011-01-01

    The Semi-Conductor Tracker (SCT) in the ATLAS experiment at the Large Hadron Collider (LHC) could be subject to various beam loss scenarios. If a severe beam loss event were to occur, it would be beneficial to know how SCT components would be affected. In the SCT detector modules, a key component is the ABCD application specific integrated circuit (ASIC), the onboard readout electronics of the system. This ASIC has design specifications that it should withstand a 5nC charge injection within 25 ns, which is the period of the LHC bunch crossing. The first test performed is designed to test this limit, reaching a maximum of 10nC deposited in 25 ns. One model for beam loss predicts that a large charge, of the order of 10^6 MIPS, could be incident on the detector. According to detector studies, this causes a local field breakdown between the backplane of the sensor, held at 450V, and the strips. In this case the signal seen on the readout strip has a rise time of about 1μs due to a charge screening effect. A seco...

  14. ATLAS ABCD Hybrid Fatal Charge Dosage Test

    CERN Document Server

    Kuhl, A; Grillo, A A; Martinez-McKinney, F; Nielsen, J; Spencer, E; Wilder, M

    2011-01-01

    The Semi-Conductor Tracker (SCT) in the ATLAS experiment at the Large Hadron Collider (LHC) could be subject to various beam loss scenarios. If a severe beam loss event were to occur, it would be beneficial to know how SCT components would be affected. In the SCT detector modules, a key component is the ABCD application specific integrated circuit (ASIC), the onboard readout electronics of the system. This ASIC has design specifications that it should withstand a 5 nC charge application within 25 ns, which is the period of the LHC bunch crossing. The first test performed is designed to test this limit, reaching a maximum of 10 nC deposited in 25 ns. One model for beam loss predicts that a large charge, of the order of 106 MIPS, could be incident on the detector. According to detector studies, this causes a local field breakdown between the backplane of the sensor, held at 450 V, and the strips. In this case the signal seen on the readout strip has a rise time of about 1 μs due to a charge screening effect. A...

  15. AGIS: The ATLAS Grid Information System

    CERN Document Server

    Anisenkov, Alexey; Di Girolamo, Alessandro; Gayazov, Stavro; Klimentov, Alexei; Oleynik, Danila; Senchenko, Alexander

    2012-01-01

    ATLAS is a particle physics experiment at the Large Hadron Collider at CERN. The experiment produces petabytes of data annually through simulation production and tens petabytes of data per year from the detector itself. The ATLAS Computing model embraces the Grid paradigm and a high degree of decentralization and computing resources able to meet ATLAS requirements of petabytes scale data operations. In this paper we present ATLAS Grid Information System (AGIS) designed to integrate configuration and status information about resources, services and topology of whole ATLAS Grid needed by ATLAS Distributed Computing applications and services.

  16. A tough truck for ATLAS

    CERN Multimedia

    2003-01-01

    One of the mobile support structures that will be used to manoeuvre and assemble components of the ATLAS detector in its cavern was put through its paces at the end of July and passed its load tests with flying colours. The tests, which involved the surveyors taking measurements to detect any load-induced mechanical deformations, were carried out in Building 191. "The "truck" has been subjected to static tests with loads of up to 1250 tonnes and can carry and transport on air cushions a nominal load of up to 1000 tonnes at a top speed of 30 cm per minute," explains project leader Tommi Nyman. "It took two weeks to assemble the truck's components, the last of which arrived at CERN on 24 June. It then took a further 20 days to load the truck up for the test." The 8.5 metre-high truck will be used for final assembly of some of the ATLAS components, including the calorimeters, in cavern UX15. This powerful device is the result of a collaboration between CERN and the Henryk Niewodniczanski Institute of Nuclear ...

  17. Brain Atlas Fusion from High-Thickness Diagnostic Magnetic Resonance Images by Learning-Based Super-Resolution.

    Science.gov (United States)

    Zhang, Jinpeng; Zhang, Lichi; Xiang, Lei; Shao, Yeqin; Wu, Guorong; Zhou, Xiaodong; Shen, Dinggang; Wang, Qian

    2017-03-01

    It is fundamentally important to fuse the brain atlas from magnetic resonance (MR) images for many imaging-based studies. Most existing works focus on fusing the atlases from high-quality MR images. However, for low-quality diagnostic images (i.e., with high inter-slice thickness), the problem of atlas fusion has not been addressed yet. In this paper, we intend to fuse the brain atlas from the high-thickness diagnostic MR images that are prevalent for clinical routines. The main idea of our works is to extend the conventional groupwise registration by incorporating a novel super-resolution strategy. The contribution of the proposed super-resolution framework is two-fold. First, each high-thickness subject image is reconstructed to be isotropic by the patch-based sparsity learning. Then, the reconstructed isotropic image is enhanced for better quality through the random-forest-based regression model. In this way, the images obtained by the super-resolution strategy can be fused together by applying the groupwise registration method to construct the required atlas. Our experiments have shown that the proposed framework can effectively solve the problem of atlas fusion from the low-quality brain MR images.

  18. Probabilistic model for fatigue crack growth and fracture of welded joints in civil engineering structures

    NARCIS (Netherlands)

    Maljaars, J.; Steenbergen, H.M.G.M.; Vrouwenvelder, A.C.W.M.

    2012-01-01

    This paper presents a probabilistic assessment model for linear elastic fracture mechanics (LEFM). The model allows the determination of the failure probability of a structure subjected to fatigue loading. The distributions of the random variables for civil engineering structures are provided, and

  19. Situational analysis: preliminary regional review of the Mental Health Atlas 2014.

    Science.gov (United States)

    Gater, R; Chew, Z; Saeed, K

    2015-09-28

    The WHO comprehensive Mental Health Action Plan 2013-2020 established goals and objectives that Member States have agreed to meet by 2020. To update the Atlas of Mental Health 2011, specific indicators from the Mental Health Action Plan and additional indicators on service coverage were incorporated into the questionnaire for the Atlas 2014. The data will help facilitate improvement in information gathering and focus efforts towards implementation of the Mental Health Action Plan. The questionnaire was completed by the national mental health focal point of each country. This preliminary review seeks to consolidate data from the initial response to the Atlas 2014 questionnaire by Member States in the Eastern Mediterranean Region. Data for this review were analysed for the whole Region, by health systems groupings and by individual countries. Where possible, data are compared with the Mental Health Atlas 2011 to give a longitudinal perspective.

  20. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    Energy Technology Data Exchange (ETDEWEB)

    HAYENGA, J.L.

    2006-12-19

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements.

  1. FEDERAL USERS CONFERENCE PRODUCT LINE TOOL SET (PLTS) MAP PRODUCTION SYSTEM (MPS) ATLAS CUSTOM GRIDS [Rev 0 was draft

    International Nuclear Information System (INIS)

    HAYENGA, J.L.

    2006-01-01

    Maps, and more importantly Atlases, are assisting the user community in managing a large land area with complex issues, the most complex of which is the management of nuclear waste. The techniques and experiences discussed herein were gained while developing several atlases for use at the US Department of Energy's Hanford Site. The user community requires the ability to locate not only waste sites, but other features as well. Finding a specific waste site on a map and in the field is a difficult task at a site the size of Hanford. To find a specific waste site, the user begins by locating the item or object in an index, then locating the feature on the corresponding map within an atlas. Locating features requires a method for indexing them. The location index and how to place it on a map or atlas is the central theme presented in this article. The user requirements for atlases forced the design team to develop new and innovative solutions for requirements that Product Line Tool Set (PLTS) Map Production System (MPS)-Atlas was not designed to handle. The layout of the most complex atlases includes custom reference grids, multiple data frames, multiple map series, and up to 250 maps. All of these functional requirements are at the extreme edge of the capabilities of PLTS MPS-Atlas. This document outlines the setup of an atlas using PLTS MPS-Atlas to meet these requirements

  2. Silicon Strip Detectors for the ATLAS sLHC Upgrade

    CERN Document Server

    Miñano, M; The ATLAS collaboration

    2011-01-01

    While the Large Hadron Collider (LHC) at CERN is continuing to deliver an ever-increasing luminosity to the experiments, plans for an upgraded machine called Super-LHC (sLHC) are progressing. The upgrade is foreseen to increase the LHC design luminosity by a factor ten. The ATLAS experiment will need to build a new tracker for sLHC operation, which needs to be suited to the harsh sLHC conditions in terms of particle rates. In order to cope with the increase in pile-up backgrounds at the higher luminosity, an all silicon detector is being designed. To successfully face the increased radiation dose, a new generation of extremely radiation hard silicon detectors is being designed. The left part of figure 1 shows the simulated layout for the ATLAS tracker upgrade to be installed in the volume taken up by the current ATLAS pixel, strip and transition radiation detectors. Silicon sensors with sufficient radiation hardness are the subject of an international R&D programme, working on pixel and strip sensors. The...

  3. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  4. MRI-based treatment planning with pseudo CT generated through atlas registration

    International Nuclear Information System (INIS)

    Uh, Jinsoo; Merchant, Thomas E.; Hua, Chiaho; Li, Yimei; Li, Xingyu

    2014-01-01

    Purpose: To evaluate the feasibility and accuracy of magnetic resonance imaging (MRI)-based treatment planning using pseudo CTs generated through atlas registration. Methods: A pseudo CT, providing electron density information for dose calculation, was generated by deforming atlas CT images previously acquired on other patients. The authors tested 4 schemes of synthesizing a pseudo CT from single or multiple deformed atlas images: use of a single arbitrarily selected atlas, arithmetic mean process using 6 atlases, and pattern recognition with Gaussian process (PRGP) using 6 or 12 atlases. The required deformation for atlas CT images was derived from a nonlinear registration of conjugated atlas MR images to that of the patient of interest. The contrasts of atlas MR images were adjusted by histogram matching to reduce the effect of different sets of acquisition parameters. For comparison, the authors also tested a simple scheme assigning the Hounsfield unit of water to the entire patient volume. All pseudo CT generating schemes were applied to 14 patients with common pediatric brain tumors. The image similarity of real patient-specific CT and pseudo CTs constructed by different schemes was compared. Differences in computation times were also calculated. The real CT in the treatment planning system was replaced with the pseudo CT, and the dose distribution was recalculated to determine the difference. Results: The atlas approach generally performed better than assigning a bulk CT number to the entire patient volume. Comparing atlas-based schemes, those using multiple atlases outperformed the single atlas scheme. For multiple atlas schemes, the pseudo CTs were similar to the real CTs (correlation coefficient, 0.787–0.819). The calculated dose distribution was in close agreement with the original dose. Nearly the entire patient volume (98.3%–98.7%) satisfied the criteria of chi-evaluation (<2% maximum dose and 2 mm range). The dose to 95% of the volume and the

  5. MRI-based treatment planning with pseudo CT generated through atlas registration

    Energy Technology Data Exchange (ETDEWEB)

    Uh, Jinsoo, E-mail: jinsoo.uh@stjude.org; Merchant, Thomas E.; Hua, Chiaho [Department of Radiological Sciences, St. Jude Children' s Research Hospital, Memphis, Tennessee 38105 (United States); Li, Yimei; Li, Xingyu [Department of Biostatistics, St. Jude Children' s Research Hospital, Memphis, Tennessee 38105 (United States)

    2014-05-15

    Purpose: To evaluate the feasibility and accuracy of magnetic resonance imaging (MRI)-based treatment planning using pseudo CTs generated through atlas registration. Methods: A pseudo CT, providing electron density information for dose calculation, was generated by deforming atlas CT images previously acquired on other patients. The authors tested 4 schemes of synthesizing a pseudo CT from single or multiple deformed atlas images: use of a single arbitrarily selected atlas, arithmetic mean process using 6 atlases, and pattern recognition with Gaussian process (PRGP) using 6 or 12 atlases. The required deformation for atlas CT images was derived from a nonlinear registration of conjugated atlas MR images to that of the patient of interest. The contrasts of atlas MR images were adjusted by histogram matching to reduce the effect of different sets of acquisition parameters. For comparison, the authors also tested a simple scheme assigning the Hounsfield unit of water to the entire patient volume. All pseudo CT generating schemes were applied to 14 patients with common pediatric brain tumors. The image similarity of real patient-specific CT and pseudo CTs constructed by different schemes was compared. Differences in computation times were also calculated. The real CT in the treatment planning system was replaced with the pseudo CT, and the dose distribution was recalculated to determine the difference. Results: The atlas approach generally performed better than assigning a bulk CT number to the entire patient volume. Comparing atlas-based schemes, those using multiple atlases outperformed the single atlas scheme. For multiple atlas schemes, the pseudo CTs were similar to the real CTs (correlation coefficient, 0.787–0.819). The calculated dose distribution was in close agreement with the original dose. Nearly the entire patient volume (98.3%–98.7%) satisfied the criteria of chi-evaluation (<2% maximum dose and 2 mm range). The dose to 95% of the volume and the

  6. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  7. Wind Atlas for Egypt

    DEFF Research Database (Denmark)

    The results of a comprehensive, 8-year wind resource assessment programme in Egypt are presented. The objective has been to provide reliable and accurate wind atlas data sets for evaluating the potential wind power output from large electricityproducing wind turbine installations. The regional wind...... climates of Egypt have been determined by two independent methods: a traditional wind atlas based on observations from more than 30 stations all over Egypt, and a numerical wind atlas based on long-term reanalysis data and a mesoscale model (KAMM). The mean absolute error comparing the two methods is about...... 10% for two large-scale KAMM domains covering all of Egypt, and typically about 5% for several smaller-scale regional domains. The numerical wind atlas covers all of Egypt, whereas the meteorological stations are concentrated in six regions. The Wind Atlas for Egypt represents a significant step...

  8. Wind Atlas for Egypt

    DEFF Research Database (Denmark)

    Mortensen, Niels Gylling; Said Said, Usama; Badger, Jake

    2006-01-01

    The results of a comprehensive, 8-year wind resource assessment programme in Egypt are presented. The objective has been to provide reliable and accurate wind atlas data sets for evaluating the potential wind power output from large electricityproducing wind turbine installations. The regional wind...... climates of Egypt have been determined by two independent methods: a traditional wind atlas based on observations from more than 30 stations all over Egypt, and a numerical wind atlas based on long-term reanalysis data and a mesoscale model (KAMM). The mean absolute error comparing the two methods is about...... 10% for two large-scale KAMM domains covering all of Egypt, and typically about 5% for several smaller-scale regional domains. The numerical wind atlas covers all of Egypt, whereas the meteorological stations are concentrated in six regions. The Wind Atlas for Egypt represents a significant step...

  9. Accelerator complex for a radioactive ion beam facility at ATLAS

    International Nuclear Information System (INIS)

    Nolen, J.A.

    1995-01-01

    Since the superconducting heavy ion linac ATLAS is an ideal post-accelerator for radioactive beams, plans are being developed for expansion of the facility with the addition of a driver accelerator, a production target/ion source combination, and a low q/m pre-accelerator for radioactive ions. A working group including staff from the ANL Physics Division and current ATLAS users are preparing a radioactive beam facility proposal. The present paper reviews the specifications of the accelerators required for the facility

  10. Grammaticality, Acceptability, and Probability: A Probabilistic View of Linguistic Knowledge.

    Science.gov (United States)

    Lau, Jey Han; Clark, Alexander; Lappin, Shalom

    2017-07-01

    The question of whether humans represent grammatical knowledge as a binary condition on membership in a set of well-formed sentences, or as a probabilistic property has been the subject of debate among linguists, psychologists, and cognitive scientists for many decades. Acceptability judgments present a serious problem for both classical binary and probabilistic theories of grammaticality. These judgements are gradient in nature, and so cannot be directly accommodated in a binary formal grammar. However, it is also not possible to simply reduce acceptability to probability. The acceptability of a sentence is not the same as the likelihood of its occurrence, which is, in part, determined by factors like sentence length and lexical frequency. In this paper, we present the results of a set of large-scale experiments using crowd-sourced acceptability judgments that demonstrate gradience to be a pervasive feature in acceptability judgments. We then show how one can predict acceptability judgments on the basis of probability by augmenting probabilistic language models with an acceptability measure. This is a function that normalizes probability values to eliminate the confounding factors of length and lexical frequency. We describe a sequence of modeling experiments with unsupervised language models drawn from state-of-the-art machine learning methods in natural language processing. Several of these models achieve very encouraging levels of accuracy in the acceptability prediction task, as measured by the correlation between the acceptability measure scores and mean human acceptability values. We consider the relevance of these results to the debate on the nature of grammatical competence, and we argue that they support the view that linguistic knowledge can be intrinsically probabilistic. Copyright © 2016 Cognitive Science Society, Inc.

  11. Recent ATLAS Articles on WLAP

    CERN Multimedia

    Goldfarb, S.

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project is a system for the archiving and publishing of multimedia presentations, using the Web as medium. We list here newly available WLAP items relating to ATLAS: June ATLAS Plenary Meeting Tutorial on Physics EDM and Tools (June) Freiburg Overview Week Ketevi Assamagan's Tutorial on Analysis Tools Click here to browse WLAP for all ATLAS lectures.

  12. Probabilistic risk assessment course documentation. Volume 2. Probability and statistics for PRA applications

    International Nuclear Information System (INIS)

    Iman, R.L.; Prairie, R.R.; Cramond, W.R.

    1985-08-01

    This course is intended to provide the necessary probabilistic and statistical skills to perform a PRA. Fundamental background information is reviewed, but the principal purpose is to address specific techniques used in PRAs and to illustrate them with applications. Specific examples and problems are presented for most of the topics

  13. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    Science.gov (United States)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  14. Teacher learning about probabilistic reasoning in relation to teaching it in an Advanced Certificate in Education (ACE programme

    Directory of Open Access Journals (Sweden)

    Faaiz Gierdien

    2008-02-01

    Full Text Available I report on what teachers in an Advanced Certificate in Education (ACE in-service programme learned about probabilistic reasoning in relation to teaching it. I worked 'on the inside' using my practice as a site for studying teaching and learning. The teachers were from three different towns in the Northern Cape province and had limited teaching contact time, as is the nature of ACE programmes. Findings revealed a complicated picture, where some teachers were prepared to consider influences of their intuitive probabilistic reasoning on formal probabilistic reasoning when it came to teaching. It was, however, the 'genuineness' of teacher learning which was the issue that the findings have to address. Therefore a speculative, hopeful strategy for affecting teacher learning in mathematics teacher education practice is to sustain disequilibrium between dichotomies such as formal and intuitive probabilistic reasoning, which has analogies in content and pedagogy, and subject matter and method.

  15. Estimate of the neutron fields in ATLAS based on ATLAS-MPX detectors data

    International Nuclear Information System (INIS)

    Bouchami, J; Dallaire, F; Gutierrez, A; Idarraga, J; Leroy, C; Picard, S; Scallon, O; Kral, V; PospIsil, S; Solc, J; Suk, M; Turecek, D; Vykydal, Z; Zemlieka, J

    2011-01-01

    The ATLAS-MPX detectors are based on Medipix2 silicon devices designed by CERN for the detection of different types of radiation. These detectors are covered with converting layers of 6 LiF and polyethylene (PE) to increase their sensitivity to thermal and fast neutrons, respectively. These devices allow the measurement of the composition and spectroscopic characteristics of the radiation field in ATLAS, particularly of neutrons. These detectors can operate in low or high preset energy threshold mode. The signature of particles interacting in a ATLAS-MPX detector at low threshold are clusters of adjacent pixels with different size and form depending on their type, energy and incidence angle. The classification of particles into different categories can be done using the geometrical parameters of these clusters. The Medipix analysis framework (MAFalda) - based on the ROOT application - allows the recognition of particle tracks left in ATLAS-MPX devices located at various positions in the ATLAS detector and cavern. The pattern recognition obtained from the application of MAFalda was configured to distinguish the response of neutrons from other radiation. The neutron response at low threshold is characterized by clusters of adjoining pixels (heavy tracks and heavy blobs) left by protons and heavy ions resulting from neutron interactions in the converting layers of the ATLAS-MPX devices. The neutron detection efficiency of ATLAS-MPX devices has been determined by the exposure of two detectors of reference to radionuclide sources of neutrons ( 252 Cf and 241 AmBe). With these results, an estimate of the neutrons fields produced at the devices locations during ATLAS operation was done.

  16. STEAM Education and Communication with Art at ATLAS and CMS

    CERN Document Server

    Paolucci, Pierluigi; Hoch, Michael; Adam-Bourdarios, Claire

    2016-01-01

    Recent developments in science education policy and practice suggest that successful learning in the 21st century requires the horizontal connectedness across areas of knowledge by linking the arts and humanities with science, technology, engineering and mathematics (STEM) subjects. The rapidly increasing STEAM movement calls for arts integration into science teaching and learning to help school students develop skills that are necessary to thrive in an innovation economy. Education and outreach in high-energy physics are not an exception to these developments. In this talk, I will describe specific education and outreach initiatives by the ATLAS and CMS collaborations that use a cross-disciplinary approach to engaging the public and especially young people not only with the excitement of scientific research in particle physics but also with its positive technological and social externalities.

  17. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  18. Static magnetic forces and torques in ATLAS

    International Nuclear Information System (INIS)

    Morozov, N.A.; Samsonov, E.V.; Vorozhtsov, S.B.

    1998-01-01

    The magnetic forces acting on the various metallic objects around the ATLAS detector, are the subject of the given paper. A system designer could use the information on global forces and torque acting on various components, obtained in this report, to optimize them. The results of force calculations could also serve as additional criteria for the replacement of the magnetic baseline material of various structures by nonmagnetic ones

  19. Development of probabilistic seismic hazard analysis for international sites, challenges and guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez Ares, Antonio, E-mail: antonio.fernandez@rizzoassoc.com [Paul C. Rizzo Associates, Inc., 500 Penn Center Boulevard, Penn Center East, Suite 100, Pittsburgh, PA 15235 (United States); Fatehi, Ali, E-mail: ali.fatehi@rizzoassoc.com [Paul C. Rizzo Associates, Inc., 500 Penn Center Boulevard, Penn Center East, Suite 100, Pittsburgh, PA 15235 (United States)

    2013-06-15

    Research highlights: ► Site-specific seismic hazard study and suggestions for overcoming those challenges that are inherent to the significant amounts of epistemic uncertainty for sites at remote locations. ► Main aspects of probabilistic seismic hazard analysis (PSHA). ► Regional and site geology in the context of a probabilistic seismic hazard analysis (PSHA), including state-of-the-art ground motion estimation methods, and geophysical conditions. ► Senior seismic hazard analysis (SSHAC) as a mean to incorporate the opinions and contributions of the informed scientific community. -- Abstract: This article provides guidance to conduct a site-specific seismic hazard study, giving suggestions for overcoming those challenges that are inherent to the significant amounts of epistemic uncertainty for sites at remote locations. The text follows the general process of a seismic hazard study, describing both the deterministic and probabilistic approaches. Key and controversial items are identified in the areas of recorded seismicity, seismic sources, magnitude, ground motion models, and local site effects. A case history corresponding to a seismic hazard study in the Middle East for a Greenfield site in a remote location is incorporated along the development of the recommendations. Other examples of analysis case histories throughout the World are presented as well.

  20. Monitored Drift Chambers in the ATLAS Detector

    CERN Multimedia

    Herten, G

    Monitored Drift Chambers (MDT) are used in the ATLAS Detector to measure the momentum of high energy muons. They consist of drift tubes, which are filled with an Ar-CO2 gas mixture at 3 bar gas pressure. About 1200 drift chambers are required for ATLAS. They are up to 6 m long. Nevertheless the position of every wire needs to be known with a precision of 20 µm within a chamber. In addition, optical alignment sensors are required to measure the relative position of adjacent chambers with a precision of 30µm. This gigantic task seems impossible at first instance. Indeed it took many years of R&D to invent the right tools and methods before the first chamber could be built according to specifications. Today, at the time when 50% of the chambers have been produced, we are confident that the goal for ATLAS can be reached. The mechanical precision of the chambers could be verified with the x-ray tomograph at CERN. This ingenious device, developed for the MDT system, is able to measure the wire position insid...

  1. ATLAS TDAQ System Administration: an overview and evolution

    CERN Document Server

    LEE, CJ; The ATLAS collaboration; BOGDANCHIKOV, A; BRASOLIN, F; CONTESCU, AC; DARLEA, G-L; KOROL, A; SCANNICCHIO, DA; TWOMEY, M; VALSAN, ML

    2013-01-01

    The ATLAS Trigger and Data Acquisition (TDAQ) system is responsible for the online processing of live data streaming from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN. The system processes the direct data readout from ~100 million channels on the detector through multiple trigger levels, selecting interesting events for analysis with a factor of $10^{7}$ reduction on the data rate with a latency of less than a few seconds. Most of the functionality is implemented on ~3000 servers composing the online farm. Due to the critical functionality of the system a sophisticated computing environment is maintained, covering the online farm and ATLAS control rooms, as well as a number of development and testing labs. The specificity of the system required the development of dedicated applications (e.g. ConfDB, BWM) for system configuration and maintenance; in parallel other Open Source tools (Puppet and Quattor) are used to centrally configure the operating systems. The health monitoring of the TDAQ s...

  2. ATLAS TDAQ System Administration: an overview and evolution

    CERN Document Server

    LEE, CJ; The ATLAS collaboration; BOGDANCHIKOV, A; BRASOLIN, F; CONTESCU, AC; DARLEA, GL; KOROL, A; SCANNICCHIO, DA; TWOMEY, M; VALSAN, ML

    2013-01-01

    The ATLAS Trigger and Data Acquisition (TDAQ) system is responsible for the online processing of live data streaming from the ATLAS experiment at the Large Hadron Collider (LHC) at CERN. The system processes the direct data readout from ~100 million channels on the detector through three trigger levels, selecting interesting events for analysis with a factor of 10^7 reduction on the data rate with a latency of less than a few seconds. Most of the functionality is implemented on ~3000 servers composing the online farm. Due to the critical functionality of the system a sophisticated computing environment is maintained, covering the online farm and ATLAS control rooms, as well as a number of development and testing labs. The specificity of the system required the development of dedicated applications (e.g. ConfDB, BWM) for system configuration and maintenance; in parallel other Open Source tools (Puppet and Quattor) are used to centrally configure the operating systems. The health monitoring of the TDAQ system h...

  3. Large Scale Software Building with CMake in ATLAS

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Obreshkov, Emil; Undrus, Alexander

    2016-01-01

    The offline software of the ATLAS experiment at the LHC (Large Hadron Collider) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector trigger system to select LHC collision events during data taking. ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the mentioned software packages. This also makes it possible to develop and test new and modifi...

  4. Large scale software building with CMake in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00218447; The ATLAS collaboration; Elmsheuser, Johannes; Obreshkov, Emil; Undrus, Alexander

    2017-01-01

    The offline software of the ATLAS experiment at the LHC (Large Hadron Collider) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector trigger system to select LHC collision events during data taking. ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the mentioned software packages. This also makes it possible to develop and test new and modifi...

  5. Managing demand uncertainty: probabilistic selling versus inventory substitution

    OpenAIRE

    Zhang, Y.; Hua, Guowei; Wang, Shouyang; Zhang, Juliang; Fernández Alarcón, Vicenç

    2018-01-01

    Demand variability is prevailing in the current rapidly changing business environment, which makes it difficult for a retailer that sells multiple substitutable products to determine the optimal inventory. To combat demand uncertainty, both strategies of inventory substitution and probabilistic selling can be used. Although the two strategies differ in operation, we believe that they share a common feature in combating demand uncertainty by encouraging some customers to give up some specific ...

  6. WE-E-213CD-02: Gaussian Weighted Multi-Atlas Based Segmentation for Head and Neck Radiotherapy Planning.

    Science.gov (United States)

    Peroni, M; Sharp, G C; Golland, P; Baroni, G

    2012-06-01

    To develop a multi-atlas segmentation strategy for IMRT head and neck therapy planning. The method was tested on thirty-one head and neck simulation CTs, without demographic or pathology pre-clustering. We compare Fixed Number (FN) and Thresholding (TH) selection (based on normalized mutual information ranking) of the atlases to be included for current patient segmentation. Next step is a pairwise demons Deformable Registration (DR) onto current patient CT. DR was extended to automatically compensate for patient different field of view. Propagated labels are combined according to a Gaussian Weighted (GW) fusion rule, adapted to poor soft tissues contrast. Agreement with manual segmentation was quantified in terms of Dice Similarity Coefficient (DSC). Selection methods, number of atlases used, as well as GW, average and majority voting fusion were discriminated by means of Friedman Test (a=5%). Experimental tuning of the algorithm parameters was performed on five patients, deriving an optimal configuration for each structure. DSC reduction was not significant when ten or more atlases are selected, whereas DSC for single most similar atlas selection is 10% lower in median. DSC of FN selection rule were significantly higher for most structures. Tubular structures may benefit from computing average contour rather than looking at the singular voxel contribution, whereas the best performing strategy for all other structures was GW. When half database is selected, final median DSC were 0.86, 0.80, 0.51, 0.81, 0.69 and 0.79 for mandible, spine, optical nerves, eyes, parotids and brainstem respectively. We developed an efficient algorithm for multiatlas based segmentation of planning CT volumes, based on DR and GW. FN selection of database atlases is foreseen to increase computational efficiency. The absence of clinical pre-clustering and specific imaging protocol on database subjects makes the results closer to real clinical application. "Progetto Roberto Rocca" funded by

  7. Risk management through dynamic technical specifications

    International Nuclear Information System (INIS)

    Klopp, George T.; Petersen, Thomas A.

    2004-01-01

    The wide deployment of plant specific probabilistic risk assessments for nuclear power plants has provided the means to effect a fresh risk management perspective and a fresh, risk based, regulatory outlook on nuclear power. There has been a great deal of conversation on risk based regulation within the U. S. nuclear power industry but, curiously, very little on effective risk management. This paper proposes a means to link the two subjects through the plant Technical Specifications. A revised concept for Technical Specifications is suggested which is based on deterministic analyses and probabilistic risk assessments for each plant. The revised Technical Specifications would consider, on a real-time basis, the exact state of the plant in terms of the status of key components and systems. It would depict current plant risk levels and compare those levels to the desired and limiting (alert/action) levels. It would advise the plant operator on the risk impact of proposed actions through a simple query system and illustrate the impact of such actions on plant status relative to designated risk values. The basis for the proposed approach lies in realistic deterministic plant analyses and probabilistic risk assessment (PRA) deployment tools being developed, in parallel, by a number of parties in the U.S. today. These PRAs are based primarily on the existing plant responses to Generic Letter 88-20, 'Individual Plant Examinations' (IPEs). Each of these tools allows the plant operator to input, on a real-time basis, the status of key equipment and systems. The tools then provide explicit illustrations of dependency effects; updated, 'real-time' risk status indications such as core damage frequency; and, in some cases, allow the operator to assess the risk impact of removing from service selected components for maintenance or testing. These systems generally operate on personal computers and provide nearly instantaneous responses to plant queries. Moving from these tools to

  8. ATLAS@Home looks for CERN volunteers

    CERN Multimedia

    Rosaria Marraffino

    2014-01-01

    ATLAS@Home is a CERN volunteer computing project that runs simulated ATLAS events. As the project ramps up, the project team is looking for CERN volunteers to test the system before planning a bigger promotion for the public.   The ATLAS@home outreach website. ATLAS@Home is a large-scale research project that runs ATLAS experiment simulation software inside virtual machines hosted by volunteer computers. “People from all over the world offer up their computers’ idle time to run simulation programmes to help physicists extract information from the large amount of data collected by the detector,” explains Claire Adam Bourdarios of the ATLAS@Home project. “The ATLAS@Home project aims to extrapolate the Standard Model at a higher energy and explore what new physics may look like. Everything we’re currently running is preparation for next year's run.” ATLAS@Home became an official BOINC (Berkeley Open Infrastructure for Network ...

  9. Prospective Randomized Double-Blind Pilot Study of Site-Specific Consensus Atlas Implementation for Rectal Cancer Target Volume Delineation in the Cooperative Group Setting

    International Nuclear Information System (INIS)

    Fuller, Clifton D.; Nijkamp, Jasper; Duppen, Joop C.; Rasch, Coen R.N.; Thomas, Charles R.; Wang, Samuel J.; Okunieff, Paul; Jones, William E.; Baseman, Daniel; Patel, Shilpen; Demandante, Carlo G.N.; Harris, Anna M.; Smith, Benjamin D.; Katz, Alan W.; McGann, Camille

    2011-01-01

    Purpose: Variations in target volume delineation represent a significant hurdle in clinical trials involving conformal radiotherapy. We sought to determine the effect of a consensus guideline-based visual atlas on contouring the target volumes. Methods and Materials: A representative case was contoured (Scan 1) by 14 physician observers and a reference expert with and without target volume delineation instructions derived from a proposed rectal cancer clinical trial involving conformal radiotherapy. The gross tumor volume (GTV), and two clinical target volumes (CTVA, including the internal iliac, presacral, and perirectal nodes, and CTVB, which included the external iliac nodes) were contoured. The observers were randomly assigned to receipt (Group A) or nonreceipt (Group B) of a consensus guideline and atlas for anorectal cancers and then instructed to recontour the same case/images (Scan 2). Observer variation was analyzed volumetrically using the conformation number (CN, where CN = 1 equals total agreement). Results: Of 14 evaluable contour sets (1 expert and 7 Group A and 6 Group B observers), greater agreement was found for the GTV (mean CN, 0.75) than for the CTVs (mean CN, 0.46-0.65). Atlas exposure for Group A led to significantly increased interobserver agreement for CTVA (mean initial CN, 0.68, after atlas use, 0.76; p = .03) and increased agreement with the expert reference (initial mean CN, 0.58; after atlas use, 0.69; p = .02). For the GTV and CTVB, neither the interobserver nor the expert agreement was altered after atlas exposure. Conclusion: Consensus guideline atlas implementation resulted in a detectable difference in interobserver agreement and a greater approximation of expert volumes for the CTVA but not for the GTV or CTVB in the specified case. Visual atlas inclusion should be considered as a feature in future clinical trials incorporating conformal RT.

  10. Prospective randomized double-blind pilot study of site-specific consensus atlas implementation for rectal cancer target volume delineation in the cooperative group setting

    Science.gov (United States)

    Fuller, Clifton D.; Nijkamp, Jasper; Duppen, Joop; Rasch, Coen R.N.; Thomas, Charles R.; Wang, Samuel J.; Okunieff, Paul; Jones, William E.; Baseman, Daniel; Patel, Shilpen; Demandante, Carlo G. N.; Harris, Anna M.; Smith, Benjamin D.; Katz, Alan W.; McGann, Camille; Harper, Jennifer L.; Chang, Daniel T.; Smalley, Stephen; Marshall, David T.; Goodman, Karyn A.; Papanikolaou, Niko; Kachnic, Lisa A.

    2010-01-01

    Purpose Variation in target volume delineation represents a significant hurdle in clinical trials involving conformal radiotherapy. We sought to determine the impact of a consensus guideline-based visual atlas on contouring of target volumes. Methods A representative case and target volume delineation instructions derived from a proposed rectal cancer clinical trial involving conformal radiotherapy were contoured (Scan1) by 14 physician observers and a reference expert. Gross tumor volume (GTV), and 2 clinical target volumes (CTVA, comprising internal iliac, pre-sacral, and peri-rectal nodes, and CTVB, external iliac nodes) were contoured. Observers were randomly assigned to receipt (Group_A) /non-receipt (Group_B) of a consensus guideline and atlas for anorectal cancers, then instructed to re-contour the same case/images (Scan2). Observer variation was analyzed volumetrically using conformation number (CN, where CN=1 equals a total agreement). Results In 14 evaluable contour sets (1 expert, 7 Group_A, 6 Group_B), there was greater agreement for GTV (mean CN 0.75) than CTVs (mean CN 0.46–0.65). Atlas exposure for Group_A led to a significant increased inter-observer agreement for CTVA (mean initial CN 0.68, post-atlas 0.76; p=0.03), as well as increased agreement with the expert reference (initial mean CN 0.58, 0.69 post-atlas; p=0.02). For GTV and CTVB, neither inter-observer nor expert agreement was altered after atlas exposure. Conclusion Consensus guideline atlas implementation resulted in a detectable difference in inter-observer agreement and greater approximation of expert volumes for CTVA, but not GTV or CTVB, in the specified case. Visual atlas inclusion should be considered as a feature in future clinical trials incorporating conformal radiotherapy. PMID:20400244

  11. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  12. Probabilistic estimates of drought impacts on agricultural production

    Science.gov (United States)

    Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.

    2017-08-01

    Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.

  13. Probabilistic forecasting for extreme NO2 pollution episodes

    International Nuclear Information System (INIS)

    Aznarte, José L.

    2017-01-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. - Highlights: • A new probabilistic forecasting system is presented to predict NO 2 concentrations. • While predicting the full distribution, it also outperforms other point-forecasting models. • Forecasts show good properties and peak concentrations are properly predicted. • It forecasts the probability of exceedance of thresholds, key to decision makers. • Relative forecasting importance of the variables is obtained as a by-product.

  14. Specification of test criteria and probabilistic approach: the case of plutonium air transport

    International Nuclear Information System (INIS)

    Hubert, P.; Pages, P.; Ringot, C.; Tomachewsky, E.

    1989-03-01

    The safety of international transportation relies on compliance with IAEA regulations which specify a serie of test which the package is supposed to withstand. For Plutonium air transport some national regulations are more stringent than the IAEA one, namely the US one. For example the drop test is to be performed at 129 m.s -1 instead of 13.4 m.s -1 . The development of international Plutonium exchanges has raised the question of the adequacy of both those standards. The purpose of this paper is to show how a probabilistic approach helps in assessing the efficiency of a move towards more stringent tests

  15. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  16. Implementation of the ATLAS trigger within the ATLAS Multi­Threaded Software Framework AthenaMT

    CERN Document Server

    Wynne, Benjamin; The ATLAS collaboration

    2016-01-01

    We present an implementation of the ATLAS High Level Trigger that provides parallel execution of trigger algorithms within the ATLAS multi­threaded software framework, AthenaMT. This development will enable the ATLAS High Level Trigger to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data­taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the High Level Trigger input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that process events independently, executing algorithms sequentially in each process. AthenaMT will provide a fully multi­threaded env...

  17. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait

    Science.gov (United States)

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E.; del-Ama, Antonio J.; Dimbwadyo, Iris; Moreno, Juan C.; Florez, Julian; Pons, Jose L.

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton. PMID:29755336

  18. A Subject-Specific Kinematic Model to Predict Human Motion in Exoskeleton-Assisted Gait.

    Science.gov (United States)

    Torricelli, Diego; Cortés, Camilo; Lete, Nerea; Bertelsen, Álvaro; Gonzalez-Vargas, Jose E; Del-Ama, Antonio J; Dimbwadyo, Iris; Moreno, Juan C; Florez, Julian; Pons, Jose L

    2018-01-01

    The relative motion between human and exoskeleton is a crucial factor that has remarkable consequences on the efficiency, reliability and safety of human-robot interaction. Unfortunately, its quantitative assessment has been largely overlooked in the literature. Here, we present a methodology that allows predicting the motion of the human joints from the knowledge of the angular motion of the exoskeleton frame. Our method combines a subject-specific skeletal model with a kinematic model of a lower limb exoskeleton (H2, Technaid), imposing specific kinematic constraints between them. To calibrate the model and validate its ability to predict the relative motion in a subject-specific way, we performed experiments on seven healthy subjects during treadmill walking tasks. We demonstrate a prediction accuracy lower than 3.5° globally, and around 1.5° at the hip level, which represent an improvement up to 66% compared to the traditional approach assuming no relative motion between the user and the exoskeleton.

  19. Probabilistic fracture mechanics applied for lbb case study: international benchmark

    International Nuclear Information System (INIS)

    Radu, V.

    2015-01-01

    An application of probabilistic fracture mechanics to evaluate the structural integrity for a case study chosen from experimental Mock-ups of FP7 STYLE project is described. The reliability model for probabilistic structural integrity, focused on the assessment of TWC in the pipe weld under complex loading (bending moment and residual stress) has been setup. The basic model is the model of fracture for through-wall cracked pipe under elastic-plastic conditions. The corresponding structural reliability approach is developed with the probabilities of failure associated with maximum load for crack initiation, net-section collapse but also the evaluation the instability loads. The probabilities of failure for a through-wall crack in a pipe subject to pure bending are evaluated by using crude Monte Carlo simulations. The results from the international benchmark are presented for the mentioned case in the context of ageing and lifetime management of pressure boundary/pressure circuit component. (authors)

  20. The ATLAS Production System Evolution

    CERN Document Server

    Borodin, Mikhail; The ATLAS collaboration

    2017-01-01

    The second generation of the ATLAS Production System called ProdSys2 is a distributed workload manager that runs daily hundreds of thousands of jobs, from dozens of different ATLAS-specific workflows, across more than a hundred heterogeneous sites. It achieves high utilization by combining dynamic job definition based upon many criteria, such as input and output size, memory requirements and CPU consumption, with manageable scheduling policies and by supporting different kinds of computational resources, such as GRID, clouds, supercomputers and volunteer computers. The system dynamically assigns a group of jobs (task) to a group of geographically distributed computing resources. Dynamic assignment and resource utilization is one of the major features of the system. The Production System has a sophisticated job fault recovery mechanism, which efficiently allows running multi-terabyte tasks without human intervention. We have implemented new features which allow automatic task submission and chaining of differe...

  1. Probabilistic predictive modelling of carbon nanocomposites for medical implants design.

    Science.gov (United States)

    Chua, Matthew; Chui, Chee-Kong

    2015-04-01

    Modelling of the mechanical properties of carbon nanocomposites based on input variables like percentage weight of Carbon Nanotubes (CNT) inclusions is important for the design of medical implants and other structural scaffolds. Current constitutive models for the mechanical properties of nanocomposites may not predict well due to differences in conditions, fabrication techniques and inconsistencies in reagents properties used across industries and laboratories. Furthermore, the mechanical properties of the designed products are not deterministic, but exist as a probabilistic range. A predictive model based on a modified probabilistic surface response algorithm is proposed in this paper to address this issue. Tensile testing of three groups of different CNT weight fractions of carbon nanocomposite samples displays scattered stress-strain curves, with the instantaneous stresses assumed to vary according to a normal distribution at a specific strain. From the probabilistic density function of the experimental data, a two factors Central Composite Design (CCD) experimental matrix based on strain and CNT weight fraction input with their corresponding stress distribution was established. Monte Carlo simulation was carried out on this design matrix to generate a predictive probabilistic polynomial equation. The equation and method was subsequently validated with more tensile experiments and Finite Element (FE) studies. The method was subsequently demonstrated in the design of an artificial tracheal implant. Our algorithm provides an effective way to accurately model the mechanical properties in implants of various compositions based on experimental data of samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Report to users of ATLAS

    International Nuclear Information System (INIS)

    Ahmad, I.; Glagola, B.

    1997-03-01

    This report covers the following topics: (1) status of the ATLAS accelerator; (2) progress in R and D towards a proposal for a National ISOL Facility; (3) highlights of recent research at ATLAS; (4) the move of gammasphere from LBNL to ANL; (5) Accelerator Target Development laboratory; (6) Program Advisory Committee; (7) ATLAS User Group Executive Committee; and (8) ATLAS user handbook available in the World Wide Web. A brief summary is given for each topic

  3. Estimate of the neutron fields in ATLAS based on ATLAS-MPX detectors data

    Energy Technology Data Exchange (ETDEWEB)

    Bouchami, J; Dallaire, F; Gutierrez, A; Idarraga, J; Leroy, C; Picard, S; Scallon, O [Universite de Montreal, Montreal, Quebec H3C 3J7 (Canada); Kral, V; PospIsil, S; Solc, J; Suk, M; Turecek, D; Vykydal, Z; Zemlieka, J, E-mail: scallon@lps.umontreal.ca [Institute of Experimental and Applied Physics of the CTU in Prague, Horska 3a/22, CZ-12800 Praha2 - Albertov (Czech Republic)

    2011-01-15

    The ATLAS-MPX detectors are based on Medipix2 silicon devices designed by CERN for the detection of different types of radiation. These detectors are covered with converting layers of {sup 6}LiF and polyethylene (PE) to increase their sensitivity to thermal and fast neutrons, respectively. These devices allow the measurement of the composition and spectroscopic characteristics of the radiation field in ATLAS, particularly of neutrons. These detectors can operate in low or high preset energy threshold mode. The signature of particles interacting in a ATLAS-MPX detector at low threshold are clusters of adjacent pixels with different size and form depending on their type, energy and incidence angle. The classification of particles into different categories can be done using the geometrical parameters of these clusters. The Medipix analysis framework (MAFalda) - based on the ROOT application - allows the recognition of particle tracks left in ATLAS-MPX devices located at various positions in the ATLAS detector and cavern. The pattern recognition obtained from the application of MAFalda was configured to distinguish the response of neutrons from other radiation. The neutron response at low threshold is characterized by clusters of adjoining pixels (heavy tracks and heavy blobs) left by protons and heavy ions resulting from neutron interactions in the converting layers of the ATLAS-MPX devices. The neutron detection efficiency of ATLAS-MPX devices has been determined by the exposure of two detectors of reference to radionuclide sources of neutrons ({sup 252}Cf and {sup 241}AmBe). With these results, an estimate of the neutrons fields produced at the devices locations during ATLAS operation was done.

  4. Estimate of the neutron fields in ATLAS based on ATLAS-MPX detectors data

    Science.gov (United States)

    Bouchami, J.; Dallaire, F.; Gutiérrez, A.; Idarraga, J.; Král, V.; Leroy, C.; Picard, S.; Pospíšil, S.; Scallon, O.; Solc, J.; Suk, M.; Turecek, D.; Vykydal, Z.; Žemlièka, J.

    2011-01-01

    The ATLAS-MPX detectors are based on Medipix2 silicon devices designed by CERN for the detection of different types of radiation. These detectors are covered with converting layers of 6LiF and polyethylene (PE) to increase their sensitivity to thermal and fast neutrons, respectively. These devices allow the measurement of the composition and spectroscopic characteristics of the radiation field in ATLAS, particularly of neutrons. These detectors can operate in low or high preset energy threshold mode. The signature of particles interacting in a ATLAS-MPX detector at low threshold are clusters of adjacent pixels with different size and form depending on their type, energy and incidence angle. The classification of particles into different categories can be done using the geometrical parameters of these clusters. The Medipix analysis framework (MAFalda) — based on the ROOT application — allows the recognition of particle tracks left in ATLAS-MPX devices located at various positions in the ATLAS detector and cavern. The pattern recognition obtained from the application of MAFalda was configured to distinguish the response of neutrons from other radiation. The neutron response at low threshold is characterized by clusters of adjoining pixels (heavy tracks and heavy blobs) left by protons and heavy ions resulting from neutron interactions in the converting layers of the ATLAS-MPX devices. The neutron detection efficiency of ATLAS-MPX devices has been determined by the exposure of two detectors of reference to radionuclide sources of neutrons (252Cf and 241AmBe). With these results, an estimate of the neutrons fields produced at the devices locations during ATLAS operation was done.

  5. Examining geographic patterns of mortality: the atlas of mortality in small areas in Spain (1987-1995).

    Science.gov (United States)

    Benach, Joan; Yasui, Yutaka; Borrell, Carme; Rosa, Elisabeth; Pasarín, M Isabel; Benach, Núria; Español, Esther; Martínez, José Miguel; Daponte, Antonio

    2003-06-01

    Small-area mortality atlases have been demonstrated to be a useful tool for both showing general geographical patterns in mortality data and identifying specific high-risk locations. In Spain no study has so far systematically examined geographic patterns of small-area mortality for the main causes of death. This paper presents the main features, contents and potential uses of the Spanish Atlas of Mortality in small areas (1987-1995). Population data for 2,218 small areas were drawn from the 1991 Census. Aggregated mortality data for 14 specific causes of death for the period 1987-1995 were obtained for each small area. Empirical Bayes-model-based estimates of age-adjusted relative risk were displayed in small-area maps for each cause/gender/age group (0-64 or 65 and over) combination using the same range of values (i.e. septiles) and colour schemes. The 'Spanish Atlas of Mortality' includes multiple choropleth (area-shaded) small-area maps and graphs to answer different questions about the data. The atlas is divided into three main sections. Section 1 includes the methods and comments on the main maps. Section 2 presents a two-page layout for each leading cause of death by gender including 1) a large map with relative risk estimates, 2) a map that indicates high- and low-risk small areas, 3) a graph with median and interquartile range of relative risk estimates for 17 large regions of Spain, and 4) relative-risk maps for two age groups. Section 3 provides specific information on the geographical units of analysis, statistical methods and other supplemental maps. The 'Spanish Atlas of Mortality' is a useful tool for examining geographical patterns of mortality risk and identifying specific high-risk areas. Mortality patterns displayed in the atlas may have important implications for research and social/health policy planning purposes.

  6. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  7. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI.

    Science.gov (United States)

    Iglesias, Juan Eugenio; Augustinack, Jean C; Nguyen, Khoa; Player, Christopher M; Player, Allison; Wright, Michelle; Roy, Nicole; Frosch, Matthew P; McKee, Ann C; Wald, Lawrence L; Fischl, Bruce; Van Leemput, Koen

    2015-07-15

    Automated analysis of MRI data of the subregions of the hippocampus requires computational atlases built at a higher resolution than those that are typically used in current neuroimaging studies. Here we describe the construction of a statistical atlas of the hippocampal formation at the subregion level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise delineations were made possible by the extraordinary resolution of the scans. In addition to the subregions, manual annotations for neighboring structures (e.g., amygdala, cortex) were obtained from a separate dataset of in vivo, T1-weighted MRI scans of the whole brain (1mm resolution). The manual labels from the in vivo and ex vivo data were combined into a single computational atlas of the hippocampal formation with a novel atlas building algorithm based on Bayesian inference. The resulting atlas can be used to automatically segment the hippocampal subregions in structural MRI images, using an algorithm that can analyze multimodal data and adapt to variations in MRI contrast due to differences in acquisition hardware or pulse sequences. The applicability of the atlas, which we are releasing as part of FreeSurfer (version 6.0), is demonstrated with experiments on three different publicly available datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer's disease subjects and elderly controls with 88% accuracy in standard resolution (1mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and

  8. Probabilistic analysis in the life cycle management of construction deficiencies

    International Nuclear Information System (INIS)

    Zebroski, E.; Starr, C.

    1985-01-01

    The author discusses the urgent need for better systems and procedures for evaluating actual or suspected construction deficiencies in nuclear power plants. The following topics of interest are discussed: summary of tools available, use of plant-specific probabilistic risk assessments, general process for the rational management of construction deficiencies, rationales for the timing of required corrective actions, example of deficiency management in France, proposed screening process, deficiencies calling for corrective actions, institutional obstacles, and specific recommendations

  9. ATLAS Colouring Book

    CERN Multimedia

    Anthony, Katarina

    2016-01-01

    The ATLAS Experiment Colouring Book is a free-to-download educational book, ideal for kids aged 5-9. It aims to introduce children to the field of High-Energy Physics, as well as the work being carried out by the ATLAS Collaboration.

  10. Bayesian parameter estimation in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Siu, Nathan O.; Kelly, Dana L.

    1998-01-01

    Bayesian statistical methods are widely used in probabilistic risk assessment (PRA) because of their ability to provide useful estimates of model parameters when data are sparse and because the subjective probability framework, from which these methods are derived, is a natural framework to address the decision problems motivating PRA. This paper presents a tutorial on Bayesian parameter estimation especially relevant to PRA. It summarizes the philosophy behind these methods, approaches for constructing likelihood functions and prior distributions, some simple but realistic examples, and a variety of cautions and lessons regarding practical applications. References are also provided for more in-depth coverage of various topics

  11. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  12. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  13. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  14. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  15. Evolution of the ReadOut System of the ATLAS experiment

    CERN Document Server

    Borga, A; The ATLAS collaboration; Joos, M; Schumacher, J; Tremblet, L; Vandelli, W; Vermeulen, J; Werner, P; Wickens, F

    2014-01-01

    The ReadOut System (ROS) is a central and essential part of the ATLAS data-acquisition system. It receives and buffers event data accepted from all sub-detectors and first-level trigger subsystems. Event data are subsequently forwarded to the High-Level Trigger system and Event Builder via a GbE-based network. The ATLAS ROS will be completely renewed in view of the demanding conditions expected during LHC Run 2 and Run 3. The new ROS will consist of roughly 100 Linux-based 2U-high rack-mounted server PCs, each equipped with 2 PCIe I/O cards and four 10GbE interfaces. The FPGA-based PCIe I/O cards, developed by the ALICE collaboration, will be configured with ATLAS-specific firmware, called RobinNP. They will provide connectivity to about 2000 point-to-point optical links conveying the ATLAS event data. This dense configuration provides an excellent test bench for studying I/O efficiency and challenges in current COTS PC architectures with non-uniform memory and I/O access paths. In this paper the requirements...

  16. Evolution of the ReadOut System of the ATLAS experiment

    CERN Document Server

    Borga, A; The ATLAS collaboration; Green, B; Kugel, A; Joos, M; Panduro Vazquez, W; Schumacher, J; Teixeira-Dias, P; Tremblet, L; Vandelli, W; Vermeulen, J; Werner, P; Wickens, F

    2014-01-01

    The ReadOut System (ROS) is a central and essential part of the ATLAS DAQ system. It receives and buffers data of events accepted by the first-level trigger from all subdetectors and first-level trigger subsystems. Event data are subsequently forwarded to the High-Level Trigger system and Event Builder via a 1 GbE-based network. The ATLAS ROS is completely renewed in view of the demanding conditions expected during LHC Run 2 and Run 3, to replace obsolete technologies and space constraints require it to be compact. The new ROS will consist of roughly 100 Linux-based 2U high rack mounted server PCs, each equipped with 2 PCIe I/O cards and two four 10 GbE interfaces. The FPGA-based PCIe I/O cards, developed by the ALICE collaboration, will be configured with ATLAS-specific firmware, the so-called RobinNP firmware. They will provide the connectivity to about 2000 optical point-to-point links conveying the ATLAS event data. This dense configuration provides an excellent test bench for studying I/O efficiency and ...

  17. The Use and Development of Probabilistic Safety Assessment in NEA Member Countries

    International Nuclear Information System (INIS)

    2002-01-01

    these countries as of 1 April 2002. Since this information is subject to change, due to, advances in methodologies, changes in research programmes, etc., the reader should take these types of occurrences into account. Chapter 2 (PSA Environment) defines the background information on the use of PSA in Member countries. The various political and historical development of each nation contributing to this report has led to differences in how the use PSA has matured. The evolution of PSAs, whether or not they are legally required, who performs the PSA and who reviews them are included in this section. Chapter 3 (Quantitative Safety Guidelines) builds on the information provided in Chapter 1 and presents an overview of each country's practices in regard to the use of quantitative and probabilistic safety guidelines. Chapter 4 (Status of PSA Programmes) provides a summary of the current status of PSA Programmes in Member countries. An appendix is provided to this chapter (Appendix A) which provides a tabular form of the status. Chapter 5 (PSA Applications) presents information on how PSAs are being applied and identifies specific applications being used for decision-making. Chapter 6 (PSA Related Research and Development) provides input from the Member countries on current and proposed area of PSA research activities. Chapter 7 (PSA Plant Based Modifications) presents information on insights that have been gained and the role PSA has had in safety decision-making. References are provided to establish a contact point for obtaining further information or details about the PSA Programmes within the contributing countries and for providing information on specific documents Appendix A provides a tabular form of status of PSA programmes in Member countries

  18. ATLAS Cloud R&D

    CERN Document Server

    Panitkin, S; The ATLAS collaboration; Caballero Bejar, J; Benjamin, D; DiGirolamo, A; Gable, I; Hendrix, V; Hover, J; Kucharczuk, K; Medrano LLamas, R; Love, P; Ohman, H; Paterson, M; Sobie, R; Taylor, R; Walker, R; Zaytsev, A

    2014-01-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained...

  19. ATLAS MPGD production status

    CERN Document Server

    Schioppa, Marco; The ATLAS collaboration

    2018-01-01

    Micromegas (MICRO MEsh GAseous Structure) chambers are Micro-Pattern Gaseous Detectors designed to provide a high spatial resolution and reasonable good time resolution in highly irradiated environments. In 2007 an ambitious long-term R\\&D activity was started in the context of the ATLAS experiment, at CERN: the Muon ATLAS Micromegas Activity (MAMMA). After years of tests on prototypes and technology breakthroughs, Micromegas chambers were chosen as tracking detectors for an upgrade of the ATLAS Muon Spectrometer. These novel detectors will be installed in 2020 at the end of the second long shutdown of the Large Hadron Collider, and will serve mainly as precision detectors in the innermost part of the forward ATLAS Muon Spectrometer. Four different types of Micromegas modules, eight layers each, up to $3 m^2$ area (of unprecedented size), will cover a surface of $150 m^2$ for a total active area of about $1200 m^2$. With this upgrade the ATLAS muon system will maintain the full acceptance of its excellent...

  20. ATLAS' major cooling project

    CERN Multimedia

    2005-01-01

    In 2005, a considerable effort has been put into commissioning the various units of ATLAS' complex cryogenic system. This is in preparation for the imminent cooling of some of the largest components of the detector in their final underground configuration. The liquid helium and nitrogen ATLAS refrigerators in USA 15. Cryogenics plays a vital role in operating massive detectors such as ATLAS. In many ways the liquefied argon, nitrogen and helium are the life-blood of the detector. ATLAS could not function without cryogens that will be constantly pumped via proximity systems to the superconducting magnets and subdetectors. In recent weeks compressors at the surface and underground refrigerators, dewars, pumps, linkages and all manner of other components related to the cryogenic system have been tested and commissioned. Fifty metres underground The helium and nitrogen refrigerators, installed inside the service cavern, are an important part of the ATLAS cryogenic system. Two independent helium refrigerators ...