WorldWideScience

Sample records for highly segmented hpge-detectors

  1. Compton imaging with a highly-segmented, position-sensitive HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Steinbach, T.; Hirsch, R.; Reiter, P.; Birkenbach, B.; Bruyneel, B.; Eberth, J.; Hess, H.; Lewandowski, L. [Universitaet zu Koeln, Institut fuer Kernphysik, Koeln (Germany); Gernhaeuser, R.; Maier, L.; Schlarb, M.; Weiler, B.; Winkel, M. [Technische Universitaet Muenchen, Physik Department, Garching (Germany)

    2017-02-15

    A Compton camera based on a highly-segmented high-purity germanium (HPGe) detector and a double-sided silicon-strip detector (DSSD) was developed, tested, and put into operation; the origin of γ radiation was determined successfully. The Compton camera is operated in two different modes. Coincidences from Compton-scattered γ-ray events between DSSD and HPGe detector allow for best angular resolution; while the high-efficiency mode takes advantage of the position sensitivity of the highly-segmented HPGe detector. In this mode the setup is sensitive to the whole 4π solid angle. The interaction-point positions in the 36-fold segmented large-volume HPGe detector are determined by pulse-shape analysis (PSA) of all HPGe detector signals. Imaging algorithms were developed for each mode and successfully implemented. The angular resolution sensitively depends on parameters such as geometry, selected multiplicity and interaction-point distances. Best results were obtained taking into account the crosstalk properties, the time alignment of the signals and the distance metric for the PSA for both operation modes. An angular resolution between 13.8 {sup circle} and 19.1 {sup circle}, depending on the minimal interaction-point distance for the high-efficiency mode at an energy of 1275 keV, was achieved. In the coincidence mode, an increased angular resolution of 4.6 {sup circle} was determined for the same γ-ray energy. (orig.)

  2. Crosstalk corrections for improved energy resolution with highly segmented HPGe-detectors

    International Nuclear Information System (INIS)

    Bruyneel, Bart; Reiter, Peter; Wiens, Andreas; Eberth, Juergen; Hess, Herbert; Pascovici, Gheorghe; Warr, Nigel; Aydin, Sezgin; Bazzacco, Dino; Recchia, Francesco

    2009-01-01

    Crosstalk effects of 36-fold segmented, large volume AGATA HPGe detectors cause shifts in the γ-ray energy measured by the inner core and outer segments as function of segment multiplicity. The positions of the segment sum energy peaks vary approximately linearly with increasing segment multiplicity. The resolution of these peaks deteriorates also linearly as a function of segment multiplicity. Based on single event treatment, two methods were developed in the AGATA Collaboration to correct for the crosstalk induced effects by employing a linear transformation. The matrix elements are deduced from coincidence measurements of γ-rays of various energies as recorded with digital electronics. A very efficient way to determine the matrix elements is obtained by measuring the base line shifts of untriggered segments using γ-ray detection events in which energy is deposited in a single segment. A second approach is based on measuring segment energy values for γ-ray interaction events in which energy is deposited in only two segments. After performing crosstalk corrections, the investigated detector shows a good fit between the core energy and the segment sum energy at all multiplicities and an improved energy resolution of the segment sum energy peaks. The corrected core energy resolution equals the segment sum energy resolution which is superior at all folds compared to the individual uncorrected energy resolutions. This is achieved by combining the two independent energy measurements with the core contact on the one hand and the segment contacts on the other hand.

  3. Pulse shape analysis optimization with segmented HPGe-detectors

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, Lars; Birkenbach, Benedikt; Reiter, Peter [Institute for Nuclear Physics, University of Cologne (Germany); Bruyneel, Bart [CEA, Saclay (France); Collaboration: AGATA-Collaboration

    2014-07-01

    Measurements with the position sensitive, highly segmented AGATA HPGe detectors rely on the gamma-ray-tracking GRT technique which allows to determine the interaction point of the individual gamma-rays hitting the detector. GRT is based on a pulse shape analysis PSA of the preamplifier signals from the 36 segments and the central electrode of the detector. The achieved performance and position resolution of the AGATA detector is well within the specifications. However, an unexpected inhomogeneous distribution of interaction points inside the detector volume is observed as a result of the PSA even when the measurement is performed with an isotropically radiating gamma ray source. The clustering of interaction points motivated a study in order to optimize the PSA algorithm or its ingredients. Position resolution results were investigated by including contributions from differential crosstalk of the detector electronics, an improved preamplifier response function and a new time alignment. Moreover the spatial distribution is quantified by employing different χ{sup 2}-minimization procedures.

  4. Crosstalk properties of 36-fold segmented symmetric hexagonal HPGe detectors

    International Nuclear Information System (INIS)

    Bruyneel, Bart; Reiter, Peter; Wiens, Andreas; Eberth, Juergen; Hess, Herbert; Pascovici, Gheorghe; Warr, Nigel; Weisshaar, Dirk

    2009-01-01

    Crosstalk properties of three 36-fold segmented, symmetric, large volume, HPGe detectors from the AGATA Collaboration were deduced from coincidence measurements performed with digitized segment and core signals after interaction of γ rays with energies of 1.33 MeV. The mean energy values measured by the core signal fluctuate for γ-ray interactions with energy deposited in two segments. A regular pattern is observed depending on the hit segment combinations. The core energy shifts deviate 0.03-0.06% from the average energy calibration. The segment-sum energy is reduced with respect to the core energy as a function of the decoupling capacitance and the segment multiplicity. The deviation of the segment-sum energies from multiplicity two events fluctuates within an interval of less than 0.1% depending on the different segment combinations. The energy shifts caused by crosstalk for the core and segment signals are comparable for all three detectors. A linear electronic model of the detector and preamplifier assembly was developed to evaluate the results. The fold-dependent energy shifts of the segment-sum energies are reproduced. The model yields a constant shift in all segments, proportional to the core signal. The measured crosstalk pattern and its intensity variation in the segments agree well with the calculated values. The regular variation observed in the core energies cannot be directly related to crosstalk and may be caused by other effects like electron trapping.

  5. Pulse shape analysis and position determination in segmented HPGe detectors: The AGATA detector library

    Energy Technology Data Exchange (ETDEWEB)

    Bruyneel, B. [Universitaet zu Koeln, Institut fuer Kernphysik, Koeln (Germany); Service de Physique Nucleaire, CEA Saclay, Gif-sur-Yvette (France); Birkenbach, B.; Reiter, P. [Universitaet zu Koeln, Institut fuer Kernphysik, Koeln (Germany)

    2016-03-15

    The AGATA Detector Library (ADL) was developed for the calculation of signals from highly segmented large volume high-purity germanium (HPGe) detectors. ADL basis sets comprise a huge amount of calculated position-dependent detector pulse shapes. A basis set is needed for Pulse Shape Analysis (PSA). By means of PSA the interaction position of a γ -ray inside the active detector volume is determined. Theoretical concepts of the calculations are introduced and cover the relevant aspects of signal formation in HPGe. The approximations and the realization of the computer code with its input parameters are explained in detail. ADL is a versatile and modular computer code; new detectors can be implemented in this library. Measured position resolutions of the AGATA detectors based on ADL are discussed. (orig.)

  6. Performance of HPGe detectors in high magnetic fields

    Czech Academy of Sciences Publication Activity Database

    Lorente, A.S.; Achenbach, P.; Agnello, M.; Majling, Lubomír

    2007-01-01

    Roč. 573, č. 3 (2007), s. 410-417 ISSN 0168-9002 R&D Projects: GA ČR GA202/05/2142 Institutional research plan: CEZ:AV0Z10480505 Keywords : hypernuclear gamma-spectroscopy * HPGe detectors Subject RIV: BE - Theoretical Physics Impact factor: 1.114, year: 2007

  7. Study of the performance of HPGe detectors operating in very high magnetic fields

    International Nuclear Information System (INIS)

    Agnello, M.; Botta, E.; Bressani, T.; Bruschi, M.; Bufalino, S.; De Napoli, M.; Feliciello, A.; Fontana, A.; Giacobbe, B.; Lavezzi, L.; Raciti, G.; Rapisarda, E.; Rotondi, A.; Sbarra, C.; Sfienti, C.; Zoccoli, A.

    2009-01-01

    A new generation of high-resolution hypernuclear γ-spectroscopy experiments using high-purity germanium (HPGe) detectors is presently designed for the FINUDA spectrometer at DAΦNE, the Frascati Φ-factory, and for PANDA, the p-p-bar hadron spectrometer at the future FAIR facility. In both spectrometers the HPGe detectors have to be operated in strong magnetic fields. In this paper we report on a series of measurements performed on a HPGe detector inserted in a magnetic field of intensity up to 2.5 T, the highest ever reached for operations with a HPGe, and with different orientations of the detector's axis with respect to field direction. A significant worsening of the energy resolution was found, but with a moderate loss of the efficiency. The most relevant features of the peak shapes, described by bi-Gaussian functions, are parametrized in terms of field intensity and energy: this allows to correct the spectra measured in magnetic field and to recover the energy resolution almost completely.

  8. High precision efficiency calibration of a HPGe detector

    International Nuclear Information System (INIS)

    Nica, N.; Hardy, J.C.; Iacob, V.E.; Helmer, R.G.

    2003-01-01

    Many experiments involving measurements of γ rays require a very precise efficiency calibration. Since γ-ray detection and identification also requires good energy resolution, the most commonly used detectors are of the coaxial HPGe type. We have calibrated our 70% HPGe to ∼ 0.2% precision, motivated by the measurement of precise branching ratios (BR) in superallowed 0 + → 0 + β decays. These BRs are essential ingredients in extracting ft-values needed to test the Standard Model via the unitarity of the Cabibbo-Kobayashi-Maskawa matrix, a test that it currently fails by more than two standard deviations. To achieve the required high precision in our efficiency calibration, we measured 17 radioactive sources at a source-detector distance of 15 cm. Some of these were commercial 'standard' sources but we achieved the highest relative precision with 'home-made' sources selected because they have simple decay schemes with negligible side feeding, thus providing exactly matched γ-ray intensities. These latter sources were produced by us at Texas A and M by n-activation or by nuclear reactions. Another critical source among the 17 was a 60 Co source produced by Physikalisch-Technische Bundesanstalt, Braunschweig, Germany: its absolute activity was quoted to better than 0.06%. We used it to establish our absolute efficiency, while all the other sources were used to determine relative efficiencies, extending our calibration over a large energy range (40-3500 keV). Efficiencies were also determined with Monte Carlo calculations performed with the CYLTRAN code. The physical parameters of the Ge crystal were independently determined and only two (unmeasurable) dead-layers were adjusted, within physically reasonable limits, to achieve precise absolute agreement with our measured efficiencies. The combination of measured efficiencies at more than 60 individual energies and Monte Carlo calculations to interpolate between them allows us to quote the efficiency of our

  9. HPGe detectors long time behaviour in high-resolution γ spectrometry

    International Nuclear Information System (INIS)

    Sajo-Bohus, L.; Rosso, D.; Sajo Castelli, A.M.; Napoli, D.R.; Fioretto, E.; Menegazzo, R.; Barros, H.; Ur, C.A.; Palacios, D.; Liendo, J.

    2011-01-01

    A large set of data on long term performance of n-type HPGe detectors used in GASP, EUROBALL and CLARA γ spectrometers, as well as environmental measurements have been collected over two decades. In this paper a detailed statistical analysis of this data is given and detector long term behaviour is provided to the scientific community. We include failure, failure mode, repair frequency, repair outcome and its influence in the energy efficiency and energy resolution. A remarkable result is that the life span distribution is exponential. A detector's failure is a memory-less process, where a previous failure does not influence the upcoming one. Repaired spectrometers result in high reliability with deep implications in the management of large scale high-resolution gamma spectrometry related projects. Findings show that on average, detectors initial counting efficiency is slightly lower (∼2%) than that reported by the manufacturers and the repair process (including annealing) does not affect significantly the energy efficiency, even after a long period of use. Repaired detector energy resolution statistics show that the probability, that a repaired detector will be at least as good as it was originally, is more than 3/4.

  10. Segmented quasi-coaxial HP-Ge detectors optimized for spatial localization of the events

    International Nuclear Information System (INIS)

    Ripamonti, Giancarlo; Pulici, Paolo; Abbiati, Roberto

    2006-01-01

    A methodology for the design of segmented high purity Germanium detectors is presented. Its motivation follows from the necessity of making it easier to derive fast algorithms for measuring the gamma-detector interaction position. By using our study, detector geometries can be designed, which could allow a first estimate of the interaction coordinate along the carrier drift direction by analyzing the shape of the signal of a single segment. The maximum resolution that can be achieved and the corresponding conditions for the electronics are highlighted: basic unavoidable constraints limit the resolution to around 3 mm, but this first position estimate can be used, at least in principle, as a starting point for more accurate, although computationally heavy, algorithms

  11. Simulation and real-time analysis of pulse shapes from segmented HPGe-detectors

    Energy Technology Data Exchange (ETDEWEB)

    Schlarb, Michael Christian

    2009-11-17

    The capabilities of future HPGe arrays consisting of highly segmented detectors, like AGATA will depend heavily on the performance of {gamma}-ray tracking. The most crucial component in the whole concept is the pulse shape analysis (PSA). The working principle of PSA is to compare the experimental signal shape with signals available from a basis set with known interaction locations. The efficiency of the tracking algorithm hinges on the ability of the PSA to reconstruct the interaction locations accurately, especially for multiple {gamma}-interactions. Given the size of the arrays the PSA algorithm must be run in a real-time environment. A prerequisite to a successful PSA is an accurate knowledge of the detectors response. Making a full coincidence scan of a single AGATA detector, however takes between two and three months, which is too long to produce an experimental signal basis for all detector elements. A straight forward possibility is to use a precise simulation of the detector and to provide a basis of simulated signals. For this purpose the Java Agata Signal Simulation (JASS) was developed in the course of this thesis. The geometry of the detector is given with numerical precision and models describing the anisotropic mobilities of the charge carriers in germanium were taken from the literature. The pulse shapes of the transient and net-charge signals are calculated using weighting potentials on a finite grid. Special care was taken that the interpolation routine not only reproduces the weighting potentials precisely in the highly varying areas of the segment boundaries but also that its performance is independent of the location within the detector. Finally data from a coincidence scan and a pencil beam experiment were used to verify JASS. The experimental signals are reproduced accurately by the simulation. Pulse Shape Analysis (PSA) reconstructs the positions of the individual interactions and the corresponding energy deposits within the detector. This

  12. Simulation and real-time analysis of pulse shapes from segmented HPGe-detectors

    International Nuclear Information System (INIS)

    Schlarb, Michael Christian

    2009-01-01

    The capabilities of future HPGe arrays consisting of highly segmented detectors, like AGATA will depend heavily on the performance of γ-ray tracking. The most crucial component in the whole concept is the pulse shape analysis (PSA). The working principle of PSA is to compare the experimental signal shape with signals available from a basis set with known interaction locations. The efficiency of the tracking algorithm hinges on the ability of the PSA to reconstruct the interaction locations accurately, especially for multiple γ-interactions. Given the size of the arrays the PSA algorithm must be run in a real-time environment. A prerequisite to a successful PSA is an accurate knowledge of the detectors response. Making a full coincidence scan of a single AGATA detector, however takes between two and three months, which is too long to produce an experimental signal basis for all detector elements. A straight forward possibility is to use a precise simulation of the detector and to provide a basis of simulated signals. For this purpose the Java Agata Signal Simulation (JASS) was developed in the course of this thesis. The geometry of the detector is given with numerical precision and models describing the anisotropic mobilities of the charge carriers in germanium were taken from the literature. The pulse shapes of the transient and net-charge signals are calculated using weighting potentials on a finite grid. Special care was taken that the interpolation routine not only reproduces the weighting potentials precisely in the highly varying areas of the segment boundaries but also that its performance is independent of the location within the detector. Finally data from a coincidence scan and a pencil beam experiment were used to verify JASS. The experimental signals are reproduced accurately by the simulation. Pulse Shape Analysis (PSA) reconstructs the positions of the individual interactions and the corresponding energy deposits within the detector. This is

  13. Alpha-event and surface characterisation in segmented true-coaxial HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Abt, I.; Garbini, L., E-mail: luciagarbini86@gmail.com.mpg.de; Gooch, C.; Irlbeck, S.; Liu, X.; Palermo, M.; Schulz, O.

    2017-06-21

    A detailed study of alpha interactions on the passivation layer on the end-plate of a true-coaxial high-purity germanium detector is presented. The observation of alpha events on such a surface indicates an unexpectedly thin so-called “effective dead layer” of less than 20 µm thickness. In addition, the influence of the metalisation close to the end-plate on the time evolution of the output pulses is discussed. The results indicate that alpha contamination can result in events which could be mistaken as signals for neutrinoless double beta decay and provide some guidance on how to prevent this.

  14. HPGe detector shielding adjustment

    International Nuclear Information System (INIS)

    Trnkova, L.; Rulik, P.

    2008-01-01

    Low-level background shielding of HPGe detectors is used mainly for environmental samples with very low content of radionuclides. National Radiation Protection Institute (SURO) in Prague is equipped with 14 HPGe detectors with relative efficiency up to 150%. The detectors are placed in a room built from materials with low content of natural radionuclides and equipped with a double isolation of the floor against radon. Detectors themselves are placed in lead or steel shielding. Steel shielding with one of these detectors with relative efficiency of 100% was chosen to be rebuilt to achieve lower minimum detectable activity (MDA). Additional lead and copper shielding was built up inside the original steel shielding to reduce the volume of the inner space and filled with nitrogen by means of evaporating liquid nitrogen. The additional lead and copper shielding, consequent reduction of the inner volume and supply of evaporated nitrogen, caused a decrease of the background count and accordingly MDA values as well. The effect of nitrogen evaporation on the net areas of peaks belonging to radon daughters is significant. The enhanced shielding adjustment has the biggest influence in low energy range, what can be seen in collected data. MDA values in energy range from 30 keV to 400 keV decreased to 0.65-0.85 of original value, in energy range from 400 keV to 2 MeV they fell to 0.70-0.97 of original value. (authors)

  15. A Multi-Contact, Low Capacitance HPGe Detector for High Rate Gamma Spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Cox, Christopher [XIA LLC, Hayward, CA (United States)

    2014-12-04

    The detection, identification and non-destructive assay of special nuclear materials and nuclear fission by-products are critically important activities in support of nuclear non-proliferation programs. Both national and international nuclear safeguard agencies recognize that current accounting methods for spent nuclear fuel are inadequate from a safeguards perspective. Radiation detection and analysis by gamma-ray spectroscopy is a key tool in this field, but no instrument exists that can deliver the required performance (energy resolution and detection sensitivity) in the presence of very high background count rates encountered in the nuclear safeguards arena. The work of this project addresses this critical need by developing a unique gamma-ray detector based on high purity germanium that has the previously unachievable property of operating in the 1 million counts-per-second range while achieving state-of-the-art energy resolution necessary to identify and analyze the isotopes of interest. The technical approach was to design and fabricate a germanium detector with multiple segmented electrodes coupled to multi-channel high rate spectroscopy electronics. Dividing the germanium detector’s signal electrode into smaller sections offers two advantages; firstly, the energy resolution of the detector is potentially improved, and secondly, the detector is able to operate at higher count rates. The design challenges included the following; determining the optimum electrode configuration to meet the stringent energy resolution and count rate requirements; determining the electronic noise (and therefore energy resolution) of the completed system after multiple signals are recombined; designing the germanium crystal housing and vacuum cryostat; and customizing electronics to perform the signal recombination function in real time. In this phase I work, commercial off-the-shelf electrostatic modeling software was used to develop the segmented germanium crystal geometry

  16. Characterization of the first true coaxial 18-fold segmented n-type prototype HPGe detector for the gerda project

    International Nuclear Information System (INIS)

    Abt, I.; Caldwell, A.; Gutknecht, D.; Kroeninger, K.; Lampert, M.; Liu, X.; Majorovits, B.; Quirion, D.; Stelzer, F.; Wendling, P.

    2007-01-01

    The first true coaxial 18-fold segmented n-type HPGe prototype detector produced by Canberra-France for the GERDA neutrinoless double beta-decay project was tested both at Canberra-France and at the Max-Planck-Institut fur Physik in Munich. The main characteristics of the detector are given and measurements concerning detector properties are described. A novel method to establish contacts between the crystal and a Kapton cable is presented

  17. Absolute efficiency calibration of HPGe detector by simulation method

    International Nuclear Information System (INIS)

    Narayani, K.; Pant, Amar D.; Verma, Amit K.; Bhosale, N.A.; Anilkumar, S.

    2018-01-01

    High resolution gamma ray spectrometry by HPGe detectors is a powerful radio analytical technique for estimation of activity of various radionuclides. In the present work absolute efficiency calibration of the HPGe detector was carried out using Monte Carlo simulation technique and results are compared with those obtained by experiment using standard radionuclides of 152 Eu and 133 Ba. The coincidence summing correction factors for the measurement of these nuclides were also calculated

  18. Electrically-cooled HPGe detector for advanced x-ray spectroscopy and imaging

    Energy Technology Data Exchange (ETDEWEB)

    Marian, V.; Clauss, J.; Pirard, B.; Quirin, P.; Flamanc, J.; Lampert, M.O. [CANBERRA France, Parc des Tanneries, 1, chemin de la roseraie, 67380 Lingolsheim (France)

    2015-07-01

    High Purity Germanium (HPGe) detectors are used for high-resolution x- and gamma-ray spectroscopy. For their operation, the necessary cryogenic cooling is performed with liquid nitrogen or with electromechanical coolers. Although mature and industrialized solutions, most of HPGe detectors integrating electrical coolers present a limited spectroscopic performance due to the generated mechanical vibration and electromagnetic interference. This paper describes a novel HPGe detector, specifically designed to address the challenges of ultimate x-ray spectroscopy and imaging applications. Due to the stringent demands associated with nano-scale imaging in synchrotron applications, a custom-designed cryostat was built around a Canberra CP5-Plus electrical cooler featuring extremely low vibration levels and high cooling power. The heat generated by the cryo-cooler itself, as well as the electronics, is evacuated via an original liquid cooling circuit. This architecture can also be used to address high ambient temperature, which does not allow conventional cryo-coolers to work properly. The multichannel detector head can consist of a segmented monolithic HPGe sensor, or several closely packed sensors. Each sensor channel is read out by state-of-the-art pulse-reset preamplifiers in order to achieve excellent energy resolution for count rates in excess of 1 Mcps. The sensitive electronics are located in EMI-proof housings to avoid any interference from other devices on a beam-line. The front-end of the detector is built using selected high-purity materials and alloys to avoid any fluorescence effects. We present a detailed description of the detector design and we report on its performance. A discussion is also given on the use of electrically cooled HPGe detectors for applications requiring ultimate energy resolution, such as synchrotron, medicine or nuclear industry. (authors)

  19. Measurement of β-decay end point energy with planar HPGe detector

    Science.gov (United States)

    Bhattacharjee, T.; Pandit, Deepak; Das, S. K.; Chowdhury, A.; Das, P.; Banerjee, D.; Saha, A.; Mukhopadhyay, S.; Pal, S.; Banerjee, S. R.

    2014-12-01

    The β - γ coincidence measurement has been performed with a segmented planar Hyper-Pure Germanium (HPGe) detector and a single coaxial HPGe detector to determine the end point energies of nuclear β-decays. The experimental end point energies have been determined for some of the known β-decays in 106Rh →106Pd. The end point energies corresponding to three weak branches in 106Rh →106Pd decay have been measured for the first time. The γ ray and β particle responses for the planar HPGe detector were simulated using the Monte Carlo based code GEANT3. The experimentally obtained β spectra were successfully reproduced with the simulation.

  20. Characterization of large volume HPGe detectors. Part II: Experimental results

    International Nuclear Information System (INIS)

    Bruyneel, Bart; Reiter, Peter; Pascovici, Gheorghe

    2006-01-01

    Measurements on a 12-fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by γ-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60keV. A precise measurement of the hole drift anisotropy was performed with 356keV γ-rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and digital signal processing electronics

  1. Characterization of HPGe detectors using Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hedman, A., E-mail: Angelica.Hedman@foi.se [Swedish Defence Research Agency, Division of CBRN Defence and Security, SE-90182 Umeå (Sweden); Umeå University, Department of Radiation Sciences, Radiation Physics, SE-90187 Umeå (Sweden); Bahar Gogani, J.; Granström, M. [Swedish Defence Research Agency, Division of CBRN Defence and Security, SE-90182 Umeå (Sweden); Johansson, L.; Andersson, J.S. [Umeå University, Department of Radiation Sciences, Radiation Physics, SE-90187 Umeå (Sweden); Ramebäck, H. [Swedish Defence Research Agency, Division of CBRN Defence and Security, SE-90182 Umeå (Sweden); Chalmers University of Technology, Department of Chemical and Biological Engineering, Nuclear Chemistry, SE-41296 Göteborg (Sweden)

    2015-06-11

    Computed Tomography (CT) high-resolution imaging have been used to investigate if there is a significant change in the crystal-to-window distance, i.e. the air gap thickness, in a small n-type detector cooled to 77 K, and in a medium sized p-type HPGe detector when cooled to 100 K. The findings were compared to detector dimension data made available by the manufacturer. The air gap thickness increased by (0.38±0.07) mm for the n-type detector and by (0.40±0.15) mm for the p-type detector when the detectors were cooled to 77 resp. 100 K compared to at room temperature. Monte Carlo calculations indicate that these differences have a significant impact on the efficiency in close geometries (<5 cm). In the energy range of 40–700 keV with a source placed directly on endcap, the change in detector efficiency with temperature is 1.9–2.9% for the n-type detector and 0.3–2.1% for the p-type detector. The measured air gap thickness when cooling the detector was 1.1 mm thicker than manufacturer data for the n-type detector and 0.2 mm thicker for the p-type detector. In the energy range of 40–700 keV and with a source on endcap, this result in a change in detector efficiency of 5.2–7.1% for the n-type detector and 0.2–1.0% for the p-type detector, i.e. the detector efficiency is overestimated using data available by the manufacturer.

  2. Measurement of β-decay end point energy with planar HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, T., E-mail: btumpa@vecc.gov.in [Physics Group, Variable Energy Cyclotron Centre, Kolkata 700 064 (India); Pandit, Deepak [Physics Group, Variable Energy Cyclotron Centre, Kolkata 700 064 (India); Das, S.K. [RCD-BARC, Variable Energy Cyclotron Centre, Kolkata 700 064 (India); Chowdhury, A.; Das, P. [Physics Group, Variable Energy Cyclotron Centre, Kolkata 700 064 (India); Banerjee, D. [RCD-BARC, Variable Energy Cyclotron Centre, Kolkata 700 064 (India); Saha, A.; Mukhopadhyay, S.; Pal, S.; Banerjee, S.R. [Physics Group, Variable Energy Cyclotron Centre, Kolkata 700 064 (India)

    2014-12-11

    The β–γ coincidence measurement has been performed with a segmented planar Hyper-Pure Germanium (HPGe) detector and a single coaxial HPGe detector to determine the end point energies of nuclear β-decays. The experimental end point energies have been determined for some of the known β-decays in {sup 106}Rh→{sup 106}Pd. The end point energies corresponding to three weak branches in {sup 106}Rh→{sup 106}Pd decay have been measured for the first time. The γ ray and β particle responses for the planar HPGe detector were simulated using the Monte Carlo based code GEANT3. The experimentally obtained β spectra were successfully reproduced with the simulation.

  3. Method applied for the HPGe detector characterization

    International Nuclear Information System (INIS)

    Guillot, Nicolas; Monestier, Mathieu; Saurel, Nicolas

    2013-06-01

    Gamma ray spectrometry is a passive non destructive assay most commonly used to identify and quantify the radionuclides present in the complex huge objects such as nuclear waste packages. The treatment of spectra from the measurement of nuclear waste is performed in two steps: the first step is to extract the raw data from the spectra (energies and net photoelectric absorption peaks areas) and the second step is to determine the detection efficiency of the measured scene. The establishment by numerical modeling of the detection efficiency of the measured scene requires numerical modeling of both the measuring device (in this case a hyper pure germanium detector HPGe) and numerical modeling of the measured object. Numerical detector modeling is also called diode characterization, and has a spatial response equivalent to these of the real HPGe detector. This characterization is essential for the quantification of complex and non reproducible huge objects for which the detection efficiency can not be determined empirically. The Nuclear Measurement and Valuation Laboratory (LMNE) at the Atomic Energy Commission Valduc (CEA Valduc) has developed a new methodology for characterizing the HPGe detector. It has been tested experimentally with a real diode present in the laboratory (P-type planar detector). The characterization obtained with this methodology is similar to these of a real HPGe detector with an uncertainty approaching 5 percents. It is valid for a distance ranging from 10 cm to 150 cm, an angle ranging from 0 to 90 degrees and energy range from 53 keV to 1112 keV. The energy range is obtained with a source of Barium-133 and a source of Europium-152. The continuity of the detection efficiency curve is checked between the two sources with an uncertainty less than 2 percents. In addition, this methodology can be extrapolated to any type of detector crystal geometry (planar). (authors)

  4. HPGe detectors timing using pulse shape analysis techniques

    International Nuclear Information System (INIS)

    Crespi, F.C.L.; Vandone, V.; Brambilla, S.; Camera, F.; Million, B.; Riboldi, S.; Wieland, O.

    2010-01-01

    In this work the Pulse Shape Analysis has been used to improve the time resolution of High Purity Germanium (HPGe) detectors. A set of time aligned signals was acquired in a coincidence measurement using a coaxial HPGe and a cerium-doped lanthanum chloride (LaCl 3 :Ce) scintillation detector. The analysis using a Constant Fraction Discriminator (CFD) time output versus the HPGe signal shape shows that time resolution ranges from 2 to 12 ns depending on the slope in the initial part of the signal. An optimization procedure of the CFD parameters gives the same final time resolution (8 ns) as the one achieved after a correction of the CFD output based on the current pulse maximum position. Finally, an algorithm based on Pulse Shape Analysis was applied to the experimental data and a time resolution between 3 and 4 ns was obtained, corresponding to a 50% improvement as compared with that given by standard CFDs.

  5. The influence of anisotropic electron drift velocity on the signal shapes of closed-end HPGe detectors

    CERN Document Server

    Mihailescu, L; Lieder, R M; Brands, H; Jaeger, H

    2000-01-01

    This study is concerned with the anisotropy of the electron drift velocity in germanium crystals at high electric fields and low temperature, and its influence on the charge collection process in n-type, high-purity germanium (HPGe) detectors of closed-end, coaxial geometry. The electron trajectories inside HPGe detectors are simulated using a phenomenological model to calculate the dependence of the drift velocity on the angle between the electric field and the crystal orientation. The resulting induced currents and pulse shapes for a given detector geometry and preamplifier bandwidth are compared to experiment. Experimentally, the dependence of the pulse shapes on the conductivity anisotropy in closed-end HPGe detectors was observed. The experimental data on pulse shapes were obtained by sampling preamplifier signals of an encapsulated, hexaconical EUROBALL detector, which was irradiated by collimated sup 2 sup 2 Na and sup 2 sup 4 sup 1 Am sources. The crystal orientation was measured by neutron reflection...

  6. Relative efficiency calculation of a HPGe detector using MCNPX code

    International Nuclear Information System (INIS)

    Medeiros, Marcos P.C.; Rebello, Wilson F.; Lopes, Jose M.; Silva, Ademir X.

    2015-01-01

    High-purity germanium detectors (HPGe) are mandatory tools for spectrometry because of their excellent energy resolution. The efficiency of such detectors, quoted in the list of specifications by the manufacturer, frequently refers to the relative full-energy peak efficiency, related to the absolute full-energy peak efficiency of a 7.6 cm x 7.6 cm (diameter x height) NaI(Tl) crystal, based on the 1.33 MeV peak of a 60 Co source positioned 25 cm from the detector. In this study, we used MCNPX code to simulate a HPGe detector (Canberra GC3020), from Real-Time Neutrongraphy Laboratory of UFRJ, to survey the spectrum of a 60 Co source located 25 cm from the detector in order to calculate and confirm the efficiency declared by the manufacturer. Agreement between experimental and simulated data was achieved. The model under development will be used for calculating and comparison purposes with the detector calibration curve from software Genie2000™, also serving as a reference for future studies. (author)

  7. Gamma-ray Full Spectrum Analysis for Environmental Radioactivity by HPGe Detector

    Science.gov (United States)

    Jeong, Meeyoung; Lee, Kyeong Beom; Kim, Kyeong Ja; Lee, Min-Kie; Han, Ju-Bong

    2014-12-01

    Odyssey, one of the NASA¡¯s Mars exploration program and SELENE (Kaguya), a Japanese lunar orbiting spacecraft have a payload of Gamma-Ray Spectrometer (GRS) for analyzing radioactive chemical elements of the atmosphere and the surface. In these days, gamma-ray spectroscopy with a High-Purity Germanium (HPGe) detector has been widely used for the activity measurements of natural radionuclides contained in the soil of the Earth. The energy spectra obtained by the HPGe detectors have been generally analyzed by means of the Window Analysis (WA) method. In this method, activity concentrations are determined by using the net counts of energy window around individual peaks. Meanwhile, an alternative method, the so-called Full Spectrum Analysis (FSA) method uses count numbers not only from full-absorption peaks but from the contributions of Compton scattering due to gamma-rays. Consequently, while it takes a substantial time to obtain a statistically significant result in the WA method, the FSA method requires a much shorter time to reach the same level of the statistical significance. This study shows the validation results of FSA method. We have compared the concentration of radioactivity of 40K, 232Th and 238U in the soil measured by the WA method and the FSA method, respectively. The gamma-ray spectrum of reference materials (RGU and RGTh, KCl) and soil samples were measured by the 120% HPGe detector with cosmic muon veto detector. According to the comparison result of activity concentrations between the FSA and the WA, we could conclude that FSA method is validated against the WA method. This study implies that the FSA method can be used in a harsh measurement environment, such as the gamma-ray measurement in the Moon, in which the level of statistical significance is usually required in a much shorter data acquisition time than the WA method.

  8. Influence of Cell Phone Waves on the Performance of HPGe Detector

    International Nuclear Information System (INIS)

    Mansour, N.A.; Hassan, M.F.

    2012-01-01

    Hand phone mobile waves search systems, constructed with high resolution germanium (HPGe) detectors, are currently being installed at locations worldwide. This reflects a general desire for improved performance and a reduction in the time to make a good decision in interdiction cases. An integrated gamma-ray spectrometer, incorporating a mechanically-cooled HPGe detector, digital signal processing electronics, MCA, and communications has been developed to meet the detection and environmental needs of these systems. The HPGe detectors are designed to have good low- and medium-energy detection efficiency and excellent spectral peak resolution in order to eliminate peak overlaps and thereby remove problems by common industrial and medical radionuclides found in all types of hand phone mobile. Systems using detectors with inferior resolution, regardless of efficiency, are unable to separate the radiation signals from NORM and illicit nuclides. The absolute full-energy peak efficiency of the detector and background count-rate in the peak energy region determine the signal-to-noise ratio. Measurements presented show the impact of shielding and masking on the performance of the hand phone mobile. The results illustrate applicability of the design to a variety of monitoring situations for the detection of illicit material. In the present work we studied the effects of different types of hand phone waves on the performance of 70% HPGe X and Gamma-ray detector. The detected interference has an energy range 30-100 keV. A correction battues was estimated as a function of time verses cell phone type.The measurement quality of the measurer gamma-spectra can be corrected at low X-ray region. The effect of these waves was also studied on the performance of the main detector amplifier. The results were obtained for Etesalat, Vodafone and Mobinile stations. The introduced method can be simulated for other devices having the same interference effect.

  9. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    International Nuclear Information System (INIS)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia

    2013-01-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  10. Gamma-ray Full Spectrum Analysis for Environmental Radioactivity by HPGe Detector

    Directory of Open Access Journals (Sweden)

    Meeyoung Jeong

    2014-12-01

    Full Text Available Odyssey, one of the NASA’s Mars exploration program and SELENE (Kaguya, a Japanese lunar orbiting spacecraft have a payload of Gamma-Ray Spectrometer (GRS for analyzing radioactive chemical elements of the atmosphere and the surface. In these days, gamma-ray spectroscopy with a High-Purity Germanium (HPGe detector has been widely used for the activity measurements of natural radionuclides contained in the soil of the Earth. The energy spectra obtained by the HPGe detectors have been generally analyzed by means of the Window Analysis (WA method. In this method, activity concentrations are determined by using the net counts of energy window around individual peaks. Meanwhile, an alternative method, the so-called Full Spectrum Analysis (FSA method uses count numbers not only from full-absorption peaks but from the contributions of Compton scattering due to gamma-rays. Consequently, while it takes a substantial time to obtain a statistically significant result in the WA method, the FSA method requires a much shorter time to reach the same level of the statistical significance. This study shows the validation results of FSA method. We have compared the concentration of radioactivity of 40K, 232Th and 238U in the soil measured by the WA method and the FSA method, respectively. The gamma-ray spectrum of reference materials (RGU and RGTh, KCl and soil samples were measured by the 120% HPGe detector with cosmic muon veto detector. According to the comparison result of activity concentrations between the FSA and the WA, we could conclude that FSA method is validated against the WA method. This study implies that the FSA method can be used in a harsh measurement environment, such as the gamma-ray measurement in the Moon, in which the level of statistical significance is usually required in a much shorter data acquisition time than the WA method.

  11. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2013-07-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  12. MCNP HPGe detector benchmark with previously validated Cyltran model.

    Science.gov (United States)

    Hau, I D; Russ, W R; Bronson, F

    2009-05-01

    An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.

  13. Dose measurements with a HPGe detector - a technical manual

    Energy Technology Data Exchange (ETDEWEB)

    Lidstroem, K.; Nordenfors, C.; Aagren, G

    2000-06-01

    This paper is a technical manual for estimations of dose based on a gamma spectrum. The method used is based on the Monte Carlo code EGS4. Since dose estimations from spectra are specific for each detector, this work is performed on two mobile HPGe detectors at FOA NBC Defence in Umeaa. This technical manual describes the method used in three steps: Part 1 explains how to construct a model of the detector geometry and the specific material for a new detector. Part 2 describes the underlying work of Monte Carlo simulations of a detector given geometry and material. Part 3 describes dose estimations from a gamma spectrum.

  14. Soucreless efficiency calibration for HPGe detector based on medical images

    International Nuclear Information System (INIS)

    Chen Chaobin; She Ruogu; Xiao Gang; Zuo Li

    2012-01-01

    Digital phantom of patient and region of interest (supposed to be filled with isotropy volume source) are built from medical CT images. They are used to calculate the detection efficiency of HPGe detectors located outside of human body by sourceless calibration method based on a fast integral technique and MCNP code respectively, and the results from two codes are in good accord besides a max difference about 5% at intermediate energy region. The software produced in this work are in better behavior than Monte Carlo code not only in time consume but also in complexity of problem to solve. (authors)

  15. Charge collection performance of a segmented planar high-purity germanium detector

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.J. [Department of Physics, The University of Liverpool, Oliver Lodge Laboratory, Liverpool Merseyside L69 7ZE (United Kingdom)], E-mail: R.Cooper@liverpool.ac.uk; Boston, A.J.; Boston, H.C.; Cresswell, J.R.; Grint, A.N.; Harkness, L.J.; Nolan, P.J.; Oxley, D.C.; Scraggs, D.P. [Department of Physics, The University of Liverpool, Oliver Lodge Laboratory, Liverpool Merseyside L69 7ZE (United Kingdom); Lazarus, I.; Simpson, J. [STFC Daresbury Laboratory, Warrington, Cheshire WA4 4AD (United Kingdom); Dobson, J. [Rosemere Cancer Centre, Royal Preston Hospital, Preston PR2 9HT (United Kingdom)

    2008-10-01

    High-precision scans of a segmented planar high-purity germanium (HPGe) detector have been performed with a range of finely collimated gamma ray beams allowing the response as a function of gamma ray interaction position to be quantified. This has allowed the development of parametric pulse shape analysis (PSA) techniques and algorithms for the correction of imperfections in performance. In this paper we report on the performance of this detector, designed for use in a positron emission tomography (PET) development system.

  16. Assessment of applicability of portable HPGe detector with in situ object counting system based on performance evaluation of thyroid radiobioassays

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Seok; Kwon, Tae Eun; Pak, Min Jung; Park, Se Young; Ha, Wi Ho; Jin, Young Woo [National Radiation Emergency Medical Center, Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of)

    2017-06-15

    Different cases exist in the measurement of thyroid radiobioassays owing to the individual characteristics of the subjects, especially the potential variation in the counting efficiency. An In situ Object Counting System (ISOCS) was developed to perform an efficiency calibration based on the Monte Carlo calculation, as an alternative to conventional calibration methods. The purpose of this study is to evaluate the applicability of ISOCS to thyroid radiobioassays by comparison with a conventional thyroid monitoring system. The efficiency calibration of a portable high-purity germanium (HPGe) detector was performed using ISOCS software. In contrast, the conventional efficiency calibration, which needed a radioactive material, was applied to a scintillator-based thyroid monitor. Four radioiodine samples that contained 125I and 131I in both aqueous solution and gel forms were measured to evaluate radioactivity in the thyroid. ANSI/HPS N13.30 performance criteria, which included the relative bias, relative precision, and root-mean-squared error, were applied to evaluate the performance of the measurement system. The portable HPGe detector could measure both radioiodines with ISOCS but the thyroid monitor could not measure 125I because of the limited energy resolution of the NaI(Tl) scintillator. The 131I results from both detectors agreed to within 5% with the certified results. Moreover, the 125I results from the portable HPGe detector agreed to within 10% with the certified results. All measurement results complied with the ANSI/HPS N13.30 performance criteria. The results of the intercomparison program indicated the feasibility of applying ISOCS software to direct thyroid radiobioassays. The portable HPGe detector with ISOCS software can provide the convenience of efficiency calibration and higher energy resolution for identifying photopeaks, compared with a conventional thyroid monitor with a NaI(Tl) scintillator. The application of ISOCS software in a radiation

  17. Development of the MCNPX model for the portable HPGe detector

    International Nuclear Information System (INIS)

    Koleska, Michal; Viererbl, Ladislav; Marek, Milan

    2014-01-01

    The portable HPGe coaxial detector Canberra Big MAC is used in LVR-15 research reactor for spectrometric measurement of spent nuclear fuel. The fuel is measured in the dedicated system located in the spent fuel pool situated near the reactor. For the purpose of the spectrometric system calibration, the detector was precisely modeled with the MCNPX code. This model was constructed with the data acquired from the technical specification provided by the manufacturer and from the data obtained by the radiography of the crystal. The detector model was verified on the experimental data measured with available standard radionuclide sources and on-site prepared 110m Ag source. - Highlights: • Inner structure of the HPGe detector is determined. • An MCNPX model of the detector is developed. • The model is verified using different sources for two measurement geometries

  18. New approach for calibration the efficiency of HPGe detectors

    International Nuclear Information System (INIS)

    Alnour, I.A.; Wagiran, H.; Suhaimi Hamzah; Siong, W.B.; Mohd Suhaimi Elias

    2013-01-01

    Full-text: This work evaluates the efficiency calibrating of HPGe detector coupled with Canberra GC3018 with Genie 2000 software and Ortec GEM25-76-XLB-C with Gamma Vision software; available at Neutron activation analysis laboratory in Malaysian Nuclear Agency (NM). The efficiency calibration curve was constructed from measurement of an IAEA, standard gamma point sources set composed by 214 Am, 57 Co, 133 Ba, 152 Eu, 137 Cs and 60 Co. The efficiency calibrations were performed for three different geometries: 5, 10 and 15 cm distances from the end cap detector. The polynomial parameters functions were simulated through a computer program, MATLAB in order to find an accurate fit to the experimental data points. The efficiency equation was established from the known fitted parameters which allow for the efficiency evaluation at particular energy of interest. The study shows that significant deviations in the efficiency, depending on the source-detector distance and photon energy. (author)

  19. Response function of a p type - HPGe detector

    International Nuclear Information System (INIS)

    Lopez-Pino, Neivy; Cabral, Fatima Padilla; D'Alessandro, Katia; Maidana, Nora Lia; Vanin, Vito Roberto

    2011-01-01

    The response function of a HPGe detector depends on Ge crystal dimensions and dead layers thicknesses; most of them are not given by the manufacturers or change with detector damage from neutrons or contact with the atmosphere and therefore must be experimentally determined. The response function is obtained by a Monte-Carlo simulation procedure based on the Ge crystal characteristics. In this work, a p-type coaxial HPGe detector with 30% efficiency, manufactured in 1989, was investigated. The crystal radius and length and the inner hole dimensions were obtained scanning the capsule both in the radial and axial directions using 4 mm collimated beams from 137 Cs, 207 Bi point sources placed on a x-y table in steps of 2,00 mm. These dimensions were estimated comparing the experimental peak areas with those obtained by simulation using several hole configurations. In a similar procedure, the frontal dead layer thickness was determined using 2 mm collimated beams of the 59 keV gamma-rays from 241 Am and 81 keV from 133 Ba sources hitting the detector at 90 deg and 45 deg with respect to the capsule surface. The Monte Carlo detector model included, besides the crystal, hole and capsules sizes, the Ge dead-layers. The obtained spectra were folded with a gaussian resolution function to account for electronic noise. The comparison of simulated and experimental response functions for 4 mm collimated beams of 60 Co, 137 Cs, and 207 Bi points sources placed at distances of 7, 11 and 17 cm from the detector end cap showed relative deviations of about 10% in general and below 10% in the peak. The frontal dead layer thickness determined by our procedure was different from that specified by the detector manufacturer. (author)

  20. A New Virtual Point Detector Concept for a HPGe detector

    International Nuclear Information System (INIS)

    Byun, Jong In; Yun, Ju Yong

    2009-01-01

    For last several decades, the radiation measurement and radioactivity analysis techniques using gamma detectors have been well established. Especially , the study about the detection efficiency has been done as an important part of gamma spectrometry. The detection efficiency depends strongly on source-to-detector distance. The detection efficiency with source-to-detector distance can be expressed by a complex function of geometry and physical characteristics of gamma detectors. In order to simplify the relation, a virtual point detector concept was introduced by Notea. Recently, further studies concerning the virtual point detector have been performed. In previous other works the virtual point detector has been considered as a fictitious point existing behind the detector end cap. However the virtual point detector position for the front and side of voluminous detectors might be different due to different effective central axis of them. In order to more accurately define the relation, therefore, we should consider the virtual point detector for the front as well as side and off-center of the detector. The aim of this study is to accurately define the relation between the detection efficiency and source-to-detector distance with the virtual point detector. This paper demonstrates the method to situate the virtual point detectors for a HPGe detector. The new virtual point detector concept was introduced for three area of the detector and its characteristics also were demonstrated by using Monte Carlo Simulation method. We found that the detector has three virtual point detectors except for its rear area. This shows that we should consider the virtual point detectors for each area when applying the concept to radiation measurement. This concept can be applied to the accurate geometric simplification for the detector and radioactive sources.

  1. Application of PHOTON simulation software on calibration of HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Nikolic, J., E-mail: jnikolic@vinca.rs [University of Belgrade Institute for Nuclear Sciences Vinča, Mike Petrovica Alasa 12-16, 11001 Belgrade (Serbia); Puzovic, J. [University of Belgrade Faculty of Physics, Studentski trg 6, 11000 Belgrade (Serbia); Todorovic, D.; Rajacic, M. [University of Belgrade Institute for Nuclear Sciences Vinča, Mike Petrovica Alasa 12-16, 11001 Belgrade (Serbia)

    2015-11-01

    One of the major difficulties in gamma spectrometry of voluminous environmental samples is the efficiency calibration of the detectors used for the measurement. The direct measurement of different calibration sources, containing isolated γ-ray emitters within the energy range of interest, and subsequent fitting to a parametric function, is the most accurate and at the same time most complicated and time consuming method of efficiency calibration. Many other methods are developed in time, some of them using Monte Carlo simulation. One of such methods is a dedicated and user-friendly program PHOTON, developed to simulate the passage of photons through different media with different geometries. This program was used for efficiency calibration of three HPGe detectors, readily used in Laboratory for Environment and Radiation Protection of the Institute for Nuclear Sciences Vinca, Belgrade, Serbia. The simulation produced the spectral response of the detectors for fixed energy and for different sample geometries and matrices. Thus obtained efficiencies were compared to the values obtained by the measurement of the secondary reference materials and to the results obtained by GEANT4 simulation, in order to establish whether the simulated values agree with the experimental ones. To further analyze the results, a realistic measurement of the materials provided by the IAEA within different interlaboratory proficiency tests, was performed. The activities obtained using simulated efficiencies were compared to the reference values provided by the organizer. A good agreement in the mid energy section of the spectrum was obtained, while for low energies the lack of some parameters in the simulation libraries proved to produce unacceptable discrepancies.

  2. Placement of HPGE detectors for whole body counting applications using simulations of voxel phantoms

    International Nuclear Information System (INIS)

    Marzocchi, O.; Breustedt, B.; Zankl, M.

    2010-01-01

    The partial body counter at KIT is going to be rebuilt in order to replace the old Ge detectors with four new HPGe detectors. The new installation will also add whole body capabilities to the system, thanks to an improved mechanics able to position the detectors with a high degree of freedom in the chamber. During the definition of the position of the detectors a compromise between the opposite goals of high efficiency and small dependence of the detection efficiency on the position of the source had to be sought. High detection efficiency involves placing the detector near the skin, where the photon flux is maximal, while the second goal involves placing the detectors at a greater distance from the body. The same concept was applied during the definition of the partial body measurement configurations, but the goal was the increase of the specificity of the measurement. In addition, the mechanical installation poses some constraints: two detectors are mounted on carts and therefore can be placed independently around the subjects, but not in front of it, while the other two detectors are mounted on carts hanging from the same rail on the ceiling, therefore their distance from the subject is constrained by the maximum offset between them. (orig.)

  3. Optimization of the n-type HPGe detector parameters to theoretical determination of efficiency curves

    International Nuclear Information System (INIS)

    Rodriguez-Rodriguez, A.; Correa-Alfonso, C.M.; Lopez-Pino, N.; Padilla-Cabal, F.; D'Alessandro, K.; Corrales, Y.; Garcia-Alvarez, J. A.; Perez-Mellor, A.; Baly-Gil, L.; Machado, A.

    2011-01-01

    A highly detailed characterization of a 130 cm 3 n-type HPGe detector, employed in low - background gamma spectrometry measurements, was done. Precise measured data and several Monte Carlo (MC) calculations have been combined to optimize the detector parameters. HPGe crystal location inside the Aluminum end-cap as well as its dimensions, including the borehole radius and height, were determined from frontal and lateral scans. Additionally, X-ray radiography and Computed Axial Tomography (CT) studies were carried out to complement the information about detector features. Using seven calibrated point sources ( 241 Am, 133 Ba, 57,60 Co, 137 Cs, 22 Na and 152 Eu), photo-peak efficiency curves at three different source - detector distances (SDD) were obtained. Taking into account the experimental values, an optimization procedure by means of MC simulations (MCNPX 2.6 code) were performed. MC efficiency curves were calculated specifying the optimized detector parameters in the MCNPX input files. Efficiency calculation results agree with empirical data, showing relative deviations lesser 10%. (Author)

  4. New approach to calculate the true-coincidence effect of HpGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Alnour, I. A., E-mail: aaibrahim3@live.utm.my, E-mail: ibrahim.elnour@yahoo.com [Department of Physics, Faculty of Pure and Applied Science, International University of Africa, 12223 Khartoum (Sudan); Wagiran, H. [Department of Physics, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Skudai,Johor (Malaysia); Ibrahim, N. [Faculty of Defence Science and Technology, National Defence University of Malaysia, Kem Sungai Besi, 57000 Kuala Lumpur (Malaysia); Hamzah, S.; Elias, M. S. [Malaysia Nuclear Agency (MNA), Bangi, 43000 Kajang, Selangor D.E. (Malaysia); Siong, W. B. [Chemistry Department, Faculty of Resource Science & Technology, Universiti Malaysia Sarawak, 94300 Kota Samarahan, Sarawak (Malaysia)

    2016-01-22

    The corrections for true-coincidence effects in HpGe detector are important, especially at low source-to-detector distances. This work established an approach to calculate the true-coincidence effects experimentally for HpGe detectors of type Canberra GC3018 and Ortec GEM25-76-XLB-C, which are in operation at neutron activation analysis lab in Malaysian Nuclear Agency (NM). The correction for true-coincidence effects was performed close to detector at distances 2 and 5 cm using {sup 57}Co, {sup 60}Co, {sup 133}Ba and {sup 137}Cs as standard point sources. The correction factors were ranged between 0.93-1.10 at 2 cm and 0.97-1.00 at 5 cm for Canberra HpGe detector; whereas for Ortec HpGe detector ranged between 0.92-1.13 and 0.95-100 at 2 and 5 cm respectively. The change in efficiency calibration curve of the detector at 2 and 5 cm after correction was found to be less than 1%. Moreover, the polynomial parameters functions were simulated through a computer program, MATLAB in order to find an accurate fit to the experimental data points.

  5. Experimental and simulated efficiency of a HPGe detector in the energy range of 0.06∼11 MeV

    International Nuclear Information System (INIS)

    Park, Chang Su; Choi, H. D.; Sun, Gwang Min

    2003-01-01

    The full energy peak efficiency of a Hyper Pure Germanium (HPGe) detector was calibrated in a wide energy range from 0.06 to 11 MeV. Both the experimental technique and the Monte Carlo method were used for the efficiency calibration. The measurement was performed using the standard radioisotopes in the low energy region of 60∼1408 keV, which was further extended up to 11 MeV by using the 14 N(n,γ) and 35 Cl(n,γ) reactions. The GEANT Monte Carlo code was used for efficiency calculation. The calculated efficiency had the same dependency on the γ-ray energy with the measurement, and the discrepancy between the calculation and the measurement was minimized by fine-tuning of the detector geometry. From the calculated result, the efficiency curve of the HPGe detector was reliably determined particularly in the high energy region above several MeV, where the number of measured efficiency points is relatively small despite the wide energy region. The calculated efficiency agreed with the measurement within about 7%. In addition to the efficiency calculation, the origin of the local minimum near 600 keV on the efficiency curve was analyzed as a general characteristics of a HPGe detector

  6. Validation of an efficiency calibration procedure for a coaxial n-type and a well-type HPGe detector used for the measurement of environmental radioactivity

    Energy Technology Data Exchange (ETDEWEB)

    Morera-Gómez, Yasser, E-mail: ymore24@gamail.com [Centro de Estudios Ambientales de Cienfuegos, AP 5. Ciudad Nuclear, CP 59350 Cienfuegos (Cuba); Departamento de Química y Edafología, Universidad de Navarra, Irunlarrea No 1, Pamplona 31009, Navarra (Spain); Cartas-Aguila, Héctor A.; Alonso-Hernández, Carlos M.; Nuñez-Duartes, Carlos [Centro de Estudios Ambientales de Cienfuegos, AP 5. Ciudad Nuclear, CP 59350 Cienfuegos (Cuba)

    2016-05-11

    To obtain reliable measurements of the environmental radionuclide activity using HPGe (High Purity Germanium) detectors, the knowledge of the absolute peak efficiency is required. This work presents a practical procedure for efficiency calibration of a coaxial n-type and a well-type HPGe detector using experimental and Monte Carlo simulations methods. The method was performed in an energy range from 40 to 1460 keV and it can be used for both, solid and liquid environmental samples. The calibration was initially verified measuring several reference materials provided by the IAEA (International Atomic Energy Agency). Finally, through the participation in two Proficiency Tests organized by IAEA for the members of the ALMERA network (Analytical Laboratories for the Measurement of Environmental Radioactivity) the validity of the developed procedure was confirmed. The validation also showed that measurement of {sup 226}Ra should be conducted using coaxial n-type HPGe detector in order to minimize the true coincidence summing effect. - Highlights: • An efficiency calibration for a coaxial and a well-type HPGe detector was performed. • The calibration was made using experimental and Monte Carlo simulations methods. • The procedure was verified measuring several reference materials provided by IAEA. • Calibrations were validated through the participation in 2 ALMERA Proficiency Tests.

  7. Dual photon absorptiometer utilizing a HpGe detector and microprocessor controller

    International Nuclear Information System (INIS)

    Ellis, K.J.; Vartsky, D.; Pearlstein, T.B.; Alberi, J.L.; Cohn, S.H.

    1978-01-01

    The analysis of bone mineral content (BMC) using a single energy-photon beam assumes that there are only two materials present, bone mineral and a uniform soft tissue component. Uncertainty in the value of BMC increases with different adipose tissue components in the transmitted beam. These errors, however, are reduced by the dual energy technique. Also, extension to additional energies further identifies the separate constituents of the soft tissue component. A multi-energy bone scanning apparatus with data acquisition and analysis capability sufficient to perform multi-energy analysis of bone mineral content was designed and developed. The present work reports on the development of device operated in the dual energy mode. The high purity germanium (HpGe) detector is an integral component of the scanner. Errors in BMC due to multiple small angle scatters are reduced due to the excellent energy resolution of the detector (530 eV at 60 keV). Also, the need to filter the source or additional collimation on the detector is eliminated. A new dual source holder was designed using 200 mCi 125 I and 100 mCi 241 Am. The active areas of the two source capsules are aligned on a common axis. The congruence of the dual source was verified by measuring the collimator response function. This new holder design insures that the same tissue mass simultaneously attenuates both sources. The controller portion of the microprocessor allows for variation in total scan length, step size, and counting time per step. These options allow for multiple measurements without changes in the detector, source, or collimator. The system has been successfully used to determine the BMC content of different bones

  8. Performance revaluation of a N-type coaxial HPGe detector with front edges crystal using MCNPX.

    Science.gov (United States)

    Azli, Tarek; Chaoui, Zine-El-Abidine

    2015-03-01

    The MCNPX code was used to determine the efficiency of a N-type HPGe detector after two decades of operation. Accounting for the roundedness of the crystal`s front edges and an inhomogeneous description of the detector's dead layers were shown to achieve better agreement between measurements and simulation efficiency determination. The calculations were experimentally verified using point sources in the energy range from 50keV to 1400keV, and an overall uncertainty less than 2% was achieved. In order to use the detector for different matrices and geometries in radioactivity, the suggested model was validated by changing the counting geometry and by using multi-gamma disc sources. The introduced simulation approach permitted the revaluation of the performance of an HPGe detector in comparison of its initial condition, which is a useful tool for precise determination of the thickness of the inhomogeneous dead layer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Studies on a pulse shaping system for fast coincidence with very large volume HPGe detectors

    International Nuclear Information System (INIS)

    Bose, S.; Chatterjee, M.B.; Sinha, B.K.; Bhattacharya, R.

    1987-01-01

    A variant of the leading edge timing (LET) has been proposed which compensates the ''walk'' due to risetime spread in very large volume (∝100 cm 3 ) HPGe detectors. The method - shape compensated leading edge timing (SCLET) - can be used over a wide dynamic range of energies with 100% efficiency and has been compared with the LET and ARC methods. A time resolution of 10 ns fwhm and 21 ns fwtm has been obtained with 22 Na gamma rays and two HPGe detectors of 96 and 114 cm 3 volume. This circuit is easy to duplicate and use can be a low cost alternative to commercial circuits in experiments requiring a large number of detectors. (orig.)

  10. Using lattice tools and unfolding methods for hpge detector efficiency simulation with the Monte Carlo code MCNP5

    International Nuclear Information System (INIS)

    Querol, A.; Gallardo, S.; Ródenas, J.; Verdú, G.

    2015-01-01

    In environmental radioactivity measurements, High Purity Germanium (HPGe) detectors are commonly used due to their excellent resolution. Efficiency calibration of detectors is essential to determine activity of radionuclides. The Monte Carlo method has been proved to be a powerful tool to complement efficiency calculations. In aged detectors, efficiency is partially deteriorated due to the dead layer increasing and consequently, the active volume decreasing. The characterization of the radiation transport in the dead layer is essential for a realistic HPGe simulation. In this work, the MCNP5 code is used to calculate the detector efficiency. The F4MESH tally is used to determine the photon and electron fluence in the dead layer and the active volume. The energy deposited in the Ge has been analyzed using the ⁎F8 tally. The F8 tally is used to obtain spectra and to calculate the detector efficiency. When the photon fluence and the energy deposition in the crystal are known, some unfolding methods can be used to estimate the activity of a given source. In this way, the efficiency is obtained and serves to verify the value obtained by other methods. - Highlights: • The MCNP5 code is used to estimate the dead layer thickness of an HPGe detector. • The F4MESH tally is applied to verify where interactions occur into the Ge crystal. • PHD and the energy deposited are obtained with F8 and ⁎F8 tallies, respectively. • An average dead layer between 70 and 80 µm is obtained for the HPGe studied. • The efficiency is calculated applying the TSVD method to the response matrix.

  11. Determination of efficiency curves for HPGE detector in different counting geometries

    International Nuclear Information System (INIS)

    Rodrigues, Josianne L.; Kastner, Geraldo F.; Ferreira, Andrea V.

    2011-01-01

    This paper presents the first experimental results related to determination of efficiency curves for HPGe detector in different counting geometries. The detector is a GX2520 Canberra belonging to CDTN/CNEN. Efficiency curves for punctual were determined by using a certified set of gamma sources. These curves were determined for three counting geometries. Following that, efficiency curves for non punctual samples were determined by using standard solutions of radionuclides in 500 ml and 1000 ml wash bottle Marinelli

  12. Mod 1 ICS TI Report: ICS Conversion of a 140% HPGe Detector

    Energy Technology Data Exchange (ETDEWEB)

    Bounds, John Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-05

    This report evaluates the Mod 1 ICS, an electrically cooled 140% HPGe detector. It is a custom version of the ORTEC Integrated Cooling System (ICS) modified to make it more practical for us to use in the field. Performance and operating characteristics of the Mod 1 ICS are documented, noting both pros and cons. The Mod 1 ICS is deemed a success. Recommendations for a Mod 2 ICS, a true field prototype, are provided.

  13. Performance revaluation of a N-type coaxial HPGe detector with front edges crystal using MCNPX

    International Nuclear Information System (INIS)

    Azli, Tarek; Chaoui, Zine-El-Abidine

    2015-01-01

    The MCNPX code was used to determine the efficiency of a N-type HPGe detector after two decades of operation. Accounting for the roundedness of the crystal's front edges and an inhomogeneous description of the detector's dead layers were shown to achieve better agreement between measurements and simulation efficiency determination. The calculations were experimentally verified using point sources in the energy range from 50 keV to 1400 keV, and an overall uncertainty less than 2% was achieved. In order to use the detector for different matrices and geometries in radioactivity, the suggested model was validated by changing the counting geometry and by using multi-gamma disc sources. The introduced simulation approach permitted the revaluation of the performance of an HPGe detector in comparison of its initial condition, which is a useful tool for precise determination of the thickness of the inhomogeneous dead layer. - Highlights: • Monte Carlo (MCNPX) simulation of an HPGe detector performance after more than two decades in use. • Investigating influence of detector rounded front edges of crystal. • Achieving good matching between Monte Carlo simulation and experiments by inhomogeneous description of detector dead layers

  14. Determination Performance Of Gamma Spectrometry Co-Axial HPGE Detector In Radiochemistry And Environment Group, Nuclear Malaysia

    International Nuclear Information System (INIS)

    Mei-Woo, Y.

    2014-01-01

    Gamma Spectrometry System is used to measure qualitatively and quantitatively a gamma emitting radionuclide. The accuracy of the measurement very much depends on the performance specifications of the HPGe detectors. From this study it found that all the seven co-axial HPGe detectors in Radiochemistry and Environment Group, Nuclear Malaysia are in good working conditions base on the verification of performance specifications namely Resolution, Peak Shape, Peak-to-Compton ratio and Relative Efficiency against the warranted value from the manufacturers. (author)

  15. Position sensitivity of the first SmartPET HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, R.J. [Department of Physics, University of Liverpool, Liverpool (United Kingdom)]. E-mail: rjc@ns.ph.liv.ac.uk; Turk, G. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Boston, A.J. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Boston, H.C. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Cresswell, J.R. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Mather, A.R. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Nolan, P.J. [Department of Physics, University of Liverpool, Liverpool (United Kingdom); Hall, C.J. [CCLRC Daresbury, Warrington, Cheshire (United Kingdom); Lazarus, I. [CCLRC Daresbury, Warrington, Cheshire (United Kingdom); Simpson, J. [CCLRC Daresbury, Warrington, Cheshire (United Kingdom); Berry, A. [School of Physics and materials Engineering, Monash University, Melbourne (Australia); Beveridge, T. [School of Physics and materials Engineering, Monash University, Melbourne (Australia); Gillam, J. [School of Physics and materials Engineering, Monash University, Melbourne (Australia); Lewis, R.A. [School of Physics and materials Engineering, Monash University, Melbourne (Australia)

    2007-04-01

    In this paper we discuss the Smart Positron Emission Tomography (PET) imaging system being developed by University of Liverpool in conjunction with CCLRC Daresbury Laboratory. We describe the motivation for the development of a semiconductor-based PET system and the advantages it will offer over current tomographs. Details of the detectors and associated electronics are discussed and results of high precision scans are presented. Analysis of this scan data has facilitated full characterization of the detector response function and calibration of the three-dimensional position sensitivity. This work presents the analysis of the depth sensitivity of the detector.

  16. Determination of the dead layer and full-energy peak efficiency of an HPGe detector using the MCNP code and experimental results

    Directory of Open Access Journals (Sweden)

    M Moeinifar

    2017-02-01

    Full Text Available One important factor in using an High Purity Germanium (HPGe detector is its efficiency that highly depends on the geometry and absorption factors, so that when the configuration of source-detector geometry is changed, the detector efficiency must be re-measured. The best way of determining the efficiency of a detector is measuring the efficiency of standard sources. But considering the fact that standard sources are hardly available and it is time consuming to find them, determinig the efficiency by simulation which gives enough efficiency in less time, is important. In this study, the dead layer thickness and the full-energy peak efficiency of an HPGe detector was obtained by Monte Carlo simulation, using MCNPX code. For this, we first measured gamma–ray spectra for different sources placed at various distances from the detector and stored the measured spectra obtained. Then the obtained spectra were simulated under similar conditions in vitro.At first, the whole volume of germanium was regarded as active, and the obtaind spectra from calculation were compared with the corresponding experimental spectra. Comparison of the calculated spectra with the measured spectra showed considerable differences. By making small variations in the dead layer thickness of the detector (about a few hundredths of a millimeter in the simulation program, we tried to remove these differences and in this way a dead layer of 0.57 mm was obtained for the detector. By incorporating this value for the dead layer in the simulating program, the full-energy peak efficiency of the detector was then obtained both by experiment and by simulation, for various sources at various distances from the detector, and both methods showed good agreements. Then, using MCNP code and considering the exact measurement system, one can conclude that the efficiency of an HPGe detector for various source-detector geometries can be calculated with rather good accuracy by simulation method

  17. Fitted curve parameters for the efficiency of a coaxial HPGe Detector

    International Nuclear Information System (INIS)

    Supian Samat

    1996-01-01

    Using Ngraph software, the parameters of various functions were determined by least squares analysis of fits to experimental efficiencies , ε sub f of a coaxial HPGe detector for gamma rays in the energy range 59 keV to 1836 keV. When these parameters had been determined, their reliability was tested by the calculated goodness-of-fit parameter χ sup 2 sub cal. It is shown that the function, ln ε sub f = Σ sup n sub j=0 a sub j (ln E/E sub 0) sup j , where n=3, gives satisfactory results

  18. Operation of bare HPGe detectors in LAr/LN{sub 2} for the GERDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Heider, M Barnabe; Chkvorets, O; Schoenert, S [MPI fuer Kernphysik, Heidelberg (Germany); Cattadori, C [INFN-Milano Bicocca, Milano (Italy); Vacri, A di [INFN-LNGS, L' Aquila (Italy); Gusev, K; Shirchenko, M [Russian Research Center Kurchatov Institute, Moscow, Russia and JINR, Dubna (Russian Federation)], E-mail: assunta.divacri@lngs.infn.it

    2008-11-01

    GERDA is designed to search for 0{nu}{beta}{beta}-decay of {sup 76}Ge using high purity germanium detectors (HPGe), enriched ({approx} 85%) in {sup 76}Ge, directly immersed in LAr which acts both as shield against {gamma} radiation and as cooling medium. The cryostat is located in a stainless steel water tank providing an additional shield against external background. The GERDA experiment aims at a background (b) {approx}<10{sup -3} cts/(kg-y-keV) and energy resolution (FWHM) {<=} 4 keV at Q{sub {beta}}{sub {beta}} = 2039 keV. GERDA experiment is foreseen to proceed in two phases. For Phase I, eight reprocessed enriched HPGe detectors from the past HdM [C Balysh et al., Phys. Rev. D 66 (1997) 54] and IGEX [C E Aalseth et al., Phys. of Atomic Nuclei 63 (2000) 1225] experiments ({approx} 18 kg) and six reprocessed natural HPGe detectors ({approx} 15 kg) from the Genius Test-Facility [H V Klapdor et al., HIM A 481 (2002) 149] will be deployed in strings. GERDA aims at b {approx}< 10{sup -2} cts/(kg{center_dot}keV{center_dot}y). With an exposure of {approx} 15 kg{center_dot}y of {sup 76}Ge and resolution {approx} 3.6 keV, the sensitivity on the half-life will be T{sup 0{nu}}{sub 1/2} 3 {center_dot} 10{sup 25} y (90 % C.L.) corresponding to m{sub ee} < 270 meV [V A Rodin et al., Nucl. Phys. A 766 (2006) 107]. In Phase II, new diodes, able to discriminate between single- and multi-site events, will be added ({approx} 20 kg of {sup 76}Ge with intrinsic b {approx} 10{sup -2} cts/(kg{center_dot}keV{center_dot}y). With an exposure of {approx} 120 kg{center_dot}y, it is expected T{sup 0{nu}}{sub 1/2} > 1.5 {center_dot} 10{sup 26} y (90% C.L.) corresponding to m{sub ee} < 110 meV [V A Rodin et al., Nucl. Phys. A 766 (2006) 107]. Three natural p-type HPGe prototypes (different passivation layer designs) are available in the GERDA underground facility at LNGS to investigate the effect of the detector assembly (low-mass low-activity holder), of the handling procedure and of the

  19. Development of Educational Simulation on Spectrum of HPGe Detector and Implementation of Education Program

    International Nuclear Information System (INIS)

    Seo, K. W.; Joo, Y. C.; Ji, Y. J.; Lee, M. O.; Lee, S. Y.; Jun, Y. K.

    2005-12-01

    In this development, characteristics of Aptec, Genie2000(Canberra Co, USA), GammaVision(Ortec Co, USA) which are usually used in Korea radioactive measure laboratory, such as peak search, peak fitting, central area position and area calculation, spectrum correction and method for radioactive calculation are included. And radioactive source geometry, absorption of sample itself, methods for correcting coincidence summing effect is developed and the result effected on spectrum analysis teaching material. Developed simulation HPGe detector spectrum are spectrum for correction, spectrum for correcting radio source-detection duration geometry, sample spectrum which need self absorption correction of radio source, peak search spectrum for optimizing peak search offset setting and background spectrum. These spectrum are made similar to real spectrum by processing peak and background which were measured from mix standard volume radio source. Spectrum analysis teaching material is developed more focus on practical thing than theoretical thing, simulation spectrum must be used in spectrum analysis practise. Optimal method for spectrum analysis condition, spectrum correction, Geometry correction and background spectrum analysis are included in teaching material and also ANSI N42 recommended 'Spectrum analysis program test' procedure is included too. Aptec, Genie2000, Gamma Vision software manuals are included in appendix. In order to check the text of developed simulation on spectrum of HPGe detector, in 2004 and 2005, these was implemented in the other regular course as a course for superviser of the handling with RI. And the text and practical procedure were reviewed through the course and were revised

  20. Deep-water gamma-spectrometer based on HP(Ge) detector

    International Nuclear Information System (INIS)

    Sokolov, A.; Danengirsh, S.; Popov, S.; Pchelincev, A; Gostilo, V.; Kravchenko, S.; Shapovalov, V.; Druzhinin, A.

    1995-01-01

    Full text: For radionuclide monitoring of the sea bottom near underwater storage of high active waste of nuclear industries and near places of accidents with nuclear submarines the spectrometers of gamma-radiation, which allow to carry out the measurements on the great depth, are needed. Usually, these problems are solved with devices, which are cast down into the water, using the rope, and transmit the signals on the surface by the cable. However, the depth of immersion is limited by this construction and often the conditions of measurement are complicated. The deep water gamma-spectrometer based on HP(Ge) detector for the measurement on the depth up to 3000 m is developed. The spectrometer is completely autonomic and is put up in the selected place, using the manipulator of a deep-water apparatus. The spectrometer is created in two cylindrical cases with 170 mm diameter and 1100 mm length, bearing the high hydrostatic pressure. The part of the case around the detector is created from titanium and has especial construction with a thin wall for increasing the efficiency of registration in the region of low-energy gamma-radiation. The cooling of the semiconductor detector is provided by a coolant which supports the working temperature of the detector during more than 24 hours. The electronic system of the spectrometer includes high voltage supply f or the detector, preamplifier, analog processor, analog-digital converter and a device for collecting and storing information in flash memory. The power supply of the spectrometer is provided by a battery of accumulators, which can be recharged on the surface. The programming of the processor is carried out before immersion by connecting the spectrometer to personal computer using standard interface RS-232. During 24 hours the spectrometer provides registration of 16 spectrums each in 4096 channels. The reading of the information by the computer is carried out after lifting up the spectrometer on the surface in the same

  1. Development of CANDLES low background HPGe detector and half-life measurement of 180Tam

    Science.gov (United States)

    Chan, W. M.; Kishimoto, T.; Umehara, S.; Matsuoka, K.; Suzuki, K.; Yoshida, S.; Nakajima, K.; Iida, T.; Fushimi, K.; Nomachi, M.; Ogawa, I.; Tamagawa, Y.; Hazama, R.; Takemoto, Y.; Nakatani, N.; Takihira, Y.; Tozawa, M.; Kakubata, H.; Trang, V. T. T.; Ohata, T.; Tetsuno, K.; Maeda, T.; Khai, B. T.; Li, X. L.; Batpurev, T.

    2018-01-01

    A low background HPGe detector system was developed at CANDLES Experimental Hall for multipurpose use. Various low background techniques were employed, including hermatic shield design, radon gas suppression, and background reduction analysis. A new pulse shape discrimination (PSD) method was specially created for coaxial Ge detector. Using this PSD method, microphonics noise and background event at low energy region less than 200 keV can be rejected effectively. Monte Carlo simulation by GEANT4 was performed to acquire the detection efficiency and study the interaction of gamma-rays with detector system. For rare decay measurement, the detector was utilized to detect the nature's most stable isomer tantalum-180m (180Tam) decay. Two phases of tantalum physics run were completed with total livetime of 358.2 days, which Phase II has upgraded shield configuration. The world most stringent half-life limit of 180Tam has been successfully achieved.

  2. True coincidence summing corrections for an extended energy range HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Venegas-Argumedo, Y. [Centro de Investigación en Materiales Avanzados (CIMAV), Miguel de Cervantes 120, Chihuahua, Chih 31109 (Mexico); M.S. Student at CIMAV (Mexico); Montero-Cabrera, M. E., E-mail: elena.montero@cimav.edu.mx [Centro de Investigación en Materiales Avanzados (CIMAV), Miguel de Cervantes 120, Chihuahua, Chih 31109 (Mexico)

    2015-07-23

    True coincidence summing (TCS) effect for natural radioactive families of U-238 and Th-232 represents a problem when an environmental sample with a close source-detector geometry measurement is performed. By using a certified multi-nuclide standard source to calibrate an energy extended range (XtRa) HPGe detector, it is possible to obtain an intensity spectrum slightly affected by the TCS effect with energies from 46 to 1836 keV. In this work, the equations and some other considerations required to calculate the TCS correction factor for isotopes of natural radioactive chains are described. It is projected a validation of the calibration, performed with the IAEA-CU-2006-03 samples (soil and water)

  3. Measurement of radiation shielding properties of polymer composites by using HPGe detector

    International Nuclear Information System (INIS)

    Gupta, Anil; Pillay, H.C.M.; Kale, P.K.; Datta, D.; Suman, S.K.; Gover, V.

    2014-01-01

    Lead is the most common radiation shield and its composite with polymers can be used as flexible radiation shields for different applications. However, lead is very hazardous and has been found to be associated with neurological disorders, kidney failure and hematotoxicity. Lead free radiation shield material has been developed by synthesizing radiation cross linked PDMS/Bi 2 O 3 polymer composites. In order to have a lead free radiation shield the relevant shielding properties such as linear attenuation, half value thickness (HVT) and tenth value thickness (TVT) have been measured by using HPGe detector. The present study describes the methodology of measurement of the shielding properties of the lead free shield material. In the measurement gamma energies such as 59.537 keV ( 241 Am), 122.061 keV and 136.474 keV ( 57 Co) are taken into consideration

  4. Calibration efficiency of HPGe detector in the 50-1800 KeV energy range

    International Nuclear Information System (INIS)

    Venturini, Luzia

    1996-01-01

    This paper describes the efficiency of an HPGe detector in the 50 - 1800 keV energy range, for two geometries for water measurements: Marinelli breaker (850 ml) and a polyethylene flask (100 ml). The experimental data were corrected for the summing effect and fitted to a continuous, differentiable and energy dependent function given by 1n(ε)=b 0 +b 1 .1n(E/E 0 )+ β.1n(E/E 0 ) 2 , where β = b 2 if E>E 0 and β =a 2 if E ≤E 0 ; ε = the full absorption peak efficiency; E is the gamma-ray energy and {b 0 , b 1 , b 2 , a 2 , E 0 } is the parameter set to be fitted. (author)

  5. Characterization of segmented large volume, high purity germanium detectors

    Energy Technology Data Exchange (ETDEWEB)

    Bruyneel, B. [Koeln Univ. (Germany). Inst. fuer Kernphysik

    2006-07-01

    {gamma}-ray tracking in future HPGe arrays like AGATA will rely on pulse shape analysis (PSA) of multiple {gamma}-interactions. For this purpose, a simple and fast procedure was developed which enabled the first full characterization of a segmented large volume HPGe detector. An analytical model for the hole mobility in a Ge crystal lattice was developed to describe the hole drift anisotropy with experimental velocity values along the crystal axis as parameters. The new model is based on the drifted Maxwellian hole distribution in Ge. It is verified by reproducing successfully experimental longitudinal hole anisotropy data. A comparison between electron and hole mobility shows large differences for the longitudinal and tangential velocity anisotropy as a function of the electrical field orientation. Measurements on a 12 fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by {gamma}-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60 keV. A precise measurement of the hole drift anisotropy was performed with 356 keV rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and the digital electronics

  6. Characterization of segmented large volume, high purity germanium detectors

    International Nuclear Information System (INIS)

    Bruyneel, B.

    2006-01-01

    γ-ray tracking in future HPGe arrays like AGATA will rely on pulse shape analysis (PSA) of multiple γ-interactions. For this purpose, a simple and fast procedure was developed which enabled the first full characterization of a segmented large volume HPGe detector. An analytical model for the hole mobility in a Ge crystal lattice was developed to describe the hole drift anisotropy with experimental velocity values along the crystal axis as parameters. The new model is based on the drifted Maxwellian hole distribution in Ge. It is verified by reproducing successfully experimental longitudinal hole anisotropy data. A comparison between electron and hole mobility shows large differences for the longitudinal and tangential velocity anisotropy as a function of the electrical field orientation. Measurements on a 12 fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by γ-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60 keV. A precise measurement of the hole drift anisotropy was performed with 356 keV rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and the digital electronics. The results are

  7. Measurements of radionuclide in Par Pond sediments with an underwater HPGe detector

    International Nuclear Information System (INIS)

    Winn, W.G.

    1993-01-01

    Savannah River Site (SRS) effluent gamma emitting radionuclides in Par Pond sediment were examined in situ with an underwater HPGe detector prior to and following a 19 ft drawdown of the pond in 1991 to address dam repairs. These measurements provide a map of the 137 Cs concentrations of the pond sediment, indicating that 9.4 ± 1.5 Ci is exposed by the drawdown and that 46.6 ± 7.2 Ci is the entire pond inventory. The highest individual 137 Cs concentration was 25 μCi/m 2 for the exposed sediment and 50 μCi/m 2 for the entire pond. The results are consistent with parallel studies conducted by SREL, as well as historical data. Aside from 137 Cs, the only other SRS-produced isotope observed was 60 Co, with activity of only about 1% of that for 137 Cs. This observation was also confirmed in grab samples of pond sediment and vegetation, which were returned to the laboratory for ultra-low-level gamma spectrometry analysis. A special effort was required to calibrate the underwater HPGe detector, where both measurements and calculational models were used. The effects of sediment depth profiles for density and 137 Cs concentration were addressed in the calibration. Calibration factors for sediment surface concentrations (μCi/m 2 /cpm) and sediment mass concentrations (pCi/kg/cpm) were obtained. In general, the μCi/m 2 /cpm factor is recommended, as the pCi/kg/cpm factor depends on the depth location of the sediment of interest. However, a pCi/kg/cpm factor, which is dependent on the depth within the sediment is presented to address dose calculations that require it

  8. Use of planar HPGe detector as a part of X-ray fluorescent spectrometer for educational purposes

    International Nuclear Information System (INIS)

    Verenchikova, M.S.; Kalinin, V.N.; Mikhajlov, V.A.

    2011-01-01

    This work shows the possibility of use of the nondedicated gamma and X-ray detection head on the basis of planar HPGe detector with a big sensitive area equal to 2000 mm''2 as a part of X-ray fluorescent spectrometer during students' practicum.

  9. An investigation of the performance of a coaxial HPGe detector operating in a magnetic resonance imaging field

    Energy Technology Data Exchange (ETDEWEB)

    Harkness, L.J., E-mail: ljh@ns.ph.liv.ac.u [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Boston, A.J.; Boston, H.C.; Cole, P.; Cresswell, J.R.; Filmer, F.; Jones, M.; Judson, D.S.; Nolan, P.J.; Oxley, D.C.; Sampson, J.A.; Scraggs, D.P.; Slee, M.J. [Department of Physics, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Bimson, W.E.; Kemp, G.J. [MARIARC, University of Liverpool, Liverpool L69 3GE (United Kingdom); Groves, J.; Headspith, J.; Lazarus, I.; Simpson, J. [STFC Daresbury Laboratory, Daresbury, Warrington WA4 4AD (United Kingdom); Cooper, R.J. [Joint Institute for Heavy Ion Research, Oak Ridge National Laboratory, Oak Ridge, TN 37831-6371 (United States)

    2011-05-11

    Nuclear medical imaging modalities such as positron emission tomography and single photon emission computed tomography are used to probe physiological functions of the body by detecting gamma rays emitted from biologically targeted radiopharmaceuticals. A system which is capable of simultaneous data acquisition for nuclear medical imaging and magnetic resonance imaging is highly sought after by the medical imaging community. Such a device could provide a more complete medical insight into the functions of the body within a well-defined structural context. However, acquiring simultaneous nuclear/MRI sequences are technically challenging due to the conventional photomultiplier tube readout employed by most existing scintillator detector systems. A promising solution is a nuclear imaging device composed of semiconductor detectors that can be operated with a standard MRI scanner. However, the influence of placing a semiconductor detector such as high purity germanium (HPGe) within or close to the bore of an MRI scanner, where high magnetic fields are present, is not well understood. In this paper, the performance of a HPGe detector operating in a high strength static (B{sub S}) MRI field along with fast switching gradient fields and radiofrequency from the MRI system has been assessed. The influence of the B{sub S} field on the energy resolution of the detector has been investigated for various positions and orientations of the detector within the magnetic field. The results have then been interpreted in terms of the influence of the B{sub S} field on the charge collection properties. MRI images have been acquired with the detector situated at the entrance of the MRI bore to investigate the effects of simultaneous data acquisition on detector performance and MRI imaging.

  10. An automated measuring system based on gamma spectrometry with HPGe detectors

    International Nuclear Information System (INIS)

    Mala, Helena; Rulik, Petr; Hyza, Miroslav; Dragounova, Lenka; Helebrant, Jan; Hroznicek, Marek; Jelinek, Pavel; Zak, Jan

    2016-01-01

    An automatic system for unattended gamma spectrometric measurements of bulk samples ( “Gamma Automat”, GA) was developed by the National Radiation Protection Institute and Nuvia, Inc. as a part of a research project. The basic parts include a detection system with two HPGe detectors in the lead shielded chambers, sample changer, sample tray and a control unit. The GA enables counting in two geometries: (i) with cylindrical containers (200 ml) either one placed at the detector face or 2-6 placed around the detector or (II) with Marinelli beakers (600 ml). The shelf can accommodate 180 cylindrical containers or 54 Marinelli beakers. Samples are changed by a robotic arm. The sample data and the analysis required are passed to the GA by a matrix code (generated within the laboratory system) located on the lid of a sample container, whence the GA reads information. Spectrometric analysis is performed automatically after the counting. Current status of GA can be remotely monitored. Information about the activities of the GA, measurement completion or failures of the equipment are automatically generated and sent to a mobile phone and the operator PC. A presentation of the GA is available at https://youtu.be/1lQhfo0Fljo. (orig.)

  11. Influence of the geometrical characteristics of an HpGe detector on its efficiency

    International Nuclear Information System (INIS)

    Vargas, M.J.; Timon, A.F.; Sanchez, D.P.

    2002-01-01

    Computer codes based on Monte Carlo calculations have been extensively developed for the computation of the efficiency in gamma-ray spectrometry. The errors in the specific parameters of the detector due to the lack of precise knowledge of its characteristics usually represent one of the most important sources of inaccuracy in this simulation technique. Influence of several detector parameters on the efficiency for a typical coaxial n-type HpGe detector is presented. Calculations of the full-energy peak efficiencies were performed by means of a Monte Carlo code in the range 122-1836 keV for several types of source configuration: point source, cellulose filter, and two different cylindrical boxes containing a solid matrix of SiO 2 . The detector parameters varied were the crystal diameter, crystal height, diameter of the internal core, and the position of the crystal with respect to the beryllium window. Significant deviations in the efficiency, depending on the source geometry and the photon energy, can be produced by varying only slightly some of the detector parameters. (author)

  12. Comparison of HPGe detector response data for low energy photons using MCNP, EGS, and its codes

    International Nuclear Information System (INIS)

    Kim, Soon Young; Kim, Jong Kyung

    1995-01-01

    In this study, the photopeak efficiency, K α and K β escape fractions of HPGe detector(100mm 2 X 10mm) are calculated and tabulated as a function of incident X-ray energies from 12 to 60keV in 2-keV increments. Compton, elastic, and penetration fractions are not tabulated from this work since they are negligible amounts in this energy range. The results calculated from this work are compared with earlier Monte Carlo results which had been carried out by Chin-Tu Chen et al.. From the comparison, it is found that the results calculated from each code show a large difference when the incident photon energy approaches to 12keV as compared with energy ranges from 50 to 60keV. In X-ray dosimetry and diagnostic radiology, it is essential to have accurate knowledge of X-ray spectra for studies of patient dose and image quality. Being X-ray spectra measured with a detection system, some distortions due to the incomplete absorption of primary photon or escape before interacting with the detector which have finite dimension can take place

  13. Scoping measurements of radionuclides in L Lake with an underwater HPGe detector

    International Nuclear Information System (INIS)

    Dunn, D.L.; Win, W.G.; Bresnahan, P.J.

    1996-01-01

    This study of L Lake was conducted to determine whether the distribution of man-made radiation levels had changed from the time preceding the filling of the newly created lake in 1985. Overflight gamma measurements by EG ampersand G in 1985 mapped the man-made radiation levels, indicating that significant levels were only detected from former stream beds that were to be covered by the lake. the present scoping gamma measurements were consistent with these earlier findings, indicating no major evidence of movement of the radioactivity. These results will be available to guide decisions concerning future plans for the lake. Gamma-emitting radionuclides of L Lake were examined in situ with an underwater HPGe detector and further studied by retrieving various sediment samples for analysis by HPGe gamma spectrometry in the Underground Counting Facility. The predominant man-made radionuclide detected was 137 Cs; it had about 100 times greater activity than 60 Co, which was the only other man-made radionuclide that was detected above trace levels

  14. Study of Efficiency Calibrations of HPGe Detectors for Radioactivity Measurements of Environmental Samples

    International Nuclear Information System (INIS)

    Harb, S.; Salahel Din, K.; Abbady, A.

    2009-01-01

    In this paper, we describe a method of calibrating of efficiency of a HPGe gamma-ray spectrometry of bulk environmental samples (Tea, crops, water, and soil) is a significant part of the environmental radioactivity measurements. Here we will discuss the full energy peak efficiency (FEPE) of three HPGe detectors it as a consequence, it is essential that the efficiency is determined for each set-up employed. Besides to take full advantage at gamma-ray spectrometry, a set of efficiency at several energies which covers the wide the range in energy, the large the number of radionuclides whose concentration can be determined to measure the main natural gamma-ray emitters, the efficiency should be known at least from 46.54 keV ( 210 Pb) to 1836 keV ( 88 Y). Radioactive sources were prepared from two different standards, a first mixed standard QC Y 40 containing 210 Pb, 241 Am, 109 Cd, and Co 57 , and the second QC Y 48 containing 241 Am, 109 Cd, 57 Co, 139 Ce, 113 Sn, 85 Sr, 137 Cs, 88 Y, and 60 Co is necessary in order to calculate the activity of the different radionuclides contained in a sample. In this work, we will study the efficiency calibration as a function of different parameters as:- Energy of gamma ray from 46.54 keV ( 210 Pb) to 1836 keV ( 88 Y), three different detectors A, B, and C, geometry of containers (point source, marinelli beaker, and cylindrical bottle 1 L), height of standard soil samples in bottle 250 ml, and density of standard environmental samples. These standard environmental sample must be measured before added standard solution because we will use the same environmental samples in order to consider the self absorption especially and composition in the case of volume samples.

  15. SU-F-T-368: Improved HPGe Detector Precise Efficiency Calibration with Monte Carlo Simulations and Radioactive Sources

    Energy Technology Data Exchange (ETDEWEB)

    Zhai, Y. John [Vanderbilt University, Vanderbilt-Ingram Cancer Center, Nashville, TN 37232 (United States)

    2016-06-15

    Purpose: To obtain an improved precise gamma efficiency calibration curve of HPGe (High Purity Germanium) detector with a new comprehensive approach. Methods: Both of radioactive sources and Monte Carlo simulation (CYLTRAN) are used to determine HPGe gamma efficiency for energy range of 0–8 MeV. The HPGe is a GMX coaxial 280 cm{sup 3} N-type 70% gamma detector. Using Momentum Achromat Recoil Spectrometer (MARS) at the K500 superconducting cyclotron of Texas A&M University, the radioactive nucleus {sup 24} Al was produced and separated. This nucleus has positron decays followed by gamma transitions up to 8 MeV from {sup 24} Mg excited states which is used to do HPGe efficiency calibration. Results: With {sup 24} Al gamma energy spectrum up to 8MeV, the efficiency for γ ray 7.07 MeV at 4.9 cm distance away from the radioactive source {sup 24} Al was obtained at a value of 0.194(4)%, by carefully considering various factors such as positron annihilation, peak summing effect, beta detector efficiency and internal conversion effect. The Monte Carlo simulation (CYLTRAN) gave a value of 0.189%, which was in agreement with the experimental measurements. Applying to different energy points, then a precise efficiency calibration curve of HPGe detector up to 7.07 MeV at 4.9 cm distance away from the source {sup 24} Al was obtained. Using the same data analysis procedure, the efficiency for the 7.07 MeV gamma ray at 15.1 cm from the source {sup 24} Al was obtained at a value of 0.0387(6)%. MC simulation got a similar value of 0.0395%. This discrepancy led us to assign an uncertainty of 3% to the efficiency at 15.1 cm up to 7.07 MeV. The MC calculations also reproduced the intensity of observed single-and double-escape peaks, providing that the effects of positron annihilation-in-flight were incorporated. Conclusion: The precision improved gamma efficiency calibration curve provides more accurate radiation detection and dose calculation for cancer radiotherapy treatment.

  16. Development of a technique for the efficiency calibration of a HPGe detector for the off gas samples of a nuclear reactor

    International Nuclear Information System (INIS)

    Singh, Sarbjit; Agarwal, Chhavi; Ramaswami, A.; Manchanda, V.K.

    2007-01-01

    Regular monitoring of off gases released to the environment from a nuclear reactor is mandatory. The gaseous fission products are estimated by gamma ray spectrometry using a HPGe detector coupled to a multichannel analyser. In view of the lack of availability of gaseous fission products standards, an indirect method based on the charcoal absorption technique was developed for the efficiency calibration of HPGe detector system using 133B a and 152E u standards. The known activities of 133B a and 152E u are uniformly distributed in a vial having activated charcoal and counted on the HPGe detector system at liquid nitrogen temperature to determine the gamma ray efficiency for the vial having activated charcoal. The ratio of the gamma ray efficiencies of off gas present in the normal vial and the vial having activated charcoal at liquid nitrogen temperature are used to determine the gamma ray efficiency of off gas present in the normal vial. (author)

  17. Theoretical determination of spectrum-exposure rate conversion operator of HPGe detector and its application to the measurement of environmental gamma-ray exposure rate

    International Nuclear Information System (INIS)

    Park, Ch.M.; Choi, B.I.; Kwak, S.S.; Ji, P.K.; Kim, T.W.; Park, Y.W.; Yoon, B.K.

    1993-01-01

    A conversion operator between spectrum and exposure rate, using a portable HPGe detector for environmental radiation monitoring, was determined theoretically under the assumption of uniform distribution of radiation source on the ground surface. The measurement results were compared with those of a pressurized ionization chamber. The results obtained with the HPGe detector were slightly lower. The method can be easily applied to any gamma ray detector to obtain a spectrum - exposure-rate conversion factor for computing the exposure rate of environmental gamma radiation. (N.T.) 15 refs.; 6 figs.; 3 tabs

  18. Measurements and simulation-based optimization of TIGRESS HPGe detector array performance

    International Nuclear Information System (INIS)

    Schumaker, M.A.

    2005-01-01

    TIGRESS is a new γ-ray detector array being developed for installation at the new ISAC-II facility at TRIUMF in Vancouver. When complete, it will consist of twelve large-volume segmented HPGe clover detectors, fitted with segmented Compton suppression shields. The combined operation of prototypes of both a TIGRESS detector and a suppression shield has been tested. Peak-to-total ratios, relative photopeak efficiencies, and energy resolution functions have been determined in order to characterize the performance of TIGRESS. This information was then used to refine a GEANT4 simulation of the full detector array. Using this simulation, methods to overcome the degradation of the photopeak efficiency and peak-to-total response that occurs with high γ-ray multiplicity events were explored. These methods take advantage of the high segmentation of both the HPGe clovers and the suppression shields to suppress or sum detector interactions selectively. For a range of γ-ray energies and multiplicities, optimal analysis methods have been determined, which has resulted in significant gains in the expected performance of TIGRESS. (author)

  19. Optimization of Compton-suppression and summing schemes for the TIGRESS HPGe detector array

    Science.gov (United States)

    Schumaker, M. A.; Svensson, C. E.; Andreoiu, C.; Andreyev, A.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Boston, A. J.; Chakrawarthy, R. S.; Churchman, R.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hackman, G.; Hyland, B.; Jones, B.; Maharaj, R.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Sarazin, F.; Scraggs, H. C.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.

    2007-04-01

    Methods of optimizing the performance of an array of Compton-suppressed, segmented HPGe clover detectors have been developed which rely on the physical position sensitivity of both the HPGe crystals and the Compton-suppression shields. These relatively simple analysis procedures promise to improve the precision of experiments with the TRIUMF-ISAC Gamma-Ray Escape-Suppressed Spectrometer (TIGRESS). Suppression schemes will improve the efficiency and peak-to-total ratio of TIGRESS for high γ-ray multiplicity events by taking advantage of the 20-fold segmentation of the Compton-suppression shields, while the use of different summing schemes will improve results for a wide range of experimental conditions. The benefits of these methods are compared for many γ-ray energies and multiplicities using a GEANT4 simulation, and the optimal physical configuration of the TIGRESS array under each set of conditions is determined.

  20. Calibration of a HPGe detector for Marinelli vessel geometry for measurement of gaseous 41Ar activity using MCNP

    International Nuclear Information System (INIS)

    Raghunath, T.; Narasimhanath, V.; Sunil, C.N.; Kumaravel, S.; Ramakrishna, V.; Prashanth Kumar, M.; Nair, B.S.K.; Purohit, R.G.; Sarkar, P.K.

    2012-01-01

    To carry out measurement of 41 Ar gaseous activity an attempt is made to calibrate the detector (HPGe) for Standard Measuring Flask (SMF) and Marinelli vessel geometry and compare their efficiencies. As standard gaseous source of 41 Ar is not available the calibration is done using liquid standard source of 22 Na (having 1274.5 KeV gamma energy close to the 1293.6 KeV gamma energy of 41 Ar). The HPGe detector and both the geometries are simulated and efficiencies for Full Energy Peak (FEP) are obtained using MCNP. The correction factor for energy and sample matrix is obtained from simulated efficiencies. By applying these correction factors the calibration is done. (author)

  1. A fitting algorithm based on simulated annealing techniques for efficiency calibration of HPGe detectors using different mathematical functions

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, S. [Servicio de Radioisotopos, Centro de Investigacion, Tecnologia e Innovacion (CITIUS), Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: shurtado@us.es; Garcia-Leon, M. [Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Fisica, Universidad de Sevilla, Aptd. 1065, 41080 Sevilla (Spain); Garcia-Tenorio, R. [Departamento de Fisica Aplicada II, E.T.S.A. Universidad de Sevilla, Avda, Reina Mercedes 2, 41012 Sevilla (Spain)

    2008-09-11

    In this work several mathematical functions are compared in order to perform the full-energy peak efficiency calibration of HPGe detectors using a 126cm{sup 3} HPGe coaxial detector and gamma-ray energies ranging from 36 to 1460 keV. Statistical tests and Monte Carlo simulations were used to study the performance of the fitting curve equations. Furthermore the fitting procedure of these complex functional forms to experimental data is a non-linear multi-parameter minimization problem. In gamma-ray spectrometry usually non-linear least-squares fitting algorithms (Levenberg-Marquardt method) provide a fast convergence while minimizing {chi}{sub R}{sup 2}, however, sometimes reaching only local minima. In order to overcome that shortcoming a hybrid algorithm based on simulated annealing (HSA) techniques is proposed. Additionally a new function is suggested that models the efficiency curve of germanium detectors in gamma-ray spectrometry.

  2. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    Science.gov (United States)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  3. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    International Nuclear Information System (INIS)

    Martín, S.; Quintana, B.; Barrientos, D.

    2016-01-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ"2 test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  4. Wilcoxon signed-rank-based technique for the pulse-shape analysis of HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Martín, S., E-mail: sergiomr@usal.es; Quintana, B.; Barrientos, D.

    2016-07-01

    The characterization of the electric response of segmented-contact high-purity germanium detectors requires scanning systems capable of accurately associating each pulse with the position of the interaction that generated it. This process requires an algorithm sensitive to changes above the electronic noise in the pulse shapes produced at different positions, depending on the resolution of the Ge crystal. In this work, a pulse-shape comparison technique based on the Wilcoxon signed-rank test has been developed. It provides a method to distinguish pulses coming from different interaction points in the germanium crystal. Therefore, this technique is a necessary step for building a reliable pulse-shape database that can be used later for the determination of the position of interaction for γ-ray tracking spectrometry devices such as AGATA, GRETA or GERDA. The method was validated by comparison with a χ{sup 2} test using simulated and experimental pulses corresponding to a Broad Energy germanium detector (BEGe).

  5. Numerical expressions for the computation of coincidence-summing correction factors in gamma-ray spectrometry with HPGe detectors

    International Nuclear Information System (INIS)

    Rizzo, S.; Tomarchio, E.

    2008-01-01

    to a 60% relative efficiency HPGe detector - are also reported. Point source peak and total efficiency curves are determined by measuring 'single-line' calibration sources while the computation of 'effective efficiencies' was performed by means of a Monte Carlo MCNP5 simulation of HPGe detector response

  6. Direct and precise determination of environmental radionuclides in solid materials using a modified Marinelli beaker and a HPGe detector

    International Nuclear Information System (INIS)

    Seo, B.K.; Lee, D.W.; Lee, K.Y.; Yoon, Y.Y.

    2001-01-01

    A simple but precise detection method was studied for the determination of natural radionuclides using a conventional HPGe detector. A new aluminium beaker instead of a plastic Marinelli beaker was constructed and examined to reach radioactive equilibrium conditions between radon and its daughter elements without the escape of gaseous radon. Using this beaker fifteen natural radionuclides from three natural decay series could be determined by direct γ-ray measurement and sixteen radionuclides could be determined indirectly after radioactive equilibrium had been reached. Analytical results from ground water were compared with those from conventional α spectroscopy and the results agreed well within 12% difference. Nitrogen gas purge was used to replace the surrounding air of the detector to obtain a stable background and reducing the interference of radon daughter nuclides in the atmosphere. The use of nitrogen purging and the aluminium Marinelli beaker results in an approximately tenfold increase of sensitivity and a decrease of the detection limit of 226 Ra to about 0.74 Bq kg -1 in soil samples. (orig.)

  7. Direct and precise determination of environmental radionuclides in solid materials using a modified Marinelli beaker and a HPGe detector.

    Science.gov (United States)

    Seo, B K; Lee, K Y; Yoon, Y Y; Lee, D W

    2001-06-01

    A simple but precise detection method was studied for the determination of natural radionuclides using a conventional HPGe detector. A new aluminium beaker instead of a plastic Marinelli beaker was constructed and examined to reach radioactive equilibrium conditions between radon and its daughter elements without the escape of gaseous radon. Using this beaker fifteen natural radionuclides from three natural decay series could be determined by direct gamma-ray measurement and sixteen radionuclides could be determined indirectly after radioactive equilibrium had been reached. Analytical results from ground water were compared with those from conventional alpha spectroscopy and the results agreed well within 12% difference. Nitrogen gas purge was used to replace the surrounding air of the detector to obtain a stable background and reducing the interference of radon daughter nuclides in the atmosphere. The use of nitrogen purging and the aluminium Marinelli beaker results in an approximately tenfold increase of sensitivity and a decrease of the detection limit of 226Ra to about 0.74 Bq kg(-1) in soil samples.

  8. Ninth degree polynomial fit function for calculation of efficiency calibrations for Ge(Li) and HPGe detectors

    International Nuclear Information System (INIS)

    Uosif, M.A.M.

    2006-01-01

    A new 9 th degree polynomial fit function has been constructed to calculate the absolute γ-ray detection efficiencies (ηth) of Ge(Li) and HPGe Detectors, for calculating the absolute efficiency at any interesting γ-energy in the energy range between 25 and 2000 keV and distance between 6 and 148 cm. The total absolute γ -ray detection efficiencies have been calculated for six detectors, three of them are Ge(Li) and three HPGe at different distances. The absolute efficiency of the different detectors was calculated at the specific energy of the standard sources for each measuring distances. In this calculation, experimental (η e xp) and fitting (η f it) efficiency have been calculated. Seven calibrated point sources Am-241, Ba-133, Co-57, Co-60, Cs-137, Eu-152 and Ra-226 were used. The uncertainties of efficiency calibration have been calculated also for quality control. The measured (η e xp) and (η f it) calculated efficiency values were compared with efficiency, which calculated, by Gray fit function (time)- The results obtained on the basis of (η e xp)and (η f it) seem to be in very good agreement

  9. Development of twin Ge detector for high energy photon measurement and its performance

    Energy Technology Data Exchange (ETDEWEB)

    Shigetome, Yoshiaki; Harada, Hideo [Power Reactor and Nuclear Fuel Development Corp., Tokai, Ibaraki (Japan). Tokai Works

    1998-03-01

    Prototype twin HPGe detector composed of two large HPGe crystals was developed to obtain better detection efficiency ({epsilon}) and P/T ratio, which was required for high energy photon spectroscopy. In this work, the performances of the twin HPGe detector were evaluated by computer simulation employing EGS4 code. (author)

  10. Efficiency calibration of a HPGe detector for [18F] FDG activity measurements

    International Nuclear Information System (INIS)

    Fragoso, Maria da Conceicao de Farias; Lacerda, Isabelle Viviane Batista de; Albuquerque, Antonio Morais de Sa

    2013-01-01

    The radionuclide 18 F, in the form of flurodeoxyglucose (FDG), is the most used radiopharmaceutical for Positron Emission Tomography (PET). Due to [ 18 F]FDG increasing demand, it is important to ensure high quality activity measurements in the nuclear medicine practice. Therefore, standardized reference sources are necessary to calibrate of 18 F measuring systems. Usually, the activity measurements are performed in re-entrant ionization chambers, also known as radionuclide calibrators. Among the existing alternatives for the standardization of radioactive sources, the method known as gamma spectrometry is widely used for short-lived radionuclides, since it is essential to minimize source preparation time. The purpose of this work was to perform the standardization of the [ 18 F]FDG solution by gamma spectrometry. In addition, the reference sources calibrated by this method can be used to calibrate and test the radionuclide calibrators from the Divisao de Producao de Radiofarmacos (DIPRA) of the Centro Regional de Ciencias Nucleares do Nordeste (CRCN-NE). Standard sources of 152 Eu, 137 Cs and 68 Ge were used for the efficiency calibration of the spectrometer system. As a result, the efficiency curve as a function of energy was determined in wide energy range from 122 to 1408 keV. Reference sources obtained by this method can be used in [ 18 F]FDG activity measurements comparison programs for PET services localized in the Brazilian Northeast region. (author)

  11. Surface Alpha Interactions in P-Type Point-Contact HPGe Detectors: Maximizing Sensitivity of 76Ge Neutrinoless Double-Beta Decay Searches

    Science.gov (United States)

    Gruszko, Julieta

    Though the existence of neutrino oscillations proves that neutrinos must have non-zero mass, Beyond-the-Standard-Model physics is needed to explain the origins of that mass. One intriguing possibility is that neutrinos are Majorana particles, i.e., they are their own anti-particles. Such a mechanism could naturally explain the observed smallness of the neutrino masses, and would have consequences that go far beyond neutrino physics, with implications for Grand Unification and leptogenesis. If neutrinos are Majorana particles, they could undergo neutrinoless double-beta decay (0nBB), a hypothesized rare decay in which two antineutrinos annihilate one another. This process, if it exists, would be exceedingly rare, with a half-life over 1E25 years. Therefore, searching for it requires experiments with extremely low background rates. One promising technique in the search for 0nBB is the use of P-type point-contact (P-PC) high-purity Germanium (HPGe) detectors enriched in 76Ge, operated in large low-background arrays. This approach is used, with some key differences, by the MAJORANA and GERDA Collaborations. A problematic background in such large granular detector arrays is posed by alpha particles incident on the surfaces of the detectors, often caused by 222Rn contamination of parts or of the detectors themselves. In the MAJORANA DEMONSTRATOR, events have been observed that are consistent with energy-degraded alphas originating near the passivated surface of the detectors, leading to a potential background contribution in the region-of-interest for neutrinoless double-beta decay. However, it is also observed that when energy deposition occurs very close to the passivated surface, high charge trapping occurs along with subsequent slow charge re-release. This leads to both a reduced prompt signal and a measurable change in slope of the tail of a recorded pulse. Here we discuss the characteristics of these events and the development of a filter that can identify the

  12. Applying a low energy HPGe detector gamma ray spectrometric technique for the evaluation of Pu/Am ratio in biological samples.

    Science.gov (United States)

    Singh, I S; Mishra, Lokpati; Yadav, J R; Nadar, M Y; Rao, D D; Pradeepkumar, K S

    2015-10-01

    The estimation of Pu/(241)Am ratio in the biological samples is an important input for the assessment of internal dose received by the workers. The radiochemical separation of Pu isotopes and (241)Am in a sample followed by alpha spectrometry is a widely used technique for the determination of Pu/(241)Am ratio. However, this method is time consuming and many times quick estimation is required. In this work, Pu/(241)Am ratio in the biological sample was estimated with HPGe detector based measurements using gamma/X-rays emitted by these radionuclides. These results were compared with those obtained from alpha spectroscopy of sample after radiochemical analysis and found to be in good agreement. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  14. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  15. Study of the γ decay of high-lying states in 208Pb via inelastic scattering of 17O ions

    Directory of Open Access Journals (Sweden)

    Crespi F.C.L.

    2014-03-01

    Full Text Available A measurement of the high-lying states in 208Pb has been made using 17O beams at 20 MeV/u. The gamma decay following inelastic excitation was measured with the detector system AGATA Demonstrator based on segmented HPGe detectors, coupled to an array of large volume LaBr3:Ce scintillators and to an array of Si detectors. Preliminary results in comparison with (γ,γ’ data, for states in the 5-8 MeV energy interval, are presented.

  16. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  17. Resolution, efficiency and stability of HPGe detector operating in a magnetic field at various gamma-ray energies

    International Nuclear Information System (INIS)

    Szymanska, K.; Achenbach, P.; Agnello, M.; Botta, E.; Bracco, A.; Bressani, T.; Camera, F.; Cederwall, B.; Feliciello, A.; Ferro, F.; Gerl, J.; Iazzi, F.; Kavatsyuk, M.; Kojouharov, I.; Pochodzalla, J.; Raciti, G.; Saito, T.R.; Sanchez Lorente, A.; Tegner, P.-E.; Wieland, O.

    2008-01-01

    The use of High Purity Germanium detectors (HPGe) has been planned in some future experiments of hadronic physics. The crystals will be located close to large spectrometers where the magnetic fringing field will not be negligible and their performances might change. Moreover high precision is required in these experiments. The contribution of magnetic field presence and long term measurements is unique. In this paper the results of systematic measurements of the resolution, stability and efficiency of a crystal operating inside a magnetic field of 0.8 T, using radioactive sources in the energy range from 0.08 to 1.33 MeV, are reported. The measurements have been repeated during several months in order to test if any permanent damage occurred. The resolution at 1.117 and 1.332 MeV gamma-rays from a 60 Co source has been measured at different magnetic fields in the range of 0-0.8 T and the results are compared with the previous data

  18. Consistent empirical physical formula construction for recoil energy distribution in HPGe detectors by using artificial neural networks

    International Nuclear Information System (INIS)

    Akkoyun, Serkan; Yildiz, Nihat

    2012-01-01

    The gamma-ray tracking technique is a highly efficient detection method in experimental nuclear structure physics. On the basis of this method, two gamma-ray tracking arrays, AGATA in Europe and GRETA in the USA, are currently being tested. The interactions of neutrons in these detectors lead to an unwanted background in the gamma-ray spectra. Thus, the interaction points of neutrons in these detectors have to be determined in the gamma-ray tracking process in order to improve photo-peak efficiencies and peak-to-total ratios of the gamma-ray peaks. In this paper, the recoil energy distributions of germanium nuclei due to inelastic scatterings of 1–5 MeV neutrons were first obtained by simulation experiments. Secondly, as a novel approach, for these highly nonlinear detector responses of recoiling germanium nuclei, consistent empirical physical formulas (EPFs) were constructed by appropriate feedforward neural networks (LFNNs). The LFNN-EPFs are of explicit mathematical functional form. Therefore, the LFNN-EPFs can be used to derive further physical functions which could be potentially relevant for the determination of neutron interactions in gamma-ray tracking process.

  19. A comparative study for the correction of random gamma ray summing effect in HPGe - detector based gamma ray spectrometry

    International Nuclear Information System (INIS)

    Rajput, M.U.

    2007-01-01

    Random coincidence summing of gamma rays is a potential source of errors in gamma ray spectrometry. The effect has a little significance at low counting rates but becomes increasingly important at high counting rates. Careful corrections are required to avoid the introduction of errors in quantitative based measurements. Several correction methods have been proposed. The most common is the pulser method that requires a precision Pulse Generator in the electronic circuitry to provide reference peak. In this work, a comparative study has been carried out both by using pulser method and utilizing radioactive source based method. This study makes the use of 137 Cs radionuclide as a fixed source and the 241 Am as a varied source. The dead time of the system has been varied and the acquisition of the spectra at each position yielded the resulted peak areas with pulsed pile up losses. The linear regression of the data has been carried out. The study has resulted in establishing a consistent factor that can be used as the characteristic of the detector and thereby removes the need of the calibrated or precise Pulse Generator. (author)

  20. Absolute standardization of radionuclides with complex decay by the peak-sum coincidence method and photon spectrometry with HPGe detector

    International Nuclear Information System (INIS)

    Silva, Ronaldo Lins da

    2017-01-01

    This study aims to present a new methodology for absolute standardization of 133 Ba, which is a complex decay radionuclide, using the peak-sum coincidence method associated with gamma spectrometry with a high resolution germanium detector. The use of the method of direct multiplication of matrices allowed identifying all the energies of sum coincidence, as well as their probabilities of detection, which made possible the calculation of the probabilities of detecting the energies of interferences. In addition, with the use of deconvolution software it was possible to obtain the areas of energy without interference of other sums, and by means of the deduced equation for the peak sum method, it was possible to standardize 133 Ba. The result of the activity was compared with those found by the absolute methods existing in the LNMRI, where the result obtained by coincidence peak-sum was highlighted among all. The estimated uncertainties were below 0.30%, compatible with the results found in the literature by other absolute methods. Thus, it was verified that the methodology was able to standardize radionuclide 133 Ba with precision, accuracy, easiness and quickness. The relevance of this doctoral thesis is to provide the National Metrology Laboratory of Ionizing Radiation (LNMRI) with a new absolute standardization methodology for complex decay radionuclides. (author)

  1. Development of a self-absorption correction method used for a HPGe detector by means of a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Itadzu, Hidesuke; Iguchi, Tetsuo; Suzuki, Toshikazu

    2013-01-01

    Quantitative analysis for food products and natural samples, to determine the activity of each radionuclide, can be made by using a high-purity germanium (HPGe) gamma-ray spectrometer system. The analysis procedure is, in general, based upon the guidelines established by the Nuclear Safety Division of the Ministry of Education, Culture, Sports, Science and Technology in Japan (JP MEXT). In the case of gamma-ray spectrum analysis for large volume samples, re-entrant (marinelli) containers are commonly used. The effect of photon attenuation in a large-volume sample, so-called “self-absorption”, should be corrected for precise determination of the activity. As for marinelli containers, two accurate geometries are shown in the JP MEXT guidelines for 700 milliliter and 2 liter volumes. In the document, the functions to obtain the self-absorption coefficients for these specific shapes are also shown. Therefore, self-absorption corrections have been carried out only for these two containers with practical media. However, to measure radioactivity for samples in containers of volumes other than those described in the guidelines, the self-absorption correction functions must be obtained by measuring at least two standard multinuclide volume sources, which consist of different media or different linear attenuation coefficients. In this work, we developed a method to obtain these functions over a wide range of linear attenuation coefficients for self-absorption in various shapes of marinelli containers using a Monte Carlo simulation. This method was applied to a 1-liter marinelli container, which is widely used for the above quantitative analysis, although its self-absorption correction function has not yet been established. The validity of this method was experimentally checked through an analysis of natural samples with known activity levels. (author)

  2. Peak-to-valley ratios for three different HPGe detectors for the assessment of 137Cs deposition on the ground and the impact of the detector field-of-view.

    Science.gov (United States)

    Östlund, Karl; Samuelsson, Christer; Mattsson, Sören; Rääf, Christopher L

    2017-02-01

    The peak-to-valley (PTV) method was investigated experimentally comparing PTV ratios for three HPGe detectors, with complementary Monte Carlo simulations of scatter in air for larger source-detector distances. The measured PTV ratios for 137Cs in air were similar for three different detectors for incident angles between 0 and 90°. The study indicated that the PTV method can differentiate between surface and shallow depth sources if the detector field of view is limited to a radius of less than 3.5m. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Calibration efficiency of HPGe detector in the 50-1800 KeV energy range; Calibracao em eficiencia de um detector HPGe na faixa de energias 50 - 1800KeV

    Energy Technology Data Exchange (ETDEWEB)

    Venturini, Luzia [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil). Servico de Monitoracao Ambiental

    1996-07-01

    This paper describes the efficiency of an HPGe detector in the 50 - 1800 keV energy range, for two geometries for water measurements: Marinelli breaker (850 ml) and a polyethylene flask (100 ml). The experimental data were corrected for the summing effect and fitted to a continuous, differentiable and energy dependent function given by 1n({epsilon})=b{sub 0}+b{sub 1}.1n(E/E{sub 0})+ {beta}.1n(E/E{sub 0}){sup 2}, where {beta} = b{sub 2} if E>E{sub 0} and {beta} =a{sub 2} if E {<=}E{sub 0}; {epsilon} = the full absorption peak efficiency; E is the gamma-ray energy and {l_brace}b{sub 0}, b{sub 1}, b{sub 2}, a{sub 2}, E{sub 0} {r_brace} is the parameter set to be fitted. (author)

  4. Modern utilization of accurate methods for gamma-ray spectral analysis detected by high pure germanium (HPGE) detectors through different applications

    International Nuclear Information System (INIS)

    El-Sayed, M.M.

    2005-01-01

    this thesis presents a novel way for application of wavelet trans-from theory in gamma -ray spectroscopy. this technique was applied for searching real and weak peaks, solving problem of multiplets, smoothing and de-noising gamma-ray spectra, and using artificial neural network for identifying peaks. a brief description about gamma-ray spectrum analysis is presented . we discussed the necessary formulas and algorithms of wavelet theory to solve these main problems in gamma ray spectrum analysis. the algorithm of peak search was applied on different types of spectra, IAEA spectra and other sources of gamma spectra. the algorithm of multiplets algorithm was applied successfully on different types of multiplets. the algorithm of denoising was applied successfully on different sources of spectra

  5. modern utilization of accurate methods for gamma-ray spectral analysis detected by high pure germanium (HPGE) detectors through different applications

    International Nuclear Information System (INIS)

    El-Sayed, M.M.

    2006-01-01

    this thesis presents a novel way for application of wavelet transform theory in gamma-ray spectroscopy . this technique was applied for searching real and weak peaks, solving problem of multiplets, smoothing and de-noising gamma-ray spectra, and using artificial neural network for identifying peaks. a brief description about gamma-ray spectrum analysis is presented. we discussed the necessary formulas and algorithms of wavelet theory to solve these main problems in gamma -ray spectrum analysis. the algorithm of peak search was applied on different types of spectra, IAEA spectra and other sources of gamma spectra. the algorithm of multiplets algorithm was applied successfully on different types of multiplets. the algorithm of de noising was applied successfully on different sources of spectra.finally, a database for neutron activation laboratory is created. this data base consists of five routines, wavelet gamma spectrum analysis, peak identification, elemental concentration , neutron flux determination,and detector efficiency calculation

  6. Assessment of ambient-temperature, high-resolution detectors for nuclear safeguards applications

    International Nuclear Information System (INIS)

    Ruhter, W.D.; McQuaid, J.H.; Lavietes, A.

    1993-01-01

    High-resolution, gamma- and x-ray spectrometry are used routinely in nuclear safeguards verification measurements of plutonium and uranium in the field. These measurements are now performed with high-purity germanium (HPGe) detectors that require cooling liquid-nitrogen temperatures, thus limiting their utility in field and unattended safeguards measurement applications. Ambient temperature semiconductor detectors may complement HPGe detectors for certain safeguards verification applications. Their potential will be determined by criteria such as their performance, commercial availability, stage of development, and costs. We have conducted as assessment of ambient temperature detectors for safeguards measurement applications with these criteria in mind

  7. Performance of A Compact Multi-crystal High-purity Germanium Detector Array for Measuring Coincident Gamma-ray Emissions

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Chris [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Daigle, Stephen [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Buckner, Matt [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Erikson, Luke E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Runkle, Robert C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Stave, Sean C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Champagne, Art [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Cooper, Andrew [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Downen, Lori [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Glasgow, Brian D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelly, Keegan [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States); Sallaska, Anne [Univ. of North Carolina, Chapel Hill, NC (United States); Triangle Univ. Nuclear Lab., Durham, NC (United States)

    2015-02-18

    The Multi-sensor Airborne Radiation Survey (MARS) detector is a 14-crystal array of high-purity germanium (HPGe) detectors housed in a single cryostat. The array was used to measure the astrophysical S-factor for the 14N(p,γ)15O* reaction for several transition energies at an effective center of mass energy of 163 keV. Owing to the segmented nature of the MARS detector, the effect of gamma-ray summing was greatly reduced in comparison to past experiments which utilized large, single-crystal detectors. The new S-factor values agree within the uncertainties with the past measurements. Details of the analysis and detector performance will be presented.

  8. Determination of Barium and selected rare-earth elements in geological materials employing a HpGe detector by radioisotope excited x-ray fluorescence

    International Nuclear Information System (INIS)

    LaBrecque, J.J.; Preiss, I.L.

    1984-01-01

    The laterite material (geological) from Cerro Impacto was first studied by air radiometric techniques in the 1970's and was found to have an abnormally high radioactive background. Further studies showed this deposit to be rich in thorium, columbium, barium and rare-earth elements (mostly La, Ce, Pr and Nd). A similar work has been reported for the analysis of Brazil's lateritic material from Morro do Ferro to determine elemental compositions (including barium and rare-earth elements) and its relationship to the mobilization of thorium from the deposit using a Co-57 radioisotope source. The objective of this work was to develop an analytical method to determine barium and rare-earth element present in Venezuelan lateritic material from Cerro Impacto. We have employed a method before, employing a Si(Li) detector, but due to the low detection efficiencies in the rare-earth K-lines region (about 30 KeV - 40 KeV), we have decided to study the improvement in sensitivities and detection limits using an hyperpure germanium detector

  9. Characterizing and Reaching High-Risk Drinkers Using Audience Segmentation

    Science.gov (United States)

    Moss, Howard B.; Kirby, Susan D.; Donodeo, Fred

    2010-01-01

    Background Market or audience segmentation is widely used in social marketing efforts to help planners identify segments of a population to target for tailored program interventions. Market-based segments are typically defined by behaviors, attitudes, knowledge, opinions, or lifestyles. They are more helpful to health communication and marketing planning than epidemiologically-defined groups because market-based segments are similar in respect to how they behave or might react to marketing and communication efforts. However, market segmentation has rarely been used in alcohol research. As an illustration of its utility, we employed commercial data that describes the sociodemographic characteristics of high-risk drinkers as an audience segment; where they tend to live, lifestyles, interests, consumer behaviors, alcohol consumption behaviors, other health-related behaviors, and cultural values. Such information can be extremely valuable in targeting and planning public health campaigns, targeted mailings, prevention interventions and research efforts. Methods We describe the results of a segmentation analysis of those individuals who self-report consuming five or more drinks per drinking episode at least twice in the last 30-days. The study used the proprietary PRIZM™ audience segmentation database merged with Center for Disease Control and Prevention's (CDC) Behavioral Risk Factor Surveillance System (BRFSS) database. The top ten of the 66 PRIZM™ audience segments for this risky drinking pattern are described. For five of these segments we provide additional in-depth details about consumer behavior and the estimates of the market areas where these risky drinkers reside. Results The top ten audience segments (PRIZM clusters) most likely to engage in high-risk drinking are described. The cluster with the highest concentration of binge drinking behavior is referred to as the “Cyber Millenials.” This cluster is characterized as “the nation's tech-savvy singles

  10. Characterizing and reaching high-risk drinkers using audience segmentation.

    Science.gov (United States)

    Moss, Howard B; Kirby, Susan D; Donodeo, Fred

    2009-08-01

    Market or audience segmentation is widely used in social marketing efforts to help planners identify segments of a population to target for tailored program interventions. Market-based segments are typically defined by behaviors, attitudes, knowledge, opinions, or lifestyles. They are more helpful to health communication and marketing planning than epidemiologically defined groups because market-based segments are similar in respect to how they behave or might react to marketing and communication efforts. However, market segmentation has rarely been used in alcohol research. As an illustration of its utility, we employed commercial data that describes the sociodemographic characteristics of high-risk drinkers as an audience segment, including where they tend to live, lifestyles, interests, consumer behaviors, alcohol consumption behaviors, other health-related behaviors, and cultural values. Such information can be extremely valuable in targeting and planning public health campaigns, targeted mailings, prevention interventions, and research efforts. We described the results of a segmentation analysis of those individuals who self-reported to consume 5 or more drinks per drinking episode at least twice in the last 30 days. The study used the proprietary PRIZM (Claritas, Inc., San Diego, CA) audience segmentation database merged with the Center for Disease Control and Prevention's (CDC) Behavioral Risk Factor Surveillance System (BRFSS) database. The top 10 of the 66 PRIZM audience segments for this risky drinking pattern are described. For five of these segments we provided additional in-depth details about consumer behavior and the estimates of the market areas where these risky drinkers resided. The top 10 audience segments (PRIZM clusters) most likely to engage in high-risk drinking are described. The cluster with the highest concentration of binge-drinking behavior is referred to as the "Cyber Millenials." This cluster is characterized as "the nation's tech

  11. TIGRESS highly-segmented high-purity germanium clover detector

    Science.gov (United States)

    Scraggs, H. C.; Pearson, C. J.; Hackman, G.; Smith, M. B.; Austin, R. A. E.; Ball, G. C.; Boston, A. J.; Bricault, P.; Chakrawarthy, R. S.; Churchman, R.; Cowan, N.; Cronkhite, G.; Cunningham, E. S.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hyland, B.; Jones, B.; Leslie, J. R.; Martin, J.-P.; Morris, D.; Morton, A. C.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Svensson, C. E.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.; Zimmerman, L.

    2005-05-01

    The TRIUMF-ISAC Gamma-Ray Escape-Suppressed Spectrometer (TIGRESS) will consist of twelve units of four high-purity germanium (HPGe) crystals in a common cryostat. The outer contacts of each crystal will be divided into four quadrants and two lateral segments for a total of eight outer contacts. The performance of a prototype HPGe four-crystal unit has been investigated. Integrated noise spectra for all contacts were measured. Energy resolutions, relative efficiencies for both individual crystals and for the entire unit, and peak-to-total ratios were measured with point-like sources. Position-dependent performance was measured by moving a collimated source across the face of the detector.

  12. MWPC with highly segmented cathode pad readout

    International Nuclear Information System (INIS)

    Debbe, R.; Fischer, J.; Lissauer, D.

    1989-01-01

    Experiments being conducted with high energy heavy ion beams at Brookhaven National Laboratory and at CERN have shown the importance of developing position sensitive detectors capable of handling events with high multiplicity in environments of high track density as will also be the case in future high luminosity colliders like SSC and RHIC. In addition, these detectors are required to have a dynamic range wide enough to detect minimum ionizing particles and heavy ions like oxygen or silicon. We present here a description of work being done on a prototype of such a detector at BNL. Results from a similar counter are also presented in this Conference. The ''pad chamber'' is a detector with a cathode area subdivided into a very large number of pixel-like elements such that a charged particle traversing the detector at normal incidence leaves an induced charge on a few localized pads. The pads are interconnected by a resistive strip, and readout amplifiers are connected to the resistive strip at appropriate, carefully determined spacings. The pattern of tracks in a multi-hit event is easily recognized, and a centroid-finding readout system allows position determination to a small fraction of the basic cell size. 5 refs., 9 figs

  13. Automation of the Characterization of High Purity Germanium Detectors

    Science.gov (United States)

    Dugger, Charles ``Chip''

    2014-09-01

    Neutrinoless double beta decay is a rare hypothesized process that may yield valuable insight into the fundamental properties of the neutrino. Currently there are several experiments trying to observe this process, including the Majorana DEMONSTRAOR experiment, which uses high purity germanium (HPGe) detectors to generate and search for these events. Because the event happens internally, it is essential to have the lowest background possible. This is done through passive detector shielding, as well as event discrimination techniques that distinguish between multi-site events characteristic of gamma-radiation, and single-site events characteristic of neutrinoless double beta decay. Before fielding such an experiment, the radiation response of the detectors must be characterized. A robotic arm is being tested for future calibration of HPGe detectors. The arm will hold a source at locations relative to the crystal while data is acquired. Several radioactive sources of varying energy levels will be used to determine the characteristics of the crystal. In this poster, I will present our work with the robot, as well as the characterization of data we took with an underground HPGe detector at the WIPP facility in Carlsbad, NM (2013). Neutrinoless double beta decay is a rare hypothesized process that may yield valuable insight into the fundamental properties of the neutrino. Currently there are several experiments trying to observe this process, including the Majorana DEMONSTRAOR experiment, which uses high purity germanium (HPGe) detectors to generate and search for these events. Because the event happens internally, it is essential to have the lowest background possible. This is done through passive detector shielding, as well as event discrimination techniques that distinguish between multi-site events characteristic of gamma-radiation, and single-site events characteristic of neutrinoless double beta decay. Before fielding such an experiment, the radiation response of

  14. High-dynamic-range imaging for cloud segmentation

    Science.gov (United States)

    Dev, Soumyabrata; Savoy, Florian M.; Lee, Yee Hui; Winkler, Stefan

    2018-04-01

    Sky-cloud images obtained from ground-based sky cameras are usually captured using a fisheye lens with a wide field of view. However, the sky exhibits a large dynamic range in terms of luminance, more than a conventional camera can capture. It is thus difficult to capture the details of an entire scene with a regular camera in a single shot. In most cases, the circumsolar region is overexposed, and the regions near the horizon are underexposed. This renders cloud segmentation for such images difficult. In this paper, we propose HDRCloudSeg - an effective method for cloud segmentation using high-dynamic-range (HDR) imaging based on multi-exposure fusion. We describe the HDR image generation process and release a new database to the community for benchmarking. Our proposed approach is the first using HDR radiance maps for cloud segmentation and achieves very good results.

  15. Fast iterative segmentation of high resolution medical images

    International Nuclear Information System (INIS)

    Hebert, T.J.

    1996-01-01

    Various applications in positron emission tomography (PET), single photon emission computed tomography (SPECT) and magnetic resonance imaging (MRI) require segmentation of 20 to 60 high resolution images of size 256x256 pixels in 3-9 seconds per image. This places particular constraints on the design of image segmentation algorithms. This paper examines the trade-offs in segmenting images based on fitting a density function to the pixel intensities using curve-fitting versus the maximum likelihood method. A quantized data representation is proposed and the EM algorithm for fitting a finite mixture density function to the quantized representation for an image is derived. A Monte Carlo evaluation of mean estimation error and classification error showed that the resulting quantized EM algorithm dramatically reduces the required computation time without loss of accuracy

  16. Epidermal segmentation in high-definition optical coherence tomography.

    Science.gov (United States)

    Li, Annan; Cheng, Jun; Yow, Ai Ping; Wall, Carolin; Wong, Damon Wing Kee; Tey, Hong Liang; Liu, Jiang

    2015-01-01

    Epidermis segmentation is a crucial step in many dermatological applications. Recently, high-definition optical coherence tomography (HD-OCT) has been developed and applied to imaging subsurface skin tissues. In this paper, a novel epidermis segmentation method using HD-OCT is proposed in which the epidermis is segmented by 3 steps: the weighted least square-based pre-processing, the graph-based skin surface detection and the local integral projection-based dermal-epidermal junction detection respectively. Using a dataset of five 3D volumes, we found that this method correlates well with the conventional method of manually marking out the epidermis. This method can therefore serve to effectively and rapidly delineate the epidermis for study and clinical management of skin diseases.

  17. Power ramp tests of high burnup BWR segment rods

    International Nuclear Information System (INIS)

    Hayashi, H.; Etoh, Y.; Tsukuda, Y.; Shimada, S.; Sakurai, H.

    2002-01-01

    Lead use assemblies (LUAs) of high burnup 8x8 fuel design for Japanese BWRs were irradiated up to 5 cycles in Fukushima Daini Nuclear Power Station No. 2 Unit. Segment rods were installed in LUAs and used for power ramp tests in Japanese Material Test Reactor (JMTR). Post irradiation examinations (PIEs) of segment rods were carried out at Nippon Nuclear Fuel Development Co., Ltd. before and after ramp tests. Maximum linear heat rates of LUAs were kept above 300 W/cm in the first cycle, above 250 W/cm in the second and third cycles and decreased to 200 W/cm in the fourth cycle and 80 W/cm in the fifth cycle. The integrity of high burnup 8x8 fuel was confirmed up to the bundle burnup of 48 GWd/t after 5 cycles of irradiation. Systematic and high quality data were collected through detailed PIEs. The main results are as follows. The oxide on the outer surface of cladding tubes was uniform and its thickness was less than 20 micro-meter after 5 cycles of irradiation and was almost independent of burnup. Hydrogen contents in cladding tubes were less than 150 ppm after 5 cycles of irradiation, although hydrogen contents increased during the fourth and fifth irradiation cycles. Mechanical properties of cladding tubes were on the extrapolated line of previous data up to 5 cycles of irradiation. Fission gas release rates were in the low level (mainly less than 6%) up to 5 cycles of irradiation due to the design to decrease pellet temperature. Pellet-cladding bonding layers were observed after the third cycle and almost full bonding was observed after the fifth cycle. Pellet volume increased with burnup in proportion to solid swelling rate up to the forth cycle. After the fifth cycle, slightly higher pellet swelling was confirmed. Power ramp tests were carried out and satisfactory performance of Zr-lined cladding tube was confirmed up to 60 GWd/t (segment average burnup). One segment rod irradiated for 3 cycles failed by a single step ramp test at terminal ramp power of 614 W

  18. High-resolution gamma-ray measurement systems using a compact electro- mechanically cooled detector system and intelligent software

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Neufeld, K.W.

    1995-01-01

    Obtaining high-resolution gamma-ray measurements using high-purity germanium (HPGe) detectors in the field has been of limited practicality due to the need to use and maintain a supply of liquid nitrogen (LN 2 ). This same constraint limits high-resolution gamma measurements in unattended safeguards or treaty Verification applications. We are developing detectors and software to greatly extend the applicability of high-resolution germanium-based measurements for these situations

  19. Multi-granularity synthesis segmentation for high spatial resolution Remote sensing images

    International Nuclear Information System (INIS)

    Yi, Lina; Liu, Pengfei; Qiao, Xiaojun; Zhang, Xiaoning; Gao, Yuan; Feng, Boyan

    2014-01-01

    Traditional segmentation method can only partition an image in a single granularity space, with segmentation accuracy limited to the single granularity space. This paper proposes a multi-granularity synthesis segmentation method for high spatial resolution remote sensing images based on a quotient space model. Firstly, we divide the whole image area into multiple granules (regions), each region is consisted of ground objects that have similar optimal segmentation scale, and then select and synthesize the sub-optimal segmentations of each region to get the final segmentation result. To validate this method, the land cover category map is used to guide the scale synthesis of multi-scale image segmentations for Quickbird image land use classification. Firstly, the image is coarsely divided into multiple regions, each region belongs to a certain land cover category. Then multi-scale segmentation results are generated by the Mumford-Shah function based region merging method. For each land cover category, the optimal segmentation scale is selected by the supervised segmentation accuracy assessment method. Finally, the optimal scales of segmentation results are synthesized under the guide of land cover category. Experiments show that the multi-granularity synthesis segmentation can produce more accurate segmentation than that of a single granularity space and benefit the classification

  20. Construction of precast high performance concrete segmental bridges.

    OpenAIRE

    Ruiz Ripoll, Lidia

    2016-01-01

    The construction of both medium and long span precast concrete segmental bridges is widely spread throughout Spain. Usually, the segments have multiple-keyed epoxy joints, and are assembled by internal prestressing. Yet, there is a more recent type of bridge with dry joints and external prestressing. In these last ones, shear is transferred through physical support between keys and friction between faces of the compressed joint. This shear force is evaluated using friction coefficients from t...

  1. High intensity region segmentation in MR imaging of multiple sclerosis

    International Nuclear Information System (INIS)

    Rodrigo, F; Filipuzzi, M; Graffigna, J P; Isoardi, R; Noceti, M

    2013-01-01

    Numerous pathologies are often manifest in Magnetic Resonance Imaging (MRI) as hyperintense or bright regions as compared to normal tissue. It is of particular interest to develop an algorithm to detect, identify and define those Regions of Interest (ROI) when analyzing MRI studies, particularly for lesions of Multiple Sclerosis (MS). The objective of this study is to analyze those parameters which optimize segmentation of the areas of interest. To establish which areas should be considered as hyperintense regions, we developed a database (DB), with studies of patients diagnosed with MS. This disease causes axonal demyelination and it is expressed as bright regions in PD, T2 and FLAIR MRI sequences. Thus, with more than 4300 hyperintense regions validated by an expert physician, an algorithm was developed to detect such spots, approximating the results the expert obtained. Alongside these hyperintense lesion regions, it also detected bone regions with high intensity levels, similar to the intensity of the lesions, but with other features that allow a good differentiation.The algorithm will then detect ROIs with similar intensity levels and performs classification through data mining techniques

  2. Optimized digital filtering techniques for radiation detection with HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Salathe, Marco, E-mail: marco.salathe@mpi-hd.mpg.de; Kihm, Thomas, E-mail: mizzi@mpi-hd.mpg.de

    2016-02-01

    This paper describes state-of-the-art digital filtering techniques that are part of GEANA, an automatic data analysis software used for the GERDA experiment. The discussed filters include a novel, nonlinear correction method for ballistic deficits, which is combined with one of three shaping filters: a pseudo-Gaussian, a modified trapezoidal, or a modified cusp filter. The performance of the filters is demonstrated with a 762 g Broad Energy Germanium (BEGe) detector, produced by Canberra, that measures γ-ray lines from radioactive sources in an energy range between 59.5 and 2614.5 keV. At 1332.5 keV, together with the ballistic deficit correction method, all filters produce a comparable energy resolution of ~1.61 keV FWHM. This value is superior to those measured by the manufacturer and those found in publications with detectors of a similar design and mass. At 59.5 keV, the modified cusp filter without a ballistic deficit correction produced the best result, with an energy resolution of 0.46 keV. It is observed that the loss in resolution by using a constant shaping time over the entire energy range is small when using the ballistic deficit correction method.

  3. Real-time numerical processing for HPGE detectors signals

    International Nuclear Information System (INIS)

    Eric Barat; Thomas Dautremer; Laurent Laribiere; Jean Christophe Trama

    2006-01-01

    Full text of publication follows: Concerning the gamma spectrometry, technology progresses in the processor field makes very conceivable and attractive executing complex real-time digital process. Only some simplified and rigid treatments can be find in the market up to now. Indeed, the historical solution used for 50 years consists of performing a so-called 'cusp' filtering and disturbing the optimal shape in order to shrink and/or truncate it. This tuning largely determined by the input count rate (ICR) the user expects to measure is then a compromise between the resolution and the throughput. Because it is not possible to tune it for each pulse, that is a kind of 'leveling down' which is made: the energy of each pulse is not as well estimated as it could be. The new approach proposed here avoids totally this restricting hand tuning. The innovation lies in the modelling of the shot-noise signal as a Jump Markov Linear System. The jump is the occurrence of a pulse in the signal. From this model, we developed an algorithm which makes possible the on-line estimation of the energies without having to temporally enlarge the pulses as the cusp filter does. The algorithm first determines whether there is a pulse or not at each time, then conditionally to this information, it performs an optimal Kalman smoother. Thanks to this global optimization, this allows us to dramatically increase the compromise throughput versus resolution, gaining an important factor on a commercial device concerning the admissible ICR (more than 1 million counts per second admissible). A huge advantage of the absence of hand tuning is that the system accepts fluctuating ICR. To validate the concept we built a real time demonstrator. First, our equipment is composed of an electronic stage which prepared the signal coming from the preamplifier of the detector and optimized the signal-to-noise ratio. Then the signal is sampled at 10 MHz and the powerful of two Pentium running at 3 GHz is enough to perform the whole treatment. Both for the count rate as well as the metrology, the result is far more efficient than the simple cusp filter. With our first lab version, it was for example possible to continuously monitor a 60 Co source (7600 cps ICR) perturbed by a 137 Cs (from 0 to 1600000 cps ICR), while maintaining a good resolution (from 1,8 keV at 1332 keV to 3,2 keV at 1332 keV). The reference activity was quantitatively good (2,7 % peak-peak dispersion for all the perturbed activity range). (authors)

  4. Coincidence corrected efficiency calibration of Compton-suppressed HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Aucott, Timothy [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Brand, Alexander [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); DiPrete, David [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-04-20

    The authors present a reliable method to calibrate the full-energy efficiency and the coincidence correction factors using a commonly-available mixed source gamma standard. This is accomplished by measuring the peak areas from both summing and non-summing decay schemes and simultaneously fitting both the full-energy efficiency, as well as the total efficiency, as functions of energy. By using known decay schemes, these functions can then be used to provide correction factors for other nuclides not included in the calibration standard.

  5. Information Extraction of High Resolution Remote Sensing Images Based on the Calculation of Optimal Segmentation Parameters

    Science.gov (United States)

    Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei

    2016-01-01

    Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762

  6. Performance of a highly segmented scintillating fibres electromagnetic calorimeter

    International Nuclear Information System (INIS)

    Asmone, A.; Bertino, M.; Bini, C.; De Zorzi, G.; Diambrini Palazzi, G.; Di Cosimo, G.; Di Domenico, A.; Garufi, F.; Gauzzi, P.; Zanello, D.

    1993-01-01

    A prototype of scintillating fibres electromagnetic calorimeter has been constructed and tested with 2, 4 and 8 GeV electron beams at the CERN PS. The calorimeter modules consist of a Bi-Pb-Sn alloy and scintillating fibres. The fibres are parallel to the modules longer axis, and nearly parallel to the incident electrons direction. The calorimeter has two different segmentation regions of 24x24 mm 2 and 8x24 mm 2 cross area respectively. Results on energy and impact point space resolution are obtained and compared for the two different granularities. (orig.)

  7. Contextually guided very-high-resolution imagery classification with semantic segments

    Science.gov (United States)

    Zhao, Wenzhi; Du, Shihong; Wang, Qiao; Emery, William J.

    2017-10-01

    Contextual information, revealing relationships and dependencies between image objects, is one of the most important information for the successful interpretation of very-high-resolution (VHR) remote sensing imagery. Over the last decade, geographic object-based image analysis (GEOBIA) technique has been widely used to first divide images into homogeneous parts, and then to assign semantic labels according to the properties of image segments. However, due to the complexity and heterogeneity of VHR images, segments without semantic labels (i.e., semantic-free segments) generated with low-level features often fail to represent geographic entities (such as building roofs usually be partitioned into chimney/antenna/shadow parts). As a result, it is hard to capture contextual information across geographic entities when using semantic-free segments. In contrast to low-level features, "deep" features can be used to build robust segments with accurate labels (i.e., semantic segments) in order to represent geographic entities at higher levels. Based on these semantic segments, semantic graphs can be constructed to capture contextual information in VHR images. In this paper, semantic segments were first explored with convolutional neural networks (CNN) and a conditional random field (CRF) model was then applied to model the contextual information between semantic segments. Experimental results on two challenging VHR datasets (i.e., the Vaihingen and Beijing scenes) indicate that the proposed method is an improvement over existing image classification techniques in classification performance (overall accuracy ranges from 82% to 96%).

  8. Track segments in hadronic showers in a highly granular scintillator-steel hadron calorimeter

    CERN Document Server

    Adloff, C.; Chefdeville, M.; Drancourt, C.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Koletsou, I.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Schlereth, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S.T.; Sosebee, M.; White, A.P.; Yu, J.; Eigen, G.; Mikami, Y.; Watson, N.K.; Mavromanolakis, G.; Thomson, M.A.; Ward, D.R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Apostolakis, J.; Dannheim, D.; Dotti, A.; Folger, G.; Ivantchenko, V.; Klempt, W.; Kraaij, E.van der; Lucaci-Timoce, A.-I; Ribon, A.; Schlatter, D.; Uzhinskiy, V.; Cârloganu, C.; Gay, P.; Manen, S.; Royer, L.; Tytgat, M.; Zaganidis, N.; Blazey, G.C.; Dyshkant, A.; Lima, J.G.R.; Zutshi, V.; Hostachy, J.-Y; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hartbrich, O.; Hermberg, B.; Karstensen, S.; Krivan, F.; Krüger, K.; Lu, S.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Feege, N.; Garutti, E.; Laurien, S.; Marchesini, I.; Matysek, M.; Ramilli, M.; Briggl, K.; Eckert, P.; Harion, T.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G.W.; Kawagoe, K.; Sudo, Y.; Yoshioka, T.; Dauncey, P.D.; Magnan, A.-M; Bartsch, V.; Wing, M.; Salvatore, F.; Gil, E.Cortina; Mannai, S.; Baulieu, G.; Calabria, P.; Caponetto, L.; Combaret, C.; Negra, R.Della; Grenier, G.; Han, R.; Ianigro, J-C; Kieffer, R.; Laktineh, I.; Lumb, N.; Mathez, H.; Mirabito, L.; Petrukhin, A.; Steen, A.; Tromeur, W.; Donckt, M.Vander; Zoccarato, Y.; Alamillo, E.Calvo; Fouz, M.-C; Puerta-Pelayo, J.; Corriveau, F.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Popov, V.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Tikhomirov, V.; Kiesling, C.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Amjad, M.S.; Bonis, J.; Callier, S.; Lorenzo, S.Conforti di; Cornebise, P.; Doublet, Ph; Dulucq, F.; Fleury, J.; Frisson, T.; der Kolk, N.van; Li, H.; Martin-Chassard, G.; Richard, F.; Taille, Ch de la; Pöschl, R.; Raux, L.; Rouëné, J.; Seguin-Moreau, N.; Anduze, M.; Balagura, V.; Boudry, V.; Brient, J-C; Cornat, R.; Frotin, M.; Gastaldi, F.; Guliyev, E.; Haddad, Y.; Magniette, F.; Musat, G.; Ruan, M.; Tran, T.H.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Kotera, K.; Takeshita, T.; Uozumi, S.; Jeans, D.; Götze, M.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2013-01-01

    We investigate the three dimensional substructure of hadronic showers in the CALICE scintillator-steel hadronic calorimeter. The high granularity of the detector is used to find track segments of minimum ionising particles within hadronic showers, providing sensitivity to the spatial structure and the details of secondary particle production in hadronic cascades. The multiplicity, length and angular distribution of identified track segments are compared to GEANT4 simulations with several different shower models. Track segments also provide the possibility for in-situ calibration of highly granular calorimeters.

  9. High-resolution satellite image segmentation using Hölder exponents

    Indian Academy of Sciences (India)

    Keywords. High resolution image; texture analysis; segmentation; IKONOS; Hölder exponent; cluster. ... are that. • it can be used as a tool to measure the roughness ... uses reinforcement learning to learn the reward values of ..... The numerical.

  10. Pyramidal Watershed Segmentation Algorithm for High-Resolution Remote Sensing Images Using Discrete Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    K. Parvathi

    2009-01-01

    Full Text Available The watershed transformation is a useful morphological segmentation tool for a variety of grey-scale images. However, over segmentation and under segmentation have become the key problems for the conventional algorithm. In this paper, an efficient segmentation method for high-resolution remote sensing image analysis is presented. Wavelet analysis is one of the most popular techniques that can be used to detect local intensity variation and hence the wavelet transformation is used to analyze the image. Wavelet transform is applied to the image, producing detail (horizontal, vertical, and diagonal and Approximation coefficients. The image gradient with selective regional minima is estimated with the grey-scale morphology for the Approximation image at a suitable resolution, and then the watershed is applied to the gradient image to avoid over segmentation. The segmented image is projected up to high resolutions using the inverse wavelet transform. The watershed segmentation is applied to small subset size image, demanding less computational time. We have applied our new approach to analyze remote sensing images. The algorithm was implemented in MATLAB. Experimental results demonstrated the method to be effective.

  11. Fragmented esophageal smooth muscle contraction segments on high resolution manometry: a marker of esophageal hypomotility.

    Science.gov (United States)

    Porter, R F; Kumar, N; Drapekin, J E; Gyawali, C P

    2012-08-01

    Esophageal peristalsis consists of a chain of contracting striated and smooth muscle segments on high resolution manometry (HRM). We compared smooth muscle contraction segments in symptomatic subjects with reflux disease to healthy controls. High resolution manometry Clouse plots were analyzed in 110 subjects with reflux disease (50 ± 1.4 years, 51.5% women) and 15 controls (27 ± 2.1 years, 60.0% women). Using the 30 mmHg isobaric contour tool, sequences were designated fragmented if either smooth muscle contraction segment was absent or if the two smooth muscle segments were separated by a pressure trough, and failed if both smooth muscle contraction segments were absent. The discriminative value of contraction segment analysis was assessed. A total of 1115 swallows were analyzed (reflux group: 965, controls: 150). Reflux subjects had lower peak and averaged contraction amplitudes compared with controls (P value to HRM analysis. Specifically, fragmented smooth muscle contraction segments may be a marker of esophageal hypomotility. © 2012 Blackwell Publishing Ltd.

  12. High Performance Shape Memory Polyurethane Synthesized with High Molecular Weight Polyol as the Soft Segment

    Directory of Open Access Journals (Sweden)

    Manzoor Ahmad

    2012-05-01

    Full Text Available Shape memory polyurethanes (SMPUs are typically synthesized using polyols of low molecular weight (MW~2,000 g/mol as it is believed that the high density of cross-links in these low molecular weight polyols are essential for high mechanical strength and good shape memory effect. In this study, polyethylene glycol (PEG-6000 with MW ~6000 g/mol as the soft segment and diisocyanate as the hard segment were used to synthesize SMPUs, and the results were compared with the SMPUs with polycaprolactone PCL-2000. The study revealed that although the PEG-6000-based SMPUs have lower maximum elongations at break (425% and recovery stresses than those of PCL-based SMPUs, they have much better recovery ratios (up to 98% and shape fixity (up to 95%, hence better shape memory effect. Furthermore, PEG-based SMPUs showed a much shorter actuation time of < 10 s for up to 90% shape recovery compared to typical actuation times of tens of seconds to a few minutes for common SMPUs, demonstrated their great potential for applications in microsystems and other engineering components.

  13. Textural Segmentation of High-Resolution Sidescan Sonar Images

    National Research Council Canada - National Science Library

    Kalcic, Maria; Bibee, Dale

    1995-01-01

    .... The high resolution of the 455 kHz sonar imagery also provides much information about the surficial bottom sediments, however their acoustic scattering properties are not well understood at high frequencies...

  14. Chemically synthesized metal-oxide-metal segmented nanowires with high ferroelectric response

    Energy Technology Data Exchange (ETDEWEB)

    Herderick, Edward D; Padture, Nitin P [Department of Materials Science and Engineering, Center for Emergent Materials, Ohio State University, Columbus, OH 43210 (United States); Polomoff, Nicholas A; Huey, Bryan D, E-mail: padture.1@osu.edu [Department of Chemical, Materials, and Biomolecular Engineering, Institute of Materials Science, University of Connecticut, Storrs, CT 06269 (United States)

    2010-08-20

    A chemical synthesis method is presented for the fabrication of high-definition segmented metal-oxide-metal (MOM) nanowires in two different ferroelectric oxide systems: Au-BaTiO{sub 3}-Au and Au-PbTiO{sub 3}-Au. This method entails electrodeposition of segmented nanowires of Au-TiO{sub 2}-Au inside anodic aluminum oxide (AAO) templates, followed by topotactic hydrothermal conversion of the TiO{sub 2} segments into BaTiO{sub 3} or PbTiO{sub 3} segments. Two-terminal devices from individual MOM nanowires are fabricated, and their ferroelectric properties are measured directly, without the aid of scanning probe microscopy (SPM) methods. The MOM nanowire architecture provides high-quality end-on electrical contacts to the oxide segments, and allows direct measurement of properties of nanoscale volume, strain-free oxide segments. Unusually high ferroelectric responses, for chemically synthesized oxides, in these MOM nanowires are reported, and are attributed to the lack of residual strain in the oxides. The ability to measure directly the active properties of nanoscale volume, strain-free oxides afforded by the MOM nanowire architecture has important implications for fundamental studies of not only ferroelectric nanostructures but also nanostructures in the emerging field of multiferroics.

  15. Breast tumor segmentation in high resolution x-ray phase contrast analyzer based computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Brun, E., E-mail: emmanuel.brun@esrf.fr [European Synchrotron Radiation Facility (ESRF), Grenoble 380000, France and Department of Physics, Ludwig-Maximilians University, Garching 85748 (Germany); Grandl, S.; Sztrókay-Gaul, A.; Gasilov, S. [Institute for Clinical Radiology, Ludwig-Maximilians-University Hospital Munich, 81377 Munich (Germany); Barbone, G. [Department of Physics, Harvard University, Cambridge, Massachusetts 02138 (United States); Mittone, A.; Coan, P. [Department of Physics, Ludwig-Maximilians University, Garching 85748, Germany and Institute for Clinical Radiology, Ludwig-Maximilians-University Hospital Munich, 81377 Munich (Germany); Bravin, A. [European Synchrotron Radiation Facility (ESRF), Grenoble 380000 (France)

    2014-11-01

    Purpose: Phase contrast computed tomography has emerged as an imaging method, which is able to outperform present day clinical mammography in breast tumor visualization while maintaining an equivalent average dose. To this day, no segmentation technique takes into account the specificity of the phase contrast signal. In this study, the authors propose a new mathematical framework for human-guided breast tumor segmentation. This method has been applied to high-resolution images of excised human organs, each of several gigabytes. Methods: The authors present a segmentation procedure based on the viscous watershed transform and demonstrate the efficacy of this method on analyzer based phase contrast images. The segmentation of tumors inside two full human breasts is then shown as an example of this procedure’s possible applications. Results: A correct and precise identification of the tumor boundaries was obtained and confirmed by manual contouring performed independently by four experienced radiologists. Conclusions: The authors demonstrate that applying the watershed viscous transform allows them to perform the segmentation of tumors in high-resolution x-ray analyzer based phase contrast breast computed tomography images. Combining the additional information provided by the segmentation procedure with the already high definition of morphological details and tissue boundaries offered by phase contrast imaging techniques, will represent a valuable multistep procedure to be used in future medical diagnostic applications.

  16. Chemically synthesized metal-oxide-metal segmented nanowires with high ferroelectric response

    International Nuclear Information System (INIS)

    Herderick, Edward D; Padture, Nitin P; Polomoff, Nicholas A; Huey, Bryan D

    2010-01-01

    A chemical synthesis method is presented for the fabrication of high-definition segmented metal-oxide-metal (MOM) nanowires in two different ferroelectric oxide systems: Au-BaTiO 3 -Au and Au-PbTiO 3 -Au. This method entails electrodeposition of segmented nanowires of Au-TiO 2 -Au inside anodic aluminum oxide (AAO) templates, followed by topotactic hydrothermal conversion of the TiO 2 segments into BaTiO 3 or PbTiO 3 segments. Two-terminal devices from individual MOM nanowires are fabricated, and their ferroelectric properties are measured directly, without the aid of scanning probe microscopy (SPM) methods. The MOM nanowire architecture provides high-quality end-on electrical contacts to the oxide segments, and allows direct measurement of properties of nanoscale volume, strain-free oxide segments. Unusually high ferroelectric responses, for chemically synthesized oxides, in these MOM nanowires are reported, and are attributed to the lack of residual strain in the oxides. The ability to measure directly the active properties of nanoscale volume, strain-free oxides afforded by the MOM nanowire architecture has important implications for fundamental studies of not only ferroelectric nanostructures but also nanostructures in the emerging field of multiferroics.

  17. Breast tumor segmentation in high resolution x-ray phase contrast analyzer based computed tomography.

    Science.gov (United States)

    Brun, E; Grandl, S; Sztrókay-Gaul, A; Barbone, G; Mittone, A; Gasilov, S; Bravin, A; Coan, P

    2014-11-01

    Phase contrast computed tomography has emerged as an imaging method, which is able to outperform present day clinical mammography in breast tumor visualization while maintaining an equivalent average dose. To this day, no segmentation technique takes into account the specificity of the phase contrast signal. In this study, the authors propose a new mathematical framework for human-guided breast tumor segmentation. This method has been applied to high-resolution images of excised human organs, each of several gigabytes. The authors present a segmentation procedure based on the viscous watershed transform and demonstrate the efficacy of this method on analyzer based phase contrast images. The segmentation of tumors inside two full human breasts is then shown as an example of this procedure's possible applications. A correct and precise identification of the tumor boundaries was obtained and confirmed by manual contouring performed independently by four experienced radiologists. The authors demonstrate that applying the watershed viscous transform allows them to perform the segmentation of tumors in high-resolution x-ray analyzer based phase contrast breast computed tomography images. Combining the additional information provided by the segmentation procedure with the already high definition of morphological details and tissue boundaries offered by phase contrast imaging techniques, will represent a valuable multistep procedure to be used in future medical diagnostic applications.

  18. Specifics of marketing strategy in the segment of high fashion

    OpenAIRE

    Butigan, Ružica; Grilec Kaurić, Alica; Ujević, Darko

    2013-01-01

    The success of high fashion designers is not only in a specificity of the products but also in specific and very well executed marketing strategy. Emphasis is placed on the design of very specific marketing program and marketing strategies that must concider all the characteristics of the high fashion market. Therefore, a scientific research problem is defined as follows: although the market of high fashion at first glance does not imply a completely different marketing approach than other fa...

  19. Brookhaven segment interconnect

    International Nuclear Information System (INIS)

    Morse, W.M.; Benenson, G.; Leipuner, L.B.

    1983-01-01

    We have performed a high energy physics experiment using a multisegment Brookhaven FASTBUS system. The system was composed of three crate segments and two cable segments. We discuss the segment interconnect module which permits communication between the various segments

  20. Comparing Individual Tree Segmentation Based on High Resolution Multispectral Image and Lidar Data

    Science.gov (United States)

    Xiao, P.; Kelly, M.; Guo, Q.

    2014-12-01

    This study compares the use of high-resolution multispectral WorldView images and high density Lidar data for individual tree segmentation. The application focuses on coniferous and deciduous forests in the Sierra Nevada Mountains. The tree objects are obtained in two ways: a hybrid region-merging segmentation method with multispectral images, and a top-down and bottom-up region-growing method with Lidar data. The hybrid region-merging method is used to segment individual tree from multispectral images. It integrates the advantages of global-oriented and local-oriented region-merging strategies into a unified framework. The globally most-similar pair of regions is used to determine the starting point of a growing region. The merging iterations are constrained within the local vicinity, thus the segmentation is accelerated and can reflect the local context. The top-down region-growing method is adopted in coniferous forest to delineate individual tree from Lidar data. It exploits the spacing between the tops of trees to identify and group points into a single tree based on simple rules of proximity and likely tree shape. The bottom-up region-growing method based on the intensity and 3D structure of Lidar data is applied in deciduous forest. It segments tree trunks based on the intensity and topological relationships of the points, and then allocate other points to exact tree crowns according to distance. The accuracies for each method are evaluated with field survey data in several test sites, covering dense and sparse canopy. Three types of segmentation results are produced: true positive represents a correctly segmented individual tree, false negative represents a tree that is not detected and assigned to a nearby tree, and false positive represents that a point or pixel cluster is segmented as a tree that does not in fact exist. They respectively represent correct-, under-, and over-segmentation. Three types of index are compared for segmenting individual tree

  1. Segmenting high-frequency intracardiac ultrasound images of myocardium into infarcted, ischemic, and normal regions.

    Science.gov (United States)

    Hao, X; Bruce, C J; Pislaru, C; Greenleaf, J F

    2001-12-01

    Segmenting abnormal from normal myocardium using high-frequency intracardiac echocardiography (ICE) images presents new challenges for image processing. Gray-level intensity and texture features of ICE images of myocardium with the same structural/perfusion properties differ. This significant limitation conflicts with the fundamental assumption on which existing segmentation techniques are based. This paper describes a new seeded region growing method to overcome the limitations of the existing segmentation techniques. Three criteria are used for region growing control: 1) Each pixel is merged into the globally closest region in the multifeature space. 2) "Geographic similarity" is introduced to overcome the problem that myocardial tissue, despite having the same property (i.e., perfusion status), may be segmented into several different regions using existing segmentation methods. 3) "Equal opportunity competence" criterion is employed making results independent of processing order. This novel segmentation method is applied to in vivo intracardiac ultrasound images using pathology as the reference method for the ground truth. The corresponding results demonstrate that this method is reliable and effective.

  2. Segmentation of High Angular Resolution Diffusion MRI using Sparse Riemannian Manifold Clustering

    Science.gov (United States)

    Wright, Margaret J.; Thompson, Paul M.; Vidal, René

    2015-01-01

    We address the problem of segmenting high angular resolution diffusion imaging (HARDI) data into multiple regions (or fiber tracts) with distinct diffusion properties. We use the orientation distribution function (ODF) to represent HARDI data and cast the problem as a clustering problem in the space of ODFs. Our approach integrates tools from sparse representation theory and Riemannian geometry into a graph theoretic segmentation framework. By exploiting the Riemannian properties of the space of ODFs, we learn a sparse representation for each ODF and infer the segmentation by applying spectral clustering to a similarity matrix built from these representations. In cases where regions with similar (resp. distinct) diffusion properties belong to different (resp. same) fiber tracts, we obtain the segmentation by incorporating spatial and user-specified pairwise relationships into the formulation. Experiments on synthetic data evaluate the sensitivity of our method to image noise and the presence of complex fiber configurations, and show its superior performance compared to alternative segmentation methods. Experiments on phantom and real data demonstrate the accuracy of the proposed method in segmenting simulated fibers, as well as white matter fiber tracts of clinical importance in the human brain. PMID:24108748

  3. Segmentation of low‐cost high efficiency oxide‐based thermoelectric materials

    DEFF Research Database (Denmark)

    Le, Thanh Hung; Van Nong, Ngo; Linderoth, Søren

    2015-01-01

    Thermoelectric (TE) oxide materials have attracted great interest in advanced renewable energy research owing to the fact that they consist of abundant elements, can be manufactured by low-cost processing, sustain high temperatures, be robust and provide long lifetime. However, the low conversion...... efficiency of TE oxides has been a major drawback limiting these materials to broaden applications. In this work, theoretical calculations are used to predict how segmentation of oxide and semimetal materials, utilizing the benefits of both types of materials, can provide high efficiency, high temperature...... oxide-based segmented legs. The materials for segmentation are selected by their compatibility factors and their conversion efficiency versus material cost, i.e., “efficiency ratio”. Numerical modelling results showed that conversion efficiency could reach values of more than 10% for unicouples using...

  4. Development of a segmented gamma ray scanning system

    International Nuclear Information System (INIS)

    Zhu Rongbao; Tan Yajun; Yuan Xiaoxin

    1994-01-01

    A segmented gamma ray scanning system is developed for the purposes of non-destructive assay of the contents of uranium, plutonium or fission products existing in packed low density or medium density nuclear residuals, scrapes or wastes. The near field three-dimensional model for computing CF(AT) is used for cylindrical sample and container, the system developed consists of a transmission source wheel, a rotatable scanning plate, a beam shutter, and annular shielding body, stepping motors and control system, HPGe detector, nuclear electronics and computer. The full scale scanning of samples, spectrum accumulation and data reduction could be fulfilled automatically according to preset standard procedures. The radioisotopes of 169 Yb and 75 Se and used as the transmission sources for assaying 235 U and potential 239 Pu respectively. The calibration experiments using 1 liter solution sample of 192 Ir and 235 U is performed. The standard deviations were obtained for 192 Ir γ rays of 295 keV, 308 keV and 316 keV are +- 0.41%, +- 0.29% and +-0.42% respectively. The standard divination for 235 U 185 keV γ ray is +- 0.62%

  5. MULTI-SCALE SEGMENTATION OF HIGH RESOLUTION REMOTE SENSING IMAGES BY INTEGRATING MULTIPLE FEATURES

    Directory of Open Access Journals (Sweden)

    Y. Di

    2017-05-01

    Full Text Available Most of multi-scale segmentation algorithms are not aiming at high resolution remote sensing images and have difficulty to communicate and use layers’ information. In view of them, we proposes a method of multi-scale segmentation of high resolution remote sensing images by integrating multiple features. First, Canny operator is used to extract edge information, and then band weighted distance function is built to obtain the edge weight. According to the criterion, the initial segmentation objects of color images can be gained by Kruskal minimum spanning tree algorithm. Finally segmentation images are got by the adaptive rule of Mumford–Shah region merging combination with spectral and texture information. The proposed method is evaluated precisely using analog images and ZY-3 satellite images through quantitative and qualitative analysis. The experimental results show that the multi-scale segmentation of high resolution remote sensing images by integrating multiple features outperformed the software eCognition fractal network evolution algorithm (highest-resolution network evolution that FNEA on the accuracy and slightly inferior to FNEA on the efficiency.

  6. Profiling the high frequency wine consumer by price segmentation in the US market

    Directory of Open Access Journals (Sweden)

    Liz Thach

    2015-06-01

    Full Text Available Heavy users of consumer products are important to marketers as a profitable target segment. This is equally true in the wine industry, but with the added precaution of encouraging responsible consumption. This study examines the attributes and behaviors of 681 high frequency (heavy-user wine consumers in the US, based on a price segmentation of High, Moderate, and Low Spenders. For this study, price segmentation was defined as the price typically paid for a bottle of wine for home consumption. Significant differences were discovered based on gender, age, income, wine involvement, shopping channel, ecommerce/social media usage and other key areas. Implications for marketing managers as well as areas of future research are described.

  7. Automatic and manual segmentation of healthy retinas using high-definition optical coherence tomography.

    Science.gov (United States)

    Golbaz, Isabelle; Ahlers, Christian; Goesseringer, Nina; Stock, Geraldine; Geitzenauer, Wolfgang; Prünte, Christian; Schmidt-Erfurth, Ursula Margarethe

    2011-03-01

    This study compared automatic- and manual segmentation modalities in the retina of healthy eyes using high-definition optical coherence tomography (HD-OCT). Twenty retinas in 20 healthy individuals were examined using an HD-OCT system (Carl Zeiss Meditec, Inc.). Three-dimensional imaging was performed with an axial resolution of 6 μm at a maximum scanning speed of 25,000 A-scans/second. Volumes of 6 × 6 × 2 mm were scanned. Scans were analysed using a matlab-based algorithm and a manual segmentation software system (3D-Doctor). The volume values calculated by the two methods were compared. Statistical analysis revealed a high correlation between automatic and manual modes of segmentation. The automatic mode of measuring retinal volume and the corresponding three-dimensional images provided similar results to the manual segmentation procedure. Both methods were able to visualize retinal and subretinal features accurately. This study compared two methods of assessing retinal volume using HD-OCT scans in healthy retinas. Both methods were able to provide realistic volumetric data when applied to raster scan sets. Manual segmentation methods represent an adequate tool with which to control automated processes and to identify clinically relevant structures, whereas automatic procedures will be needed to obtain data in larger patient populations. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.

  8. Hierarchical graph-based segmentation for extracting road networks from high-resolution satellite images

    Science.gov (United States)

    Alshehhi, Rasha; Marpu, Prashanth Reddy

    2017-04-01

    Extraction of road networks in urban areas from remotely sensed imagery plays an important role in many urban applications (e.g. road navigation, geometric correction of urban remote sensing images, updating geographic information systems, etc.). It is normally difficult to accurately differentiate road from its background due to the complex geometry of the buildings and the acquisition geometry of the sensor. In this paper, we present a new method for extracting roads from high-resolution imagery based on hierarchical graph-based image segmentation. The proposed method consists of: 1. Extracting features (e.g., using Gabor and morphological filtering) to enhance the contrast between road and non-road pixels, 2. Graph-based segmentation consisting of (i) Constructing a graph representation of the image based on initial segmentation and (ii) Hierarchical merging and splitting of image segments based on color and shape features, and 3. Post-processing to remove irregularities in the extracted road segments. Experiments are conducted on three challenging datasets of high-resolution images to demonstrate the proposed method and compare with other similar approaches. The results demonstrate the validity and superior performance of the proposed method for road extraction in urban areas.

  9. Conditioning the gamma spectrometer for activity measurement at very high background

    OpenAIRE

    Yan, Weihua; Zhang, Liguo; Zhang, Zhao; Xiao, Zhigang

    2013-01-01

    The application of a high purity germanium (HPGe) gamma spectrometer in determining the fuel element burnup in a future reactor is studied. The HPGe detector is exposed by a Co60 source with varying irradiation rate from 10 kcps to 150 kcps to simulate the input counting rate in real reactor environment. A Cs137 and a Eu152 source are positioned at given distances to generate certain event rate in the detector with the former being proposed as a labeling nuclide to measure the burnup of fuel ...

  10. Action potential generation requires a high sodium channel density in the axon initial segment

    NARCIS (Netherlands)

    Kole, Maarten H. P.; Ilschner, Susanne U.; Kampa, Björn M.; Williams, Stephen R.; Ruben, Peter C.; Stuart, Greg J.

    2008-01-01

    The axon initial segment ( AIS) is a specialized region in neurons where action potentials are initiated. It is commonly assumed that this process requires a high density of voltage-gated sodium ( Na(+)) channels. Paradoxically, the results of patch-clamp studies suggest that the Na(+) channel

  11. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    Science.gov (United States)

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  12. Principal component analysis for neural electron/jet discrimination in highly segmented calorimeters

    International Nuclear Information System (INIS)

    Vassali, M.R.; Seixas, J.M.

    2001-01-01

    A neural electron/jet discriminator based on calorimetry is developed for the second-level trigger system of the ATLAS detector. As preprocessing of the calorimeter information, a principal component analysis is performed on each segment of the two sections (electromagnetic and hadronic) of the calorimeter system, in order to reduce significantly the dimension of the input data space and fully explore the detailed energy deposition profile, which is provided by the highly-segmented calorimeter system. It is shown that projecting calorimeter data onto 33 segmented principal components, the discrimination efficiency of the neural classifier reaches 98.9% for electrons (with only 1% of false alarm probability). Furthermore, restricting data projection onto only 9 components, an electron efficiency of 99.1% is achieved (with 3% of false alarm), which confirms that a fast triggering system may be designed using few components

  13. High-resolution γ-ray spectroscopy: a versatile tool for nuclear β-decay studies at TRIUMF-ISAC

    Science.gov (United States)

    Ball, G. C.; Achtzehn, T.; Albers, D.; Khalili, J. S. Al; Andreoiu, C.; Andreyev, A.; Ashley, S. F.; Austin, R. A. E.; Becker, J. A.; Bricault, P.; Chan, S.; Chakrawarthy, R. S.; Churchman, R.; Coombes, H.; Cunningham, E. S.; Daoud, J.; Dombsky, M.; Drake, T. E.; Eshpeter, B.; Finlay, P.; Garrett, P. E.; Geppert, C.; Grinyer, G. F.; Hackman, G.; Hanemaayer, V.; Hyland, B.; Jones, G. A.; Koopmans, K. A.; Kulp, W. D.; Lassen, J.; Lavoie, J. P.; Leslie, J. R.; Litvinov, Y.; Macdonald, J. A.; Mattoon, C.; Melconian, D.; Morton, A. C.; Osborne, C. J.; Pearson, C. J.; Pearson, M.; Phillips, A. A.; Ressler, J. J.; Sarazin, F.; Schumaker, M. A.; Schwarzenberg, J.; Scraggs, H. C.; Smith, M. B.; Svensson, C. E.; Valiente-Dobon, J. J.; Waddington, J. C.; Walker, P. M.; Wendt, K.; Williams, S. J.; Wood, J. L.; Zganjar, E. F.

    2005-10-01

    High-resolution γ-ray spectroscopy is essential to fully exploit the unique, high-quality beams available at the next generation of radioactive ion beam facilities such as the TRIUMF isotope separator and accelerator (ISAC). The 8π spectrometer, which consists of 20 Compton-suppressed HPGe detectors, has recently been reconfigured for a vigorous research programme in weak interaction and nuclear structure physics. With the addition of a variety of ancillary detectors it has become the world's most powerful device dedicated to β-decay studies. This paper provides a brief overview of the apparatus and highlights from recent experiments.

  14. Automated ventricular systems segmentation in brain CT images by combining low-level segmentation and high-level template matching

    Directory of Open Access Journals (Sweden)

    Ward Kevin R

    2009-11-01

    Full Text Available Abstract Background Accurate analysis of CT brain scans is vital for diagnosis and treatment of Traumatic Brain Injuries (TBI. Automatic processing of these CT brain scans could speed up the decision making process, lower the cost of healthcare, and reduce the chance of human error. In this paper, we focus on automatic processing of CT brain images to segment and identify the ventricular systems. The segmentation of ventricles provides quantitative measures on the changes of ventricles in the brain that form vital diagnosis information. Methods First all CT slices are aligned by detecting the ideal midlines in all images. The initial estimation of the ideal midline of the brain is found based on skull symmetry and then the initial estimate is further refined using detected anatomical features. Then a two-step method is used for ventricle segmentation. First a low-level segmentation on each pixel is applied on the CT images. For this step, both Iterated Conditional Mode (ICM and Maximum A Posteriori Spatial Probability (MASP are evaluated and compared. The second step applies template matching algorithm to identify objects in the initial low-level segmentation as ventricles. Experiments for ventricle segmentation are conducted using a relatively large CT dataset containing mild and severe TBI cases. Results Experiments show that the acceptable rate of the ideal midline detection is over 95%. Two measurements are defined to evaluate ventricle recognition results. The first measure is a sensitivity-like measure and the second is a false positive-like measure. For the first measurement, the rate is 100% indicating that all ventricles are identified in all slices. The false positives-like measurement is 8.59%. We also point out the similarities and differences between ICM and MASP algorithms through both mathematically relationships and segmentation results on CT images. Conclusion The experiments show the reliability of the proposed algorithms. The

  15. ROLE OF HIGH RESOLUTION ULTRASONOGRAPHY IN THE EVALUATION OF POSTERIOR SEGMENT LESIONS OF THE EYE

    Directory of Open Access Journals (Sweden)

    Rashmi Nagaraju

    2015-01-01

    Full Text Available BACKGRO UND: The superficial location of the eye, its cystic composition, and the advent of high - frequency ultrasound make sonography ideal for imaging the eye. Ultrasonography is a simple, readily available, non - invasive, non - ionizing, highly accurate, real time and cost effective modality . OBJECTIVES: 1 To evaluate the accuracy of high resolution B - mode ultrasonography in the diagnosis of posterior segment lesions of eye as compared to ophthalmoscopic examination particularly in cases of opaque conducting media . 2 To evaluate sonographic appearances of various posterior segment lesions of the eye . MATERIALS AND METHODS: 1 A prospective study was carried out on 62 cases with suspected posterior segment lesions of eye. All patients clinically suspected to have p osterior segment lesions in the presence of opaque conducting media were included in the study. Cases suspected to have isolated anterior segmental and extra ocular lesions were excluded. 2 HRUS was performed with Philips IU22 using high frequency probe ( 5 to 17 MHz utilizing contact method. 3 Sonological diagnosis was made based on sonographic features such as location, morphology, echo pattern, color Doppler characteristics, kinetics of the lesion with eye movements and acoustic characteristics of the lesion. 4 Subsequent clinical, lab investigations, surgical and histopathological examinations were carried out as applicable and final diagnosis was made which was correlated with the sonological diagnosis. Sonological diagnosis was also compared with op hthalmoscopic diagnosis. STATISTICAL ANALYSES: The validities and diagnostic accuracies of high resolution ultrasound and ophthalmoscopic examinations were calculated and compared. RESULTS AND CONCLUSIONS: 1 Ultrasound was the initial imaging modality opt ed in most of the cases as it was readily available, simple and cost effective modality. It establishes the diagnosis in significant number of cases superseding the accuracy

  16. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation

    Science.gov (United States)

    2013-01-01

    The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087

  17. Retinal blood vessel segmentation in high resolution fundus photographs using automated feature parameter estimation

    Science.gov (United States)

    Orlando, José Ignacio; Fracchia, Marcos; del Río, Valeria; del Fresno, Mariana

    2017-11-01

    Several ophthalmological and systemic diseases are manifested through pathological changes in the properties and the distribution of the retinal blood vessels. The characterization of such alterations requires the segmentation of the vasculature, which is a tedious and time-consuming task that is infeasible to be performed manually. Numerous attempts have been made to propose automated methods for segmenting the retinal vasculature from fundus photographs, although their application in real clinical scenarios is usually limited by their ability to deal with images taken at different resolutions. This is likely due to the large number of parameters that have to be properly calibrated according to each image scale. In this paper we propose to apply a novel strategy for automated feature parameter estimation, combined with a vessel segmentation method based on fully connected conditional random fields. The estimation model is learned by linear regression from structural properties of the images and known optimal configurations, that were previously obtained for low resolution data sets. Our experiments in high resolution images show that this approach is able to estimate appropriate configurations that are suitable for performing the segmentation task without requiring to re-engineer parameters. Furthermore, our combined approach reported state of the art performance on the benchmark data set HRF, as measured in terms of the F1-score and the Matthews correlation coefficient.

  18. Pig epidermal growth factor precursor contains segments that are highly conserved among species

    DEFF Research Database (Denmark)

    Jørgensen, P E; Jensen, L.G.; Sørensen, B S

    1998-01-01

    segment with that of the human, the rat and the mouse EGF precursors, in order to identify highly conserved domains. The examined part of the precursor contains EGF itself and six so-called EGF-like modules. The overall amino acid identity among the four species is 64%. However, the amino acid identity...... differed from around 30% in some segments to around 70% in others. The highest amino acid identity, 71%, was observed for a 345-aa segment that contains three EGF-like modules and which is homologous to a part of the low-density lipoprotein receptor (LDL receptor). The amino acid identities are 64% for EGF...... itself, and 50-67% for the remaining three EGF-like modules. The segment of the LDL receptor that is homologous to a part of the EGF precursor is important for the function of the LDL receptor, and EGF-like modules seem to be involved in protein-protein interactions in a number of proteins. In conclusion...

  19. High-Resolution Gamma-Ray Imaging Measurements Using Externally Segmented Germanium Detectors

    Science.gov (United States)

    Callas, J.; Mahoney, W.; Skelton, R.; Varnell, L.; Wheaton, W.

    1994-01-01

    Fully two-dimensional gamma-ray imaging with simultaneous high-resolution spectroscopy has been demonstrated using an externally segmented germanium sensor. The system employs a single high-purity coaxial detector with its outer electrode segmented into 5 distinct charge collection regions and a lead coded aperture with a uniformly redundant array (URA) pattern. A series of one-dimensional responses was collected around 511 keV while the system was rotated in steps through 180 degrees. A non-negative, linear least-squares algorithm was then employed to reconstruct a 2-dimensional image. Corrections for multiple scattering in the detector, and the finite distance of source and detector are made in the reconstruction process.

  20. Highly polarized light emission by isotropic quantum dots integrated with magnetically aligned segmented nanowires

    International Nuclear Information System (INIS)

    Uran, Can; Erdem, Talha; Guzelturk, Burak; Perkgöz, Nihan Kosku; Jun, Shinae; Jang, Eunjoo; Demir, Hilmi Volkan

    2014-01-01

    In this work, we demonstrate a proof-of-concept system for generating highly polarized light from colloidal quantum dots (QDs) coupled with magnetically aligned segmented Au/Ni/Au nanowires (NWs). Optical characterizations reveal that the optimized QD-NW coupled structures emit highly polarized light with an s-to p-polarization (s/p) contrast as high as 15:1 corresponding to a degree of polarization of 0.88. These experimental results are supported by the finite-difference time-domain simulations, which demonstrate the interplay between the inter-NW distance and the degree of polarization.

  1. Event-Based Color Segmentation With a High Dynamic Range Sensor

    Directory of Open Access Journals (Sweden)

    Alexandre Marcireau

    2018-04-01

    Full Text Available This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond. Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.

  2. MARKET SEGMENTATION: IDENTIFYING THE HIGH-GROWTH EXPORT MARKETS FOR U.S. AGRICULTURE

    OpenAIRE

    Reed, Michael R.; Salvacruz, Joseph C.

    1994-01-01

    A cluster analysis based on a five-year growth rate of agricultural imports from the United States was conducted on 86 countries and revealed two significant market segments for U.S. agriculture: the high-growth markets and the low-growth markets. Multiple discriminant analysis was then used to test the significance of the countries' trade-related and macroeconomic variables to their market growth classification. The discriminant function was used to predict the high-growth markets for U.S. a...

  3. In-beam measurement of the position resolution of a highly segmented coaxial germanium detector

    International Nuclear Information System (INIS)

    Descovich, M.; Lee, I.Y.; Fallon, P.; Cromaz, M.; Macchiavelli, A.O.; Radford, D.C.; Vetter, K.; Clark, R.M.; Deleplanque, M.A.; Stephens, F.S.; Ward, D.

    2005-01-01

    The position resolution of a highly segmented coaxial germanium detector was determined by analyzing the 2055keV γ-ray transition of Zr90 excited in a fusion-evaporation reaction. The high velocity of the Zr90 nuclei imparted large Doppler shifts. Digital analysis of the detector signals recovered the energy and position of individual γ-ray interactions. The location of the first interaction in the crystal was used to correct the Doppler energy shift. Comparison of the measured energy resolution with simulations implied a position resolution (root mean square) of 2mm in three-dimensions

  4. Rule-based land cover classification from very high-resolution satellite image with multiresolution segmentation

    Science.gov (United States)

    Haque, Md. Enamul; Al-Ramadan, Baqer; Johnson, Brian A.

    2016-07-01

    Multiresolution segmentation and rule-based classification techniques are used to classify objects from very high-resolution satellite images of urban areas. Custom rules are developed using different spectral, geometric, and textural features with five scale parameters, which exploit varying classification accuracy. Principal component analysis is used to select the most important features out of a total of 207 different features. In particular, seven different object types are considered for classification. The overall classification accuracy achieved for the rule-based method is 95.55% and 98.95% for seven and five classes, respectively. Other classifiers that are not using rules perform at 84.17% and 97.3% accuracy for seven and five classes, respectively. The results exploit coarse segmentation for higher scale parameter and fine segmentation for lower scale parameter. The major contribution of this research is the development of rule sets and the identification of major features for satellite image classification where the rule sets are transferable and the parameters are tunable for different types of imagery. Additionally, the individual objectwise classification and principal component analysis help to identify the required object from an arbitrary number of objects within images given ground truth data for the training.

  5. Image processing pipeline for segmentation and material classification based on multispectral high dynamic range polarimetric images.

    Science.gov (United States)

    Martínez-Domingo, Miguel Ángel; Valero, Eva M; Hernández-Andrés, Javier; Tominaga, Shoji; Horiuchi, Takahiko; Hirai, Keita

    2017-11-27

    We propose a method for the capture of high dynamic range (HDR), multispectral (MS), polarimetric (Pol) images of indoor scenes using a liquid crystal tunable filter (LCTF). We have included the adaptive exposure estimation (AEE) method to fully automatize the capturing process. We also propose a pre-processing method which can be applied for the registration of HDR images after they are already built as the result of combining different low dynamic range (LDR) images. This method is applied to ensure a correct alignment of the different polarization HDR images for each spectral band. We have focused our efforts in two main applications: object segmentation and classification into metal and dielectric classes. We have simplified the segmentation using mean shift combined with cluster averaging and region merging techniques. We compare the performance of our segmentation with that of Ncut and Watershed methods. For the classification task, we propose to use information not only in the highlight regions but also in their surrounding area, extracted from the degree of linear polarization (DoLP) maps. We present experimental results which proof that the proposed image processing pipeline outperforms previous techniques developed specifically for MSHDRPol image cubes.

  6. Multi-surface segmentation of OCT images with AMD using sparse high order potentials.

    Science.gov (United States)

    Oliveira, Jorge; Pereira, Sérgio; Gonçalves, Luís; Ferreira, Manuel; Silva, Carlos A

    2017-01-01

    In age-related macular degeneration (AMD), the quantification of drusen is important because it is correlated with the evolution of the disease to an advanced stage. Therefore, we propose an algorithm based on a multi-surface framework for the segmentation of the limiting boundaries of drusen: the inner boundary of the retinal pigment epithelium + drusen complex (IRPEDC) and the Bruch's membrane (BM). Several segmentation methods have been considerably successful in segmenting retinal layers of healthy retinas in optical coherence tomography (OCT) images. These methods are successful because they incorporate prior information and regularization. Nonetheless, these factors tend to hinder the segmentation for diseased retinas. The proposed algorithm takes into account the presence of drusen and geographic atrophy (GA) related to AMD by excluding prior information and regularization just valid for healthy regions. However, even with this algorithm, prior information and regularization still cause the oversmoothing of drusen in some locations. Thus, we propose the integration of local shape prior in the form of a sparse high order potentials (SHOPs) into the algorithm to reduce the oversmoothing of drusen. The proposed algorithm was evaluated in a public database. The mean unsigned errors, relative to the average of two experts, for the inner limiting membrane (ILM), IRPEDC and BM were 2.94±2.69, 5.53±5.66 and 4.00±4.00 µ m, respectively. Drusen areas measurements were evaluated, relative to the average of two expert graders, by the mean absolute area difference and overlap ratio, which were 1579.7 ± 2106.8 µ m 2 and 0.78 ± 0.11, respectively.

  7. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods. - Highlights: • A detailed discussion on the template-matching algorithm was given. • The algorithm was tested on data from a NaI and a Si detector. • The algorithm was successfully implemented on high rate data from a HPGe detector. • The performance of the algorithm was compared with traditional shaping methods. • The advantage of the algorithm in active interrogation was discussed.

  8. Wavelet-space correlation imaging for high-speed MRI without motion monitoring or data segmentation.

    Science.gov (United States)

    Li, Yu; Wang, Hui; Tkach, Jean; Roach, David; Woods, Jason; Dumoulin, Charles

    2015-12-01

    This study aims to (i) develop a new high-speed MRI approach by implementing correlation imaging in wavelet-space, and (ii) demonstrate the ability of wavelet-space correlation imaging to image human anatomy with involuntary or physiological motion. Correlation imaging is a high-speed MRI framework in which image reconstruction relies on quantification of data correlation. The presented work integrates correlation imaging with a wavelet transform technique developed originally in the field of signal and image processing. This provides a new high-speed MRI approach to motion-free data collection without motion monitoring or data segmentation. The new approach, called "wavelet-space correlation imaging", is investigated in brain imaging with involuntary motion and chest imaging with free-breathing. Wavelet-space correlation imaging can exceed the speed limit of conventional parallel imaging methods. Using this approach with high acceleration factors (6 for brain MRI, 16 for cardiac MRI, and 8 for lung MRI), motion-free images can be generated in static brain MRI with involuntary motion and nonsegmented dynamic cardiac/lung MRI with free-breathing. Wavelet-space correlation imaging enables high-speed MRI in the presence of involuntary motion or physiological dynamics without motion monitoring or data segmentation. © 2014 Wiley Periodicals, Inc.

  9. Impact of image segmentation on high-content screening data quality for SK-BR-3 cells

    Directory of Open Access Journals (Sweden)

    Li Yizheng

    2007-09-01

    Full Text Available Abstract Background High content screening (HCS is a powerful method for the exploration of cellular signalling and morphology that is rapidly being adopted in cancer research. HCS uses automated microscopy to collect images of cultured cells. The images are subjected to segmentation algorithms to identify cellular structures and quantitate their morphology, for hundreds to millions of individual cells. However, image analysis may be imperfect, especially for "HCS-unfriendly" cell lines whose morphology is not well handled by current image segmentation algorithms. We asked if segmentation errors were common for a clinically relevant cell line, if such errors had measurable effects on the data, and if HCS data could be improved by automated identification of well-segmented cells. Results Cases of poor cell body segmentation occurred frequently for the SK-BR-3 cell line. We trained classifiers to identify SK-BR-3 cells that were well segmented. On an independent test set created by human review of cell images, our optimal support-vector machine classifier identified well-segmented cells with 81% accuracy. The dose responses of morphological features were measurably different in well- and poorly-segmented populations. Elimination of the poorly-segmented cell population increased the purity of DNA content distributions, while appropriately retaining biological heterogeneity, and simultaneously increasing our ability to resolve specific morphological changes in perturbed cells. Conclusion Image segmentation has a measurable impact on HCS data. The application of a multivariate shape-based filter to identify well-segmented cells improved HCS data quality for an HCS-unfriendly cell line, and could be a valuable post-processing step for some HCS datasets.

  10. Highly segmented large-area hybrid photodiodes with bialkali photocathodes and enclosed VLSI readout electronics

    CERN Document Server

    Braem, André; Filthaut, Frank; Go, A; Joram, C; Weilhammer, Peter; Wicht, P; Dulinski, W; Séguinot, Jacques; Wenzel, H; Ypsilantis, Thomas

    2000-01-01

    We report on the principles, design, fabrication, and operation of a highly segmented, large-area hybrid photodiode, which is being developed in the framework of the LHCb RICH project. The device consists of a cylindrical, 127 mm diameter vacuum envelope capped with a spherical borosilicate UV-glass entrance window, with an active-to-total-area fraction of 81A fountain-focusing electron optics is used to demagnify the image onto a 50 mm diameter silicon sensor, containing 2048 pads of size 1*1 mm/sup 2/. (10 refs).

  11. A Pole Pair Segment of a 2-MW High-Temperature Superconducting Wind Turbine Generator

    DEFF Research Database (Denmark)

    Song, Xiaowei (Andy); Mijatovic, Nenad; Kellers, Jürgen

    2017-01-01

    A 2-MW high-temperature superconducting (HTS) generator with 24 pole pairs has been designed for the wind turbine application. In order to identify potential challenges and obtain practical knowledge prior to production, a full-size stationary experimental setup, which is one pole pair segment...... and the setup in terms of the flux density, the operating condition of the HTS winding, and the force-generation capability. Finite element (FE) software MagNet is used to carry out numerical simulations. The findings show that the HTS winding in the setup is a good surrogate for these that would be used...

  12. A Pole Pair Segment of a 2 MW High Temperature Superconducting Wind Turbine Generator

    DEFF Research Database (Denmark)

    Song, Xiaowei (Andy); Mijatovic, Nenad; Kellers, Jürgen

    2016-01-01

    A 2 MW high temperature superconducting (HTS) generator with 24 pole pairs has been designed for the wind turbine application. In order to identify potential challenges and obtain practical knowledge prior to production, a fullsize stationary experimental set-up, which is one pole pair segment...... generator and the set-up in terms of the flux density, the operating condition of the HTS winding, and the force-generation capability. Finite element (FE) software MagNet is used to carry out numerical simulations. The findings show that the HTS winding in the set-up is a good surrogate...

  13. HIGH QUALITY FACADE SEGMENTATION BASED ON STRUCTURED RANDOM FOREST, REGION PROPOSAL NETWORK AND RECTANGULAR FITTING

    Directory of Open Access Journals (Sweden)

    K. Rahmani

    2018-05-01

    Full Text Available In this paper we present a pipeline for high quality semantic segmentation of building facades using Structured Random Forest (SRF, Region Proposal Network (RPN based on a Convolutional Neural Network (CNN as well as rectangular fitting optimization. Our main contribution is that we employ features created by the RPN as channels in the SRF.We empirically show that this is very effective especially for doors and windows. Our pipeline is evaluated on two datasets where we outperform current state-of-the-art methods. Additionally, we quantify the contribution of the RPN and the rectangular fitting optimization on the accuracy of the result.

  14. Efficacy of highly hydrophilic soft contact lenses for persistent corneal epithelial defects after anterior segment surgery

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Peng

    2015-02-01

    Full Text Available AIM:To investigate the efficacy of highly hydrophilic soft contact lenses for persistent corneal epithelial defects.METHODS:In this retrospective case analysis, 28 patients(28 eyeswith persistent corneal epithelial defects after anterior segment surgery from January 2011 to June 2013 in our hospital were reviewed. After regular treatment for at least 2wk, the persistent corneal epithelial defects were treated with highly hydrophilic soft contact lenses, until the corneal epithelial healing. Continued to wear the same lens no more than 3wk, or in need of replacement the new one. All cases were followed up for 6mo. Key indicators of corneal epithelial healling, corneal fluorescein staining and ocular symptoms improvement were observed.RESULTS: Twenty-one eyes were cured(75.00%, markedly effective in 5 eyes(17.86%, effective in 2 eyes(7.14%, no invalid cases, the total efficiency of 100.00%. Ocular symptoms of 25 cases(89.29%relieved within 2d, the rest 3 cases(10.71%relieved within 1wk. The corneal epithelial of 6 cases(21.43%repaired in 3wk, 13 cases(46.43%in 6wk, 7 cases(25.00%in 9wk, 2 cases(7.14%over 12wk. There were no signs of secondary infection. And no evidence of recurrence in 6mo. CONCLUSION: Highly hydrophilic soft contact lenses could repair persistent corneal epithelial defects after anterior segment surgery significantly, while quickly and effectively relieve a variety of ocular irritation.

  15. High resolution gamma-ray spectroscopy at high count rates with a prototype High Purity Germanium detector

    Science.gov (United States)

    Cooper, R. J.; Amman, M.; Vetter, K.

    2018-04-01

    High-resolution gamma-ray spectrometers are required for applications in nuclear safeguards, emergency response, and fundamental nuclear physics. To overcome one of the shortcomings of conventional High Purity Germanium (HPGe) detectors, we have developed a prototype device capable of achieving high event throughput and high energy resolution at very high count rates. This device, the design of which we have previously reported on, features a planar HPGe crystal with a reduced-capacitance strip electrode geometry. This design is intended to provide good energy resolution at the short shaping or digital filter times that are required for high rate operation and which are enabled by the fast charge collection afforded by the planar geometry crystal. In this work, we report on the initial performance of the system at count rates up to and including two million counts per second.

  16. 18F half-life measurement using a high-purity germanium detector

    International Nuclear Information System (INIS)

    Han, Jubong; Lee, K.B.; Park, T.S.; Lee, J.M.; Oh, P.J.; Lee, S.H.; Kang, Y.S.; Ahn, J.K.

    2012-01-01

    The half-life of 18 F has been measured using HPGe detectors with a 137 Cs reference source. The counting ratio of 511 keV γ-rays from 18 F to 622 keV γ-rays from 137 Cs was fitted for the half-life with a weighted least-square method. Uncertainties due to the systematic effects arising from the measurement of a high activity 18 F source were studied in detail. The half-life of 18 F was found to be (109.72±0.19) min. The result is in a good agreement with the recommended value of (109.728±0.019) min evaluated at the Laborotaire National Henri Becquerel (LNHB). - Highlights: ► The 18 F half-life was measured with a reference source and without it using HPGe detectors. ► We found the systematic effect ‘activity dynamic range effect’ by monitoring the counts of the reference source. ► This activity dynamic range effect was corrected by using the reference source method. ► The 18 F half-life using the reference source method was in a good agreement with the recommended value of LNHB.

  17. High Purity Germanium Detector as part of Health Canada's Mobile Nuclear Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Stocki, Trevor J.; Bouchard, Claude; Rollings, John; Boudreau, Marc-Oliver; McCutcheon- Wickham, Rory; Bergman, Lauren [Radiation Protection Bureau, Health Canada, AL6302D, 775 Brookfield Road, Ottawa, K1A 0K9 (Canada)

    2014-07-01

    In the event of a nuclear emergency on Canadian soil, Health Canada has designed and equipped two Mobile Nuclear Labs (MNLs) which can be deployed near a radiological accident site to provide radiological measurement capabilities. These measurements would help public authorities to make informed decisions for radiation protection recommendations. One of the MNLs has been outfitted with a High Purity Germanium (HPGe) detector within a lead castle, which can be used for identification as well as quantification of gamma emitting radioisotopes in contaminated soil, water, and other samples. By spring 2014, Health Canada's second MNL will be equipped with a similar detector to increase sample analysis capacity and also provide redundancy if one of the detectors requires maintenance. The Mobile Nuclear Lab (MNL) with the HPGe detector has been successfully deployed in the field for various exercises. One of these field exercises was a dirty bomb scenario where an unknown radioisotope required identification. A second exercise was an inter-comparison between the measurements of spiked soil and water samples, by two field teams and a certified laboratory. A third exercise was the deployment of the MNL as part of a full scale nuclear exercise simulating an emergency at a Canadian nuclear power plant. The lessons learned from these experiences will be discussed. (authors)

  18. Microstrip Resonator for High Field MRI with Capacitor-Segmented Strip and Ground Plane

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Boer, Vincent; Petersen, Esben Thade

    2017-01-01

    ) segmenting stripe and ground plane of the resonator with series capacitors. The design equations for capacitors providing symmetric current distribution are derived. The performance of two types of segmented resonators are investigated experimentally. To authors’ knowledge, a microstrip resonator, where both......, strip and ground plane are capacitor-segmented, is shown here for the first time....

  19. Fast Segmentation and Classification of Very High Resolution Remote Sensing Data Using SLIC Superpixels

    Directory of Open Access Journals (Sweden)

    Ovidiu Csillik

    2017-03-01

    Full Text Available Speed and accuracy are important factors when dealing with time-constraint events for disaster, risk, and crisis-management support. Object-based image analysis can be a time consuming task in extracting information from large images because most of the segmentation algorithms use the pixel-grid for the initial object representation. It would be more natural and efficient to work with perceptually meaningful entities that are derived from pixels using a low-level grouping process (superpixels. Firstly, we tested a new workflow for image segmentation of remote sensing data, starting the multiresolution segmentation (MRS, using ESP2 tool from the superpixel level and aiming at reducing the amount of time needed to automatically partition relatively large datasets of very high resolution remote sensing data. Secondly, we examined whether a Random Forest classification based on an oversegmentation produced by a Simple Linear Iterative Clustering (SLIC superpixel algorithm performs similarly with reference to a traditional object-based classification regarding accuracy. Tests were applied on QuickBird and WorldView-2 data with different extents, scene content complexities, and number of bands to assess how the computational time and classification accuracy are affected by these factors. The proposed segmentation approach is compared with the traditional one, starting the MRS from the pixel level, regarding geometric accuracy of the objects and the computational time. The computational time was reduced in all cases, the biggest improvement being from 5 h 35 min to 13 min, for a WorldView-2 scene with eight bands and an extent of 12.2 million pixels, while the geometric accuracy is kept similar or slightly better. SLIC superpixel-based classification had similar or better overall accuracy values when compared to MRS-based classification, but the results were obtained in a fast manner and avoiding the parameterization of the MRS. These two approaches

  20. High-resolution imaging gamma-ray spectroscopy with externally segmented germanium detectors

    Science.gov (United States)

    Callas, J. L.; Mahoney, W. A.; Varnell, L. S.; Wheaton, W. A.

    1993-01-01

    Externally segmented germanium detectors promise a breakthrough in gamma-ray imaging capabilities while retaining the superb energy resolution of germanium spectrometers. An angular resolution of 0.2 deg becomes practical by combining position-sensitive germanium detectors having a segment thickness of a few millimeters with a one-dimensional coded aperture located about a meter from the detectors. Correspondingly higher angular resolutions are possible with larger separations between the detectors and the coded aperture. Two-dimensional images can be obtained by rotating the instrument. Although the basic concept is similar to optical or X-ray coded-aperture imaging techniques, several complicating effects arise because of the penetrating nature of gamma rays. The complications include partial transmission through the coded aperture elements, Compton scattering in the germanium detectors, and high background count rates. Extensive electron-photon Monte Carlo modeling of a realistic detector/coded-aperture/collimator system has been performed. Results show that these complicating effects can be characterized and accounted for with no significant loss in instrument sensitivity.

  1. A high resolution portable spectroscopy system

    International Nuclear Information System (INIS)

    Kulkarni, C.P.; Vaidya, P.P.; Paulson, M.; Bhatnagar, P.V.; Pande, S.S.; Padmini, S.

    2003-01-01

    Full text: This paper describes the system details of a High Resolution Portable Spectroscopy System (HRPSS) developed at Electronics Division, BARC. The system can be used for laboratory class, high-resolution nuclear spectroscopy applications. The HRPSS consists of a specially designed compact NIM bin, with built-in power supplies, accommodating a low power, high resolution MCA, and on-board embedded computer for spectrum building and communication. A NIM based spectroscopy amplifier and a HV module for detector bias are integrated (plug-in) in the bin. The system communicates with a host PC via a serial link. Along-with a laptop PC, and a portable HP-Ge detector, the HRPSS offers a laboratory class performance for portable applications

  2. Analysis of high-identity segmental duplications in the grapevine genome

    Directory of Open Access Journals (Sweden)

    Carelli Francesco N

    2011-08-01

    Full Text Available Abstract Background Segmental duplications (SDs are blocks of genomic sequence of 1-200 kb that map to different loci in a genome and share a sequence identity > 90%. SDs show at the sequence level the same characteristics as other regions of the human genome: they contain both high-copy repeats and gene sequences. SDs play an important role in genome plasticity by creating new genes and modeling genome structure. Although data is plentiful for mammals, not much was known about the representation of SDs in plant genomes. In this regard, we performed a genome-wide analysis of high-identity SDs on the sequenced grapevine (Vitis vinifera genome (PN40024. Results We demonstrate that recent SDs (> 94% identity and >= 10 kb in size are a relevant component of the grapevine genome (85 Mb, 17% of the genome sequence. We detected mitochondrial and plastid DNA and genes (10% of gene annotation in segmentally duplicated regions of the nuclear genome. In particular, the nine highest copy number genes have a copy in either or both organelle genomes. Further we showed that several duplicated genes take part in the biosynthesis of compounds involved in plant response to environmental stress. Conclusions These data show the great influence of SDs and organelle DNA transfers in modeling the Vitis vinifera nuclear DNA structure as well as the impact of SDs in contributing to the adaptive capacity of grapevine and the nutritional content of grape products through genome variation. This study represents a step forward in the full characterization of duplicated genes important for grapevine cultural needs and human health.

  3. Shadow Detection from Very High Resoluton Satellite Image Using Grabcut Segmentation and Ratio-Band Algorithms

    Science.gov (United States)

    Kadhim, N. M. S. M.; Mourshed, M.; Bray, M. T.

    2015-03-01

    Very-High-Resolution (VHR) satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour), the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates significant performance of

  4. SHADOW DETECTION FROM VERY HIGH RESOLUTON SATELLITE IMAGE USING GRABCUT SEGMENTATION AND RATIO-BAND ALGORITHMS

    Directory of Open Access Journals (Sweden)

    N. M. S. M. Kadhim

    2015-03-01

    Full Text Available Very-High-Resolution (VHR satellite imagery is a powerful source of data for detecting and extracting information about urban constructions. Shadow in the VHR satellite imageries provides vital information on urban construction forms, illumination direction, and the spatial distribution of the objects that can help to further understanding of the built environment. However, to extract shadows, the automated detection of shadows from images must be accurate. This paper reviews current automatic approaches that have been used for shadow detection from VHR satellite images and comprises two main parts. In the first part, shadow concepts are presented in terms of shadow appearance in the VHR satellite imageries, current shadow detection methods, and the usefulness of shadow detection in urban environments. In the second part, we adopted two approaches which are considered current state-of-the-art shadow detection, and segmentation algorithms using WorldView-3 and Quickbird images. In the first approach, the ratios between the NIR and visible bands were computed on a pixel-by-pixel basis, which allows for disambiguation between shadows and dark objects. To obtain an accurate shadow candidate map, we further refine the shadow map after applying the ratio algorithm on the Quickbird image. The second selected approach is the GrabCut segmentation approach for examining its performance in detecting the shadow regions of urban objects using the true colour image from WorldView-3. Further refinement was applied to attain a segmented shadow map. Although the detection of shadow regions is a very difficult task when they are derived from a VHR satellite image that comprises a visible spectrum range (RGB true colour, the results demonstrate that the detection of shadow regions in the WorldView-3 image is a reasonable separation from other objects by applying the GrabCut algorithm. In addition, the derived shadow map from the Quickbird image indicates

  5. Comparison of high-resolution Scheimpflug and high-frequency ultrasound biomicroscopy to anterior-segment OCT corneal thickness measurements

    Directory of Open Access Journals (Sweden)

    Kanellopoulos AJ

    2013-11-01

    Full Text Available Anastasios John Kanellopoulos,1,2 George Asimellis1 1Laservision.gr Eye Institute, Athens, Greece; 2New York University Medical School, New York, NY, USA Background: The purpose of this study was to compare and correlate central corneal thickness in healthy, nonoperated eyes with three advanced anterior-segment imaging systems: a high-resolution Scheimpflug tomography camera (Oculyzer II, a spectral-domain anterior-segment optical coherence tomography (AS-OCT system, and a high-frequency ultrasound biomicroscopy (HF-UBM system. Methods: Fifty eyes randomly selected from 50 patients were included in the study. Inclusion criteria were healthy, nonoperated eyes examined consecutively by the same examiner. Corneal imaging was performed by three different methods, ie, Oculyzer II, spectral-domain AS-OCT, and FH-UBM. Central corneal thickness measurements were compared using scatter diagrams, Bland-Altman plots (with bias and 95% confidence intervals, and two-paired analysis. Results: The coefficient of determination (r2 between the Oculyzer II and AS-OCT measurements was 0.895. Likewise, the coefficient was 0.893 between the Oculyzer II and HF-UBM and 0.830 between the AS-OCT and HF-UBM. The trend line coefficients of linearity were 0.925 between the Oculyzer II and the AS-OCT, 1.006 between the Oculyzer II and HF-UBM, and 0.841 between the AS-OCT and HF-UBM. The differences in average corneal thickness between the three pairs of CCT measurements were –6.86 µm between the Oculyzer II and HF-UBM, –12.20 µm between the AS-OCT and Oculyzer II, and +19.06 µm between the HF-UBM and AS-OCT. Conclusion: The three methods used for corneal thickness measurement are highly correlated. Compared with the Scheimplug and ultrasound devices, the AS-OCT appears to report a more accurate, but overally thinner corneal pachymetry. Keywords: anterior eye segment, high-frequency ultrasound biomicroscopy, optical coherence tomography, high-resolution Pentacam

  6. Classification of semiurban landscapes from very high-resolution satellite images using a regionalized multiscale segmentation approach

    Science.gov (United States)

    Kavzoglu, Taskin; Erdemir, Merve Yildiz; Tonbul, Hasan

    2017-07-01

    In object-based image analysis, obtaining representative image objects is an important prerequisite for a successful image classification. The major threat is the issue of scale selection due to the complex spatial structure of landscapes portrayed as an image. This study proposes a two-stage approach to conduct regionalized multiscale segmentation. In the first stage, an initial high-level segmentation is applied through a "broadscale," and a set of image objects characterizing natural borders of the landscape features are extracted. Contiguous objects are then merged to create regions by considering their normalized difference vegetation index resemblance. In the second stage, optimal scale values are estimated for the extracted regions, and multiresolution segmentation is applied with these settings. Two satellite images with different spatial and spectral resolutions were utilized to test the effectiveness of the proposed approach and its transferability to different geographical sites. Results were compared to those of image-based single-scale segmentation and it was found that the proposed approach outperformed the single-scale segmentations. Using the proposed methodology, significant improvement in terms of segmentation quality and classification accuracy (up to 5%) was achieved. In addition, the highest classification accuracies were produced using fine-scale values.

  7. aMAP is a validated pipeline for registration and segmentation of high-resolution mouse brain data

    Science.gov (United States)

    Niedworok, Christian J.; Brown, Alexander P. Y.; Jorge Cardoso, M.; Osten, Pavel; Ourselin, Sebastien; Modat, Marc; Margrie, Troy W.

    2016-01-01

    The validation of automated image registration and segmentation is crucial for accurate and reliable mapping of brain connectivity and function in three-dimensional (3D) data sets. While validation standards are necessarily high and routinely met in the clinical arena, they have to date been lacking for high-resolution microscopy data sets obtained from the rodent brain. Here we present a tool for optimized automated mouse atlas propagation (aMAP) based on clinical registration software (NiftyReg) for anatomical segmentation of high-resolution 3D fluorescence images of the adult mouse brain. We empirically evaluate aMAP as a method for registration and subsequent segmentation by validating it against the performance of expert human raters. This study therefore establishes a benchmark standard for mapping the molecular function and cellular connectivity of the rodent brain. PMID:27384127

  8. ROBUST MOTION SEGMENTATION FOR HIGH DEFINITION VIDEO SEQUENCES USING A FAST MULTI-RESOLUTION MOTION ESTIMATION BASED ON SPATIO-TEMPORAL TUBES

    OpenAIRE

    Brouard , Olivier; Delannay , Fabrice; Ricordel , Vincent; Barba , Dominique

    2007-01-01

    4 pages; International audience; Motion segmentation methods are effective for tracking video objects. However, objects segmentation methods based on motion need to know the global motion of the video in order to back-compensate it before computing the segmentation. In this paper, we propose a method which estimates the global motion of a High Definition (HD) video shot and then segments it using the remaining motion information. First, we develop a fast method for multi-resolution motion est...

  9. Automated Segmentation of High-Resolution Photospheric Images of Active Regions

    Science.gov (United States)

    Yang, Meng; Tian, Yu; Rao, Changhui

    2018-02-01

    Due to the development of ground-based, large-aperture solar telescopes with adaptive optics (AO) resulting in increasing resolving ability, more accurate sunspot identifications and characterizations are required. In this article, we have developed a set of automated segmentation methods for high-resolution solar photospheric images. Firstly, a local-intensity-clustering level-set method is applied to roughly separate solar granulation and sunspots. Then reinitialization-free level-set evolution is adopted to adjust the boundaries of the photospheric patch; an adaptive intensity threshold is used to discriminate between umbra and penumbra; light bridges are selected according to their regional properties from candidates produced by morphological operations. The proposed method is applied to the solar high-resolution TiO 705.7-nm images taken by the 151-element AO system and Ground-Layer Adaptive Optics prototype system at the 1-m New Vacuum Solar Telescope of the Yunnan Observatory. Experimental results show that the method achieves satisfactory robustness and efficiency with low computational cost on high-resolution images. The method could also be applied to full-disk images, and the calculated sunspot areas correlate well with the data given by the National Oceanic and Atmospheric Administration (NOAA).

  10. Influence of “J”-Curve Spring Stiffness on Running Speeds of Segmented Legs during High-Speed Locomotion

    Directory of Open Access Journals (Sweden)

    Runxiao Wang

    2016-01-01

    Full Text Available Both the linear leg spring model and the two-segment leg model with constant spring stiffness have been broadly used as template models to investigate bouncing gaits for legged robots with compliant legs. In addition to these two models, the other stiffness leg spring models developed using inspiration from biological characteristic have the potential to improve high-speed running capacity of spring-legged robots. In this paper, we investigate the effects of “J”-curve spring stiffness inspired by biological materials on running speeds of segmented legs during high-speed locomotion. Mathematical formulation of the relationship between the virtual leg force and the virtual leg compression is established. When the SLIP model and the two-segment leg model with constant spring stiffness and with “J”-curve spring stiffness have the same dimensionless reference stiffness, the two-segment leg model with “J”-curve spring stiffness reveals that (1 both the largest tolerated range of running speeds and the tolerated maximum running speed are found and (2 at fast running speed from 25 to 40/92 m s−1 both the tolerated range of landing angle and the stability region are the largest. It is suggested that the two-segment leg model with “J”-curve spring stiffness is more advantageous for high-speed running compared with the SLIP model and with constant spring stiffness.

  11. Intra-arterial high signals on arterial spin labeling perfusion images predict the occluded internal carotid artery segment

    International Nuclear Information System (INIS)

    Sogabe, Shu; Satomi, Junichiro; Tada, Yoshiteru; Kanematsu, Yasuhisa; Kuwayama, Kazuyuki; Yagi, Kenji; Yoshioka, Shotaro; Mizobuchi, Yoshifumi; Mure, Hideo; Yamaguchi, Izumi; Kitazato, Keiko T.; Nagahiro, Shinji; Abe, Takashi; Harada, Masafumi; Yamamoto, Nobuaki; Kaji, Ryuji

    2017-01-01

    Arterial spin labeling (ASL) involves perfusion imaging using the inverted magnetization of arterial water. If the arterial arrival times are longer than the post-labeling delay, labeled spins are visible on ASL images as bright, high intra-arterial signals (IASs); such signals were found within occluded vessels of patients with acute ischemic stroke. The identification of the occluded segment in the internal carotid artery (ICA) is crucial for endovascular treatment. We tested our hypothesis that high IASs on ASL images can predict the occluded segment. Our study included 13 patients with acute ICA occlusion who had undergone angiographic and ASL studies within 48 h of onset. We retrospectively identified the high IAS on ASL images and angiograms and recorded the occluded segment and the number of high IAS-positive slices on ASL images. The ICA segments were classified as cervical (C1), petrous (C2), cavernous (C3), and supraclinoid (C4). Of seven patients with intracranial ICA occlusion, five demonstrated high IASs at C1-C2, suggesting that high IASs could identify stagnant flow proximal to the occluded segment. Among six patients with extracranial ICA occlusion, five presented with high IASs at C3-C4, suggesting that signals could identify the collateral flow via the ophthalmic artery. None had high IASs at C1-C2. The mean number of high IAS-positive slices was significantly higher in patients with intra- than extracranial ICA occlusion. High IASs on ASL images can identify slow stagnant and collateral flow through the ophthalmic artery in patients with acute ICA occlusion and help to predict the occlusion site. (orig.)

  12. Intra-arterial high signals on arterial spin labeling perfusion images predict the occluded internal carotid artery segment

    Energy Technology Data Exchange (ETDEWEB)

    Sogabe, Shu; Satomi, Junichiro; Tada, Yoshiteru; Kanematsu, Yasuhisa; Kuwayama, Kazuyuki; Yagi, Kenji; Yoshioka, Shotaro; Mizobuchi, Yoshifumi; Mure, Hideo; Yamaguchi, Izumi; Kitazato, Keiko T.; Nagahiro, Shinji [Tokushima University Graduate School, Department of Neurosurgery, Tokushima (Japan); Abe, Takashi; Harada, Masafumi [Tokushima University Graduate School, Department of Radiology, Tokushima (Japan); Yamamoto, Nobuaki; Kaji, Ryuji [Tokushima University Graduate School, Department of Clinical Neurosciences, Institute of Biomedical Biosciences, Tokushima (Japan)

    2017-06-15

    Arterial spin labeling (ASL) involves perfusion imaging using the inverted magnetization of arterial water. If the arterial arrival times are longer than the post-labeling delay, labeled spins are visible on ASL images as bright, high intra-arterial signals (IASs); such signals were found within occluded vessels of patients with acute ischemic stroke. The identification of the occluded segment in the internal carotid artery (ICA) is crucial for endovascular treatment. We tested our hypothesis that high IASs on ASL images can predict the occluded segment. Our study included 13 patients with acute ICA occlusion who had undergone angiographic and ASL studies within 48 h of onset. We retrospectively identified the high IAS on ASL images and angiograms and recorded the occluded segment and the number of high IAS-positive slices on ASL images. The ICA segments were classified as cervical (C1), petrous (C2), cavernous (C3), and supraclinoid (C4). Of seven patients with intracranial ICA occlusion, five demonstrated high IASs at C1-C2, suggesting that high IASs could identify stagnant flow proximal to the occluded segment. Among six patients with extracranial ICA occlusion, five presented with high IASs at C3-C4, suggesting that signals could identify the collateral flow via the ophthalmic artery. None had high IASs at C1-C2. The mean number of high IAS-positive slices was significantly higher in patients with intra- than extracranial ICA occlusion. High IASs on ASL images can identify slow stagnant and collateral flow through the ophthalmic artery in patients with acute ICA occlusion and help to predict the occlusion site. (orig.)

  13. Hourglass-ShapeNetwork Based Semantic Segmentation for High Resolution Aerial Imagery

    Directory of Open Access Journals (Sweden)

    Yu Liu

    2017-05-01

    Full Text Available A new convolution neural network (CNN architecture for semantic segmentation of high resolution aerial imagery is proposed in this paper. The proposed architecture follows an hourglass-shaped network (HSN design being structured into encoding and decoding stages. By taking advantage of recent advances in CNN designs, we use the composed inception module to replace common convolutional layers, providing the network with multi-scale receptive areas with rich context. Additionally, in order to reduce spatial ambiguities in the up-sampling stage, skip connections with residual units are also employed to feed forward encoding-stage information directly to the decoder. Moreover, overlap inference is employed to alleviate boundary effects occurring when high resolution images are inferred from small-sized patches. Finally, we also propose a post-processing method based on weighted belief propagation to visually enhance the classification results. Extensive experiments based on the Vaihingen and Potsdam datasets demonstrate that the proposed architectures outperform three reference state-of-the-art network designs both numerically and visually.

  14. Time-optimized high-resolution readout-segmented diffusion tensor imaging.

    Directory of Open Access Journals (Sweden)

    Gernot Reishofer

    Full Text Available Readout-segmented echo planar imaging with 2D navigator-based reacquisition is an uprising technique enabling the sampling of high-resolution diffusion images with reduced susceptibility artifacts. However, low signal from the small voxels and long scan times hamper the clinical applicability. Therefore, we introduce a regularization algorithm based on total variation that is applied directly on the entire diffusion tensor. The spatially varying regularization parameter is determined automatically dependent on spatial variations in signal-to-noise ratio thus, avoiding over- or under-regularization. Information about the noise distribution in the diffusion tensor is extracted from the diffusion weighted images by means of complex independent component analysis. Moreover, the combination of those features enables processing of the diffusion data absolutely user independent. Tractography from in vivo data and from a software phantom demonstrate the advantage of the spatially varying regularization compared to un-regularized data with respect to parameters relevant for fiber-tracking such as Mean Fiber Length, Track Count, Volume and Voxel Count. Specifically, for in vivo data findings suggest that tractography results from the regularized diffusion tensor based on one measurement (16 min generates results comparable to the un-regularized data with three averages (48 min. This significant reduction in scan time renders high resolution (1 × 1 × 2.5 mm(3 diffusion tensor imaging of the entire brain applicable in a clinical context.

  15. Information Extraction of High-Resolution Remotely Sensed Image Based on Multiresolution Segmentation

    Directory of Open Access Journals (Sweden)

    Peng Shao

    2014-08-01

    Full Text Available The principle of multiresolution segmentation was represented in detail in this study, and the canny algorithm was applied for edge-detection of a remotely sensed image based on this principle. The target image was divided into regions based on object-oriented multiresolution segmentation and edge-detection. Furthermore, object hierarchy was created, and a series of features (water bodies, vegetation, roads, residential areas, bare land and other information were extracted by the spectral and geometrical features. The results indicate that the edge-detection has a positive effect on multiresolution segmentation, and overall accuracy of information extraction reaches to 94.6% by the confusion matrix.

  16. High-speed MRF-based segmentation algorithm using pixonal images

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Hassanpour, H.; Naimi, H. M.

    2013-01-01

    Segmentation is one of the most complicated procedures in the image processing that has important role in the image analysis. In this paper, an improved pixon-based method for image segmentation is proposed. In proposed algorithm, complex partial differential equations (PDEs) is used as a kernel...... function to make pixonal image. Using this kernel function causes noise on images to reduce and an image not to be over-segment when the pixon-based method is used. Utilising the PDE-based method leads to elimination of some unnecessary details and results in a fewer pixon number, faster performance...... and more robustness against unwanted environmental noises. As the next step, the appropriate pixons are extracted and eventually, we segment the image with the use of a Markov random field. The experimental results indicate that the proposed pixon-based approach has a reduced computational load...

  17. Experimental test of the background rejection, through imaging capability, of a highly segmented AGATA germanium detector

    International Nuclear Information System (INIS)

    Doncel, M.; Recchia, F.; Quintana, B.; Gadea, A.; Farnea, E.

    2010-01-01

    The development of highly segmented germanium detectors as well as the algorithms to identify the position of the interaction within the crystal opens the possibility to locate the γ-ray source using Compton imaging algorithms. While the Compton-suppression shield, coupled to the germanium detector in conventional arrays, works also as an active filter against the γ rays originated outside the target, the new generation of position sensitive γ-ray detector arrays has to fully rely on tracking capabilities for this purpose. In specific experimental conditions, as the ones foreseen at radioactive beam facilities, the ability to discriminate background radiation improves the sensitivity of the gamma spectrometer. In this work we present the results of a measurement performed at the Laboratori Nazionali di Legnaro (LNL) aiming the evaluation of the AGATA detector capabilities to discriminate the origin of the γ rays on an event-by-event basis. It will be shown that, exploiting the Compton scattering formula, it is possible to track back γ rays coming from different positions, assigning them to specific emitting locations. These imaging capabilities are quantified for a single crystal AGATA detector.

  18. Forest Stand Segmentation Using Airborne LIDAR Data and Very High Resolution Multispectral Imagery

    Science.gov (United States)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet, Valérie; Hervieu, Alexandre

    2016-06-01

    Forest stands are the basic units for forest inventory and mapping. Stands are large forested areas (e.g., ≥ 2 ha) of homogeneous tree species composition. The accurate delineation of forest stands is usually performed by visual analysis of human operators on very high resolution (VHR) optical images. This work is highly time consuming and should be automated for scalability purposes. In this paper, a method based on the fusion of airborne laser scanning data (or lidar) and very high resolution multispectral imagery for automatic forest stand delineation and forest land-cover database update is proposed. The multispectral images give access to the tree species whereas 3D lidar point clouds provide geometric information on the trees. Therefore, multi-modal features are computed, both at pixel and object levels. The objects are individual trees extracted from lidar data. A supervised classification is performed at the object level on the computed features in order to coarsely discriminate the existing tree species in the area of interest. The analysis at tree level is particularly relevant since it significantly improves the tree species classification. A probability map is generated through the tree species classification and inserted with the pixel-based features map in an energetical framework. The proposed energy is then minimized using a standard graph-cut method (namely QPBO with α-expansion) in order to produce a segmentation map with a controlled level of details. Comparison with an existing forest land cover database shows that our method provides satisfactory results both in terms of stand labelling and delineation (matching ranges between 94% and 99%).

  19. Fused silica segments: a possible solution for x-ray telescopes with very high angular resolution like Lynx/XRS

    Science.gov (United States)

    Salmaso, Bianca; Basso, Stefano; Civitani, Marta; Ghigo, Mauro; Hołyszko, Joanna; Spiga, Daniele; Vecchi, Gabriele; Pareschi, Giovanni

    2017-09-01

    In order to look beyond Chandra, the Lynx/XRS mission has been proposed in USA and is currently studied by NASA. The optic will have an effective area of 2.5 m2 and an angular resolution of 0.5 arcsec HEW at 1 keV. In order to fulfill these requirements different technologies are considered, with the approaches of both full and segmented shells (that, possibly, can be also combined together). Concerning the production of segmented mirrors, a variety of thin substrates (glass, metal, silicon) are envisaged, that can be produced using both direct polishing or replication methods. Innovative post-fabrication correction methods (such as piezoelectric or magneto-restrictive film actuators on the back surface, differential deposition, ion implantation) are being also considered in order to reach the final tolerances. In this paper we are presenting a technology development based on fused silica (SiO2) segmented substrates, owing the low coefficient of thermal expansion of Fused Silica and its high chemical stability compared to other glasses. Thin SiO2 segmented substrates (typically 2 mm thick) are figured by direct polishing combined with final profile ion figuring correction, while the roughness reduction is reached with pitch tools. For the profile and roughness correction, the segments are glued to a substrate. In this paper we present the current status of this technology.

  20. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    Science.gov (United States)

    Seow, P.; Win, M. T.; Wong, J. H. D.; Abdullah, N. A.; Ramli, N.

    2016-03-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging.

  1. Segmentation of solid subregion of high grade gliomas in MRI images based on active contour model (ACM)

    International Nuclear Information System (INIS)

    Seow, P; Win, M T; Wong, J H D; Ramli, N; Abdullah, N A

    2016-01-01

    Gliomas are tumours arising from the interstitial tissue of the brain which are heterogeneous, infiltrative and possess ill-defined borders. Tumour subregions (e.g. solid enhancing part, edema and necrosis) are often used for tumour characterisation. Tumour demarcation into substructures facilitates glioma staging and provides essential information. Manual segmentation had several drawbacks that include laborious, time consuming, subjected to intra and inter-rater variability and hindered by diversity in the appearance of tumour tissues. In this work, active contour model (ACM) was used to segment the solid enhancing subregion of the tumour. 2D brain image acquisition data using 3T MRI fast spoiled gradient echo sequence in post gadolinium of four histologically proven high-grade glioma patients were obtained. Preprocessing of the images which includes subtraction and skull stripping were performed and then followed by ACM segmentation. The results of the automatic segmentation method were compared against the manual delineation of the tumour by a trainee radiologist. Both results were further validated by an experienced neuroradiologist and a brief quantitative evaluations (pixel area and difference ratio) were performed. Preliminary results of the clinical data showed the potential of ACM model in the application of fast and large scale tumour segmentation in medical imaging. (paper)

  2. High-Resolution Isotropic Three-Dimensional MR Imaging of the Extraforaminal Segments of the Cranial Nerves.

    Science.gov (United States)

    Wen, Jessica; Desai, Naman S; Jeffery, Dean; Aygun, Nafi; Blitz, Ari

    2018-02-01

    High-resolution isotropic 3-dimensional (D) MR imaging with and without contrast is now routinely used for imaging evaluation of cranial nerve anatomy and pathologic conditions. The anatomic details of the extraforaminal segments are well-visualized on these techniques. A wide range of pathologic entities may cause enhancement or displacement of the nerve, which is now visible to an extent not available on standard 2D imaging. This article highlights the anatomy of extraforaminal segments of the cranial nerves and uses select cases to illustrate the utility and power of these sequences, with a focus on constructive interference in steady-state. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Segmentation of high-resolution InSar data of tropical forest using Fourier parameterised deformable models

    NARCIS (Netherlands)

    Varekamp, C.; Hoekman, D.H.

    2001-01-01

    Currently, tree maps are produced from field measurements that are time consuming and expensive. Application of existing techniques based on aerial photography is often hindered by cloud cover. This has initiated research into the segmentation of high resolution airborne interferometric Synthetic

  4. High-permeability criterion for BCS classification: segmental/pH dependent permeability considerations.

    Science.gov (United States)

    Dahan, Arik; Miller, Jonathan M; Hilfinger, John M; Yamashita, Shinji; Yu, Lawrence X; Lennernäs, Hans; Amidon, Gordon L

    2010-10-04

    The FDA classifies a drug substance as high-permeability when the fraction of dose absorbed (F(abs)) in humans is 90% or higher. This direct correlation between human permeability and F(abs) has been recently controversial, since the β-blocker sotalol showed high F(abs) (90%) and low Caco-2 permeability. The purpose of this study was to investigate the scientific basis for this disparity between permeability and F(abs). The effective permeabilities (P(eff)) of sotalol and metoprolol, a FDA standard for the low/high P(eff) class boundary, were investigated in the rat perfusion model, in three different intestinal segments with pHs corresponding to the physiological pH in each region: (1) proximal jejunum, pH 6.5; (2) mid small intestine, pH 7.0; and (3) distal ileum, pH 7.5. Both metoprolol and sotalol showed pH-dependent permeability, with higher P(eff) at higher pH. At any given pH, sotalol showed lower permeability than metoprolol; however, the permeability of sotalol determined at pH 7.5 exceeded/matched metoprolol's at pH 6.5 and 7.0, respectively. Physicochemical analysis based on ionization, pK(a) and partitioning of these drugs predicted the same trend and clarified the mechanism behind these observed results. Experimental octanol-buffer partitioning experiments confirmed the theoretical curves. An oral dose of metoprolol has been reported to be completely absorbed in the upper small intestine; it follows, hence, that metoprolol's P(eff) value at pH 7.5 is not likely physiologically relevant for an immediate release dosage form, and the permeability at pH 6.5 represents the actual relevant value for the low/high permeability class boundary. Although sotalol's permeability is low at pH 6.5 and 7.0, at pH 7.5 it exceeds/matches the threshold of metoprolol at pH 6.5 and 7.0, most likely responsible for its high F(abs). In conclusion, we have shown that, in fact, there is no discrepancy between P(eff) and F(abs) in sotalol's absorption; the data emphasize that

  5. A High Torque Segmented Outer Rotor Permanent Magnet Flux Switching Motor for Motorcycle Propulsion

    Directory of Open Access Journals (Sweden)

    Mbadiwe I Enwelum

    2018-01-01

    Full Text Available Electric scooters also known as electric motorcycle are viable and personal means of road transportation have been making their ways into the world markets now because in them, combustion engine with the use of fuel oil for propulsion have been completely eliminated for economic and environmental imperatives. Electric motor which converts electrical energy into mechanical energy is used to overcome the complication of combustion engine. As it is, everyone is opting for combustion engine free and fuel-less type of vehicle. For this reason, manufacturers have exhibited interest, making research on electric motor very attractive. Meanwhile, surface permanent magnet synchronous motor (SPMSM has been successfully developed having output torque of 110 Nm, the assembly of motor lacked mechanical strength between the rotor yoke and the mounted permanent magnet (PM which heats up during speed operation, resulting to poor performance. To overcome the challenges laced with SPMSM, this paper presents a novel design of 24 stator 14 pole outer rotor-permanent magnet flux switching motor (SOR-PMFSM capable of high torque and high performance. It employs an unconventional segmented rotor which has short flux path flow. It also embraces alternate stator tooth windings to reduce material cost. Design specifications and restriction with input DC current are the same with SPMSM. The 2D-FEA by JMAG, version 14 is used to examine the performance of the proposed motor in terms of cogging torque, back-emf, average torque, power and efficiency. Preliminary results showed that torque, power output and efficiency of the proposed motor are 1.9Nm times, 5.8kW times more than SPMSM and efficiency of 84% thus, can sustain acceleration for long distance travel.

  6. Recoil distance method lifetime measurements of the 2⁺₁ excited states in ⁸⁴Kr and ⁹⁴Sr

    OpenAIRE

    Chester, Aaron Stuart

    2017-01-01

    Intense re-accelerated beams delivered by the Isotope Separator and Accelerator (ISAC-II) facility at TRIUMF, Canada’s national laboratory for particle and nuclear physics, permit access to nuclear structure information for a wide range of radionuclides via in-beam γ-ray spectroscopy with the TRIUMF-ISAC Gamma-Ray Escape Suppressed Spectrometer (TIGRESS), a high-efficiency and Compton-suppressed segmented high-purity germanium (HPGe) detector array. Electromagnetic transition rates measured v...

  7. Individual Building Rooftop and Tree Crown Segmentation from High-Resolution Urban Aerial Optical Images

    Directory of Open Access Journals (Sweden)

    Jichao Jiao

    2016-01-01

    Full Text Available We segment buildings and trees from aerial photographs by using superpixels, and we estimate the tree’s parameters by using a cost function proposed in this paper. A method based on image complexity is proposed to refine superpixels boundaries. In order to classify buildings from ground and classify trees from grass, the salient feature vectors that include colors, Features from Accelerated Segment Test (FAST corners, and Gabor edges are extracted from refined superpixels. The vectors are used to train the classifier based on Naive Bayes classifier. The trained classifier is used to classify refined superpixels as object or nonobject. The properties of a tree, including its locations and radius, are estimated by minimizing the cost function. The shadow is used to calculate the tree height using sun angle and the time when the image was taken. Our segmentation algorithm is compared with other two state-of-the-art segmentation algorithms, and the tree parameters obtained in this paper are compared to the ground truth data. Experiments show that the proposed method can segment trees and buildings appropriately, yielding higher precision and better recall rates, and the tree parameters are in good agreement with the ground truth data.

  8. Fully automatic segmentation of femurs with medullary canal definition in high and in low resolution CT scans.

    Science.gov (United States)

    Almeida, Diogo F; Ruben, Rui B; Folgado, João; Fernandes, Paulo R; Audenaert, Emmanuel; Verhegghe, Benedict; De Beule, Matthieu

    2016-12-01

    Femur segmentation can be an important tool in orthopedic surgical planning. However, in order to overcome the need of an experienced user with extensive knowledge on the techniques, segmentation should be fully automatic. In this paper a new fully automatic femur segmentation method for CT images is presented. This method is also able to define automatically the medullary canal and performs well even in low resolution CT scans. Fully automatic femoral segmentation was performed adapting a template mesh of the femoral volume to medical images. In order to achieve this, an adaptation of the active shape model (ASM) technique based on the statistical shape model (SSM) and local appearance model (LAM) of the femur with a novel initialization method was used, to drive the template mesh deformation in order to fit the in-image femoral shape in a time effective approach. With the proposed method a 98% convergence rate was achieved. For high resolution CT images group the average error is less than 1mm. For the low resolution image group the results are also accurate and the average error is less than 1.5mm. The proposed segmentation pipeline is accurate, robust and completely user free. The method is robust to patient orientation, image artifacts and poorly defined edges. The results excelled even in CT images with a significant slice thickness, i.e., above 5mm. Medullary canal segmentation increases the geometric information that can be used in orthopedic surgical planning or in finite element analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  9. Robust automatic high resolution segmentation of SOFC anode porosity in 3D

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Bowen, Jacob R.

    2008-01-01

    Routine use of 3D characterization of SOFCs by focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice. We apply advanced image analysis algorithms to automatically segment the porosity phase of an SOFC...... anode in 3D. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to the desired phase boundary. Vector fields derived from the experimentally acquired data are used as the driving force. The automatic segmentation compared to manual delineation...... reveals and good correspondence and the two approaches are quantitatively compared. It is concluded that the. automatic approach is more robust, more reproduceable and orders of magnitude quicker than manual segmentation of SOFC anode porosity for subsequent quantitative 3D analysis. Lastly...

  10. Multiblock copolymers with highly sulfonated blocks containing di- and tetrasulfonated arylene sulfone segments for proton exchange membrane fuel cell applications

    Energy Technology Data Exchange (ETDEWEB)

    Takamuku, Shogo; Jannasch, Patric [Polymer and Materials Chemistry, Department of Chemistry, Lund University (Sweden)

    2012-01-15

    Multiblock copoly(arylene ether sulfone)s with different block lengths and ionic contents are tailored for durable and proton-conducting electrolyte membranes. Two series of fully aromatic copolymers are prepared by coupling reactions between non-sulfonated hydrophobic precursor blocks and highly sulfonated hydrophilic precursor blocks containing either fully disulfonated diarylsulfone or fully tetrasulfonated tetraaryldisulfone segments. The sulfonic acid groups are exclusively introduced in ortho positions to the sulfone bridges to impede desulfonation reactions and give the blocks ion exchange capacities (IECs) of 4.1 and 4.6 meq. g{sup -1}, respectively. Solvent cast block copolymer membranes show well-connected hydrophilic nanophase domains for proton transport and high decomposition temperatures above 310 C under air. Despite higher IEC values, membranes containing tetrasulfonated tetraaryldisulfone segments display a markedly lower water uptake than the corresponding ones with disulfonated diarylsulfone segments when immersed in water at 100 C, presumably because of the much higher chain stiffness and glass transition temperature of the former segments. The former membranes have proton conductivities in level of a perfluorosulfonic acid membrane (NRE212) under fully humidified conditions. A membrane with an IEC of 1.83 meq. g{sup -1} reaches above 6 mS cm{sup -1} under 30% relative humidity at 80 C, to be compared with 10 mS cm{sup -1} for NRE212 under the same conditions. (Copyright copyright 2012 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Absolute standardization of radionuclides with complex decay by the peak-sum coincidence method and photon spectrometry with HPGe detector; Padronização primária de radionuclídeos com decaimento complexo pelo método de coincidência pico-soma espectrometria de fótons com detector GeHP

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Ronaldo Lins da

    2017-07-01

    This study aims to present a new methodology for absolute standardization of {sup 133}Ba, which is a complex decay radionuclide, using the peak-sum coincidence method associated with gamma spectrometry with a high resolution germanium detector. The use of the method of direct multiplication of matrices allowed identifying all the energies of sum coincidence, as well as their probabilities of detection, which made possible the calculation of the probabilities of detecting the energies of interferences. In addition, with the use of deconvolution software it was possible to obtain the areas of energy without interference of other sums, and by means of the deduced equation for the peak sum method, it was possible to standardize {sup 133}Ba. The result of the activity was compared with those found by the absolute methods existing in the LNMRI, where the result obtained by coincidence peak-sum was highlighted among all. The estimated uncertainties were below 0.30%, compatible with the results found in the literature by other absolute methods. Thus, it was verified that the methodology was able to standardize radionuclide {sup 133}Ba with precision, accuracy, easiness and quickness. The relevance of this doctoral thesis is to provide the National Metrology Laboratory of Ionizing Radiation (LNMRI) with a new absolute standardization methodology for complex decay radionuclides. (author)

  12. Sampling strategies for subsampled segmented EPI PRF thermometry in MR guided high intensity focused ultrasound

    Science.gov (United States)

    Odéen, Henrik; Todd, Nick; Diakite, Mahamadou; Minalga, Emilee; Payne, Allison; Parker, Dennis L.

    2014-01-01

    Purpose: To investigate k-space subsampling strategies to achieve fast, large field-of-view (FOV) temperature monitoring using segmented echo planar imaging (EPI) proton resonance frequency shift thermometry for MR guided high intensity focused ultrasound (MRgHIFU) applications. Methods: Five different k-space sampling approaches were investigated, varying sample spacing (equally vs nonequally spaced within the echo train), sampling density (variable sampling density in zero, one, and two dimensions), and utilizing sequential or centric sampling. Three of the schemes utilized sequential sampling with the sampling density varied in zero, one, and two dimensions, to investigate sampling the k-space center more frequently. Two of the schemes utilized centric sampling to acquire the k-space center with a longer echo time for improved phase measurements, and vary the sampling density in zero and two dimensions, respectively. Phantom experiments and a theoretical point spread function analysis were performed to investigate their performance. Variable density sampling in zero and two dimensions was also implemented in a non-EPI GRE pulse sequence for comparison. All subsampled data were reconstructed with a previously described temporally constrained reconstruction (TCR) algorithm. Results: The accuracy of each sampling strategy in measuring the temperature rise in the HIFU focal spot was measured in terms of the root-mean-square-error (RMSE) compared to fully sampled “truth.” For the schemes utilizing sequential sampling, the accuracy was found to improve with the dimensionality of the variable density sampling, giving values of 0.65 °C, 0.49 °C, and 0.35 °C for density variation in zero, one, and two dimensions, respectively. The schemes utilizing centric sampling were found to underestimate the temperature rise, with RMSE values of 1.05 °C and 1.31 °C, for variable density sampling in zero and two dimensions, respectively. Similar subsampling schemes

  13. High-resolution CISS MR imaging with and without contrast for evaluation of the upper cranial nerves: segmental anatomy and selected pathologic conditions of the cisternal through extraforaminal segments.

    Science.gov (United States)

    Blitz, Ari M; Macedo, Leonardo L; Chonka, Zachary D; Ilica, Ahmet T; Choudhri, Asim F; Gallia, Gary L; Aygun, Nafi

    2014-02-01

    The authors review the course and appearance of the major segments of the upper cranial nerves from their apparent origin at the brainstem through the proximal extraforaminal region, focusing on the imaging and anatomic features of particular relevance to high-resolution magnetic resonance imaging evaluation. Selected pathologic entities are included in the discussion of the corresponding cranial nerve segments for illustrative purposes. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. A multi-scale tensor voting approach for small retinal vessel segmentation in high resolution fundus images.

    Science.gov (United States)

    Christodoulidis, Argyrios; Hurtut, Thomas; Tahar, Houssem Ben; Cheriet, Farida

    2016-09-01

    Segmenting the retinal vessels from fundus images is a prerequisite for many CAD systems for the automatic detection of diabetic retinopathy lesions. So far, research efforts have concentrated mainly on the accurate localization of the large to medium diameter vessels. However, failure to detect the smallest vessels at the segmentation step can lead to false positive lesion detection counts in a subsequent lesion analysis stage. In this study, a new hybrid method for the segmentation of the smallest vessels is proposed. Line detection and perceptual organization techniques are combined in a multi-scale scheme. Small vessels are reconstructed from the perceptual-based approach via tracking and pixel painting. The segmentation was validated in a high resolution fundus image database including healthy and diabetic subjects using pixel-based as well as perceptual-based measures. The proposed method achieves 85.06% sensitivity rate, while the original multi-scale line detection method achieves 81.06% sensitivity rate for the corresponding images (p<0.05). The improvement in the sensitivity rate for the database is 6.47% when only the smallest vessels are considered (p<0.05). For the perceptual-based measure, the proposed method improves the detection of the vasculature by 7.8% against the original multi-scale line detection method (p<0.05). Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Design and Fabrication of High Gain Multi-element Multi-segment Quarter-sector Cylindrical Dielectric Resonator Antenna

    Science.gov (United States)

    Ranjan, Pinku; Gangwar, Ravi Kumar

    2017-12-01

    A novel design and analysis of quarter cylindrical dielectric resonator antenna (q-CDRA) with multi-element and multi-segment (MEMS) approach has been presented. The MEMS q-CDRA has been designed by splitting four identical quarters from a solid cylinder and then multi-segmentation approach has been utilized to design q-CDRA. The proposed antenna has been designed for enhancement in bandwidth as well as for high gain. For bandwidth enhancement, multi-segmentation method has been explained for the selection of dielectric constant of materials. The performance of the proposed MEMS q-CDRA has been demonstrated with design guideline of MEMS approach. To validate the antenna performance, three segments q-CDRA has been fabricated and analyzed practically. The simulated results have been in good agreement with measured one. The MEMS q-CDRA has wide impedance bandwidth (|S11|≤-10 dB) of 133.8 % with monopole-like radiation pattern. The proposed MEMS q-CDRA has been operating at TM01δ mode with the measured gain of 6.65 dBi and minimum gain of 4.5 dBi in entire operating frequency band (5.1-13.7 GHz). The proposed MEMS q-CDRA may find appropriate applications in WiMAX and WLAN band.

  16. Automated analysis of high-throughput B-cell sequencing data reveals a high frequency of novel immunoglobulin V gene segment alleles.

    Science.gov (United States)

    Gadala-Maria, Daniel; Yaari, Gur; Uduman, Mohamed; Kleinstein, Steven H

    2015-02-24

    Individual variation in germline and expressed B-cell immunoglobulin (Ig) repertoires has been associated with aging, disease susceptibility, and differential response to infection and vaccination. Repertoire properties can now be studied at large-scale through next-generation sequencing of rearranged Ig genes. Accurate analysis of these repertoire-sequencing (Rep-Seq) data requires identifying the germline variable (V), diversity (D), and joining (J) gene segments used by each Ig sequence. Current V(D)J assignment methods work by aligning sequences to a database of known germline V(D)J segment alleles. However, existing databases are likely to be incomplete and novel polymorphisms are hard to differentiate from the frequent occurrence of somatic hypermutations in Ig sequences. Here we develop a Tool for Ig Genotype Elucidation via Rep-Seq (TIgGER). TIgGER analyzes mutation patterns in Rep-Seq data to identify novel V segment alleles, and also constructs a personalized germline database containing the specific set of alleles carried by a subject. This information is then used to improve the initial V segment assignments from existing tools, like IMGT/HighV-QUEST. The application of TIgGER to Rep-Seq data from seven subjects identified 11 novel V segment alleles, including at least one in every subject examined. These novel alleles constituted 13% of the total number of unique alleles in these subjects, and impacted 3% of V(D)J segment assignments. These results reinforce the highly polymorphic nature of human Ig V genes, and suggest that many novel alleles remain to be discovered. The integration of TIgGER into Rep-Seq processing pipelines will increase the accuracy of V segment assignments, thus improving B-cell repertoire analyses.

  17. Calculation of HPGe Detector Response for NRF Photons Scattered from Threat Materials

    International Nuclear Information System (INIS)

    Park, B. G.; Choi, H. D.

    2009-01-01

    Nuclear Resonance Fluorescence (NRF) is a process of resonant nuclear absorption of photons, followed by deexcitation with emission of fluorescence photons. The cross section of NRF photons process is given by σ i max ≡ 2π(λ/2π) 2 2J+1/2J 0 +1 Γ 0 Γ i /Γ tot 2 , where λ is the wavelength of the photon, J 0 and J are the nuclear spins of the ground state and excited state, respectively, Γ 0 , Γ i and Γ tot are decay width for deexcitation to the ground state, to the i-th mode state and total decay width, respectively. NRF based security inspection technique uses the signatures of resonance energies of the fluorescence photon scattered from nuclides of the illicit materials in cargo container. NRF can be used to identify the material type, quantity and location. It is performed by measuring the fluorescence photon and the transmitted photon spectrum while irradiating Bremsstrahlung photon beam to the sample

  18. A method for the determination of counting efficiencies in γ-spectrometric measurements with HPGe detectors

    International Nuclear Information System (INIS)

    Bolivar, J.P.; Garcia-Leon, M.

    1996-01-01

    In this paper a general method for γ-ray efficiency calibration is presented. The method takes into account the differences of densities and counting geometry between the real sample and the calibration sample. It is based on the γ-transmission method and gives the correction factor f as a function of E γ , the density and counting geometry. Altough developed for soil samples, its underlying working philosophy is useful for any sample whose geometry can be adequately reproduced. (orig.)

  19. X-ray fluorescence analysis in environmental radiological surveillance using HPGe detectors

    International Nuclear Information System (INIS)

    Herrera Peraza, E.; Renteria Villalobos, M.; Montero Cabrera, M.E.; Munoz Romero, A.

    2004-01-01

    X-ray fluorescence (XRF) has been proven to be a valuable tool for determining trace quantities of heavy metals, such as uranium and lead, in different types of samples. The present paper demonstrates the applicability of XRF spectrometry to measure the concentrations of these heavy metals in samples from natural ore and soil. The values of uranium concentrations in rock from the Pena Blanca uranium ore, in Chihuahua, Mexico, were calculated for the purpose of precertifying the rock powders samples. The comparison with other techniques, such as inductively coupled plasma atomic emission spectrometry, atomic absorption spectrometry, alpha spectrometry and electron microscopy, was used to complete the precertification process, so that the sample powders may be used as secondary standards. The source-sample-detector geometry and the incident angle are the most important factors for obtaining low detection limits. The selected system uses a 57 Co source of about 0.1 mCi to excite the K X-rays from uranium and lead. X-rays were recorded on a CANBERRA HPGe coaxial detector. The comparative results for two incident angles (90 deg and 180 deg ) performed previously by other authors show that the best geometry is the backscattering geometry. In the present paper, using EGS4 code system with Monte Carlo simulation, it was possible to determine the location and distribution of background produced by the Compton edge in the optimized geometry. This procedure allowed to find the minimum detectable concentration of uranium and lead, which was experimentally calculated using standards. The possibility of performing in vivo measurements rapidly and easily, as well as the factors affecting accuracy and the minimum detectable concentration in several samples are also discussed

  20. X-ray fluorescence analysis in environmental radiological surveillance using HPGe detectors

    Energy Technology Data Exchange (ETDEWEB)

    Herrera Peraza, E. [Department of Environmental Radiological Surveillance, Centro de Investigacion en Materiales Avanzados (CIMAV), P.O. Box 31109, Miguel de Cervantes no. 120, Complejo Industrial Chihuahua, Chihuahua (Mexico)]. E-mail: eduardo.herrera@cimav.edu.mx; Renteria Villalobos, M. [Department of Environmental Radiological Surveillance, Centro de Investigacion en Materiales Avanzados (CIMAV), P.O. Box 31109, Miguel de Cervantes no. 120, Complejo Industrial Chihuahua, Chihuahua (Mexico); Montero Cabrera, M.E. [Department of Environmental Radiological Surveillance, Centro de Investigacion en Materiales Avanzados (CIMAV), P.O. Box 31109, Miguel de Cervantes no. 120, Complejo Industrial Chihuahua, Chihuahua (Mexico); Munoz Romero, A. [Department of Environmental Radiological Surveillance, Centro de Investigacion en Materiales Avanzados (CIMAV), P.O. Box 31109, Miguel de Cervantes no. 120, Complejo Industrial Chihuahua, Chihuahua (Mexico)

    2004-10-08

    X-ray fluorescence (XRF) has been proven to be a valuable tool for determining trace quantities of heavy metals, such as uranium and lead, in different types of samples. The present paper demonstrates the applicability of XRF spectrometry to measure the concentrations of these heavy metals in samples from natural ore and soil. The values of uranium concentrations in rock from the Pena Blanca uranium ore, in Chihuahua, Mexico, were calculated for the purpose of precertifying the rock powders samples. The comparison with other techniques, such as inductively coupled plasma atomic emission spectrometry, atomic absorption spectrometry, alpha spectrometry and electron microscopy, was used to complete the precertification process, so that the sample powders may be used as secondary standards. The source-sample-detector geometry and the incident angle are the most important factors for obtaining low detection limits. The selected system uses a {sup 57}Co source of about 0.1 mCi to excite the K X-rays from uranium and lead. X-rays were recorded on a CANBERRA HPGe coaxial detector. The comparative results for two incident angles (90 deg and 180 deg ) performed previously by other authors show that the best geometry is the backscattering geometry. In the present paper, using EGS4 code system with Monte Carlo simulation, it was possible to determine the location and distribution of background produced by the Compton edge in the optimized geometry. This procedure allowed to find the minimum detectable concentration of uranium and lead, which was experimentally calculated using standards. The possibility of performing in vivo measurements rapidly and easily, as well as the factors affecting accuracy and the minimum detectable concentration in several samples are also discussed.

  1. Monte Carlo based geometrical model for efficiency calculation of an n-type HPGe detector

    Energy Technology Data Exchange (ETDEWEB)

    Padilla Cabal, Fatima, E-mail: fpadilla@instec.c [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba); Lopez-Pino, Neivy; Luis Bernal-Castillo, Jose; Martinez-Palenzuela, Yisel; Aguilar-Mena, Jimmy; D' Alessandro, Katia; Arbelo, Yuniesky; Corrales, Yasser; Diaz, Oscar [Instituto Superior de Tecnologias y Ciencias Aplicadas, ' Quinta de los Molinos' Ave. Salvador Allende, esq. Luaces, Plaza de la Revolucion, Ciudad de la Habana, CP 10400 (Cuba)

    2010-12-15

    A procedure to optimize the geometrical model of an n-type detector is described. Sixteen lines from seven point sources ({sup 241}Am, {sup 133}Ba, {sup 22}Na, {sup 60}Co, {sup 57}Co, {sup 137}Cs and {sup 152}Eu) placed at three different source-to-detector distances (10, 20 and 30 cm) were used to calibrate a low-background gamma spectrometer between 26 and 1408 keV. Direct Monte Carlo techniques using the MCNPX 2.6 and GEANT 4 9.2 codes, and a semi-empirical procedure were performed to obtain theoretical efficiency curves. Since discrepancies were found between experimental and calculated data using the manufacturer parameters of the detector, a detail study of the crystal dimensions and the geometrical configuration is carried out. The relative deviation with experimental data decreases from a mean value of 18-4%, after the parameters were optimized.

  2. Assessment of activity incorporated in human body by means of a HPGe-detector

    Energy Technology Data Exchange (ETDEWEB)

    Boshkova, T; Minev, L [Sofia Univ. (Bulgaria). Fizicheski Fakultet; Konstantinov, V [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    Human body models (phantoms) have been used to study relations between efficiency in measurement of radionuclide activity and measurement conditions. A comparison is made between the activity measurement with and without a motion. When the phantom is moved under the detector, an absolute efficiency increase of 30-40% in homogeneously distributed activity and 2-3 times increase in activity concentrated in the lungs are found. Incorrect calibration is found to introduce an error of 30-40% in the measurement with motion while without motion the error can reach 150-300%. The effects of object weight, size and shield are studied. A procedure for accurate measurement has been developed. The incorporated activity in a human body has been measured using this procedure. Two Kozloduy-2 staff members have been subject to measurement. Cs-137 activity was homogeneously distributed with values 175 Bq in person X and 1050 Bq in person Y. Co-60 activity was concentrated in the lungs with values of 110 Bq and 470 Bq respectively. 14 refs., 5 tabs.

  3. CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.

    Science.gov (United States)

    Saegusa, Jun

    2008-01-01

    The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.

  4. A detailed investigation of interactions within the shielding to HPGe detector response using MCNP code

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Tran Thien; Tao, Chau Van; Loan, Truong Thi Hong; Nhon, Mai Van; Chuong, Huynh Dinh; Au, Bui Hai [Vietnam National Univ., Ho Chi Minh City (Viet Nam). Dept. of Nuclear Physics

    2012-12-15

    The accuracy of the coincidence-summing corrections in gamma spectrometry depends on the total efficiency calibration that is hardly obtained over the whole energy as the required experimental conditions are not easily attained. Monte Carlo simulations using MCNP5 code was performed in order to estimate the affect of the shielding to total efficiency. The effect of HPGe response are also shown. (orig.)

  5. Tow efficiency correction functions of source self-absorption of an HPGe detector

    International Nuclear Information System (INIS)

    Gao Zheng; Ma Yusheng; Luo Jianghua; Chen Luning

    2007-01-01

    The efficiency correction function of source absorption of an HPGe γ detector is determined by experiment in energy range from 59.5 keV to 1408 keV and density range from 0.3 g/cm 3 to 2.0 g/cm 3 . Fit Polynomial and fit Sigmoidal are compared. The results show that fit Sigmoidal is better than fit polynomial, and the detection efficiency at any points of energy and density could be conveniently calculated with it in calibrated range. (authors)

  6. Highly CO2-Selective Gas Separation Membranes Based on Segmented Copolymers of Poly(Ethylene oxide) Reinforced with Pentiptycene-Containing Polyimide Hard Segments.

    Science.gov (United States)

    Luo, Shuangjiang; Stevens, Kevin A; Park, Jae Sung; Moon, Joshua D; Liu, Qiang; Freeman, Benny D; Guo, Ruilan

    2016-01-27

    Poly(ethylene oxide) (PEO)-containing polymer membranes are attractive for CO2-related gas separations due to their high selectivity toward CO2. However, the development of PEO-rich membranes is frequently challenged by weak mechanical properties and a high crystallization tendency of PEO that hinders gas transport. Here we report a new series of highly CO2-selective, amorphous PEO-containing segmented copolymers prepared from commercial Jeffamine polyetheramines and pentiptycene-based polyimide. The copolymers are much more mechanically robust than the nonpentiptycene containing counterparts due to the molecular reinforcement mechanism of supramolecular chain threading and interlocking interactions induced by the pentiptycene structures, which also effectively suppresses PEO crystallization leading to a completely amorphous structure even at 60% PEO weight content. Membrane transport properties are sensitively affected by both PEO weight content and PEO chain length. A nonlinear correlation between CO2 permeability with PEO weight content was observed due to the competition between solubility and diffusivity contributions, whereby the copolymers change from being size-selective to solubility-selective when PEO content reaches 40%. CO2 selectivities over H2 and N2 increase monotonically with both PEO content and chain length, indicating strong CO2-philicity of the copolymers. The copolymer film with the longest PEO sequence (PEO2000) and highest PEO weight content (60%) showed a measured CO2 pure gas permeability of 39 Barrer, and ideal CO2/H2 and CO2/N2 selectivities of 4.1 and 46, respectively, at 35 °C and 3 atm, making them attractive for hydrogen purification and carbon capture.

  7. High-degree atrioventricular block complicating ST-segment elevation myocardial infarction in the era of primary percutaneous coronary intervention

    DEFF Research Database (Denmark)

    Gang, Uffe Jakob Ortved; Hvelplund, Anders; Pedersen, Sune

    2012-01-01

    Primary percutaneous coronary intervention (pPCI) has replaced thrombolysis as treatment-of-choice for ST-segment elevation myocardial infarction (STEMI). However, the incidence and prognostic significance of high-degree atrioventricular block (HAVB) in STEMI patients in the pPCI era has been only...... sparsely investigated. The objective of this study was to assess the incidence, predictors and prognostic significance of HAVB in STEMI patients treated with pPCI....

  8. Parallel segmented outlet flow high performance liquid chromatography with multiplexed detection

    International Nuclear Information System (INIS)

    Camenzuli, Michelle; Terry, Jessica M.; Shalliker, R. Andrew; Conlan, Xavier A.; Barnett, Neil W.; Francis, Paul S.

    2013-01-01

    Graphical abstract: -- Highlights: •Multiplexed detection for liquid chromatography. •‘Parallel segmented outlet flow’ distributes inner and outer portions of the analyte zone. •Three detectors were used simultaneously for the determination of opiate alkaloids. -- Abstract: We describe a new approach to multiplex detection for HPLC, exploiting parallel segmented outlet flow – a new column technology that provides pressure-regulated control of eluate flow through multiple outlet channels, which minimises the additional dead volume associated with conventional post-column flow splitting. Using three detectors: one UV-absorbance and two chemiluminescence systems (tris(2,2′-bipyridine)ruthenium(III) and permanganate), we examine the relative responses for six opium poppy (Papaver somniferum) alkaloids under conventional and multiplexed conditions, where approximately 30% of the eluate was distributed to each detector and the remaining solution directed to a collection vessel. The parallel segmented outlet flow mode of operation offers advantages in terms of solvent consumption, waste generation, total analysis time and solute band volume when applying multiple detectors to HPLC, but the manner in which each detection system is influenced by changes in solute concentration and solution flow rates must be carefully considered

  9. Parallel segmented outlet flow high performance liquid chromatography with multiplexed detection

    Energy Technology Data Exchange (ETDEWEB)

    Camenzuli, Michelle [Australian Centre for Research on Separation Science (ACROSS), School of Science and Health, University of Western Sydney (Parramatta), Sydney, NSW (Australia); Terry, Jessica M. [Centre for Chemistry and Biotechnology, School of Life and Environmental Sciences, Deakin University, Geelong, Victoria 3216 (Australia); Shalliker, R. Andrew, E-mail: r.shalliker@uws.edu.au [Australian Centre for Research on Separation Science (ACROSS), School of Science and Health, University of Western Sydney (Parramatta), Sydney, NSW (Australia); Conlan, Xavier A.; Barnett, Neil W. [Centre for Chemistry and Biotechnology, School of Life and Environmental Sciences, Deakin University, Geelong, Victoria 3216 (Australia); Francis, Paul S., E-mail: paul.francis@deakin.edu.au [Centre for Chemistry and Biotechnology, School of Life and Environmental Sciences, Deakin University, Geelong, Victoria 3216 (Australia)

    2013-11-25

    Graphical abstract: -- Highlights: •Multiplexed detection for liquid chromatography. •‘Parallel segmented outlet flow’ distributes inner and outer portions of the analyte zone. •Three detectors were used simultaneously for the determination of opiate alkaloids. -- Abstract: We describe a new approach to multiplex detection for HPLC, exploiting parallel segmented outlet flow – a new column technology that provides pressure-regulated control of eluate flow through multiple outlet channels, which minimises the additional dead volume associated with conventional post-column flow splitting. Using three detectors: one UV-absorbance and two chemiluminescence systems (tris(2,2′-bipyridine)ruthenium(III) and permanganate), we examine the relative responses for six opium poppy (Papaver somniferum) alkaloids under conventional and multiplexed conditions, where approximately 30% of the eluate was distributed to each detector and the remaining solution directed to a collection vessel. The parallel segmented outlet flow mode of operation offers advantages in terms of solvent consumption, waste generation, total analysis time and solute band volume when applying multiple detectors to HPLC, but the manner in which each detection system is influenced by changes in solute concentration and solution flow rates must be carefully considered.

  10. Automatic segmentation in three-dimensional analysis of fibrovascular pigmentepithelial detachment using high-definition optical coherence tomography.

    Science.gov (United States)

    Ahlers, C; Simader, C; Geitzenauer, W; Stock, G; Stetson, P; Dastmalchi, S; Schmidt-Erfurth, U

    2008-02-01

    A limited number of scans compromise conventional optical coherence tomography (OCT) to track chorioretinal disease in its full extension. Failures in edge-detection algorithms falsify the results of retinal mapping even further. High-definition-OCT (HD-OCT) is based on raster scanning and was used to visualise the localisation and volume of intra- and sub-pigment-epithelial (RPE) changes in fibrovascular pigment epithelial detachments (fPED). Two different scanning patterns were evaluated. 22 eyes with fPED were imaged using a frequency-domain, high-speed prototype of the Cirrus HD-OCT. The axial resolution was 6 mum, and the scanning speed was 25 kA scans/s. Two different scanning patterns covering an area of 6 x 6 mm in the macular retina were compared. Three-dimensional topographic reconstructions and volume calculations were performed using MATLAB-based automatic segmentation software. Detailed information about layer-specific distribution of fluid accumulation and volumetric measurements can be obtained for retinal- and sub-RPE volumes. Both raster scans show a high correlation (p0.89) of measured values, that is PED volume/area, retinal volume and mean retinal thickness. Quality control of the automatic segmentation revealed reasonable results in over 90% of the examinations. Automatic segmentation allows for detailed quantitative and topographic analysis of the RPE and the overlying retina. In fPED, the 128 x 512 scanning-pattern shows mild advantages when compared with the 256 x 256 scan. Together with the ability for automatic segmentation, HD-OCT clearly improves the clinical monitoring of chorioretinal disease by adding relevant new parameters. HD-OCT is likely capable of enhancing the understanding of pathophysiology and benefits of treatment for current anti-CNV strategies in future.

  11. Segmented block copolymers with monodisperse aramide end-segments

    NARCIS (Netherlands)

    Araichimani, A.; Gaymans, R.J.

    2008-01-01

    Segmented block copolymers were synthesized using monodisperse diaramide (TT) as hard segments and PTMO with a molecular weight of 2 900 g · mol-1 as soft segments. The aramide: PTMO segment ratio was increased from 1:1 to 2:1 thereby changing the structure from a high molecular weight multi-block

  12. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

    Science.gov (United States)

    Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan

    2018-01-01

    A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.

  13. High-contrast visible nulling coronagraph for segmented and arbitrary telescope apertures

    Science.gov (United States)

    Hicks, Brian A.; Lyon, Richard G.; Bolcar, Matthew R.; Clampin, Mark; Petrone, Peter

    2014-08-01

    Exoplanet coronagraphy will be driven by the telescope architectures available and thus the system designer must have available one or more suitable coronagraphic instrument choices that spans the set of telescope apertures, including filled (off-axis), obscured (e.g. with secondary mirror spiders and struts), segmented apertures, such as JWST, and interferometric apertures. In this work we present one such choice of coronagraph, known as the visible nulling coronagraph (VNC), that spans all four types of aperture and also employs differential sensing and control.

  14. Large microcalorimeter arrays for high-resolution X- and gamma-rayspectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Hoover, A.S., E-mail: ahoover@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Hoteling, N.; Rabin, M.W. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Ullom, J.N.; Bennett, D.A. [National Institute of Standards and Technology, Boulder, CO 80305 (United States); Karpius, P.J.; Vo, D.T. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Doriese, W.B.; Hilton, G.C.; Horansky, R.D.; Irwin, K.D.; Kotsubo, V. [National Institute of Standards and Technology, Boulder, CO 80305 (United States); Lee, D.W. [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Vale, L.R. [National Institute of Standards and Technology, Boulder, CO 80305 (United States)

    2011-10-01

    Microcalorimeter detectors provide unprecedented energy resolution for the measurement of X-rays and soft gamma-rays. Energy resolution in the 100 keV region can be up to an order of magnitude better than planar high-purity germanium (HPGe) detectors. The technology is well-suited to analysis of materials with complex spectra presenting closely spaced photopeaks. One application area is the measurement and assay of nuclear materials for safeguards and fuel cycle applications. In this paper, we discuss the operation and performance of a 256-pixel array, and present results of a head-to-head comparison of isotopic determination measurements with high-purity germanium using a plutonium standard. We show that the uncertainty of a single measurement is smaller for the microcalorimeter data compared to the HPGe data when photopeak areas are equal. We identify several key areas where analysis codes can be optimized that will likely lead to improvement in the microcalorimeter performance.

  15. Automated Slide Scanning and Segmentation in Fluorescently-labeled Tissues Using a Widefield High-content Analysis System.

    Science.gov (United States)

    Poon, Candice C; Ebacher, Vincent; Liu, Katherine; Yong, Voon Wee; Kelly, John James Patrick

    2018-05-03

    Automated slide scanning and segmentation of fluorescently-labeled tissues is the most efficient way to analyze whole slides or large tissue sections. Unfortunately, many researchers spend large amounts of time and resources developing and optimizing workflows that are only relevant to their own experiments. In this article, we describe a protocol that can be used by those with access to a widefield high-content analysis system (WHCAS) to image any slide-mounted tissue, with options for customization within pre-built modules found in the associated software. Not originally intended for slide scanning, the steps detailed in this article make it possible to acquire slide scanning images in the WHCAS which can be imported into the associated software. In this example, the automated segmentation of brain tumor slides is demonstrated, but the automated segmentation of any fluorescently-labeled nuclear or cytoplasmic marker is possible. Furthermore, there are a variety of other quantitative software modules including assays for protein localization/translocation, cellular proliferation/viability/apoptosis, and angiogenesis that can be run. This technique will save researchers time and effort and create an automated protocol for slide analysis.

  16. Development and Characterization of a High Sensitivity Segmented Fast Neutron Spectrometer (FaNS-2).

    Science.gov (United States)

    Langford, T J; Beise, E J; Breuer, H; Heimbach, C R; Ji, G; Nico, J S

    2016-01-01

    We present the development of a segmented fast neutron spectrometer (FaNS-2) based upon plastic scintillator and 3 He proportional counters. It was designed to measure both the flux and spectrum of fast neutrons in the energy range of few MeV to 1 GeV. FaNS-2 utilizes capture-gated spectroscopy to identify neutron events and reject backgrounds. Neutrons deposit energy in the plastic scintillator before capturing on a 3 He nucleus in the proportional counters. Segmentation improves neutron energy reconstruction while the large volume of scintillator increases sensitivity to low neutron fluxes. A main goal of its design is to study comparatively low neutron fluxes, such as cosmogenic neutrons at the Earth's surface, in an underground environment, or from low-activity neutron sources. In this paper, we present details of its design and construction as well as its characterization with a calibrated 252 Cf source and monoenergetic neutron fields of 2.5 MeV and 14 MeV. Detected monoenergetic neutron spectra are unfolded using a Singular Value Decomposition method, demonstrating a 5% energy resolution at 14 MeV. Finally, we discuss plans for measuring the surface and underground cosmogenic neutron spectra with FaNS-2.

  17. Efficiency correction for disk sources using coaxial High-Purity Ge detectors

    International Nuclear Information System (INIS)

    Chatani, Hiroshi.

    1993-03-01

    Efficiency correction factors for disk sources were determined by making use of closed-ended coaxial High-Purity Ge (HPGe) detectors, their relative efficiencies for a 3' 'x3' ' NaI(Tl) with the 1.3 MeV γ-rays were 30 % and 10 %, respectively. Parameters for the correction by mapping method were obtained systematically, using several monoenergetic (i.e. no coincidence summing loses) γ-ray sources produced by irradiation in the Kyoto University Reactor (KUR) core. These were found out that (1) the systematics of the Gaussian fitting parameters, which were calculated using the relative efficiency distributions of HPGe, to the γ-ray energies are recognized, (2) the efficiency distributions deviate from the Gaussian distributions outside of the radii of HPGe. (3) mapping method is a practical use in satisfactory accuracy, as the results in comparison with the disk source measurements. (author)

  18. High-Precision Half-life Measurements for the Superallowed β+ Emitter 14O

    Science.gov (United States)

    Laffoley, A. T.; Svensson, C. E.; Andreoiu, C.; Austin, R. A. E.; Ball, G. C.; Blank, B.; Bouzomita, H.; Cross, D. S.; Diaz Varela, A.; Dunlop, R.; Finlay, P.; Garnsworthy, A. B.; Garrett, P. E.; Giovinazzo, J.; Grinyer, G. F.; Hackman, G.; Hadinia, B.; Jamieson, D. S.; Ketelhut, S.; Leach, K. G.; Leslie, J. R.; Tardiff, E. R.; Thomas, J. C.; Unsworth, C.

    2014-03-01

    The half-life of 14O, a superallowed Fermi β+ emitter, has been determined via simultaneous γ and β counting experiments at TRIUMF's Isotope Separator and Accelerator facility. Following the implantation of 14O samples at the center of the 8π spectrometer, a γ counting measurement was performed by detecting the 2313 keV γ-rays emitted from the first excited state of the daughter 14N using 20 high-purity germanium (HPGe) detectors. A simultaneous β counting experiment was performed using a fast plastic scintillator positioned directly behind the implantation site. The results, T½(γ) = 70:632 ± 0:094 s and T½(β) = 70:610 ± 0:030 s, are consistent with one another and, together with eight previous measurements, establish a new average for the 14O half-life of T½ = 70:619 ± 0:011 s with a reduced χ2 of 0.99.

  19. a High-Precision Branching-Ratio Measurement for the Superallowed β+ Emitter 74Rb

    Science.gov (United States)

    Dunlop, R.; Chagnon-Lessard, S.; Finlay, P.; Garrett, P. E.; Hadinia, B.; Leach, K. G.; Svensson, C. E.; Wong, J.; Ball, G.; Garnsworthy, A. B.; Glister, J.; Hackman, G.; Tardiff, E. R.; Triambak, S.; Williams, S. J.; Leslie, J. R.; Andreoiu, C.; Chester, A.; Cross, D.; Starosta, K.; Yates, S. W.; Zganjar, E. F.

    2013-03-01

    Precision measurements of superallowed Fermi beta decay allow for tests of the Cabibbo-Kobayashi-Maskawa matrix (CKM) unitarity, the conserved vector current hypothesis, and the magnitude of isospin-symmetry-breaking effects in nuclei. A high-precision measurement of the branching ratio for the β+ decay of 74Rb has been performed at the Isotope Separator and ACcelerator (ISAC) facility at TRIUMF. The 8π spectrometer, an array of 20 close-packed HPGe detectors, was used to detect gamma rays emitted following the decay of 74Rb. PACES, an array of 5 Si(Li) detectors, was used to detect emitted conversion electrons, while SCEPTAR, an array of plastic scintillators, was used to detect emitted beta particles. A total of 51γ rays have been identified following the decay of 21 excited states in the daughter nucleus 74Kr.

  20. Gamma ray spectrum of Am 241 in a backscattering geometry using a high purity germanium detector

    International Nuclear Information System (INIS)

    Chong Chon Sing; Ibrahim Salih Elyaseery; Ahmad Shukri Mustapa Kamal; Abdul Aziz Tajuddin

    1997-01-01

    In back scattering geometry using an annular Am-241 source and a HPGE detector has been set up to study both the coherent and incoherent scattering of photon emissions of Am-241 from medium-Z and high-Z elements. Besides the coherent and incoherent scattered peaks of the emissions from the source, the gamma ray spectrum from the different target elements obtained using a microcomputer based multichannel analyser showed the presence of several other peaks. These peaks have been identified to arise from the fluorescence of the targets, the fluorescence of the shielding material Pb, and also as fluorescence sum peaks and X-ray escape peaks of the detector material Ge. The spectra are presented for three target elements viz. Mo, Zn and W

  1. Simulation for photon detection in spectrometric system of high purity (HPGe) using MCNPX code

    International Nuclear Information System (INIS)

    Correa, Guilherme Jorge de Souza

    2013-01-01

    The Brazilian National Commission of Nuclear Energy defines parameters for classification and management of radioactive waste in accordance with the activity of materials. The efficiency of a detection system is crucial to determine the real activity of a radioactive source. When it's possible, the system's calibration should be performed using a standard source. Unfortunately, there are only a few cases that it can be done this way, considering the difficulty of obtaining appropriate standard sources for each type of measurement. So, computer simulations can be performed to assist in calculating of the efficiency of the system and, consequently, also auxiliary the classification of radioactive waste. This study aims to model a high purity germanium (HPGe) detector with MCNPX code, approaching the spectral values computationally obtained of the values experimentally obtained for the photopeak of 137 Cs. The approach will be made through changes in outer dead layer of the germanium crystal modeled. (author)

  2. The high-efficiency γ-ray spectroscopy setup γ{sup 3} at HIγS

    Energy Technology Data Exchange (ETDEWEB)

    Löher, B., E-mail: b.loeher@gsi.de [ExtreMe Matter Institute EMMI and Research Division, GSI Helmholtzzentrum für Schwerionenforschung, Planckstr. 1, 64291 Darmstadt (Germany); Frankfurt Institute for Advanced Studies FIAS, Ruth-Moufang-Str. 1, 60438 Frankfurt am Main (Germany); Derya, V. [Institut für Kernphysik, Universität zu Köln, Zülpicher Str. 77, D-50937 Köln (Germany); Aumann, T. [Institut für Kernphysik, TU Darmstadt, Schlossgartenstr. 9, 64289 Darmstadt (Germany); GSI Helmholtzzentrum für Schwerionenforschung, Planckstr. 1, 64291 Darmstadt (Germany); Beller, J. [Institut für Kernphysik, TU Darmstadt, Schlossgartenstr. 9, 64289 Darmstadt (Germany); Cooper, N. [WNSL, Yale University, P.O. Box 208120, New Haven, CT 06520-8120 (United States); Duchêne, M. [Institut für Kernphysik, TU Darmstadt, Schlossgartenstr. 9, 64289 Darmstadt (Germany); Endres, J. [Institut für Kernphysik, Universität zu Köln, Zülpicher Str. 77, D-50937 Köln (Germany); Fiori, E.; Isaak, J. [ExtreMe Matter Institute EMMI and Research Division, GSI Helmholtzzentrum für Schwerionenforschung, Planckstr. 1, 64291 Darmstadt (Germany); Frankfurt Institute for Advanced Studies FIAS, Ruth-Moufang-Str. 1, 60438 Frankfurt am Main (Germany); and others

    2013-09-21

    The existing Nuclear Resonance Fluorescence (NRF) setup at the HIγS facility at the Triangle Universities Nuclear Laboratory at Duke University has been extended in order to perform γ–γ coincidence experiments. The new setup combines large volume LaBr{sub 3}:Ce detectors and high resolution HPGe detectors in a very close geometry to offer high efficiency, high energy resolution as well as high count rate capabilities at the same time. The combination of a highly efficient γ-ray spectroscopy setup with the mono-energetic high-intensity photon beam of HIγS provides a worldwide unique experimental facility to investigate the γ-decay pattern of dipole excitations in atomic nuclei. The performance of the new setup has been assessed by studying the nucleus {sup 32}S at 8.125 MeV beam energy. The relative γ-decay branching ratio from the 1{sup +} level at 8125.4 keV to the first excited 2{sup +} state was determined to 15.7(3)%. -- Author-Highlights: • We have extended the existing NRF setup at HIγS at TUNL to combine large LaBr and HPGe detectors. • NRF experiments with the mono-energetic beam in combination with Gamma coincidences are possible. • We describe the changes to the experimental setup and data acquisition as well as data analysis. • The performance of the new setup was assessed by investigating the nucleus 32S. • We present a more precisely measured value for the branching ratio for the 1+→2+ transition.

  3. a Comparison of Tree Segmentation Methods Using Very High Density Airborne Laser Scanner Data

    Science.gov (United States)

    Pirotti, F.; Kobal, M.; Roussel, J. R.

    2017-09-01

    Developments of LiDAR technology are decreasing the unit cost per single point (e.g. single-photo counting). This brings to the possibility of future LiDAR datasets having very dense point clouds. In this work, we process a very dense point cloud ( 200 points per square meter), using three different methods for segmenting single trees and extracting tree positions and other metrics of interest in forestry, such as tree height distribution and canopy area distribution. The three algorithms are tested at decreasing densities, up to a lowest density of 5 point per square meter. Accuracy assessment is done using Kappa, recall, precision and F-Score metrics comparing results with tree positions from groundtruth measurements in six ground plots where tree positions and heights were surveyed manually. Results show that one method provides better Kappa and recall accuracy results for all cases, and that different point densities, in the range used in this study, do not affect accuracy significantly. Processing time is also considered; the method with better accuracy is several times slower than the other two methods and increases exponentially with point density. Best performer gave Kappa = 0.7. The implications of metrics for determining the accuracy of results of point positions' detection is reported. Motives for the different performances of the three methods is discussed and further research direction is proposed.

  4. High-resolution magnetic resonance imaging reveals nuclei of the human amygdala: manual segmentation to automatic atlas.

    Science.gov (United States)

    Saygin, Z M; Kliemann, D; Iglesias, J E; van der Kouwe, A J W; Boyd, E; Reuter, M; Stevens, A; Van Leemput, K; McKee, A; Frosch, M P; Fischl, B; Augustinack, J C

    2017-07-15

    The amygdala is composed of multiple nuclei with unique functions and connections in the limbic system and to the rest of the brain. However, standard in vivo neuroimaging tools to automatically delineate the amygdala into its multiple nuclei are still rare. By scanning postmortem specimens at high resolution (100-150µm) at 7T field strength (n = 10), we were able to visualize and label nine amygdala nuclei (anterior amygdaloid, cortico-amygdaloid transition area; basal, lateral, accessory basal, central, cortical medial, paralaminar nuclei). We created an atlas from these labels using a recently developed atlas building algorithm based on Bayesian inference. This atlas, which will be released as part of FreeSurfer, can be used to automatically segment nine amygdala nuclei from a standard resolution structural MR image. We applied this atlas to two publicly available datasets (ADNI and ABIDE) with standard resolution T1 data, used individual volumetric data of the amygdala nuclei as the measure and found that our atlas i) discriminates between Alzheimer's disease participants and age-matched control participants with 84% accuracy (AUC=0.915), and ii) discriminates between individuals with autism and age-, sex- and IQ-matched neurotypically developed control participants with 59.5% accuracy (AUC=0.59). For both datasets, the new ex vivo atlas significantly outperformed (all p amygdala derived from the segmentation in FreeSurfer 5.1 (ADNI: 75%, ABIDE: 54% accuracy), as well as classification based on whole amygdala volume (using the sum of all amygdala nuclei volumes; ADNI: 81%, ABIDE: 55% accuracy). This new atlas and the segmentation tools that utilize it will provide neuroimaging researchers with the ability to explore the function and connectivity of the human amygdala nuclei with unprecedented detail in healthy adults as well as those with neurodevelopmental and neurodegenerative disorders. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Estimation of thorium intake due to consumption of vegetables by inhabitants of high background radiation area by INAA

    International Nuclear Information System (INIS)

    Sathyapriya, R.S.; Suma Nair; Prabhath, R.K.; Madhu Nair; Rao, D.D.

    2012-01-01

    A study was conducted to estimate the thorium concentration in locally grown vegetables in high background radiation area (HBRA) of southern coastal regions of India. Locally grown vegetables were collected from HBRA of southern coastal regions of India. Thorium concentration was quantified using instrumental neutron activation analysis. The samples were irradiated at CIRUS reactor and counted using a 40% relative efficiency HPGe detector coupled to MCA. The annual intake of thorium was evaluated using the consumption data provided by National Nutrition Monitoring Board. The daily intake of 232 Th from the four food categories (green leafy vegetables, others vegetables, roots and tubers, and fruits) ranged between 0.27 and 5.352 mBq d -1 . The annual internal dose due to ingestion of thorium from these food categories was 46.8 x 10 -8 for female and 58.6 x 10 -8 Sv y -1 for male. (author)

  6. Performance of a compact multi-crystal high-purity germanium detector array for measuring coincident gamma-ray emissions

    Energy Technology Data Exchange (ETDEWEB)

    Howard, Chris; Daigle, Stephen; Buckner, Matt [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States); Erikson, Luke E.; Runkle, Robert C. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Stave, Sean C., E-mail: Sean.Stave@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Champagne, Arthur E.; Cooper, Andrew; Downen, Lori [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States); Glasgow, Brian D. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Kelly, Keegan; Sallaska, Anne [University of North Carolina at Chapel Hill, Chapel Hill, NC 27599 (United States); Triangle Universities Nuclear Laboratory, Durham, NC 27708 (United States)

    2015-05-21

    The Multi-sensor Airborne Radiation Survey (MARS) detector is a 14-crystal array of high-purity germanium (HPGe) detectors housed in a single cryostat. The array was used to measure the astrophysical S-factor for the {sup 14}N(p,γ){sup 15}O{sup ⁎} reaction for several transition energies at an effective center-of-mass energy of 163 keV. Owing to the granular nature of the MARS detector, the effect of gamma-ray summing was greatly reduced in comparison to past experiments which utilized large, single-crystal detectors. The new S-factor values agree within their uncertainties with the past measurements. Details of the analysis and detector performance are presented.

  7. The Influence Of Dead Layer Effect On The Characteristics Of The High Purity Germanium P-Type Detector

    International Nuclear Information System (INIS)

    Ngo Quang Huy

    2011-01-01

    The present work aims at reviewing the studies of the influence of dead layer effect on the characteristics of a high purity germanium (HPGe) p-type detector, obtained by the author and his colleagues in the recent years. The object for study was the HPGe GC1518 detector-based gamma spectrometer of the Center for Nuclear Techniques, Ho Chi Minh City. The studying problems were: The modeling of an HPGe detector-based gamma spectrometer with using the MCNP code; the method of determining the thickness of dead layer by experimental measurements of gamma spectra and the calculations using MCNP code; the influence of material parameters and dead layer on detector efficiency; the increase of dead layer thickness over the operating time of the GC1518 detector; the influence of dead layer thickness increase on the decrease of detector efficiency; the dead layer effect for the gamma spectra measured in the GC1518 detector. (author)

  8. Germanium detectors for nuclear spectroscopy: Current research and development activity at LNL

    Energy Technology Data Exchange (ETDEWEB)

    Napoli, D. R., E-mail: daniel.r.napoli@lnl.infn.it [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Legnaro, Viale dell’Università 2, 35020 Legnaro, Padova (Italy); Maggioni, G., E-mail: maggioni@lnl.infn.it; Carturan, S.; Gelain, M. [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Legnaro, Viale dell’Università 2, 35020 Legnaro, Padova (Italy); Department of Physics and Astronomy “G. Galilei”, University of Padova, Via Marzolo 8, 35121 Padova (Italy); Eberth, J. [Institut für Kernphysik, Universität zu Köln, Zülpicher Straße 77, D-50937 Köln (Germany); Grimaldi, M. G.; Tatí, S. [Department of Physics and Astronomy, University of Catania (Italy); Riccetto, S. [University of Camerino and INFN of Perugia (Italy); Mea, G. Della [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Legnaro, Viale dell’Università 2, 35020 Legnaro, Padova (Italy); University of Trento (Italy)

    2016-07-07

    High-purity Germanium (HPGe) detectors have reached an unprecedented level of sophistication and are still the best solution for high-resolution gamma spectroscopy. In the present work, we will show the results of the characterization of new surface treatments for the production of these detectors, studied in the framework of our multidisciplinary research program in HPGe detector technologies.

  9. Active Segmentation.

    Science.gov (United States)

    Mishra, Ajay; Aloimonos, Yiannis

    2009-01-01

    The human visual system observes and understands a scene/image by making a series of fixations. Every fixation point lies inside a particular region of arbitrary shape and size in the scene which can either be an object or just a part of it. We define as a basic segmentation problem the task of segmenting that region containing the fixation point. Segmenting the region containing the fixation is equivalent to finding the enclosing contour- a connected set of boundary edge fragments in the edge map of the scene - around the fixation. This enclosing contour should be a depth boundary.We present here a novel algorithm that finds this bounding contour and achieves the segmentation of one object, given the fixation. The proposed segmentation framework combines monocular cues (color/intensity/texture) with stereo and/or motion, in a cue independent manner. The semantic robots of the immediate future will be able to use this algorithm to automatically find objects in any environment. The capability of automatically segmenting objects in their visual field can bring the visual processing to the next level. Our approach is different from current approaches. While existing work attempts to segment the whole scene at once into many areas, we segment only one image region, specifically the one containing the fixation point. Experiments with real imagery collected by our active robot and from the known databases 1 demonstrate the promise of the approach.

  10. High resolution 148Nd(3He, ny) two proton stripping reaction and the structure of the O2+ state in 150Sm

    International Nuclear Information System (INIS)

    Sharpey-Schafer, J.F.; Dinoko, T.S.; Herbert, M.S.; Orce, J.N.; Papka, P.; Kheswa, B.V.; Ndayishimye, J.; Bvumbi, S.P.; Jones, P.M.; Bucher, T.D.; Lawrie, E.A.; Lawrie, J.J.; Negi, D.; Shirinda, O.; Wiedeking, M.; Vymers, P.; Easton, J.L.; Noncolela, S.P.; Sithole, P.; Khumalo, N.; Majola, S.N.T.; Stankiewicz, M.A.

    2014-01-01

    The challenge of achieving high resolution in binary reactions involving an outgoing high energy neutron is solved by detecting the γ-ray decay of populated excited states in an array of escape suppressed HPGe detectors in coincidence with fast neutrons detected in a wall of scintillator detectors 2 m down beam of the target. The selectivity of the arrangement is of the order of 1 in 1000. The time-of-flight difference is sufficient to separate fast neutrons from direct reactions from a large background of statistical neutrons from fusion-evaporation reactions. Our interest is in the wavefunction of the 0 2 + state at 740 keV in the N=88 nucleus 150 Sm which, with the 0 2 + state in 100 Ru, are the only two excited states observed in 2β2ν double β-decay. (authors)

  11. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    Science.gov (United States)

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  12. Shallow high-resolution geophysical investigation along the western segment of the Victoria Lines Fault (island of Malta)

    Science.gov (United States)

    Villani, Fabio; D'Amico, Sebastiano; Panzera, Francesco; Vassallo, Maurizio; Bozionelos, George; Farrugia, Daniela; Galea, Pauline

    2018-01-01

    The Victoria Lines Fault (island of Malta) is a >15 km-long and N260°-striking segmented normal fault-system, which is probably inactive since the late Pliocene. In the westernmost part, the Fomm Ir-Rih segment displays comparable geologic throw and escarpment height ( 150-170 m), moreover its hangingwall hosts thin patches of Middle Pleistocene clastic continental deposits (red beds), which are poorly preserved elsewhere. We acquired two seismic transects, by collecting ambient vibration recordings, processed by using horizontal-to-vertical spectral ratios, complemented by one high-resolution 2-D refraction tomography survey crossing this fault where it is locally covered by red beds and recent colluvial deposits. We found a resonance peak at 1.0 Hz in the hangingwall block, whereas clear peaks in the range 5.0-10.0 Hz appear when approaching the subsurface fault, and we relate them to the fractured bedrock within the fault zone. The best-fit tomographic model shows a relatively high-Vp shallow body (Vp 2200-2400 m/s) that we relate to the weathered top of the Miocene Upper Coralline Limestone Fm., bounded on both sides by low-Vp regions (230 m/s above the weathered top-bedrock. Our results depict a clear seismic signature of the Victoria Lines Fault, characterized by low seismic velocity and high amplification of ground motion. We hypothesize that, during the Middle Pleistocene, faulting may have affected the basal part of the red beds, so that this part of the investigated complex fault-system may be considered inactive since 0.6 Myr ago.

  13. Segmentation: Identification of consumer segments

    DEFF Research Database (Denmark)

    Høg, Esben

    2005-01-01

    It is very common to categorise people, especially in the advertising business. Also traditional marketing theory has taken in consumer segments as a favorite topic. Segmentation is closely related to the broader concept of classification. From a historical point of view, classification has its...... origin in other sciences as for example biology, anthropology etc. From an economic point of view, it is called segmentation when specific scientific techniques are used to classify consumers to different characteristic groupings. What is the purpose of segmentation? For example, to be able to obtain...... a basic understanding of grouping people. Advertising agencies may use segmentation totarget advertisements, while food companies may usesegmentation to develop products to various groups of consumers. MAPP has for example investigated the positioning of fish in relation to other food products...

  14. High-resolution magnetic resonance imaging reveals nuclei of the human amygdala: manual segmentation to automatic atlas

    DEFF Research Database (Denmark)

    Saygin, Z M; Kliemann, D; Iglesias, J. E.

    2017-01-01

    The amygdala is composed of multiple nuclei with unique functions and connections in the limbic system and to the rest of the brain. However, standard in vivo neuroimaging tools to automatically delineate the amygdala into its multiple nuclei are still rare. By scanning postmortem specimens at high...... resolution (100-150µm) at 7T field strength (n = 10), we were able to visualize and label nine amygdala nuclei (anterior amygdaloid, cortico-amygdaloid transition area; basal, lateral, accessory basal, central, cortical medial, paralaminar nuclei). We created an atlas from these labels using a recently...... developed atlas building algorithm based on Bayesian inference. This atlas, which will be released as part of FreeSurfer, can be used to automatically segment nine amygdala nuclei from a standard resolution structural MR image. We applied this atlas to two publicly available datasets (ADNI and ABIDE...

  15. Segmental Vitiligo.

    Science.gov (United States)

    van Geel, Nanja; Speeckaert, Reinhart

    2017-04-01

    Segmental vitiligo is characterized by its early onset, rapid stabilization, and unilateral distribution. Recent evidence suggests that segmental and nonsegmental vitiligo could represent variants of the same disease spectrum. Observational studies with respect to its distribution pattern point to a possible role of cutaneous mosaicism, whereas the original stated dermatomal distribution seems to be a misnomer. Although the exact pathogenic mechanism behind the melanocyte destruction is still unknown, increasing evidence has been published on the autoimmune/inflammatory theory of segmental vitiligo. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. A highly segmented and compact liquid argon calorimeter for the LHC the TGT calorimeter

    CERN Document Server

    Berger, C; Geulig, H; Pierschel, G; Siedling, R; Tutas, J; Wlochal, M; Wotschack, J; Cheplakov, A P; Eremeev, R V; Feshchenko, A; Gavrishchuk, O P; Kazarinov, Yu M; Khrenov, Yu V; Kukhtin, V V; Ladygin, E; Obudovskij, V; Shalyugin, A N; Tolmachev, V T; Volodko, A G; Geweniger, C; Hanke, P; Kluge, E E; Krause, J; Putzer, A; Tittel, K; Wunsch, M; Bán, J; Bruncko, Dusan; Kriván, F; Kurca, T; Murín, P; Sándor, L; Spalek, J; Aderholz, Michael; Brettel, H; Dydak, Friedrich; Fent, J; Huber, J; Hajduk, L; Jakobs, K; Kiesling, C; Oberlack, H; Schacht, P; Stiegler, U; Bogolyubsky, M Yu; Chekulaev, S V; Kiryunin, A E; Kurchaninov, L L; Levitsky, M S; Maximov, V V; Minaenko, A A; Moiseev, A M; Semenov, P A; CERN. Geneva. Detector Research and Development Committee

    1992-01-01

    The development of a fast, highly granular and compact electromagnetic liquid argon calorimeter is proposed as an R&D project for an LHC calorimeter with full rapidity coverage. The proposed ``Thin Gap Turbine'' (TGT) calorimeter offers uniform energy response and constant energy resolution independent of the production angle of the impinging particle and of its impact position at the calorimeter. An important aspect of the project is the development of electronics for fast signal processing matched to the short charge collection time in the TGT read-out cell. The system aspects of the integration of a high degree of signal processing into the liquid argon would be investigated.

  17. Track segments in hadronic showers in a highly granular scintillator-steel hadron calorimeter

    Czech Academy of Sciences Publication Activity Database

    Adloff, C.; Blaising, J.J.; Chefdeville, M.; Cvach, Jaroslav; Gallus, Petr; Havránek, Miroslav; Janata, Milan; Kvasnička, Jiří; Lednický, Denis; Marčišovský, Michal; Polák, Ivo; Popule, Jiří; Tomášek, Lukáš; Růžička, Pavel; Šícho, Petr; Smolík, Jan; Vrba, Václav; Zálešák, Jaroslav

    2013-01-01

    Roč. 8, Sep (2013), s. 1-22 ISSN 1748-0221 R&D Projects: GA MŠk LC527; GA MŠk LA09042 Institutional support: RVO:68378271 Keywords : calorimeters * detector modelling and simulations * analysis and statistical methods Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 1.526, year: 2013

  18. A highly-segmented neutron detector for the A1 experiment at MAMI

    Energy Technology Data Exchange (ETDEWEB)

    Schoth, Matthias [Institut fuer Kernphysik, Mainz (Germany); Collaboration: A1-Collaboration

    2015-07-01

    Electric and magnetic form factors of the neutron, are one of the defining properties to characterize its structure quantitatively. A planned physics program to improve the data base significantly requires high performance detection of relativistic neutrons. Exploiting the full potential of the high luminosity supplied by the MAMI accelerator, a novel neutron detector is being developed in the scope of the A1 collaboration. A large active detector volume of 0.96 m{sup 3} is required to achieve a high raw detection efficiency. The detector is subdivided into 2048 plastic scintillators to be able to cope with high background rates. The light is extracted via wavelength shifting fibres and then guided to multi anode photomultiplier. The signal is read out with FPGA based TDCs (TRBv3 developed at GSI). The energy of the signal is obtained via time over threshold information in combination with a suitable shaping and discriminating circuit. Prototype tests have been performed to optimize the choice of materials and geometry. The capability to detect neutrons in the relevant momentum range has been demonstrated using pion production. A Geant4 simulation using tracking algorithms evaluating the deposited energy is able to optimize key detector properties like particle id efficiency, multiplicity or the effective analyzing power for double polarized scattering experiments.

  19. Informing Extension Program Development through Audience Segmentation: Targeting High Water Users

    Science.gov (United States)

    Huang, Pei-wen; Lamm, Alexa J.; Dukes, Michael D.

    2016-01-01

    Human reliance on water has led to water issues globally. Although extension professionals have made efforts successfully to educate the general public about water conservation to enhance water resource sustainability, difficulty has been found in reaching high water users, defined as residents irrigating excessively to their landscape irrigation…

  20. Design and performance simulation of a segmented-absorber based muon detection system for high energy heavy ion collision experiments

    International Nuclear Information System (INIS)

    Ahmad, S.; Bhaduri, P.P.; Jahan, H.; Senger, A.; Adak, R.; Samanta, S.; Prakash, A.; Dey, K.; Lebedev, A.; Kryshen, E.; Chattopadhyay, S.; Senger, P.; Bhattacharjee, B.; Ghosh, S.K.; Raha, S.; Irfan, M.; Ahmad, N.; Farooq, M.; Singh, B.

    2015-01-01

    A muon detection system (MUCH) based on a novel concept using a segmented and instrumented absorber has been designed for high-energy heavy-ion collision experiments. The system consists of 6 hadron absorber blocks and 6 tracking detector triplets. Behind each absorber block a detector triplet is located which measures the tracks of charged particles traversing the absorber. The performance of such a system has been simulated for the CBM experiment at FAIR (Germany) that is scheduled to start taking data in heavy ion collisions in the beam energy range of 6–45 A GeV from 2019. The muon detection system is mounted downstream to a Silicon Tracking System (STS) that is located in a large aperture dipole magnet which provides momentum information of the charged particle tracks. The reconstructed tracks from the STS are to be matched to the hits measured by the muon detector triplets behind the absorber segments. This method allows the identification of muon tracks over a broad range of momenta including tracks of soft muons which do not pass through all the absorber layers. Pairs of oppositely charged muons identified by MUCH could therefore be combined to measure the invariant masses in a wide range starting from low mass vector mesons (LMVM) up to charmonia. The properties of the absorber (material, thickness, position) and of the tracking chambers (granularity, geometry) have been varied in simulations of heavy-ion collision events generated with the UrQMD generator and propagated through the setup using the GEANT3, the particle transport code. The tracks are reconstructed by a Cellular Automaton algorithm followed by a Kalman Filter. The simulations demonstrate that low mass vector mesons and charmonia can be clearly identified in central Au+Au collisions at beam energies provided by the international Facility for Antiproton and Ion Research (FAIR)

  1. Monte Carlo simulation of gamma-ray interactions in an over-square high-purity germanium detector for in-vivo measurements

    Science.gov (United States)

    Saizu, Mirela Angela

    2016-09-01

    The developments of high-purity germanium detectors match very well the requirements of the in-vivo human body measurements regarding the gamma energy ranges of the radionuclides intended to be measured, the shape of the extended radioactive sources, and the measurement geometries. The Whole Body Counter (WBC) from IFIN-HH is based on an “over-square” high-purity germanium detector (HPGe) to perform accurate measurements of the incorporated radionuclides emitting X and gamma rays in the energy range of 10 keV-1500 keV, under conditions of good shielding, suitable collimation, and calibration. As an alternative to the experimental efficiency calibration method consisting of using reference calibration sources with gamma energy lines that cover all the considered energy range, it is proposed to use the Monte Carlo method for the efficiency calibration of the WBC using the radiation transport code MCNP5. The HPGe detector was modelled and the gamma energy lines of 241Am, 57Co, 133Ba, 137Cs, 60Co, and 152Eu were simulated in order to obtain the virtual efficiency calibration curve of the WBC. The Monte Carlo method was validated by comparing the simulated results with the experimental measurements using point-like sources. For their optimum matching, the impact of the variation of the front dead layer thickness and of the detector photon absorbing layers materials on the HPGe detector efficiency was studied, and the detector’s model was refined. In order to perform the WBC efficiency calibration for realistic people monitoring, more numerical calculations were generated simulating extended sources of specific shape according to the standard man characteristics.

  2. Calocube—A highly segmented calorimeter for a space based experiment

    International Nuclear Information System (INIS)

    D'Alessandro, R.; Adriani, O.; Agnesi, A.; Albergo, S.; Auditore, L.; Basti, A.; Berti, E.; Bigongiari, G.; Bonechi, L.; Bonechi, S.; Bongi, M.; Bonvicini, V.

    2016-01-01

    Future research in High Energy Cosmic Ray Physics concerns fundamental questions on their origin, acceleration mechanism, and composition. Unambiguous measurements of the energy spectra and of the composition of cosmic rays at the “knee” region could provide some of the answers to the above questions. Only ground based observations, which rely on sophisticated models describing high energy interactions in the earth's atmosphere, have been possible so far due to the extremely low particle rates at these energies. A calorimeter based space experiment can provide not only flux measurements but also energy spectra and particle identification, especially when coupled to a dE/dx measuring detector, and thus overcome some of the limitations plaguing ground based experiments. For this to be possible very large acceptances are needed if enough statistic is to be collected in a reasonable time. This contrasts with the lightness and compactness requirements for space based experiments. A novel idea in calorimetry is discussed here which addresses these issues while limiting the mass and volume of the detector. In fact a small prototype is currently being built and tested with ions. In this paper the results obtained will be presented in light of the simulations performed.

  3. Calocube—A highly segmented calorimeter for a space based experiment

    Energy Technology Data Exchange (ETDEWEB)

    D' Alessandro, R., E-mail: candi@fi.infn.it [University of Florence, Department of Physics and Astronomy, via G. Sansone 1, I-50019 Sesto Fiorentino (Firenze) (Italy); INFN Firenze, via B. Rossi 1, I-50019 Sesto Fiorentino (Firenze) (Italy); Adriani, O. [University of Florence, Department of Physics and Astronomy, via G. Sansone 1, I-50019 Sesto Fiorentino (Firenze) (Italy); INFN Firenze, via B. Rossi 1, I-50019 Sesto Fiorentino (Firenze) (Italy); Agnesi, A. [University of Pavia, Dipartimento di Ingegneria Industriale e dell' Informazione, Pavia (Italy); INFN Pavia, via A. Bassi 6, I-27100 Pavia (Italy); Albergo, S. [University of Catania, Department of Physics and Astronomy, via S. Sofia 64, I-95123 Catania (Italy); INFN Catania, via S. Sofia 64, I-95123 Catania (Italy); Auditore, L. [University of Messina, Department of Physics, sal. Sperone 31, I-98166 Messina (Italy); INFN Catania, via S. Sofia 64, I-95123 Catania (Italy); Basti, A. [University of Siena, Department of Physical Sciences, Earth and Environment, I-53100 Siena (Italy); INFN Pisa, via F. Buonarroti 2, I-56127 Pisa (Italy); Berti, E. [University of Florence, Department of Physics and Astronomy, via G. Sansone 1, I-50019 Sesto Fiorentino (Firenze) (Italy); INFN Firenze, via B. Rossi 1, I-50019 Sesto Fiorentino (Firenze) (Italy); Bigongiari, G. [University of Siena, Department of Physical Sciences, Earth and Environment, I-53100 Siena (Italy); INFN Pisa, via F. Buonarroti 2, I-56127 Pisa (Italy); Bonechi, L. [INFN Firenze, via B. Rossi 1, I-50019 Sesto Fiorentino (Firenze) (Italy); Bonechi, S. [University of Siena, Department of Physical Sciences, Earth and Environment, I-53100 Siena (Italy); INFN Pisa, via F. Buonarroti 2, I-56127 Pisa (Italy); Bongi, M. [University of Florence, Department of Physics and Astronomy, via G. Sansone 1, I-50019 Sesto Fiorentino (Firenze) (Italy); INFN Firenze, via B. Rossi 1, I-50019 Sesto Fiorentino (Firenze) (Italy); Bonvicini, V. [INFN Trieste, via Valerio 2, I-34127 Trieste (Italy); and others

    2016-07-11

    Future research in High Energy Cosmic Ray Physics concerns fundamental questions on their origin, acceleration mechanism, and composition. Unambiguous measurements of the energy spectra and of the composition of cosmic rays at the “knee” region could provide some of the answers to the above questions. Only ground based observations, which rely on sophisticated models describing high energy interactions in the earth's atmosphere, have been possible so far due to the extremely low particle rates at these energies. A calorimeter based space experiment can provide not only flux measurements but also energy spectra and particle identification, especially when coupled to a dE/dx measuring detector, and thus overcome some of the limitations plaguing ground based experiments. For this to be possible very large acceptances are needed if enough statistic is to be collected in a reasonable time. This contrasts with the lightness and compactness requirements for space based experiments. A novel idea in calorimetry is discussed here which addresses these issues while limiting the mass and volume of the detector. In fact a small prototype is currently being built and tested with ions. In this paper the results obtained will be presented in light of the simulations performed.

  4. New segmentation-based tone mapping algorithm for high dynamic range image

    Science.gov (United States)

    Duan, Weiwei; Guo, Huinan; Zhou, Zuofeng; Huang, Huimin; Cao, Jianzhong

    2017-07-01

    The traditional tone mapping algorithm for the display of high dynamic range (HDR) image has the drawback of losing the impression of brightness, contrast and color information. To overcome this phenomenon, we propose a new tone mapping algorithm based on dividing the image into different exposure regions in this paper. Firstly, the over-exposure region is determined using the Local Binary Pattern information of HDR image. Then, based on the peak and average gray of the histogram, the under-exposure and normal-exposure region of HDR image are selected separately. Finally, the different exposure regions are mapped by differentiated tone mapping methods to get the final result. The experiment results show that the proposed algorithm achieve the better performance both in visual quality and objective contrast criterion than other algorithms.

  5. A Character Segmentation Proposal for High-Speed Visual Monitoring of Expiration Codes on Beverage Cans

    Directory of Open Access Journals (Sweden)

    José C. Rodríguez-Rodríguez

    2016-04-01

    Full Text Available Expiration date labels are ubiquitous in the food industry. With the passage of time, almost any food becomes unhealthy, even when well preserved. The expiration date is estimated based on the type and manufacture/packaging time of that particular food unit. This date is then printed on the container so it is available to the end user at the time of consumption. MONICOD (MONItoring of CODes; an industrial validator of expiration codes; allows the expiration code printed on a drink can to be read. This verification occurs immediately after printing. MONICOD faces difficulties due to the high printing rate (35 cans per second and problematic lighting caused by the metallic surface on which the code is printed. This article describes a solution that allows MONICOD to extract shapes and presents quantitative results for the speed and quality.

  6. Artifact free T2{sup *}-weighted imaging at high spatial resolution using segmented EPI sequences

    Energy Technology Data Exchange (ETDEWEB)

    Heiler, Patrick Michael; Schad, Lothar Rudi [Heidelberg Univ., Mannheim (Germany). Computer Assisted Clinical Medicine; Schmitter, Sebastian [German Cancer Research Center, Heidelberg (Germany). Dept. of Medical Physics in Radiology

    2010-07-01

    The aim of this work was the development of novel measurement techniques that acquire high resolution T2{sup *}-weighted datasets in measurement times as short as possible without suffering from noticeable blurring and ghosting artifacts. Therefore, two new measurement techniques were developed that acquire a smoother k-space than generic multi shot echo planar imaging sequences. One is based on the principle of echo train shifting, the other on the reversed gradient method. Simulations and phantom measurements demonstrate that echo train shifting works properly and reduces artifacts in multi shot echo planar imaging. For maximum SNR-efficiency this technique was further improved by adding a second contrast. Both contrasts can be acquired within a prolongation in measurement time by a factor of 1.5, leading to an SNR increase by approximately {radical}2. Furthermore it is demonstrated that the reversed gradient method remarkably reduces artifacts caused by a discontinuous k-space weighting. Assuming sequence parameters as feasible for fMRI experiments, artifact free T2{sup *}-weighted images with a matrix size of 256 x 256 leading to an in-plane resolution in the submillimeter range can be obtained in about 2 s per slice. (orig.)

  7. Artifact free T2*-weighted imaging at high spatial resolution using segmented EPI sequences

    International Nuclear Information System (INIS)

    Heiler, Patrick Michael; Schad, Lothar Rudi; Schmitter, Sebastian

    2010-01-01

    The aim of this work was the development of novel measurement techniques that acquire high resolution T2 * -weighted datasets in measurement times as short as possible without suffering from noticeable blurring and ghosting artifacts. Therefore, two new measurement techniques were developed that acquire a smoother k-space than generic multi shot echo planar imaging sequences. One is based on the principle of echo train shifting, the other on the reversed gradient method. Simulations and phantom measurements demonstrate that echo train shifting works properly and reduces artifacts in multi shot echo planar imaging. For maximum SNR-efficiency this technique was further improved by adding a second contrast. Both contrasts can be acquired within a prolongation in measurement time by a factor of 1.5, leading to an SNR increase by approximately √2. Furthermore it is demonstrated that the reversed gradient method remarkably reduces artifacts caused by a discontinuous k-space weighting. Assuming sequence parameters as feasible for fMRI experiments, artifact free T2 * -weighted images with a matrix size of 256 x 256 leading to an in-plane resolution in the submillimeter range can be obtained in about 2 s per slice. (orig.)

  8. Automation system for measurement of gamma-ray spectra of induced activity for multi-element high-volume neutron activation analysis at the IBR-2 reactor of FLNP at JINR

    International Nuclear Information System (INIS)

    Pavlov, S.S.; Dmitriev, A.Yu.; Chepurchenko, I.A.; Frontas'eva, M.V.

    2014-01-01

    The automation system for measurement of induced activity of gamma-ray spectra for multi-element high-volume neutron activation analysis (NAA) was designed, developed and implemented at the IBR-2 reactor. The system consists of three devices of automatic sample changers for three Canberra HPGe detector-based gamma spectrometry systems. Each sample changer consists of two-axis linear positioning module M202A by DriveSet (DriveSet.de) company and disk with 45 slots for containers with samples. Control of automatic sample changer is performed by the Xemo S360U controller by Systec (systec.de) company. Positioning accuracy can reach 0.1 mm. Special software performs automatic changing of samples and measurement of gamma spectra at constant interaction with the NAA database. The system is unique and can be recommended for other laboratories as one of the possible ways of the NAA integrated automation

  9. Breast ultrasound image segmentation: an optimization approach based on super-pixels and high-level descriptors

    Science.gov (United States)

    Massich, Joan; Lemaître, Guillaume; Martí, Joan; Mériaudeau, Fabrice

    2015-04-01

    Breast cancer is the second most common cancer and the leading cause of cancer death among women. Medical imaging has become an indispensable tool for its diagnosis and follow up. During the last decade, the medical community has promoted to incorporate Ultra-Sound (US) screening as part of the standard routine. The main reason for using US imaging is its capability to differentiate benign from malignant masses, when compared to other imaging techniques. The increasing usage of US imaging encourages the development of Computer Aided Diagnosis (CAD) systems applied to Breast Ultra-Sound (BUS) images. However accurate delineations of the lesions and structures of the breast are essential for CAD systems in order to extract information needed to perform diagnosis. This article proposes a highly modular and flexible framework for segmenting lesions and tissues present in BUS images. The proposal takes advantage of optimization strategies using super-pixels and high-level descriptors, which are analogous to the visual cues used by radiologists. Qualitative and quantitative results are provided stating a performance within the range of the state-of-the-art.

  10. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI

    DEFF Research Database (Denmark)

    Iglesias, Juan Eugenio; Augustinack, Jean C.; Nguyen, Khoa

    2015-01-01

    level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise...... datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer......'s disease subjects and elderly controls with 88% accuracy in standard resolution (1 mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and classification based on whole hippocampal volume (82% accuracy)....

  11. Mixed segmentation

    DEFF Research Database (Denmark)

    Hansen, Allan Grutt; Bonde, Anders; Aagaard, Morten

    content analysis and audience segmentation in a single-source perspective. The aim is to explain and understand target groups in relation to, on the one hand, emotional response to commercials or other forms of audio-visual communication and, on the other hand, living preferences and personality traits...

  12. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    Science.gov (United States)

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  13. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping

    Directory of Open Access Journals (Sweden)

    Pouria Sadeghi-Tehran

    2017-11-01

    Full Text Available Abstract Background Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. Results In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1 comparison with ground-truth images, (2 variation along a day with changes in ambient illumination, (3 comparison with manual measurements and (4 an estimation of performance along the full life cycle of a wheat canopy. Conclusion The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  14. High performance p-type segmented leg of misfit-layered cobaltite and half-Heusler alloy

    International Nuclear Information System (INIS)

    Hung, Le Thanh; Van Nong, Ngo; Snyder, G. Jeffrey; Viet, Man Hoang; Balke, Benjamin; Han, Li; Stamate, Eugen; Linderoth, Søren; Pryds, Nini

    2015-01-01

    Highlights: • p-type segmented leg of oxide and half-Heusler was for the first time demonstrated. • The maximum conversion efficiency reached a value of about 5%. • The results are among the highest reported values so far for oxide-based legs. • Oxide-based segmented leg is very promising for generating electricity. - Abstract: In this study, a segmented p-type leg of doped misfit-layered cobaltite Ca 2.8 Lu 0.15 Ag 0.05 Co 4 O 9+δ and half-Heusler Ti 0.3 Zr 0.35 Hf 0.35 CoSb 0.8 Sn 0.2 alloy was fabricated and characterized. The thermoelectric properties of single components, segmented leg, and the electrical contact resistance of the joint part were measured as a function of temperature. The output power generation characteristics of segmented legs were characterized in air under various temperature gradients, ΔT, with the hot side temperature up to 1153 K. At ΔT ≈ 756 K, the maximum conversion efficiency reached a value of ∼5%, which is about 65% of that expected from the materials without parasitic losses. The long-term stability investigation for two weeks at the hot and cold side temperatures of 1153/397 K shows that the segmented leg has good durability as a result of stable and low electrical resistance contacts

  15. An Automatic Segmentation Method Combining an Active Contour Model and a Classification Technique for Detecting Polycomb-group Proteinsin High-Throughput Microscopy Images.

    Science.gov (United States)

    Gregoretti, Francesco; Cesarini, Elisa; Lanzuolo, Chiara; Oliva, Gennaro; Antonelli, Laura

    2016-01-01

    The large amount of data generated in biological experiments that rely on advanced microscopy can be handled only with automated image analysis. Most analyses require a reliable cell image segmentation eventually capable of detecting subcellular structures.We present an automatic segmentation method to detect Polycomb group (PcG) proteins areas isolated from nuclei regions in high-resolution fluorescent cell image stacks. It combines two segmentation algorithms that use an active contour model and a classification technique serving as a tool to better understand the subcellular three-dimensional distribution of PcG proteins in live cell image sequences. We obtained accurate results throughout several cell image datasets, coming from different cell types and corresponding to different fluorescent labels, without requiring elaborate adjustments to each dataset.

  16. Combined use of high-definition and volumetric optical coherence tomography for the segmentation of neural canal opening in cases of optic nerve edema

    Science.gov (United States)

    Wang, Jui-Kai; Kardon, Randy H.; Garvin, Mona K.

    2015-03-01

    In cases of optic-nerve-head edema, the presence of the swelling reduces the visibility of the underlying neural canal opening (NCO) within spectral-domain optical coherence tomography (SD-OCT) volumes. Consequently, traditional SD-OCT-based NCO segmentation methods often overestimate the size of the NCO. The visibility of the NCO can be improved using high-definition 2D raster scans, but such scans do not provide 3D contextual image information. In this work, we present a semi-automated approach for the segmentation of the NCO in cases of optic disc edema by combining image information from volumetric and high-definition raster SD-OCT image sequences. In particular, for each subject, five high-definition OCT B-scans and the OCT volume are first separately segmented, and then the five high-definition B-scans are automatically registered to the OCT volume. Next, six NCO points are placed (manually, in this work) in the central three high-definition OCT B-scans (two points for each central B-scans) and are automatically transferred into the OCT volume. Utilizing a combination of these mapped points and the 3D image information from the volumetric scans, a graph-based approach is used to identify the complete NCO on the OCT en-face image. The segmented NCO points using the new approach were significantly closer to expert-marked points than the segmented NCO points using a traditional approach (root mean square differences in pixels: 5.34 vs. 21.71, p < 0.001).

  17. Sci-Thur AM: YIS – 03: Combining sagittally-reconstructed 3D and live-2D ultrasound for high-dose-rate prostate brachytherapy needle segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Hrinivich, Thomas; Hoover, Douglas; Surry, Kathleen; Edirisinghe, Chandima; D’Souza, David; Fenster, Aaron; Wong, Eugene [University of Western Ontario, London Regional Cancer Program/LHSC, London Regional Cancer Program/LHSC, Robarts Research Institute, London Regional Cancer Program/LHSC, Robarts Research Institute, University of Western Ontario (Canada)

    2016-08-15

    Ultrasound-guided high-dose-rate prostate brachytherapy (HDR-BT) needle segmentation is performed clinically using live-2D sagittal images. Organ segmentation is then performed using axial images, introducing a source of geometric uncertainty. Sagittally-reconstructed 3D (SR3D) ultrasound enables both needle and organ segmentation, but suffers from shadow artifacts. We present a needle segmentation technique augmenting SR3D with live-2D sagittal images using mechanical probe tracking to mitigate image artifacts and compare it to the clinical standard. Seven prostate cancer patients underwent TRUS-guided HDR-BT during which the clinical and proposed segmentation techniques were completed in parallel using dual ultrasound video outputs. Calibrated needle end-length measurements were used to calculate insertion depth errors (IDEs), and the dosimetric impact of IDEs was evaluated by perturbing clinical treatment plan source positions. The proposed technique provided smaller IDEs than the clinical approach, with mean±SD of −0.3±2.2 mm and −0.5±3.7mm respectively. The proposed and clinical techniques resulted in 84% and 43% of needles with IDEs within ±3mm, and IDE ranges across all needles of [−7.7mm, 5.9mm] and [−9.3mm, 7.7mm] respectively. The proposed and clinical IDEs lead to mean±SD changes in the volume of the prostate receiving the prescription dose of −0.6±0.9% and −2.0±5.3% respectively. The proposed technique provides improved HDR-BT needle segmentation accuracy over the clinical technique leading to decreased dosimetric uncertainty by eliminating the axial-to-sagittal registration, and mitigates the effect of shadow artifacts by incorporating mechanically registered live-2D sagittal images.

  18. Experimental Evaluation for the Microvibration Performance of a Segmented PC Method Based High Technology Industrial Facility Using 1/2 Scale Test Models

    Directory of Open Access Journals (Sweden)

    Sijun Kim

    2017-01-01

    Full Text Available The precast concrete (PC method used in the construction process of high technology industrial facilities is limited when applied to those with greater span lengths, due to the transport length restriction (maximum length of 15~16 m in Korea set by traffic laws. In order to resolve this, this study introduces a structural system with a segmented PC system, and a 1/2 scale model with a width of 9000 mm (hereafter Segmented Model is manufactured to evaluate vibration performance. Since a real vibrational environment cannot be reproduced for vibration testing using a scale model, a comparative analysis of their relative performances is conducted in this study. For this purpose, a 1/2 scale model with a width of 7200 mm (hereafter Nonsegmented Model of a high technology industrial facility is additionally prepared using the conventional PC method. By applying the same experiment method for both scale models and comparing the results, the relative vibration performance of the Segmented Model is observed. Through impact testing, the natural frequencies of the two scale models are compared. Also, in order to analyze the estimated response induced by the equipment, the vibration responses due to the exciter are compared. The experimental results show that the Segmented Model exhibits similar or superior performances when compared to the Nonsegmented Model.

  19. Market segmentation using perceived constraints

    Science.gov (United States)

    Jinhee Jun; Gerard Kyle; Andrew Mowen

    2008-01-01

    We examined the practical utility of segmenting potential visitors to Cleveland Metroparks using their constraint profiles. Our analysis identified three segments based on their scores on the dimensions of constraints: Other priorities--visitors who scored the highest on 'other priorities' dimension; Highly Constrained--visitors who scored relatively high on...

  20. The Simulation of Energy Distribution of Electrons Detected by Segmental Ionization Detector in High Pressure Conditions of ESEM

    Czech Academy of Sciences Publication Activity Database

    Neděla, Vilém; Konvalina, Ivo; Oral, Martin; Hudec, Jiří

    2015-01-01

    Roč. 21, S4 (2015), s. 264-269 ISSN 1431-9276 R&D Projects: GA ČR(CZ) GA14-22777S Institutional support: RVO:68081731 Keywords : electron-gas interactions * Monte Carlo simulation * signal amplification * segmented ionization detector Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.730, year: 2015

  1. An efficient and high fidelity method for amplification, cloning and sequencing of complete tospovirus genomic RNA segments

    Science.gov (United States)

    Amplification and sequencing of the complete M- and S-RNA segments of Tomato spotted wilt virus and Impatiens necrotic spot virus as a single fragment is useful for whole genome sequencing of tospoviruses co-infecting a single host plant. It avoids issues associated with overlapping amplicon-based ...

  2. Segmental colonic dilation is associated with premature termination of high-amplitude propagating contractions in children with intractable functional constipation

    NARCIS (Netherlands)

    Koppen, I. J. N.; Thompson, B. P.; Ambeba, E. J.; Lane, V. A.; Bates, D. G.; Minneci, P. C.; Deans, K. J.; Levitt, M. A.; Wood, R. J.; Benninga, M. A.; Di Lorenzo, C.; Yacob, D.

    2017-01-01

    Background: Colonic dilation is common in children with intractable functional constipation (FC). Our aim was to describe the association between segmental colonic dilation and colonic dysmotility in children with FC. Methods: We performed a retrospective study on 30 children with intractable FC

  3. High performance p-type segmented leg of misfit-layered cobaltite and half-Heusler alloy

    DEFF Research Database (Denmark)

    Le, Thanh Hung; Van Nong, Ngo; Snyder, Gerald Jeffrey

    2015-01-01

    of the joint part were measured as a function of temperature. The output power generation characteristics of segmented legs were characterized in air under various temperature gradients, DT, with the hot side temperature up to 1153 K. At ΔT ≈756 K, the maximum conversion efficiency reached a value of ∼5...

  4. High-resolution coronary MR angiography for evaluation of patients with anomalous coronary arteries: visualization of the intramural segment

    Energy Technology Data Exchange (ETDEWEB)

    Biko, David M. [UCSF Benioff Children' s Hospital Oakland, Department of Diagnostic Imaging, Oakland, CA (United States); The Children' s Hospital of Philadelphia, Department of Radiology, Philadelphia, PA (United States); Chung, Claudia; Chung, Taylor [UCSF Benioff Children' s Hospital Oakland, Department of Diagnostic Imaging, Oakland, CA (United States); Hitt, David M. [Philips Healthcare, Cleveland, OH (United States); Kurio, Gregory [UCSF Benioff Children' s Hospital Oakland, Department of Cardiology, Oakland, CA (United States); Reinhartz, Olaf [UCSF Benioff Children' s Hospital Oakland, Department of Cardiac Surgery, Oakland, CA (United States)

    2015-08-15

    Anomalous origin of the coronary artery from the contralateral coronary sinus is a rare coronary anomaly associated with sudden death. The inter-arterial course is most closely associated with sudden death, but it has been suggested that the presence of an intramural segment of a right anomalous coronary is associated with more symptoms and therefore may be an important criterion for intervention in these patients. To demonstrate that MR angiography can accurately determine the presence or absence of an intramural segment in an anomalous coronary artery. All studies of children who underwent MR angiography for the evaluation of an anomalous coronary artery were retrospectively reviewed by two pediatric radiologists in consensus. Criteria for an intramural anomalous coronary artery were the presence of a small or slit-like ostium and the relative smaller size of the proximal intramural portion of the coronary artery in relation to the more distal epicardial coronary artery. The anomalous coronary artery was classified as not intramural if these two findings were absent. These findings were correlated with operative reports confirming the presence or absence of an intramural segment. Twelve patients (86%) met MR angiography criteria for the presence of an intramural course. Only 2 patients (14%) met MR angiography criteria for a non-intramural course. When correlating with intraoperative findings, MR angiography was successful in distinguishing between intramural and non-intramural anomalous coronary arteries in all cases (P = 0.01). MR angiography may be able to reliably identify the intramural segment of an anomalous coronary artery in older children using the imaging criteria of a small or slit-like ostium and relative decrease in size of the proximal portion of the anomalous coronary artery compared to the distal portion of the anomalous coronary artery. Determining the presence of the intramural segment may help with surgical planning and may be an important

  5. Rediscovering market segmentation.

    Science.gov (United States)

    Yankelovich, Daniel; Meer, David

    2006-02-01

    In 1964, Daniel Yankelovich introduced in the pages of HBR the concept of nondemographic segmentation, by which he meant the classification of consumers according to criteria other than age, residence, income, and such. The predictive power of marketing studies based on demographics was no longer strong enough to serve as a basis for marketing strategy, he argued. Buying patterns had become far better guides to consumers' future purchases. In addition, properly constructed nondemographic segmentations could help companies determine which products to develop, which distribution channels to sell them in, how much to charge for them, and how to advertise them. But more than 40 years later, nondemographic segmentation has become just as unenlightening as demographic segmentation had been. Today, the technique is used almost exclusively to fulfill the needs of advertising, which it serves mainly by populating commercials with characters that viewers can identify with. It is true that psychographic types like "High-Tech Harry" and "Joe Six-Pack" may capture some truth about real people's lifestyles, attitudes, self-image, and aspirations. But they are no better than demographics at predicting purchase behavior. Thus they give corporate decision makers very little idea of how to keep customers or capture new ones. Now, Daniel Yankelovich returns to these pages, with consultant David Meer, to argue the case for a broad view of nondemographic segmentation. They describe the elements of a smart segmentation strategy, explaining how segmentations meant to strengthen brand identity differ from those capable of telling a company which markets it should enter and what goods to make. And they introduce their "gravity of decision spectrum", a tool that focuses on the form of consumer behavior that should be of the greatest interest to marketers--the importance that consumers place on a product or product category.

  6. A general strategy for performing temperature-programming in high performance liquid chromatography--prediction of segmented temperature gradients.

    Science.gov (United States)

    Wiese, Steffen; Teutenberg, Thorsten; Schmidt, Torsten C

    2011-09-28

    In the present work it is shown that the linear elution strength (LES) model which was adapted from temperature-programming gas chromatography (GC) can also be employed to predict retention times for segmented-temperature gradients based on temperature-gradient input data in liquid chromatography (LC) with high accuracy. The LES model assumes that retention times for isothermal separations can be predicted based on two temperature gradients and is employed to calculate the retention factor of an analyte when changing the start temperature of the temperature gradient. In this study it was investigated whether this approach can also be employed in LC. It was shown that this approximation cannot be transferred to temperature-programmed LC where a temperature range from 60°C up to 180°C is investigated. Major relative errors up to 169.6% were observed for isothermal retention factor predictions. In order to predict retention times for temperature gradients with different start temperatures in LC, another relationship is required to describe the influence of temperature on retention. Therefore, retention times for isothermal separations based on isothermal input runs were predicted using a plot of the natural logarithm of the retention factor vs. the inverse temperature and a plot of the natural logarithm of the retention factor vs. temperature. It could be shown that a plot of lnk vs. T yields more reliable isothermal/isocratic retention time predictions than a plot of lnk vs. 1/T which is usually employed. Hence, in order to predict retention times for temperature-gradients with different start temperatures in LC, two temperature gradient and two isothermal measurements have been employed. In this case, retention times can be predicted with a maximal relative error of 5.5% (average relative error: 2.9%). In comparison, if the start temperature of the simulated temperature gradient is equal to the start temperature of the input data, only two temperature

  7. Scorpion image segmentation system

    Science.gov (United States)

    Joseph, E.; Aibinu, A. M.; Sadiq, B. A.; Bello Salau, H.; Salami, M. J. E.

    2013-12-01

    Death as a result of scorpion sting has been a major public health problem in developing countries. Despite the high rate of death as a result of scorpion sting, little report exists in literature of intelligent device and system for automatic detection of scorpion. This paper proposed a digital image processing approach based on the floresencing characteristics of Scorpion under Ultra-violet (UV) light for automatic detection and identification of scorpion. The acquired UV-based images undergo pre-processing to equalize uneven illumination and colour space channel separation. The extracted channels are then segmented into two non-overlapping classes. It has been observed that simple thresholding of the green channel of the acquired RGB UV-based image is sufficient for segmenting Scorpion from other background components in the acquired image. Two approaches to image segmentation have also been proposed in this work, namely, the simple average segmentation technique and K-means image segmentation. The proposed algorithm has been tested on over 40 UV scorpion images obtained from different part of the world and results obtained show an average accuracy of 97.7% in correctly classifying the pixel into two non-overlapping clusters. The proposed 1system will eliminate the problem associated with some of the existing manual approaches presently in use for scorpion detection.

  8. An amino-terminal segment of hantavirus nucleocapsid protein presented on hepatitis B virus core particles induces a strong and highly cross-reactive antibody response in mice

    International Nuclear Information System (INIS)

    Geldmacher, Astrid; Skrastina, Dace; Petrovskis, Ivars; Borisova, Galina; Berriman, John A.; Roseman, Alan M.; Crowther, R. Anthony; Fischer, Jan; Musema, Shamil; Gelderblom, Hans R.; Lundkvist, Aake; Renhofa, Regina; Ose, Velta; Krueger, Detlev H.; Pumpens, Paul; Ulrich, Rainer

    2004-01-01

    Previously, we have demonstrated that hepatitis B virus (HBV) core particles tolerate the insertion of the amino-terminal 120 amino acids (aa) of the Puumala hantavirus nucleocapsid (N) protein. Here, we demonstrate that the insertion of 120 amino-terminal aa of N proteins from highly virulent Dobrava and Hantaan hantaviruses allows the formation of chimeric core particles. These particles expose the inserted foreign protein segments, at least in part, on their surface. Analysis by electron cryomicroscopy of chimeric particles harbouring the Puumala virus (PUUV) N segment revealed 90% T = 3 and 10% T = 4 shells. A map computed from T = 3 shells shows additional density splaying out from the tips of the spikes producing the effect of an extra shell of density at an outer radius compared with wild-type shells. The inserted Puumala virus N protein segment is flexibly linked to the core spikes and only partially icosahedrally ordered. Immunisation of mice of two different haplotypes (BALB/c and C57BL/6) with chimeric core particles induces a high-titered and highly cross-reactive N-specific antibody response in both mice strains

  9. Segmentation of 3-D High-Frequency Ultrasound Images of Human Lymph Nodes Using Graph Cut With Energy Functional Adapted to Local Intensity Distribution.

    Science.gov (United States)

    Kuo, Jen-Wei; Mamou, Jonathan; Wang, Yao; Saegusa-Beecroft, Emi; Machi, Junji; Feleppa, Ernest J

    2017-10-01

    Previous studies by our group have shown that 3-D high-frequency quantitative ultrasound (QUS) methods have the potential to differentiate metastatic lymph nodes (LNs) from cancer-free LNs dissected from human cancer patients. To successfully perform these methods inside the LN parenchyma (LNP), an automatic segmentation method is highly desired to exclude the surrounding thin layer of fat from QUS processing and accurately correct for ultrasound attenuation. In high-frequency ultrasound images of LNs, the intensity distribution of LNP and fat varies spatially because of acoustic attenuation and focusing effects. Thus, the intensity contrast between two object regions (e.g., LNP and fat) is also spatially varying. In our previous work, nested graph cut (GC) demonstrated its ability to simultaneously segment LNP, fat, and the outer phosphate-buffered saline bath even when some boundaries are lost because of acoustic attenuation and focusing effects. This paper describes a novel approach called GC with locally adaptive energy to further deal with spatially varying distributions of LNP and fat caused by inhomogeneous acoustic attenuation. The proposed method achieved Dice similarity coefficients of 0.937±0.035 when compared with expert manual segmentation on a representative data set consisting of 115 3-D LN images obtained from colorectal cancer patients.

  10. High-fidelity and low-latency mobile fronthaul based on segment-wise TDM and MIMO-interleaved arraying.

    Science.gov (United States)

    Li, Longsheng; Bi, Meihua; Miao, Xin; Fu, Yan; Hu, Weisheng

    2018-01-22

    In this paper, we firstly demonstrate an advanced arraying scheme in the TDM-based analog mobile fronthaul system to enhance the signal fidelity, in which the segment of the antenna carrier signal (AxC) with an appropriate length is served as the granularity for TDM aggregation. Without introducing extra processing, the entire system can be realized by simple DSP. The theoretical analysis is presented to verify the feasibility of this scheme, and to evaluate its effectiveness, the experiment with ~7-GHz bandwidth and 20 8 × 8 MIMO group signals are conducted. Results show that the segment-wise TDM is completely compatible with the MIMO-interleaved arraying, which is employed in an existing TDM scheme to improve the bandwidth efficiency. Moreover, compared to the existing TDM schemes, our scheme can not only satisfy the latency requirement of 5G but also significantly reduce the multiplexed signal bandwidth, hence providing higher signal fidelity in the bandwidth-limited fronthaul system. The experimental result of EVM verifies that 256-QAM is supportable using the segment-wise TDM arraying with only 250-ns latency, while with the ordinary TDM arraying, only 64-QAM is bearable.

  11. Metrological tests of a 200 L calibration source for HPGE detector systems for assay of radioactive waste drums.

    Science.gov (United States)

    Boshkova, T; Mitev, K

    2016-03-01

    In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume (152)Eu source (drum about 200L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Investigation of n{sup +} surface events in HPGe detectors for liquid argon background rejection in GERDA

    Energy Technology Data Exchange (ETDEWEB)

    Lehnert, Bjoern [TU-Dresden, Dresden (Germany); Collaboration: GERDA-Collaboration

    2015-07-01

    The GERDA experiment is searching for neutrinoless double beta decay (0νββ) in {sup 76}Ge using an array of germanium detectors immersed in liquid argon (LAr). Phase II of the experiment aims to improve the background level by a factor 10 in order to reach 10{sup -3} counts / (kg.keV.yr). A strong suppression technique is required to suppress the intrinsic LAr background of {sup 42}Ar/{sup 42}K. 30 newly produced p-type Broad Energy Germanium (BEGe) detectors will be deployed in Phase II. The n{sup +} electrode of the GERDA BEGe detectors is covering 96-98 % of the surface and is between 0.5 and 1.2 mm thick. Betas from the {sup 42}K decay can penetrated the detector surface and deposit energies within the 0νββ region. Experiences from GERDA Phase I show that these surface events are the dominate background component without suppression. Energy depositions inside the n{sup +} layer create pulse shapes that are slower than those from interactions in the bulk. This talk presents a rejection technique for those events. The signal development inside the n{sup +} layer is modeled and applied in Geant4 Monte Carlo simulations. The simulations are compared with data for {sup 241}Am and {sup 90}Sr calibration source measurements. The suppression capabilities are extrapolated for {sup 42}K in GERDA Phase II.

  13. Response function of an HPGe detector simulated through MCNP 4A varying the density and chemical composition of the matrix

    International Nuclear Information System (INIS)

    Leal A, B.; Mireles G, F.; Quirino T, L.; Pinedo, J.L.

    2005-01-01

    In the area of the Radiological Safety it is required of a calibrated detection system in energy and efficiency for the determination of the concentration in activity in samples that vary in chemical composition and by this in density. The area of Nuclear Engineering requires to find the grade of isotopic enrichment of the uranium of the Sub-critic Nuclear Chicago 9000 Mark. Given the experimental importance that has the determination from the curves of efficiency to the effects of establishing the quantitative results, is appealed to the simulation of the response function of the detector used in the Regional Center of Nuclear Studies inside the range of energy of 80 keV to 1400 keV varying the density of the matrix and the chemical composition by means of the application of the Monte Carlo code MCNP-4A. The obtained results in the simulation of the response function of the detector show a grade of acceptance in the range from 500 to 1400 keV energy, with a smaller percentage discrepancy to 10%, in the range of low energy that its go from 59 to 400 keV, the percentage discrepancy varies from 17% until 30%, which is manifested in the opposing isotopic relationship for 5 fuel rods of the Sub critic nuclear assemble. (Author)

  14. Metrological tests of a 200 L calibration source for HPGE detector systems for assay of radioactive waste drums

    International Nuclear Information System (INIS)

    Boshkova, T.; Mitev, K.

    2016-01-01

    In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume "1"5"2Eu source (drum about 200 L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. - Highlights: • Large (200 L) volume drum source designed, produced and certified as CRM in 2007. • Source contains 448 identical sealed radioactive "1"5"2Eu sources (modules). • Two metrological inspections in 2011 and 2014. • No statistically significant changes of the certified characteristics over time. • Stable calibration source for HPGe-gamma radioactive waste assay systems.

  15. Efficiency calibration of x-ray HPGe detectors for photons with energies above the Ge K binding energy

    Energy Technology Data Exchange (ETDEWEB)

    Maidana, Nora L., E-mail: nmaidana@if.usp.br [Instituto de Física, Universidade de São Paulo, Travessa R 187, Cidade Universitária, CEP:05508-900 São Paulo, SP (Brazil); Vanin, Vito R.; Jahnke, Viktor [Instituto de Física, Universidade de São Paulo, Travessa R 187, Cidade Universitária, CEP:05508-900 São Paulo, SP (Brazil); Fernández-Varea, José M. [Facultat de Física (ECM and ICC), Universitat de Barcelona, Diagonal 645, E-08028 Barcelona (Spain); Martins, Marcos N. [Instituto de Física, Universidade de São Paulo, Travessa R 187, Cidade Universitária, CEP:05508-900 São Paulo, SP (Brazil); Brualla, Lorenzo [NCTeam, Strahlenklinik, Universitätsklinikum Essen, Hufelandstraße 55, D-45122 Essen (Germany)

    2013-11-21

    We report on the efficiency calibration of a HPGe x-ray detector using radioactive sources and an analytical expression taken from the literature, in two different arrangements, with and without a broad-angle collimator. The frontal surface of the Ge crystal was scanned with pencil beams of photons. The Ge dead layer was found to be nonuniform, with central and intermediate regions that have thin (μm range) and thick (mm range) dead layers, respectively, surrounded by an insensitive ring. We discuss how this fact explains the observed efficiency curves and generalize the adopted model. We show that changes in the thickness of the Ge-crystal dead layer affect the efficiency of x-ray detectors, but the use of an appropriate broad-beam external collimator limiting the photon flux to the thin dead layer in the central region leads to the expected efficiency dependence with energy and renders the calibration simpler.

  16. Determination of the detection efficiency of a HPGe detector by means of the MCNP 4A simulation code

    International Nuclear Information System (INIS)

    Leal, B.

    2004-01-01

    In the majority of the laboratories, the calibration in efficiency of the detector is carried out by means of the standard sources measurement of gamma photons that have a determined activity, or for matrices that contain a variety of radionuclides that can embrace the energy range of interest. Given the experimental importance that has the determination from the curves of efficiency to the effects of establishing the quantitative results, is appealed to the simulation of the response function of the detector used in the Regional Center of Nuclear Studies inside the energy range of 80 keV to 1400 keV varying the density of the matrix, by means of the application of the Monte Carlo code MCNP-4A. The adjustment obtained shows an acceptance grade in the range of 100 to 600 keV, with a smaller percentage discrepancy to 5%. (Author)

  17. Calibration of a portable HPGe detector using MCNP code for the determination of 137Cs in soils

    International Nuclear Information System (INIS)

    Gutierrez-Villanueva, J.L.; Martin-Martin, A.; Pena, V.; Iniguez, M.P.; Celis, B. de

    2008-01-01

    In situ gamma spectrometry provides a fast method to determine 137 Cs inventories in soils. To improve the accuracy of the estimates, one can use not only the information on the photopeak count rates but also on the peak to forward-scatter ratios. Before applying this procedure to field measurements, a calibration including several experimental simulations must be carried out in the laboratory. In this paper it is shown that Monte Carlo methods are a valuable tool to minimize the number of experimental measurements needed for the calibration

  18. Calibration of a portable HPGe detector using MCNP code for the determination of 137Cs in soils.

    Science.gov (United States)

    Gutiérrez-Villanueva, J L; Martín-Martín, A; Peña, V; Iniguez, M P; de Celis, B; de la Fuente, R

    2008-10-01

    In situ gamma spectrometry provides a fast method to determine (137)Cs inventories in soils. To improve the accuracy of the estimates, one can use not only the information on the photopeak count rates but also on the peak to forward-scatter ratios. Before applying this procedure to field measurements, a calibration including several experimental simulations must be carried out in the laboratory. In this paper it is shown that Monte Carlo methods are a valuable tool to minimize the number of experimental measurements needed for the calibration.

  19. Calibration of HPGe detector for in situ measurements of 137Cs in soil by 'peak to valley' method

    International Nuclear Information System (INIS)

    Fueloep, M.

    2000-01-01

    The contamination of soil with gamma-ray emitters can be measured in two ways: soil sampling method and in situ spectrometry of the ambient gamma-ray radiation. The conventional soil sampling method has two disadvantages: samples may not be representative for a large areas and determination of the depth distribution of radionuclide requires the measurement of several samples taken from different depths. In situ measurement of a radionuclide activity in soil is more sensitive and provides more representative data than data obtained by soil sample collection and subsequent laboratory analysis. In emergency situations time to assess the contamination is critical. For rapid assessment of the deposited activity direct measurement of ambient gamma-ray radiation are used. In order to obtain accurate measurements of radionuclides in the soil, the detector should be placed on relatively even and open terrain. It is our customary practice to place the detector 1 m above the soil surface. At this height, a tripod-mounted detector can be handled easily and still provide a radius of view for gamma emitting sources out to about 10 m. The 'field of view' actually varies, being somewhat larger for higher sources. Depending upon source energy, the detector effectively sees down to a depth of 15-30 cm. Commonly used method for field gamma spectrometry is method by Beck (1). The most important disadvantages of in situ spectrometry by Beck are that the accuracy of the analysis depends on a separate knowledge of the radioactivity distribution with soil depth. This information can be obtained by calculations using data from in situ measurements and energy dependence of absorption and scattering of photons in soil and track length distribution of photons in soil (2). A method of in situ measurements of 137 Cs in soil where radionuclide distribution in soil profile is calculated by unfolding of detector responses in the full energy peak net area at 0.662 MeV and in the valley under the peak (peak to valley method) has been developed at IPCM (3). The detector is used with and without collimator in order to achieve more independent responses for unfolding, which result in better resolution in radionuclide distribution. (author)

  20. A method for the complete analysis of NORM building materials by γ-ray spectrometry using HPGe detectors.

    Science.gov (United States)

    Quintana, B; Pedrosa, M C; Vázquez-Canelas, L; Santamaría, R; Sanjuán, M A; Puertas, F

    2018-04-01

    A methodology including software tools for analysing NORM building materials and residues by low-level gamma-ray spectrometry has been developed. It comprises deconvolution of gamma-ray spectra using the software GALEA with focus on the natural radionuclides and Monte Carlo simulations for efficiency and true coincidence summing corrections. The methodology has been tested on a range of building materials and validated against reference materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Highly segmented CVD diamond detectors and high-resolution momentum measurements in knockout reactions; Hochsegmentierte CVD Diamant Detektoren und hochaufloesende Impulsmessungen in Knockout Reaktionen

    Energy Technology Data Exchange (ETDEWEB)

    Schwertel, Sabine

    2009-11-26

    highly segmented detectors with an efficiency {epsilon}>98 % could be built from this material. The diamond detectors were segmented in our laboratory and achieved a time resolution of {sigma}{sub t}=75 ps. Medium-size (25.4 x 25.4 mm{sup 2}) micro-strip detectors were tested at the FRS and at the ALADIN/LAND setup at GSI. The obtained position resolution was in the range of the strip size of 200 {mu}m. First full-size detectors (50 x 50 mm{sup 2}) will be completed soon. (orig.)

  2. Segmental distribution of high-volume caudal anesthesia in neonates, infants, and toddlers as assessed by ultrasonography.

    Science.gov (United States)

    Lundblad, Märit; Lönnqvist, Per-Arne; Eksborg, Staffan; Marhofer, Peter

    2011-02-01

    The aim of this prospective, age-stratified, observational study was to determine the cranial extent of spread of a large volume (1.5 ml·kg(-1) , ropivacaine 0.2%), single-shot caudal epidural injection using real-time ultrasonography. Fifty ASA I-III children were included in the study, stratified in three age groups; neonates, infants (1-12 months), and toddlers (1-4 years). The caudal blocks were performed during ultrasonographic observation of the spread of local anesthetic (LA) in the epidural space. A significant inverse relationship was found between age, weight, and height, and the maximal cranial level reached by 1.5 ml·kg(-1) of LA. In neonates, 93% of the blocks reached a cranial level of ≥Th12 vs 73% and 25% in infants and toddlers, respectively. Based on our data, a predictive equation of segmental spread was generated: Dose (ml/spinal segment) = 0.1539·(BW in kg)-0.0937. This study found an inverse relationship between age, weight, and height and the number of segments covered by a caudal injection of 1.5 ml·kg(-1) of ropivacaine 0.2% in children 0-4 years of age. However, the cranial spread of local anesthetics within the spinal canal as assessed by immediate ultrasound visualization was found to be in poor agreement with previously published predictive equations that are based on actual cutaneous dermatomal testing. © 2010 Blackwell Publishing Ltd.

  3. Study of the morphology exhibited by linear segmented polyurethanes

    International Nuclear Information System (INIS)

    Pereira, I.M.; Orefice, R.L.

    2009-01-01

    Five series of segmented polyurethanes with different hard segment content were prepared by the prepolymer mixing method. The nano-morphology of the obtained polyurethanes and their microphase separation were investigated by infrared spectroscopy, modulated differential scanning calorimetry and small-angle X-ray scattering. Although highly hydrogen bonded hard segments were formed, high hard segment contents promoted phase mixture and decreased the chain mobility, decreasing the hard segment domain precipitation and the soft segments crystallization. The applied techniques were able to show that the hard-segment content and the hard-segment interactions were the two controlling factors for determining the structure of segmented polyurethanes. (author)

  4. Can masses of non-experts train highly accurate image classifiers? A crowdsourcing approach to instrument segmentation in laparoscopic images.

    Science.gov (United States)

    Maier-Hein, Lena; Mersmann, Sven; Kondermann, Daniel; Bodenstedt, Sebastian; Sanchez, Alexandro; Stock, Christian; Kenngott, Hannes Gotz; Eisenmann, Mathias; Speidel, Stefanie

    2014-01-01

    Machine learning algorithms are gaining increasing interest in the context of computer-assisted interventions. One of the bottlenecks so far, however, has been the availability of training data, typically generated by medical experts with very limited resources. Crowdsourcing is a new trend that is based on outsourcing cognitive tasks to many anonymous untrained individuals from an online community. In this work, we investigate the potential of crowdsourcing for segmenting medical instruments in endoscopic image data. Our study suggests that (1) segmentations computed from annotations of multiple anonymous non-experts are comparable to those made by medical experts and (2) training data generated by the crowd is of the same quality as that annotated by medical experts. Given the speed of annotation, scalability and low costs, this implies that the scientific community might no longer need to rely on experts to generate reference or training data for certain applications. To trigger further research in endoscopic image processing, the data used in this study will be made publicly available.

  5. A quantitative analysis of two-dimensional manually segmented transrectal ultrasound axial images in planning high dose rate brachytherapy for prostate cancer

    Directory of Open Access Journals (Sweden)

    Dabić-Stanković Kata

    2017-01-01

    Full Text Available Background/Aim. Prostate delineation, pre-planning and catheter implantation procedures, in high-dose rate brachytherapy (HDR-BT, are commonly based on the prostate manually segmented transrectal ultrasound (TRUS images. The aim of this study was to quantitatively analyze the consistency of prostate capsule delineation, done by a single therapist, prior to each HDR-BT fraction and the changes in the shape of the prostate capsule during HDR-BT, using two dimensional (2D TRUS axial image. Methods. A group of 16 patients were treated at the Medical System Belgrade Brachytherapy Department with definitive HDRBT. The total applied median dose of 52 Gy was divided into four individual fractions, each fraction being delivered 2– 3 weeks apart. Real time prostate axial visualization and the manual segmentation prior to each fraction were performed using B-K Medical ultrasound. Quantitative analyses, analysis of an area and shape were applied on 2D-TRUS axial images of the prostate. Area analyses were used to calculate the average value of the cross-sectional area of the prostate image. The parameters of the prostate shape, the fractal dimension and the circularity ratio of the prostate capsule contour were estimated at the maximum axial cross section of the prostate image. Results. The sample group consisted of four phases, each phase being performed prior to the first, second, third and fourth HDR-BT fraction, respectively. Statistical analysis showed that during HDR-BT fractions there were no significant differences in the average value of area, as well as in the maximum shape of prostate capsule. Conclusions. Quantitative analysis of TRUS axial prostate segmented images shows a successful capsule delineation in the series of manually segmented TRUS images, and the prostate maximum shape remaining unchanged during HDR-BT fractions.

  6. Gamma-Ray Spectroscopy at TRIUMF-ISAC: the New Frontier of Radioactive Ion Beam Research

    Science.gov (United States)

    Ball, G. C.; Andreoiu, C.; Austin, R. A. E.; Bandyopadhyay, D.; Becker, J. A.; Bricault, P.; Brown, N.; Chan, S.; Churchman, R.; Colosimo, S.; Coombes, H.; Cross, D.; Demand, G.; Drake, T. E.; Dombsky, M.; Ettenauer, S.; Finlay, P.; Furse, D.; Garnsworthy, A.; Garrett, P. E.; Green, K. L.; Grinyer, G. F.; Hyland, B.; Hackman, G.; Kanungo, R.; Kulp, W. D.; Lassen, J.; Leach, K. G.; Leslie, J. R.; Mattoon, C.; Melconian, D.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Rand, E.; Sarazin, F.; Svensson, C. E.; Sumithrarachchi, S.; Schumaker, M. A.; Triambak, S.; Waddington, J. C.; Walker, P. M.; Williams, S. J.; Wood, J. L.; Wong, J.; Zganjar, E. F.

    2009-03-01

    High-resolution gamma-ray spectroscopy is essential to fully exploit the unique scientific opportunities at the next generation radioactive ion beam facilities such as the TRIUMF Isotope Separator and Accelerator (ISAC). At ISAC the 8π spectrometer and its associated auxiliary detectors is optimize for β-decay studies while TIGRESS an array of segmented clover HPGe detectors has been designed for studies with accelerated beams. This paper gives a brief overview of these facilities and also presents recent examples of the diverse experimental program carried out at the 8π spectrometer.

  7. A new high background radiation area in the Geothermal region of Eastern Ghats Mobile Belt (EGMB) of Orissa, India

    International Nuclear Information System (INIS)

    Baranwal, V.C.; Sharma, S.P.; Sengupta, D.; Sandilya, M.K.; Bhaumik, B.K.; Guin, R.; Saha, S.K.

    2006-01-01

    A high natural radiation zone is investigated for the first time in a geothermal region of Eastern Ghats Mobile Belt (EGMB) of Orissa state in India. The surrounding area comprises a geothermal region which has surveyed using a portable pulsed Geiger-Muller counter. On the basis of findings of GM counter, an area was marked as a high radiation zone. Soil and rock samples collected from the high radiation zone were analyzed by γ-ray spectrometry (GRS) using NaI(Tl) detector. The radioactivity is found to be contributed mainly by thorium. Concentration of thorium is reported to be very high compared to their normal abundance in crustal rocks. Further, concentrations of 238 U and 40 K are also high compared to normal abundance in crustal rocks but their magnitude is comparatively less than that of thorium. The average concentrations of 238 U (i.e. U(β-γ)), 232 Th and 40 K are found to be 33, 459ppm and 3%, respectively, in soils and 312, 1723ppm and 5%, respectively, in the granitic rocks. Maximum concentrations of 238 U, 232 Th and 40 K are found to be 95, 1194ppm and 4%, respectively, in soils and 1434, 10,590ppm and 8%, respectively, in the granitic rocks. Radioactive element emits various energies in its decay chain. High energies are utilized to estimate the concentration of actual 238 U, 232 Th and 40 K using a NaI(Tl) detector, however, low energies are used for the same in an HPGe detector. Some of the rock samples (eight in number) were also analyzed using HPGe detector for studying the behavior of low energies emitted in the decay series of uranium and thorium. The absorbed gamma dose rate in air and external annual dose rate of the high radiation zone are calculated to be 2431nGy/h and 3.0mSv/y, respectively. It is approximately 10 times greater than the dose rates obtained outside the high radiation zone. The high concentration of uranium and thorium may be one of the possible heat sources together with the normal geothermal gradient for hot springs

  8. Parallel fuzzy connected image segmentation on GPU

    OpenAIRE

    Zhuge, Ying; Cao, Yong; Udupa, Jayaram K.; Miller, Robert W.

    2011-01-01

    Purpose: Image segmentation techniques using fuzzy connectedness (FC) principles have shown their effectiveness in segmenting a variety of objects in several large applications. However, one challenge in these algorithms has been their excessive computational requirements when processing large image datasets. Nowadays, commodity graphics hardware provides a highly parallel computing environment. In this paper, the authors present a parallel fuzzy connected image segmentation algorithm impleme...

  9. Highly accurate determination of relative gamma-ray detection efficiency for Ge detector and its application

    International Nuclear Information System (INIS)

    Miyahara, H.; Mori, C.; Fleming, R.F.; Dewaraja, Y.K.

    1997-01-01

    When quantitative measurements of γ-rays using High-Purity Ge (HPGe) detectors are made for a variety of applications, accurate knowledge of oy-ray detection efficiency is required. The emission rates of γ-rays from sources can be determined quickly in the case that the absolute peak efficiency is calibrated. On the other hand, the relative peak efficiencies can be used for determination of intensity ratios for plural samples and for comparison to the standard source. Thus, both absolute and relative detection efficiencies are important in use of γ-ray detector. The objective of this work is to determine the relative gamma-ray peak detection efficiency for an HPGe detector with the uncertainty approaching 0.1% . We used some nuclides which emit at least two gamma-rays with energies from 700 to 2400 keV for which the relative emission probabilities are known with uncertainties much smaller than 0.1%. The relative peak detection efficiencies were calculated from the measurements of the nuclides, 46 Sc, 48 Sc, 60 Co and 94 Nb, emitting two γ- rays with the emission probabilities of almost unity. It is important that various corrections for the emission probabilities, the cascade summing effect, and the self-absorption are small. A third order polynomial function on both logarithmic scales of energy and efficiency was fitted to the data, and the peak efficiency predicted at certain energy from covariance matrix showed the uncertainty less than 0.5% except for near 700 keV. As an application, the emission probabilities of the 1037.5 and 1212.9 keV γ-rays for 48 Sc were determined using the function of the highly precise relative peak efficiency. Those were 0.9777+0,.00079 and 0.02345+0.00017 for the 1037.5 and 1212.9 keV γ-rays, respectively. The sum of these probabilities is close to unity within the uncertainty which means that the certainties of the results are high and the accuracy has been improved considerably

  10. A computational atlas of the hippocampal formation using ex vivo, ultra-high resolution MRI: Application to adaptive segmentation of in vivo MRI.

    Science.gov (United States)

    Iglesias, Juan Eugenio; Augustinack, Jean C; Nguyen, Khoa; Player, Christopher M; Player, Allison; Wright, Michelle; Roy, Nicole; Frosch, Matthew P; McKee, Ann C; Wald, Lawrence L; Fischl, Bruce; Van Leemput, Koen

    2015-07-15

    Automated analysis of MRI data of the subregions of the hippocampus requires computational atlases built at a higher resolution than those that are typically used in current neuroimaging studies. Here we describe the construction of a statistical atlas of the hippocampal formation at the subregion level using ultra-high resolution, ex vivo MRI. Fifteen autopsy samples were scanned at 0.13 mm isotropic resolution (on average) using customized hardware. The images were manually segmented into 13 different hippocampal substructures using a protocol specifically designed for this study; precise delineations were made possible by the extraordinary resolution of the scans. In addition to the subregions, manual annotations for neighboring structures (e.g., amygdala, cortex) were obtained from a separate dataset of in vivo, T1-weighted MRI scans of the whole brain (1mm resolution). The manual labels from the in vivo and ex vivo data were combined into a single computational atlas of the hippocampal formation with a novel atlas building algorithm based on Bayesian inference. The resulting atlas can be used to automatically segment the hippocampal subregions in structural MRI images, using an algorithm that can analyze multimodal data and adapt to variations in MRI contrast due to differences in acquisition hardware or pulse sequences. The applicability of the atlas, which we are releasing as part of FreeSurfer (version 6.0), is demonstrated with experiments on three different publicly available datasets with different types of MRI contrast. The results show that the atlas and companion segmentation method: 1) can segment T1 and T2 images, as well as their combination, 2) replicate findings on mild cognitive impairment based on high-resolution T2 data, and 3) can discriminate between Alzheimer's disease subjects and elderly controls with 88% accuracy in standard resolution (1mm) T1 data, significantly outperforming the atlas in FreeSurfer version 5.3 (86% accuracy) and

  11. Market segmentation: Venezuelan ADRs

    Directory of Open Access Journals (Sweden)

    Urbi Garay

    2012-12-01

    Full Text Available The control on foreign exchange imposed by Venezuela in 2003 constitute a natural experiment that allows researchers to observe the effects of exchange controls on stock market segmentation. This paper provides empirical evidence that although the Venezuelan capital market as a whole was highly segmented before the controls were imposed, the shares in the firm CANTV were, through their American Depositary Receipts (ADRs, partially integrated with the global market. Following the imposition of the exchange controls this integration was lost. Research also documents the spectacular and apparently contradictory rise experienced by the Caracas Stock Exchange during the serious economic crisis of 2003. It is argued that, as it happened in Argentina in 2002, the rise in share prices occurred because the depreciation of the Bolívar in the parallel currency market increased the local price of the stocks that had associated ADRs, which were negotiated in dollars.

  12. Analysis of the dead layer of a detector of germanium with code ultrapure Monte Carlo SWORD-GEANT; Analisis del dead layer de un detector de germanio ultrapuro con el codigo de Monte Carlo SWORDS-GEANT

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, S.; Querol, A.; Ortiz, J.; Rodenas, J.; Verdu, G.

    2014-07-01

    In this paper the use of Monte Carlo code SWORD-GEANT is proposed to simulate an ultra pure germanium detector High Purity Germanium detector (HPGe) detector ORTEC specifically GMX40P4, coaxial geometry. (Author)

  13. Evaluation of the Anterior Segment Angle-to-Angle Scan of Cirrus High-Definition Optical Coherence Tomography and Comparison With Gonioscopy and With the Visante OCT.

    Science.gov (United States)

    Tun, Tin A; Baskaran, Mani; Tan, Shayne S; Perera, Shamira A; Aung, Tin; Husain, Rahat

    2017-01-01

    To evaluate the diagnostic performance of the anterior segment angle-to-angle scan of the Cirrus high-definition optical coherence tomography (HD-OCT) in detecting eyes with closed angles. All subjects underwent dark-room gonioscopy by an ophthalmologist. A technician performed anterior segment imaging with Cirrus (n = 202) and Visante OCT (n = 85) under dark-room conditions. All eyes were categorized by two masked graders as per number of closed quadrants. Each quadrant of anterior chamber angle was categorized as a closed angle if posterior trabecular meshwork could not be seen on gonioscopy or if there was any irido-corneal contact anterior to scleral spur in Cirrus and Visante images. An eye was graded as having a closed angle if two or more quadrants were closed. Agreement and area under the curve (AUC) were performed. There were 50 (24.8%) eyes with closed angles. The agreements of closed-angle diagnosis (by eye) between Cirrus HD-OCT and gonioscopy (k = 0.59; 95% confidence interval (CI) 0.45-0.72; AC1 = 0.76) and between Cirrus and Visante OCT (k = 0.65; 95% CI 0.48-0.82, AC1 = 0.77) were moderate. The AUC for diagnosing the eye with gonioscopic closed angle by Cirrus HD-OCT was good (AUC = 0.86; sensitivity = 83.33; specificity = 77.78). The diagnostic performance of Cirrus HD-OCT in detecting the eyes with closed angles was similar to that of Visante (AUC 0.87 vs. 0.9, respectively; P = 0.51). The anterior segment angle-to-angle scans of Cirrus HD-OCT demonstrated similar diagnostic performance as Visante in detecting gonioscopic closed angles. The agreement between Cirrus and gonioscopy for detecting eyes with closed angles was moderate.

  14. Effects of high-frequency stimulation of the internal pallidal segment on neuronal activity in the thalamus in parkinsonian monkeys

    Science.gov (United States)

    Kammermeier, Stefan; Pittard, Damien; Hamada, Ikuma

    2016-01-01

    Deep brain stimulation of the internal globus pallidus (GPi) is a major treatment for advanced Parkinson's disease. The effects of this intervention on electrical activity patterns in targets of GPi output, specifically in the thalamus, are poorly understood. The experiments described here examined these effects using electrophysiological recordings in two Rhesus monkeys rendered moderately parkinsonian through treatment with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP), after sampling control data in the same animals. Analysis of spontaneous spiking activity of neurons in the basal ganglia-receiving areas of the ventral thalamus showed that MPTP-induced parkinsonism is associated with a reduction of firing rates of segments of the data that contained neither bursts nor decelerations, and with increased burst firing. Spectral analyses revealed an increase of power in the 3- to 13-Hz band and a reduction in the γ-range in the spiking activity of these neurons. Electrical stimulation of the ventrolateral motor territory of GPi with macroelectrodes, mimicking deep brain stimulation in parkinsonian patients (bipolar electrodes, 0.5 mm intercontact distance, biphasic stimuli, 120 Hz, 100 μs/phase, 200 μA), had antiparkinsonian effects. The stimulation markedly reduced oscillations in thalamic firing in the 13- to 30-Hz range and uncoupled the spiking activity of recorded neurons from simultaneously recorded local field potential (LFP) activity. These results confirm that oscillatory and nonoscillatory characteristics of spontaneous activity in the basal ganglia receiving ventral thalamus are altered in MPTP-induced parkinsonism. Electrical stimulation of GPi did not entrain thalamic activity but changed oscillatory activity in the ventral thalamus and altered the relationship between spikes and simultaneously recorded LFPs. PMID:27683881

  15. A new high-spin isomer in {sup 195}Bi

    Energy Technology Data Exchange (ETDEWEB)

    Roy, T.; Mukherjee, G.; Rana, T.K.; Bhattacharya, Soumik; Asgar, Md.A.; Bhattacharya, C.; Bhattacharya, S.; Bhattacharyya, S.; Pai, H. [Variable Energy Cyclotron Centre, Kolkata (India); Madhavan, N.; Bala, I.; Gehlot, J.; Gurjar, R.K.; Jhingan, A.; Kumar, R.; Muralithar, S.; Nath, S.; Singh, R.P.; Varughese, T. [Inter University Acclerator Centre, New Delhi (India); Basu, K.; Bhattacharjee, S.S.; Ghugre, S.S.; Raut, R.; Sinha, A.K. [UGC-DAE-CSR Kolkata Centre, Kolkata (India); Palit, R. [Tata Institute of Fundamental Research, Department of Nuclear and Atomic Physics, Mumbai (India)

    2015-11-15

    A new high-spin isomer has been identified in {sup 195}Bi at the focal plane of the HYbrid Recoil mass Analyser (HYRA) used in the gas-filled mode. The fusion evaporation reactions {sup 169}Tm ({sup 30}Si, x n) {sup 193,} {sup 195}Bi were used with the beam energies on targets of 168 and 146MeV for 6n and 4n channels, respectively. The evaporation residues, separated from the fission fragments, and their decays were detected at the focal plane of HYRA using MWPC, Si-Pad and clover HPGe detectors. The half-life of the new isomer in {sup 195}Bi has been measured to be 1.6(1) μs. The configuration of the new isomer has been proposed and compared with the other isomers in this region. The Total Routhian Surface (TRS) calculations for the three-quasiparticle configurations corresponding to the new isomer suggest an oblate deformation for this isomeric state. The same calculations for different configurations in {sup 195}Bi and for the even-even {sup 194}Pb core indicate that the proton i{sub 13/2} orbital has a large shape driving effect towards oblate shape in these nuclei. (orig.)

  16. Correction for hole trapping in AGATA detectors using pulse shape analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bruyneel, B. [CEA Saclay, DSM/IRFU/SPhN, Gif-sur-Yvette Cedex (France); Universitaet zu Koeln, Institut fuer Kernphysik, Koeln (Germany); Birkenbach, B.; Eberth, J.; Hess, H.; Pascovici, Gh.; Reiter, P.; Wiens, A. [Universitaet zu Koeln, Institut fuer Kernphysik, Koeln (Germany); Bazzacco, D.; Farnea, E.; Michelagnoli, C.; Recchia, F. [INFN, Sezione di Padova, Padova (Italy); Collaboration: for the AGATA Collaboration

    2013-05-15

    Data from the highly segmented High-Purity Germanium (HPGe) detectors of the AGATA spectrometer show that segments are more sensitive to neutron damage than the central core contact. Calculations on the collection efficiency of charge carriers inside the HPGe detector were performed in order to understand this phenomenon. The trapping sensitivity, an expression based on the collection efficiencies for electrons and holes, is put forward to quantify the effect of charge carrier trapping. The sensitivity is evaluated for each position in the detector volume with respect to the different electrodes and the collected charge carrier type. Using the position information obtained by pulse shape analysis from the position-sensitive AGATA detectors, it is possible to correct for the energy deficit employing detector specific sensitivity values. We report on the successful correction of the energy peaks from heavily neutron-damaged AGATA detectors for core and segment electrode signals. The original energy resolution can optimally be recovered up to a certain quantifiable limit of degradation due to statistical fluctuations caused by trapping effects. (orig.)

  17. Detection and Segmentation of Vine Canopy in Ultra-High Spatial Resolution RGB Imagery Obtained from Unmanned Aerial Vehicle (UAV: A Case Study in a Commercial Vineyard

    Directory of Open Access Journals (Sweden)

    Carlos Poblete-Echeverría

    2017-03-01

    Full Text Available The use of Unmanned Aerial Vehicles (UAVs in viticulture permits the capture of aerial Red-Green-Blue (RGB images with an ultra-high spatial resolution. Recent studies have demonstrated that RGB images can be used to monitor spatial variability of vine biophysical parameters. However, for estimating these parameters, accurate and automated segmentation methods are required to extract relevant information from RGB images. Manual segmentation of aerial images is a laborious and time-consuming process. Traditional classification methods have shown satisfactory results in the segmentation of RGB images for diverse applications and surfaces, however, in the case of commercial vineyards, it is necessary to consider some particularities inherent to canopy size in the vertical trellis systems (VSP such as shadow effect and different soil conditions in inter-rows (mixed information of soil and weeds. Therefore, the objective of this study was to compare the performance of four classification methods (K-means, Artificial Neural Networks (ANN, Random Forest (RForest and Spectral Indices (SI to detect canopy in a vineyard trained on VSP. Six flights were carried out from post-flowering to harvest in a commercial vineyard cv. Carménère using a low-cost UAV equipped with a conventional RGB camera. The results show that the ANN and the simple SI method complemented with the Otsu method for thresholding presented the best performance for the detection of the vine canopy with high overall accuracy values for all study days. Spectral indices presented the best performance in the detection of Plant class (Vine canopy with an overall accuracy of around 0.99. However, considering the performance pixel by pixel, the Spectral indices are not able to discriminate between Soil and Shadow class. The best performance in the classification of three classes (Plant, Soil, and Shadow of vineyard RGB images, was obtained when the SI values were used as input data in trained

  18. Magnitude and consequences of undertreatment of high-risk patients with non-ST segment elevation acute coronary syndromes: insights from the DESCARTES Registry.

    Science.gov (United States)

    Heras, M; Bueno, H; Bardají, A; Fernández-Ortiz, A; Martí, H; Marrugat, J

    2006-11-01

    To analyse intensity of treatment of high-risk patients with non-ST elevation acute coronary syndromes (NSTEACS) included in the DESCARTES (Descripción del Estado de los Sindromes Coronarios Agudos en un Registro Temporal Español) registry. Patients with NSTEACS (n = 1877) admitted to 45 randomly selected Spanish hospitals in April and May 2002 were studied. Patients with ST segment depression and troponin rise were considered high risk (n = 478) and were compared with non-high risk patients (n = 1399). 46.9% of high-risk patients versus 39.5% of non-high-risk patients underwent angiography (p = 0.005), 23.2% versus 18.8% (p = 0.038) underwent percutaneous revascularisation, and 24.9% versus 7.4% (p or = 4, 2-3 and or = 4 (OR 2.87, 95% CI 1.27 to 6.52, p = 0.012). Class I recommended treatments were underused in high-risk patients in the DESCARTES registry. This undertreatment was an independent predictor of death of patients with an acute coronary syndrome.

  19. Comparison of experimental and theoretical efficiency of HPGe X-ray detector

    International Nuclear Information System (INIS)

    Mohanty, B.P.; Balouria, P.; Garg, M.L.; Nandi, T.K.; Mittal, V.K.; Govil, I.M.

    2008-01-01

    The low energy high purity germanium (HPGe) detectors are being increasingly used for the quantitative estimation of elements using X-ray spectrometric techniques. The softwares used for quantitative estimation normally evaluate model based efficiency of detector using manufacturer supplied detector physical parameters. The present work shows that the manufacturer supplied detector parameters for low energy HPGe detectors need to be verified by comparing model based efficiency with the experimental ones. This is particularly crucial for detectors with ion implanted P type contacts

  20. Transulnar sheathless percutaneous coronary intervention during bivalirudin infusion in high-risk elderly female with non-ST segment elevation myocardial infarction

    Directory of Open Access Journals (Sweden)

    Marina Mustilli

    2012-06-01

    Full Text Available Due to the ageing population and raised life expectancy, elderly patients are increasingly referred for percutaneous coronary intervention (PCI during acute coronary syndromes (ACS. Bleeding complications are not infrequent during ACS, occurring in 2-5% of patients with prognostic and economic consequences. In particular, periprocedural bleeding and vascular complications are associated with worse clinical outcome, prolonged hospital stay and increased short- and long-term mortality, especially in elderly patients with acute coronary syndromes. We report the case of an 83-year old female referred to our hospital because of non-ST segment elevation myocardial infarction with high bleeding risk and unsuitable radial artery undergoing transulnar sheathless PCI during bivalirudin infusion. The clinical, technical, pharmacological and prognostic implications are discussed.

  1. Optimal design of a double-sided linear motor with a multi-segmented trapezoidal magnet array for a high precision positioning system

    International Nuclear Information System (INIS)

    Lee, Moon G.; Gweon, Dae-Gab

    2004-01-01

    A comparative analysis is performed for linear motors adopting conventional and multi-segmented trapezoidal (MST) magnet arrays, respectively, for a high-precision positioning system. The proposed MST magnet array is a modified version of a Halbach magnet array. The MST array has trapezoidal magnets with variable shape and dimensions while the Halbach magnet array generally has a rectangular magnet with identical dimensions. We propose a new model that can describe the magnetic field resulting from the complex-shaped magnets. The model can be applied to both MST and conventional magnet arrays. Using the model, a design optimization of the two types of linear motors is performed and compared. The magnet array with trapezoidal magnets can produce more force than one with rectangular magnets when they are arrayed in a linear motor where there is a yoke with high permeability. After the optimization and comparison, we conclude that the linear motor with the MST magnet array can generate more actuating force per volume than the motor with the conventional array. In order to satisfy the requirements of next generation systems such as high resolution, high speed, and long stroke, the use of a linear motor with a MST array as an actuator in a high precision positioning system is recommended from the results obtained here

  2. Segmented trapped vortex cavity

    Science.gov (United States)

    Grammel, Jr., Leonard Paul (Inventor); Pennekamp, David Lance (Inventor); Winslow, Jr., Ralph Henry (Inventor)

    2010-01-01

    An annular trapped vortex cavity assembly segment comprising includes a cavity forward wall, a cavity aft wall, and a cavity radially outer wall there between defining a cavity segment therein. A cavity opening extends between the forward and aft walls at a radially inner end of the assembly segment. Radially spaced apart pluralities of air injection first and second holes extend through the forward and aft walls respectively. The segment may include first and second expansion joint features at distal first and second ends respectively of the segment. The segment may include a forward subcomponent including the cavity forward wall attached to an aft subcomponent including the cavity aft wall. The forward and aft subcomponents include forward and aft portions of the cavity radially outer wall respectively. A ring of the segments may be circumferentially disposed about an axis to form an annular segmented vortex cavity assembly.

  3. Pavement management segment consolidation

    Science.gov (United States)

    1998-01-01

    Dividing roads into "homogeneous" segments has been a major problem for all areas of highway engineering. SDDOT uses Deighton Associates Limited software, dTIMS, to analyze life-cycle costs for various rehabilitation strategies on each segment of roa...

  4. Speaker segmentation and clustering

    OpenAIRE

    Kotti, M; Moschou, V; Kotropoulos, C

    2008-01-01

    07.08.13 KB. Ok to add the accepted version to Spiral, Elsevier says ok whlile mandate not enforced. This survey focuses on two challenging speech processing topics, namely: speaker segmentation and speaker clustering. Speaker segmentation aims at finding speaker change points in an audio stream, whereas speaker clustering aims at grouping speech segments based on speaker characteristics. Model-based, metric-based, and hybrid speaker segmentation algorithms are reviewed. Concerning speaker...

  5. Spinal segmental dysgenesis

    Directory of Open Access Journals (Sweden)

    N Mahomed

    2009-06-01

    Full Text Available Spinal segmental dysgenesis is a rare congenital spinal abnormality , seen in neonates and infants in which a segment of the spine and spinal cord fails to develop normally . The condition is segmental with normal vertebrae above and below the malformation. This condition is commonly associated with various abnormalities that affect the heart, genitourinary, gastrointestinal tract and skeletal system. We report two cases of spinal segmental dysgenesis and the associated abnormalities.

  6. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation

  7. Prognosis and high-risk complication identification in unselected patients with ST-segment elevation myocardial infarction treated with primary percutaneous coronary intervention

    DEFF Research Database (Denmark)

    Andersson, Hedvig; Ripa, Maria Sejersten; Clemmensen, Peter

    2010-01-01

    The aim of this study was to evaluate treatment with primary percutaneous coronary intervention (PCI) in unselected patients with ST-segment elevation myocardial infarction (STEMI).......The aim of this study was to evaluate treatment with primary percutaneous coronary intervention (PCI) in unselected patients with ST-segment elevation myocardial infarction (STEMI)....

  8. Semantic segmentation of forest stands of pure species combining airborne lidar data and very high resolution multispectral imagery

    Science.gov (United States)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet-Brunet, Valérie

    2017-04-01

    Forest stands are the basic units for forest inventory and mapping. Stands are defined as large forested areas (e.g., ⩾ 2 ha) of homogeneous tree species composition and age. Their accurate delineation is usually performed by human operators through visual analysis of very high resolution (VHR) infra-red images. This task is tedious, highly time consuming, and should be automated for scalability and efficient updating purposes. In this paper, a method based on the fusion of airborne lidar data and VHR multispectral images is proposed for the automatic delineation of forest stands containing one dominant species (purity superior to 75%). This is the key preliminary task for forest land-cover database update. The multispectral images give information about the tree species whereas 3D lidar point clouds provide geometric information on the trees and allow their individual extraction. Multi-modal features are computed, both at pixel and object levels: the objects are individual trees extracted from lidar data. A supervised classification is then performed at the object level in order to coarsely discriminate the existing tree species in each area of interest. The classification results are further processed to obtain homogeneous areas with smooth borders by employing an energy minimum framework, where additional constraints are joined to form the energy function. The experimental results show that the proposed method provides very satisfactory results both in terms of stand labeling and delineation (overall accuracy ranges between 84 % and 99 %).

  9. A highly variable segment of human subterminal 16p reveals a history of population growth for modern humans outside Africa

    Science.gov (United States)

    Alonso, Santos; Armour, John A. L.

    2001-01-01

    We have sequenced a highly polymorphic subterminal noncoding region from human chromosome 16p13.3, flanking the 5′ end of the hypervariable minisatellite MS205, in 100 chromosomes sampled from different African and Euroasiatic populations. Coalescence analysis indicates that the time to the most recent common ancestor (approximately 1 million years) predates the appearance of anatomically modern human forms. The root of the network describing this variability lies in Africa. African populations show a greater level of diversity and deeper branches. Most Euroasiatic variability seems to have been generated after a recent out-of-Africa range expansion. A history of population growth is the most likely scenario for the Euroasiatic populations. This pattern of nuclear variability can be reconciled with inferences based on mitochondrial DNA. PMID:11158547

  10. Brain Tumor Image Segmentation in MRI Image

    Science.gov (United States)

    Peni Agustin Tjahyaningtijas, Hapsari

    2018-04-01

    Brain tumor segmentation plays an important role in medical image processing. Treatment of patients with brain tumors is highly dependent on early detection of these tumors. Early detection of brain tumors will improve the patient’s life chances. Diagnosis of brain tumors by experts usually use a manual segmentation that is difficult and time consuming because of the necessary automatic segmentation. Nowadays automatic segmentation is very populer and can be a solution to the problem of tumor brain segmentation with better performance. The purpose of this paper is to provide a review of MRI-based brain tumor segmentation methods. There are number of existing review papers, focusing on traditional methods for MRI-based brain tumor image segmentation. this paper, we focus on the recent trend of automatic segmentation in this field. First, an introduction to brain tumors and methods for brain tumor segmentation is given. Then, the state-of-the-art algorithms with a focus on recent trend of full automatic segmentaion are discussed. Finally, an assessment of the current state is presented and future developments to standardize MRI-based brain tumor segmentation methods into daily clinical routine are addressed.

  11. High-precision half-life measurements for the superallowed Fermi β+ emitter 14O

    Science.gov (United States)

    Laffoley, A. T.; Svensson, C. E.; Andreoiu, C.; Austin, R. A. E.; Ball, G. C.; Blank, B.; Bouzomita, H.; Cross, D. S.; Diaz Varela, A.; Dunlop, R.; Finlay, P.; Garnsworthy, A. B.; Garrett, P. E.; Giovinazzo, J.; Grinyer, G. F.; Hackman, G.; Hadinia, B.; Jamieson, D. S.; Ketelhut, S.; Leach, K. G.; Leslie, J. R.; Tardiff, E.; Thomas, J. C.; Unsworth, C.

    2013-07-01

    The half-life of the superallowed Fermi β+ emitter 14O has been determined via simultaneous direct β and γ counting experiments at TRIUMF's Isotope Separator and Accelerator (ISAC) facility. A γ-ray counting measurement was performed by detecting the 2312.6-keV γ rays emitted from an excited state of the daughter 14N following the implantation of samples at the center of the 8π γ-ray spectrometer, a spherical array of 20 high-purity germanium (HPGe) detectors. A simultaneous β counting experiment was performed using a fast plastic scintillator positioned behind the implantation site with a solid angle coverage of ˜20%. The results, T1/2(β)=70.610±0.030s and T1/2(γ)=70.632±0.094s, form a consistent set and, together with eight previous measurements, establish a new average for the 14O half-life of T1/2=70.619±0.011s with a reduced χ2 of 0.99.

  12. High-precision branching-ratio measurement for the superallowed β+ emitter 74Rb

    Science.gov (United States)

    Dunlop, R.; Ball, G. C.; Leslie, J. R.; Svensson, C. E.; Towner, I. S.; Andreoiu, C.; Chagnon-Lessard, S.; Chester, A.; Cross, D. S.; Finlay, P.; Garnsworthy, A. B.; Garrett, P. E.; Glister, J.; Hackman, G.; Hadinia, B.; Leach, K. G.; Rand, E. T.; Starosta, K.; Tardiff, E. R.; Triambak, S.; Williams, S. J.; Wong, J.; Yates, S. W.; Zganjar, E. F.

    2013-10-01

    A high-precision branching-ratio measurement for the superallowed β+ decay of 74Rb was performed at the TRIUMF Isotope Separator and Accelerator (ISAC) radioactive ion-beam facility. The scintillating electron-positron tagging array (SCEPTAR), composed of 10 thin plastic scintillators, was used to detect the emitted β particles; the 8π spectrometer, an array of 20 Compton-suppressed HPGe detectors, was used for detecting γ rays that were emitted following Gamow-Teller and nonanalog Fermi β+ decays of 74Rb; and the Pentagonal Array of Conversion Electron Spectrometers (PACES), an array of 5 Si(Li) detectors, was employed for measuring β-delayed conversion electrons. Twenty-three excited states were identified in 74Kr following 8.241(4)×108 detected 74Rb β decays. A total of 58 γ-ray and electron transitions were placed in the decay scheme, allowing the superallowed branching ratio to be determined as B0=99.545(31)%. Combined with previous half-life and Q-value measurements, the superallowed branching ratio measured in this work leads to a superallowed ft value of 3082.8(65) s. Comparisons between this superallowed ft value and the world-average-corrected Ft¯ value, as well as the nonanalog Fermi branching ratios determined in this work, provide guidance for theoretical models of the isospin-symmetry-breaking corrections in this mass region.

  13. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  14. High-precision gamma-ray spectroscopy of 61Cu, an emerging medical isotope used in positron emission tomography

    Science.gov (United States)

    Nelson, N.; Ellison, P.; Nickles, R.; McCutchan, E.; Sonzogni, A.; Smith, S.; Greene, J.; Carpenter, M.; Zhu, S.; Lister, C.; Moran, K.

    2017-09-01

    61Cu (t1 / 2 = 3.339h) is an important medical isotope used in positron emission tomography (PET) tumor hypoxia imaging scans; however, its beta-plus decay and the subsequent gamma decay of 61Ni has not been studied in over 30 years. Therefore, high quality decay data of 61Cu is desired to determine the overall dose delivered to a patient. In this study, 61Cu was produced at the University of Wisconsin - Madison cyclotron and then assayed using the Gammasphere array at Argonne National Laboratory. Consisting of 70 Compton-suppressed high-purity germanium (HPGe) detectors, Gammasphere provides precise decay data that exceeds that of previous 61Cu studies. γ-ray singles and coincident data were recorded and then analyzed using Radware gf3m software. Through γ- γ coincidence techniques, new γ-ray transitions were identified and high precision determination of γ-ray intensities were made. These modifications and additions to the current decay scheme will be presented, and their impact on the resulting does estimates will be discussed. DOE Isotope Program is acknowledged for funding ST5001030. Work supported by the U.S. DOE under Grant No. DE-FG02-94ER40848 and Contract Nos. DE-AC02-98CH10946 and DE-AC02-06CH11357 and by the Science Undergraduate Laboratory Internship Program (SULI).

  15. Interactive segmentation techniques algorithms and performance evaluation

    CERN Document Server

    He, Jia; Kuo, C-C Jay

    2013-01-01

    This book focuses on interactive segmentation techniques, which have been extensively studied in recent decades. Interactive segmentation emphasizes clear extraction of objects of interest, whose locations are roughly indicated by human interactions based on high level perception. This book will first introduce classic graph-cut segmentation algorithms and then discuss state-of-the-art techniques, including graph matching methods, region merging and label propagation, clustering methods, and segmentation methods based on edge detection. A comparative analysis of these methods will be provided

  16. A new method for automated high-dimensional lesion segmentation evaluated in vascular injury and applied to the human occipital lobe.

    Science.gov (United States)

    Mah, Yee-Haur; Jager, Rolf; Kennard, Christopher; Husain, Masud; Nachev, Parashkev

    2014-07-01

    Making robust inferences about the functional neuroanatomy of the brain is critically dependent on experimental techniques that examine the consequences of focal loss of brain function. Unfortunately, the use of the most comprehensive such technique-lesion-function mapping-is complicated by the need for time-consuming and subjective manual delineation of the lesions, greatly limiting the practicability of the approach. Here we exploit a recently-described general measure of statistical anomaly, zeta, to devise a fully-automated, high-dimensional algorithm for identifying the parameters of lesions within a brain image given a reference set of normal brain images. We proceed to evaluate such an algorithm in the context of diffusion-weighted imaging of the commonest type of lesion used in neuroanatomical research: ischaemic damage. Summary performance metrics exceed those previously published for diffusion-weighted imaging and approach the current gold standard-manual segmentation-sufficiently closely for fully-automated lesion-mapping studies to become a possibility. We apply the new method to 435 unselected images of patients with ischaemic stroke to derive a probabilistic map of the pattern of damage in lesions involving the occipital lobe, demonstrating the variation of anatomical resolvability of occipital areas so as to guide future lesion-function studies of the region. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. High temperature in-situ observations of multi-segmented metal nanowires encapsulated within carbon nanotubes by in-situ filling technique.

    Science.gov (United States)

    Hayashi, Yasuhiko; Tokunaga, Tomoharu; Iijima, Toru; Iwata, Takuya; Kalita, Golap; Tanemura, Masaki; Sasaki, Katsuhiro; Kuroda, Kotaro

    2012-08-08

    Multi-segmented one-dimensional metal nanowires were encapsulated within carbon nanotubes (CNTs) through in-situ filling technique during plasma-enhanced chemical vapor deposition process. Transmission electron microscopy (TEM) and environmental TEM were employed to characterize the as-prepared sample at room temperature and high temperature. The selected area electron diffractions revealed that the Pd4Si nanowire and face-centered-cubic Co nanowire on top of the Pd nanowire were encapsulated within the bottom and tip parts of the multiwall CNT, respectively. Although the strain-induced deformation of graphite walls was observed, the solid-state phases of Pd4Si and Co-Pd remain even at above their expected melting temperatures and up to 1,550 ± 50°C. Finally, the encapsulated metals were melted and flowed out from the tip of the CNT after 2 h at the same temperature due to the increase of internal pressure of the CNT.

  18. Segmental analysis of cochlea on three-dimensional MR imaging and high-resolution CT. Application to pre-operative assessment of cochlear implant candidates

    International Nuclear Information System (INIS)

    Akiba, Hidenari; Himi, Tetsuo; Hareyama, Masato

    2002-01-01

    High-resolution computed tomography (HRCT) and magnetic resonance imaging (MRI) have recently become standard pre-operative examinations for cochlear implant candidates. HRCT can demonstrate ossification and narrowing of the cochlea, but subtle calcification or soft tissue obstruction may not be detected by this method alone, and so conventional T2 weighted image (T2WI) on MRI has been recommended to disclose them. In this study, segmental analyses of the cochlea were made on three-dimensional MRI (3DMRI) and HRCT in order to predict cochlear implant difficulties. The study involved 59 consecutive patients with bilateral profound sensorineural hearing loss who underwent MRI and HRCT from November 1992 to February 1998. Etiologies of deafness were meningogenic labyrinthitis (n=9), tympanogenic labyrinthitis (n=12), and others (n=38). Pulse sequence of heavy T2WI was steady state free precession and 3DMRI was reconstructed by maximum intensity projection method. HRCT was reconstructed by bone algorithm focusing on the temporal bone. For alternative segmental analysis, cochleas were anatomically divided into five parts and each of them was classified by three ranks of score depending on 3DMRI or HRCT findings. There was a close correlation by ranks between the total score of the five parts on 3DMRI and HRCT (rs=0.86, P<0.001), and a statistically significant difference was identified between causes of deafness in the total score on 3DMRI or HRCT (P<0.001, respectively). There was a significant difference in the score among the five parts on each examination (P<0.001, respectively), and abnormal findings were more frequent in the inferior horizontal part (IHP) of the basal turn. Of the 35 patients who underwent cochlear implantation, no one had ossification in the IHP on HRCT and only one patient had an obstacle to implantation. When no signal void in the IHP on 3DMRI and no ossification in the IHP on HRCT were assumed to be the criteria for candidacy for cochlear

  19. High-resolution, time-resolved MRA provides superior definition of lower-extremity arterial segments compared to 2D time-of-flight imaging.

    Science.gov (United States)

    Thornton, F J; Du, J; Suleiman, S A; Dieter, R; Tefera, G; Pillai, K R; Korosec, F R; Mistretta, C A; Grist, T M

    2006-08-01

    To evaluate a novel time-resolved contrast-enhanced (CE) projection reconstruction (PR) magnetic resonance angiography (MRA) method for identifying potential bypass graft target vessels in patients with Class II-IV peripheral vascular disease. Twenty patients (M:F = 15:5, mean age = 58 years, range = 48-83 years), were recruited from routine MRA referrals. All imaging was performed on a 1.5 T MRI system with fast gradients (Signa LX; GE Healthcare, Waukesha, WI). Images were acquired with a novel technique that combined undersampled PR with a time-resolved acquisition to yield an MRA method with high temporal and spatial resolution. The method is called PR hyper time-resolved imaging of contrast kinetics (PR-hyperTRICKS). Quantitative and qualitative analyses were used to compare two-dimensional (2D) time-of-flight (TOF) and PR-hyperTRICKS in 13 arterial segments per lower extremity. Statistical analysis was performed with the Wilcoxon signed-rank test. Fifteen percent (77/517) of the vessels were scored as missing or nondiagnostic with 2D TOF, but were scored as diagnostic with PR-hyperTRICKS. Image quality was superior with PR-hyperTRICKS vs. 2D TOF (on a four-point scale, mean rank = 3.3 +/- 1.2 vs. 2.9 +/- 1.2, P < 0.0001). PR-hyperTRICKS produced images with high contrast-to-noise ratios (CNR) and high spatial and temporal resolution. 2D TOF images were of inferior quality due to moderate spatial resolution, inferior CNR, greater flow-related artifacts, and absence of temporal resolution. PR-hyperTRICKS provides superior preoperative assessment of lower limb ischemia compared to 2D TOF.

  20. Application of INAA for chemical quality control analysis of C-C composite and high purity graphite by determining trace elemental concentrations

    International Nuclear Information System (INIS)

    Shinde, Amol D.; Reddy, A.V.R.; Acharya, R.; Venugopalan, Ramani

    2015-01-01

    Carbon based materials like graphite and C-C composites are used for various scientific and technological applications. Owing to its low neutron capture cross section and good moderating properties, graphite is used as a moderator or reflector in nuclear reactors. For high temperature reactors like CHTR, graphite and C-C composites are proposed as structural materials. Studies are in progress to use C-C composites as prospective candidate instead of graphite due to their excellent mechanical and thermal properties. The advantage of carbon-carbon composite is that the microstructure and the properties can be tailor made. Impurities like rare earth elements and neutron poisons which have high neutron absorption cross section and elements whose activation products of have longer half-lives like 60 Co (5.27 y), 65 Zn (244.3 d) and 59 Fe (44.5 d) are not desired in structural materials. For chemical quality control (CQC) it is necessary to evaluate accurately the impurity concentrations using a suitable non-destructive analytical technique. In the present work, two carbon/carbon composite samples and two high purity graphite samples were analyzed by Instrumental Neutron Activation Analysis (INAA) using high-flux reactor neutrons. Samples, sealed in Al foil, were irradiated in tray-rod position of Dhruva reactor, BARC at a neutron flux of ∼ 5 x 10 13 cm -2 s -1 . Radioactive assay was carried out using high resolution gamma ray spectrometry using 40% HPGe detector

  1. A comprehensive segmentation analysis of crude oil market based on time irreversibility

    Science.gov (United States)

    Xia, Jianan; Shang, Pengjian; Lu, Dan; Yin, Yi

    2016-05-01

    In this paper, we perform a comprehensive entropic segmentation analysis of crude oil future prices from 1983 to 2014 which used the Jensen-Shannon divergence as the statistical distance between segments, and analyze the results from original series S and series begin at 1986 (marked as S∗) to find common segments which have same boundaries. Then we apply time irreversibility analysis of each segment to divide all segments into two groups according to their asymmetry degree. Based on the temporal distribution of the common segments and high asymmetry segments, we figure out that these two types of segments appear alternately and do not overlap basically in daily group, while the common portions are also high asymmetry segments in weekly group. In addition, the temporal distribution of the common segments is fairly close to the time of crises, wars or other events, because the hit from severe events to oil price makes these common segments quite different from their adjacent segments. The common segments can be confirmed in daily group series, or weekly group series due to the large divergence between common segments and their neighbors. While the identification of high asymmetry segments is helpful to know the segments which are not affected badly by the events and can recover to steady states automatically. Finally, we rearrange the segments by merging the connected common segments or high asymmetry segments into a segment, and conjoin the connected segments which are neither common nor high asymmetric.

  2. Current segmented gamma-ray scanner technology

    International Nuclear Information System (INIS)

    Bjork, C.W.

    1987-01-01

    A new generation of segmented gamma-ray scanners has been developed at Los Alamos for scrap and waste measurements at the Savannah River Plant and the Los Alamos Plutonium Facility. The new designs are highly automated and exhibit special features such as good segmentation and thorough shielding to improve performance

  3. Preliminary Content Evaluation of the North Anna High Burn-Up Sister Fuel Rod Segments for Transportation in the 10-160B and NAC-LWT

    Energy Technology Data Exchange (ETDEWEB)

    Ketusky, E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-08-09

    The U.S. Department of Energy’s (DOE’s) Used Fuel Disposition Campaign (UFDC) Program has transported high-burnup nuclear sister fuel rods from a commercial nuclear power plant for purposes of evaluation and testing. The evaluation and testing of high-burnup used nuclear fuel is integral to DOE initiatives to collect information useful in determining the integrity of fuel cladding for future safe transportation of the fuel, and for determining the effects of aging, on the integrity of UNF subjected to extended storage and subsequent transportation. The UFDC Program, in collaboration with the U.S. Nuclear Regulatory Commission and the commercial nuclear industry, has obtained individual used nuclear fuel rods for testing. The rods have been received at Oak Ridge National Laboratory (ORNL) for both separate effects testing (SET) and small-scale testing (SST). To meet the research objectives, testing on multiple 6 inch fuel rod pins cut from the rods at ORNL will be performed at Pacific Northwest National Laboratory (PNNL). Up to 10 rod equivalents will be shipped. Options were evaluated for multiple shipments using the 10-160B (based on 4.5 rod equivalents) and a single shipment using the NAC-LWT. Based on the original INL/Virginia Power transfer agreement, the rods are assumed to 152 inches in length with a 0.374-inch diameter. This report provides a preliminary content evaluation for use of the 10-160B and NAC-LWT for transporting those fuel rod pins from ORNL to PNNL. This report documents the acceptability of using these packagings to transport the fuel segments from ORNL to PNNL based on the following evaluations: enrichment, A2 evaluation, Pu-239 FGE evaluation, heat load, shielding (both gamma and neutron), and content weight/structural evaluation.

  4. Gamma-ray observations of SN 1987A with an array of high-purity germanium detectors

    International Nuclear Information System (INIS)

    Sandie, W.G.; Nakano, G.H.; Chase, L.F. Jr.; Fishman, G.J.; Meegan, C.A.; Wilson, R.B.; Paciesas, W.

    1988-01-01

    A balloon borne gamma-ray spectrometer comprising an array of high-purity n-type germanium (HPGe) detectors having geometric area 119 cm 2 , resolution 2.5 keV at 1.0 MeV, surrounded by an active NaI (Tl) collimator and Compton suppressing anticoincidence shield nominally 10 cm thick, was flown from Alice Springs, Northern Territory, Australia, on May 29--30, 1987, 96 days after the observed neutrino pulse. The average column depth of residual atmosphere in the direction of SN 1987A at float altitude was 6.3 g cm-2 during the observation. SN 1987A was within the 22-deg full-width-half-maximum (FWHM) field of view for about 3300 s during May 29.9--30.3 UT. No excess gamma rays were observed at energies appropriate to the Ni(56)-Co(56) decay chain or from other lines in the energy region from 0.1 to 3.0 MeV. With 80% of the data analyzed, the 3-sigma upper limit obtained for the 1238-keV line from Co(56) at the instrument resolution (about 3 keV) is 1.3 x 10-3 photons cm-2 s-1

  5. Segmentation, advertising and prices

    NARCIS (Netherlands)

    Galeotti, Andrea; Moraga González, José

    This paper explores the implications of market segmentation on firm competitiveness. In contrast to earlier work, here market segmentation is minimal in the sense that it is based on consumer attributes that are completely unrelated to tastes. We show that when the market is comprised by two

  6. Sipunculans and segmentation

    DEFF Research Database (Denmark)

    Wanninger, Andreas; Kristof, Alen; Brinkmann, Nora

    2009-01-01

    mechanisms may act on the level of gene expression, cell proliferation, tissue differentiation and organ system formation in individual segments. Accordingly, in some polychaete annelids the first three pairs of segmental peripheral neurons arise synchronously, while the metameric commissures of the ventral...

  7. Design of an Online Fission Gas Monitoring System for Post-irradiation Examination Heating Tests of Coated Fuel Particles for High-Temperature Gas-Cooled Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Dawn Scates

    2010-10-01

    A new Fission Gas Monitoring System (FGMS) has been designed at the Idaho National Laboratory (INL) for use of monitoring online fission gas-released during fuel heating tests. The FGMS will be used with the Fuel Accident Condition Simulator (FACS) at the Hot Fuels Examination Facility (HFEF) located at the Materials and Fuels Complex (MFC) within the INL campus. Preselected Advanced Gas Reactor (AGR) TRISO (Tri-isotropic) fuel compacts will undergo testing to assess the fission product retention characteristics under high temperature accident conditions. The FACS furnace will heat the fuel to temperatures up to 2,000ºC in a helium atmosphere. Released fission products such as Kr and Xe isotopes will be transported downstream to the FGMS where they will accumulate in cryogenically cooledcollection traps and monitored with High Purity Germanium (HPGe) detectors during the heating process. Special INL developed software will be used to monitor the accumulated fission products and will report data in near real-time. These data will then be reported in a form that can be readily available to the INL reporting database. This paper describes the details of the FGMS design, the control and acqusition software, system calibration, and the expected performance of the FGMS. Preliminary online data may be available for presentation at the High Temperature Reactor (HTR) conference.

  8. Characterization of a segmented plasma torch assisted High Heat Flux (HHF) system for performance evaluation of plasma facing components in fusion devices

    International Nuclear Information System (INIS)

    Ngangom, Aomoa; Sarmah, Trinayan; Sah, Puspa; Kakati, Mayur; Ghosh, Joydeep

    2015-01-01

    A wide variety of high heat and particle flux test facilities are being used by the fusion community to evaluate the thermal performance of plasma facing materials/components, which includes electron beam, ion beam, neutral beam and thermal plasma assisted sources. In addition to simulate heat loads, plasma sources have the additional advantage of reproducing exact fusion plasma like conditions, in terms of plasma density, temperature and particle flux. At CPP-IPR, Assam, we have developed a high heat and particle flux facility using a DC, non-transferred, segmented thermal plasma torch system, which can produce a constricted, stabilized plasma jet with high ion density. In this system, the plasma torch exhausts into a low pressure chamber containing the materials to be irradiated, which produces an expanded plasma jet with more uniform profiles, compared to plasma torches operated at atmospheric pressure. The heat flux of the plasma beam was studied by using circular calorimeters of different diameters (2 and 3 cm) for different input power (5-55 kW). The effect of the change in gas (argon) flow rate and mixing of gases (argon + hydrogen) was also studied. The heat profile of the plasma beam was also studied by using a pipe calorimeter. From this, the radial heat flux was calculated by using Abel inversion. It is seen that the required heat flux of 10 MW/m 2 is achievable in our system for pure argon plasma as well as for plasma with gas mixtures. The plasma parameters like the temperature, density and the beam velocity were studied by using optical emission spectroscopy. For this, a McPherson made 1.33 meter focal length spectrometer; model number 209, was used. A plane grating with 1800 g/mm was used which gave a spectral resolution of 0.007 nm. A detailed characterization with respect to these plasma parameters for different gas (argon) flow rate and mixing of gases (argon+hydrogen) for different input power will be presented in this paper. The plasma

  9. Pancreas and cyst segmentation

    Science.gov (United States)

    Dmitriev, Konstantin; Gutenko, Ievgeniia; Nadeem, Saad; Kaufman, Arie

    2016-03-01

    Accurate segmentation of abdominal organs from medical images is an essential part of surgical planning and computer-aided disease diagnosis. Many existing algorithms are specialized for the segmentation of healthy organs. Cystic pancreas segmentation is especially challenging due to its low contrast boundaries, variability in shape, location and the stage of the pancreatic cancer. We present a semi-automatic segmentation algorithm for pancreata with cysts. In contrast to existing automatic segmentation approaches for healthy pancreas segmentation which are amenable to atlas/statistical shape approaches, a pancreas with cysts can have even higher variability with respect to the shape of the pancreas due to the size and shape of the cyst(s). Hence, fine results are better attained with semi-automatic steerable approaches. We use a novel combination of random walker and region growing approaches to delineate the boundaries of the pancreas and cysts with respective best Dice coefficients of 85.1% and 86.7%, and respective best volumetric overlap errors of 26.0% and 23.5%. Results show that the proposed algorithm for pancreas and pancreatic cyst segmentation is accurate and stable.

  10. High cut-off haemodialysis induces remission of recurrent idiopathic focal segmental glomerulosclerosis after renal transplantation but is no alternative to plasmapheresis

    NARCIS (Netherlands)

    I. Noorlander (Iris); D.A. Hesselink (Dennis); M. Wabbijn (Marike); M.G.H. Betjes (Michiel)

    2011-01-01

    textabstractA 26-year-old male experienced a recurrence of idiopathic focal segmental glomerulosclerosis (iFSGS) after his second renal transplant. Reduction of proteinuria was rapidly induced by plasmapheresis (PP) and the patient has remained in remission with a once-weekly PP regimen, which has

  11. Segmentation of consumer's markets and evaluation of market's segments

    OpenAIRE

    ŠVECOVÁ, Iveta

    2013-01-01

    The goal of this bachelor thesis was to explain a possibly segmentation of consumer´s markets for a chosen company, and to present a suitable goods offer, so it would be suitable to the needs of selected segments. The work is divided into theoretical and practical part. First part describes marketing, segmentation, segmentation of consumer's markets, consumer's market, market's segments a other terms. Second part describes an evaluation of questionnaire survey, discovering of market's segment...

  12. Segmentation of liver tumors on CT images

    International Nuclear Information System (INIS)

    Pescia, D.

    2011-01-01

    This thesis is dedicated to 3D segmentation of liver tumors in CT images. This is a task of great clinical interest since it allows physicians benefiting from reproducible and reliable methods for segmenting such lesions. Accurate segmentation would indeed help them during the evaluation of the lesions, the choice of treatment and treatment planning. Such a complex segmentation task should cope with three main scientific challenges: (i) the highly variable shape of the structures being sought, (ii) their similarity of appearance compared with their surrounding medium and finally (iii) the low signal to noise ratio being observed in these images. This problem is addressed in a clinical context through a two step approach, consisting of the segmentation of the entire liver envelope, before segmenting the tumors which are present within the envelope. We begin by proposing an atlas-based approach for computing pathological liver envelopes. Initially images are pre-processed to compute the envelopes that wrap around binary masks in an attempt to obtain liver envelopes from estimated segmentation of healthy liver parenchyma. A new statistical atlas is then introduced and used to segmentation through its diffeomorphic registration to the new image. This segmentation is achieved through the combination of image matching costs as well as spatial and appearance prior using a multi-scale approach with MRF. The second step of our approach is dedicated to lesions segmentation contained within the envelopes using a combination of machine learning techniques and graph based methods. First, an appropriate feature space is considered that involves texture descriptors being determined through filtering using various scales and orientations. Then, state of the art machine learning techniques are used to determine the most relevant features, as well as the hyper plane that separates the feature space of tumoral voxels to the ones corresponding to healthy tissues. Segmentation is then

  13. Improving image segmentation by learning region affinities

    Energy Technology Data Exchange (ETDEWEB)

    Prasad, Lakshman [Los Alamos National Laboratory; Yang, Xingwei [TEMPLE UNIV.; Latecki, Longin J [TEMPLE UNIV.

    2010-11-03

    We utilize the context information of other regions in hierarchical image segmentation to learn new regions affinities. It is well known that a single choice of quantization of an image space is highly unlikely to be a common optimal quantization level for all categories. Each level of quantization has its own benefits. Therefore, we utilize the hierarchical information among different quantizations as well as spatial proximity of their regions. The proposed affinity learning takes into account higher order relations among image regions, both local and long range relations, making it robust to instabilities and errors of the original, pairwise region affinities. Once the learnt affinities are obtained, we use a standard image segmentation algorithm to get the final segmentation. Moreover, the learnt affinities can be naturally unutilized in interactive segmentation. Experimental results on Berkeley Segmentation Dataset and MSRC Object Recognition Dataset are comparable and in some aspects better than the state-of-art methods.

  14. Segmental tuberculosis verrucosa cutis

    Directory of Open Access Journals (Sweden)

    Hanumanthappa H

    1994-01-01

    Full Text Available A case of segmental Tuberculosis Verrucosa Cutis is reported in 10 year old boy. The condition was resembling the ascending lymphangitic type of sporotrichosis. The lesions cleared on treatment with INH 150 mg daily for 6 months.

  15. Chromosome condensation and segmentation

    International Nuclear Information System (INIS)

    Viegas-Pequignot, E.M.

    1981-01-01

    Some aspects of chromosome condensation in mammalians -humans especially- were studied by means of cytogenetic techniques of chromosome banding. Two further approaches were adopted: a study of normal condensation as early as prophase, and an analysis of chromosome segmentation induced by physical (temperature and γ-rays) or chemical agents (base analogues, antibiotics, ...) in order to show out the factors liable to affect condensation. Here 'segmentation' means an abnormal chromosome condensation appearing systematically and being reproducible. The study of normal condensation was made possible by the development of a technique based on cell synchronization by thymidine and giving prophasic and prometaphasic cells. Besides, the possibility of inducing R-banding segmentations on these cells by BrdU (5-bromodeoxyuridine) allowed a much finer analysis of karyotypes. Another technique was developed using 5-ACR (5-azacytidine), it allowed to induce a segmentation similar to the one obtained using BrdU and identify heterochromatic areas rich in G-C bases pairs [fr

  16. International EUREKA: Initialization Segment

    International Nuclear Information System (INIS)

    1982-02-01

    The Initialization Segment creates the starting description of the uranium market. The starting description includes the international boundaries of trade, the geologic provinces, resources, reserves, production, uranium demand forecasts, and existing market transactions. The Initialization Segment is designed to accept information of various degrees of detail, depending on what is known about each region. It must transform this information into a specific data structure required by the Market Segment of the model, filling in gaps in the information through a predetermined sequence of defaults and built in assumptions. A principal function of the Initialization Segment is to create diagnostic messages indicating any inconsistencies in data and explaining which assumptions were used to organize the data base. This permits the user to manipulate the data base until such time the user is satisfied that all the assumptions used are reasonable and that any inconsistencies are resolved in a satisfactory manner

  17. High-precision gamma-ray spectroscopy of 82Rb and 72As, two important medical isotopes used in positron emission tomography

    Science.gov (United States)

    Nino, Michael; McCutchan, E.; Smith, S.; Sonzogni, A.; Muench, L.; Greene, J.; Carpenter, M.; Zhu, S.; Lister, C.

    2015-10-01

    Both 82Rb and 72As are very important medical isotopes used in imaging procedures, yet their full decay schemes were last studied decades ago using low-sensitivity detection systems; high quality decay data is necessary to determine the total dose received by the patient, the background in imaging technologies, and shielding requirements in production facilities. To improve the decay data of these two isotopes, sources were produced at the Brookhaven Linac Isotope Producer (BLIP) and then the Gammasphere array, consisting of 89 Compton-suppressed HPGe detectors, at Argonne National Laboratory was used to analyze the gamma-ray emissions from the daughter nuclei 82 Kr and 72 Ge. Gamma-ray singles and coincidence information were recorded and analyzed using Radware Gf3m software. Significant revisions were made to the level schemes including the observation of many new transitions and levels as well as a reduction in uncertainty on measured γ-ray intensities and deduced β-feedings. The new decay schemes as well as their impact on dose calculations will be presented. DOE Isotope Program is acknowledged for funding ST5001030. Work supported by the U.S. DOE under Grant No. DE-FG02-94ER40848 and Contract Nos. DE-AC02-98CH10946 and DE-AC02-06CH11357 and by the Science Undergraduate Laboratory Internships Program (SULI).

  18. Adolescent audience segmentation on alcohol attitudes : A further exploration

    NARCIS (Netherlands)

    Janssen, M.M.; Mathijssen, J.J.P.; van Bon, M.J.H.; van Oers, J.A.M.; Garretsen, H.F.L.

    2015-01-01

    Introduction: In an earlier audience segmentation study, Dutch adolescents aged 12–18 years were segmented into five alcohol attitudes segments: ordinaries, high spirits, consciously sobers, ordinary sobers and socials. The current study explores several aspects of alcohol consumption and leisure

  19. A hybrid concept (segmented plus monolithic fused silica shells) for a high-throughput and high-angular resolution x-ray mission (Lynx/X-Ray Surveyor like)

    Science.gov (United States)

    Basso, Stefano; Civitani, Marta; Pareschi, Giovanni; Parodi, Giancarlo

    2017-09-01

    Lynx is a large area and high angular resolution X-ray mission being studied by NASA to be presented to the next Decadal Survey for the implementation in the next decade. It aims to realize an X-ray telescope with the effective area similar to Athena (2 m2 at 1 keV) but with the same angular resolution of Chandra and a much larger Field Of View (up 20 arcmin x 20 arcmin). The science of X-ray Surveyor requires a large-throughput mirror assembly with sub-arcsec angular resolution. These future X-ray mirrors have a set of requirements which, collectively, represents very substantial advances over any currently in operation or planned for missions other than X-ray Surveyor. Of particular importance is achieving low mass per unit collecting area, while maintaining Chandra like angular resolution. Among the possible solutions under study, the direct polishing of both thin monolithic pseudo-cylindrical shells and segments made of fused silica are being considered as viable solutions for the implementation of the mirrors. Fused silica has very good thermomechanical parameters (including a very low CTE), making the material particularly well suited for for the production of the Lynx mirrors. It should be noted that the use of close shells is also very attractive, since the operations for the integration of the shells will be greatly simplified and the area lost due to the vignetting from the interfacing structures minimized even if the management of such big (diameter of 3 m) and thin shells have to be demonstrated. In this paper we will discuss a possible basic layout for a full shell mirror and a hybrid concept (segmented plus monolithic shells made of fused silica) as a second solution, for the Lynx/XRS telescope, discussing preliminary results in terms of optical and mechanical performance.

  20. Fluence map segmentation

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: 'Interpreting' the fluence map; The sequencer; Reasons for difference between desired and actual fluence map; Principle of 'Step and Shoot' segmentation; Large number of solutions for given fluence map; Optimizing 'step and shoot' segmentation; The interdigitation constraint; Main algorithms; Conclusions on segmentation algorithms (static mode); Optimizing intensity levels and monitor units; Sliding window sequencing; Synchronization to avoid the tongue-and-groove effect; Accounting for physical characteristics of MLC; Importance of corrections for leaf transmission and offset; Accounting for MLC mechanical constraints; The 'complexity' factor; Incorporating the sequencing into optimization algorithm; Data transfer to the treatment machine; Interface between R and V and accelerator; and Conclusions on fluence map segmentation (Segmentation is part of the overall inverse planning procedure; 'Step and Shoot' and 'Dynamic' options are available for most TPS (depending on accelerator model; The segmentation phase tends to come into the optimization loop; The physical characteristics of the MLC have a large influence on final dose distribution; The IMRT plans (MU and relative dose distribution) must be carefully validated). (P.A.)

  1. Gamifying Video Object Segmentation.

    Science.gov (United States)

    Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela

    2017-10-01

    Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.

  2. Strategic market segmentation

    Directory of Open Access Journals (Sweden)

    Maričić Branko R.

    2015-01-01

    Full Text Available Strategic planning of marketing activities is the basis of business success in modern business environment. Customers are not homogenous in their preferences and expectations. Formulating an adequate marketing strategy, focused on realization of company's strategic objectives, requires segmented approach to the market that appreciates differences in expectations and preferences of customers. One of significant activities in strategic planning of marketing activities is market segmentation. Strategic planning imposes a need to plan marketing activities according to strategically important segments on the long term basis. At the same time, there is a need to revise and adapt marketing activities on the short term basis. There are number of criteria based on which market segmentation is performed. The paper will consider effectiveness and efficiency of different market segmentation criteria based on empirical research of customer expectations and preferences. The analysis will include traditional criteria and criteria based on behavioral model. The research implications will be analyzed from the perspective of selection of the most adequate market segmentation criteria in strategic planning of marketing activities.

  3. Integrative image segmentation optimization and machine learning approach for high quality land-use and land-cover mapping using multisource remote sensing data

    Science.gov (United States)

    Gibril, Mohamed Barakat A.; Idrees, Mohammed Oludare; Yao, Kouame; Shafri, Helmi Zulhaidi Mohd

    2018-01-01

    The growing use of optimization for geographic object-based image analysis and the possibility to derive a wide range of information about the image in textual form makes machine learning (data mining) a versatile tool for information extraction from multiple data sources. This paper presents application of data mining for land-cover classification by fusing SPOT-6, RADARSAT-2, and derived dataset. First, the images and other derived indices (normalized difference vegetation index, normalized difference water index, and soil adjusted vegetation index) were combined and subjected to segmentation process with optimal segmentation parameters obtained using combination of spatial and Taguchi statistical optimization. The image objects, which carry all the attributes of the input datasets, were extracted and related to the target land-cover classes through data mining algorithms (decision tree) for classification. To evaluate the performance, the result was compared with two nonparametric classifiers: support vector machine (SVM) and random forest (RF). Furthermore, the decision tree classification result was evaluated against six unoptimized trials segmented using arbitrary parameter combinations. The result shows that the optimized process produces better land-use land-cover classification with overall classification accuracy of 91.79%, 87.25%, and 88.69% for SVM and RF, respectively, while the results of the six unoptimized classifications yield overall accuracy between 84.44% and 88.08%. Higher accuracy of the optimized data mining classification approach compared to the unoptimized results indicates that the optimization process has significant impact on the classification quality.

  4. SEGMENTATION OF SME PORTFOLIO IN BANKING SYSTEM

    Directory of Open Access Journals (Sweden)

    Namolosu Simona Mihaela

    2013-07-01

    Full Text Available The Small and Medium Enterprises (SMEs represent an important target market for commercial Banks. In this respect, finding the best methods for designing and implementing the optimal marketing strategies (for this target are a continuous concern for the marketing specialists and researchers from the banking system; the purpose is to find the most suitable service model for these companies. SME portfolio of a bank is not homogeneous, different characteristics and behaviours being identified. The current paper reveals empirical evidence about SME portfolio characteristics and segmentation methods used in banking system. Its purpose is to identify if segmentation has an impact in finding the optimal marketing strategies and service model and if this hypothesis might be applicable for any commercial bank, irrespective of country/ region. Some banks are segmenting the SME portfolio by a single criterion: the annual company (official turnover; others are considering also profitability and other financial indicators of the company. In some cases, even the banking behaviour becomes a criterion. For all cases, creating scenarios with different thresholds and estimating the impact in profitability and volumes are two mandatory steps in establishing the final segmentation (criteria matrix. Details about each of these segmentation methods may be found in the paper. Testing the final matrix of criteria is also detailed, with the purpose of making realistic estimations. Example for lending products is provided; the product offer is presented as responding to needs of targeted sub segment and therefore being correlated with the sub segment characteristics. Identifying key issues and trends leads to further action plan proposal. Depending on overall strategy and commercial target of the bank, the focus may shift, one or more sub segments becoming high priority (for acquisition/ activation/ retention/ cross sell/ up sell/ increase profitability etc., while

  5. Manual segmentation of the fornix, fimbria, and alveus on high-resolution 3T MRI: Application via fully-automated mapping of the human memory circuit white and grey matter in healthy and pathological aging.

    Science.gov (United States)

    Amaral, Robert S C; Park, Min Tae M; Devenyi, Gabriel A; Lynn, Vivian; Pipitone, Jon; Winterburn, Julie; Chavez, Sofia; Schira, Mark; Lobaugh, Nancy J; Voineskos, Aristotle N; Pruessner, Jens C; Chakravarty, M Mallar

    2018-04-15

    Recently, much attention has been focused on the definition and structure of the hippocampus and its subfields, while the projections from the hippocampus have been relatively understudied. Here, we derive a reliable protocol for manual segmentation of hippocampal white matter regions (alveus, fimbria, and fornix) using high-resolution magnetic resonance images that are complementary to our previous definitions of the hippocampal subfields, both of which are freely available at https://github.com/cobralab/atlases. Our segmentation methods demonstrated high inter- and intra-rater reliability, were validated as inputs in automated segmentation, and were used to analyze the trajectory of these regions in both healthy aging (OASIS), and Alzheimer's disease (AD) and mild cognitive impairment (MCI; using ADNI). We observed significant bilateral decreases in the fornix in healthy aging while the alveus and cornu ammonis (CA) 1 were well preserved (all p's<0.006). MCI and AD demonstrated significant decreases in fimbriae and fornices. Many hippocampal subfields exhibited decreased volume in both MCI and AD, yet no significant differences were found between MCI and AD cohorts themselves. Our results suggest a neuroprotective or compensatory role for the alveus and CA1 in healthy aging and suggest that an improved understanding of the volumetric trajectories of these structures is required. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Coupled dictionary learning for joint MR image restoration and segmentation

    Science.gov (United States)

    Yang, Xuesong; Fan, Yong

    2018-03-01

    To achieve better segmentation of MR images, image restoration is typically used as a preprocessing step, especially for low-quality MR images. Recent studies have demonstrated that dictionary learning methods could achieve promising performance for both image restoration and image segmentation. These methods typically learn paired dictionaries of image patches from different sources and use a common sparse representation to characterize paired image patches, such as low-quality image patches and their corresponding high quality counterparts for the image restoration, and image patches and their corresponding segmentation labels for the image segmentation. Since learning these dictionaries jointly in a unified framework may improve the image restoration and segmentation simultaneously, we propose a coupled dictionary learning method to concurrently learn dictionaries for joint image restoration and image segmentation based on sparse representations in a multi-atlas image segmentation framework. Particularly, three dictionaries, including a dictionary of low quality image patches, a dictionary of high quality image patches, and a dictionary of segmentation label patches, are learned in a unified framework so that the learned dictionaries of image restoration and segmentation can benefit each other. Our method has been evaluated for segmenting the hippocampus in MR T1 images collected with scanners of different magnetic field strengths. The experimental results have demonstrated that our method achieved better image restoration and segmentation performance than state of the art dictionary learning and sparse representation based image restoration and image segmentation methods.

  7. Segmentation of complex document

    Directory of Open Access Journals (Sweden)

    Souad Oudjemia

    2014-06-01

    Full Text Available In this paper we present a method for segmentation of documents image with complex structure. This technique based on GLCM (Grey Level Co-occurrence Matrix used to segment this type of document in three regions namely, 'graphics', 'background' and 'text'. Very briefly, this method is to divide the document image, in block size chosen after a series of tests and then applying the co-occurrence matrix to each block in order to extract five textural parameters which are energy, entropy, the sum entropy, difference entropy and standard deviation. These parameters are then used to classify the image into three regions using the k-means algorithm; the last step of segmentation is obtained by grouping connected pixels. Two performance measurements are performed for both graphics and text zones; we have obtained a classification rate of 98.3% and a Misclassification rate of 1.79%.

  8. SU-F-J-93: Automated Segmentation of High-Resolution 3D WholeBrain Spectroscopic MRI for Glioblastoma Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Schreibmann, E; Shu, H [Department of Radiation Oncology, Winship Cancer Institute of Emory University, Atlanta, GA (United States); Cordova, J; Gurbani, S; Holder, C; Cooper, L; Shim, H [Department of Radiology and Imaging Sciences, Emory University, Atlanta, GA (United States)

    2016-06-15

    Purpose: We report on an automated segmentation algorithm for defining radiation therapy target volumes using spectroscopic MR images (sMRI) acquired at nominal voxel resolution of 100 microliters. Methods: Wholebrain sMRI combining 3D echo-planar spectroscopic imaging, generalized auto-calibrating partially-parallel acquisitions, and elliptical k-space encoding were conducted on 3T MRI scanner with 32-channel head coil array creating images. Metabolite maps generated include choline (Cho), creatine (Cr), and N-acetylaspartate (NAA), as well as Cho/NAA, Cho/Cr, and NAA/Cr ratio maps. Automated segmentation was achieved by concomitantly considering sMRI metabolite maps with standard contrast enhancing (CE) imaging in a pipeline that first uses the water signal for skull stripping. Subsequently, an initial blob of tumor region is identified by searching for regions of FLAIR abnormalities that also display reduced NAA activity using a mean ratio correlation and morphological filters. These regions are used as starting point for a geodesic level-set refinement that adapts the initial blob to the fine details specific to each metabolite. Results: Accuracy of the segmentation model was tested on a cohort of 12 patients that had sMRI datasets acquired pre, mid and post-treatment, providing a broad range of enhancement patterns. Compared to classical imaging, where heterogeneity in the tumor appearance and shape across posed a greater challenge to the algorithm, sMRI’s regions of abnormal activity were easily detected in the sMRI metabolite maps when combining the detail available in the standard imaging with the local enhancement produced by the metabolites. Results can be imported in the treatment planning, leading in general increase in the target volumes (GTV60) when using sMRI+CE MRI compared to the standard CE MRI alone. Conclusion: Integration of automated segmentation of sMRI metabolite maps into planning is feasible and will likely streamline acceptance of this

  9. Connecting textual segments

    DEFF Research Database (Denmark)

    Brügger, Niels

    2017-01-01

    history than just the years of the emergence of the web, the chapter traces the history of how segments of text have deliberately been connected to each other by the use of specific textual and media features, from clay tablets, manuscripts on parchment, and print, among others, to hyperlinks on stand......In “Connecting textual segments: A brief history of the web hyperlink” Niels Brügger investigates the history of one of the most fundamental features of the web: the hyperlink. Based on the argument that the web hyperlink is best understood if it is seen as another step in a much longer and broader...

  10. Treatment of the X and γ rays lung monitoring spectra obtained by using HP-Ge detectors in case of exposures to uranium

    International Nuclear Information System (INIS)

    Berard, P.; Pourret, O.; Aussel, J.P.; Rongier, E.

    1996-01-01

    A lung monitoring counting spectrum can be described as a random phenomenon. Channel-by-channel Poisson-type modelling was verified for cases of pure background. When carrying out spectral analysis for qualitative research, one must work with the sum of the detectors. The quantification must be calculated detector by detector. Statistical tests make it possible to certify that one or several peaks are really present in the organism. The calculations are currently made with automatic spectral analysis, peak search, specific area, statistics and probability of the real presence of analytic photo peak taking into account the morphological parameters of the worker. The results are analysed detector by detector, with and without the background of the room. Detection limits obtained in Pierrelatte in monitoring measurement conditions were assessed for variable tissues covering the range of subjects to be examined. For each subject, the calculations are made taking into account the equivalent tissue thicknesses derived from individual morphological parameters. This method makes it possible to quantify lung activities with a detection limit of 3.9 Bq ( 235 U; thirty minutes counting time; reference man parameters) and to monitor exposure to the different compounds of uranium. (author)

  11. Automatic modeling using PENELOPE of two HPGe detectors used for measurement of environmental samples by γ-spectrometry from a few sets of experimental efficiencies

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Mosqueda, F.; Martel, P.; Bolivar, J. P.

    2018-02-01

    The aim of this paper is to characterize two HPGe gamma-ray detectors used in two different laboratories for environmental radioactivity measurements, so as to perform efficiency calibrations by means of Monte Carlo Simulation. To achieve such an aim, methodologies developed in previous papers have been applied, based on the automatic optimization of the model of detector, so that the differences between computational and reference FEPEs are minimized. In this work, such reference FEPEs have been obtained experimentally from several measurements of the IAEA RGU-1 reference material for specific source-detector arrangements. The models of both detectors built through these methodologies have been validated by comparing with experimental results for several reference materials and different measurement geometries, showing deviations below 10% in most cases.

  12. Trace radioactive measurement in foodstuffs using high purity germanium detector

    International Nuclear Information System (INIS)

    Morco, Ryan P.; Racho, Joseph Michael D.; Castaneda, Soledad S.; Almoneda, Rosalina V.; Pabroa, Preciosa Corazon B.; Sucgang, Raymond J.

    2010-01-01

    Trace radioactivity in food has been seriously considered sources of potential harm after the accidental radioactive releases in the last decades which led to contamination of the food chain. Countermeasures are being used to reduce the radiological health risk to the population and to ensure that public safety and international commitments are met. Investigation of radioactive traces in foods was carried out by gamma-ray spectrometry. The radionuclides being measured were fission products 1 37Cs and 1 34Cs and naturally occurring 4 0Κ. Gamma-ray measurements were performed using a hybrid gamma-ray counting system with coaxial p-type Tennelec High Purity Germanium (HPGe) detector with relative efficiency of 18.4%. Channels were calibrated to energies using a standard check source with 1 37Cs and 6 0Co present. Self-shielding within samples was taken into account by comparing directly with reference standards of similar matrix and geometry. Efficiencies of radionuclides of interests were accounted in calculating the activity concentrations in the samples. Efficiency calibration curve was generated using an in-house validated program called FINDPEAK, a least-square method that fits a polynomial up to sixth-order of equation. Lower Limits of Detection (LLD) obtained for both 1 37Cs and 1 34Cs ranges from 1-6 Bq/Kg depending on the sample matrix. In the last five years, there have been no foodstuffs analyzed exceeded the local and international regulatory limit of 1000Bq/Kg for the summed activities of 1 37Cs and 1 34Cs. (author)

  13. Segmentation in cinema perception.

    Science.gov (United States)

    Carroll, J M; Bever, T G

    1976-03-12

    Viewers perceptually segment moving picture sequences into their cinematically defined units: excerpts that follow short film sequences are recognized faster when the excerpt originally came after a structural cinematic break (a cut or change in the action) than when it originally came before the break.

  14. Dictionary Based Image Segmentation

    DEFF Research Database (Denmark)

    Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2015-01-01

    We propose a method for weakly supervised segmentation of natural images, which may contain both textured or non-textured regions. Our texture representation is based on a dictionary of image patches. To divide an image into separated regions with similar texture we use an implicit level sets...

  15. Unsupervised Image Segmentation

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Mikeš, Stanislav

    2014-01-01

    Roč. 36, č. 4 (2014), s. 23-23 R&D Projects: GA ČR(CZ) GA14-10911S Institutional support: RVO:67985556 Keywords : unsupervised image segmentation Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2014/RO/haindl-0434412.pdf

  16. Metrics for image segmentation

    Science.gov (United States)

    Rees, Gareth; Greenway, Phil; Morray, Denise

    1998-07-01

    An important challenge in mapping image-processing techniques onto applications is the lack of quantitative performance measures. From a systems engineering perspective these are essential if system level requirements are to be decomposed into sub-system requirements which can be understood in terms of algorithm selection and performance optimization. Nowhere in computer vision is this more evident than in the area of image segmentation. This is a vigorous and innovative research activity, but even after nearly two decades of progress, it remains almost impossible to answer the question 'what would the performance of this segmentation algorithm be under these new conditions?' To begin to address this shortcoming, we have devised a well-principled metric for assessing the relative performance of two segmentation algorithms. This allows meaningful objective comparisons to be made between their outputs. It also estimates the absolute performance of an algorithm given ground truth. Our approach is an information theoretic one. In this paper, we describe the theory and motivation of our method, and present practical results obtained from a range of state of the art segmentation methods. We demonstrate that it is possible to measure the objective performance of these algorithms, and to use the information so gained to provide clues about how their performance might be improved.

  17. Retina image–based optic disc segmentation

    Directory of Open Access Journals (Sweden)

    Ching-Lin Wang

    2016-05-01

    Full Text Available The change of optic disc can be used to diagnose many eye diseases, such as glaucoma, diabetic retinopathy and macular degeneration. Moreover, retinal blood vessel pattern is unique for human beings even for identical twins. It is a highly stable pattern in biometric identification. Since optic disc is the beginning of the optic nerve and main blood vessels in retina, it can be used as a reference point of identification. Therefore, optic disc segmentation is an important technique for developing a human identity recognition system and eye disease diagnostic system. This article hence presents an optic disc segmentation method to extract the optic disc from a retina image. The experimental results show that the optic disc segmentation method can give impressive results in segmenting the optic disc from a retina image.

  18. Electromechanically cooled germanium radiation detector system

    International Nuclear Information System (INIS)

    Lavietes, Anthony D.; Joseph Mauger, G.; Anderson, Eric H.

    1999-01-01

    We have successfully developed and fielded an electromechanically cooled germanium radiation detector (EMC-HPGe) at Lawrence Livermore National Laboratory (LLNL). This detector system was designed to provide optimum energy resolution, long lifetime, and extremely reliable operation for unattended and portable applications. For most analytical applications, high purity germanium (HPGe) detectors are the standard detectors of choice, providing an unsurpassed combination of high energy resolution performance and exceptional detection efficiency. Logistical difficulties associated with providing the required liquid nitrogen (LN) for cooling is the primary reason that these systems are found mainly in laboratories. The EMC-HPGe detector system described in this paper successfully provides HPGe detector performance in a portable instrument that allows for isotopic analysis in the field. It incorporates a unique active vibration control system that allows the use of a Sunpower Stirling cycle cryocooler unit without significant spectral degradation from microphonics. All standard isotopic analysis codes, including MGA and MGA++, GAMANL, GRPANL and MGAU, typically used with HPGe detectors can be used with this system with excellent results. Several national and international Safeguards organisations including the International Atomic Energy Agency (IAEA) and U.S. Department of Energy (DOE) have expressed interest in this system. The detector was combined with custom software and demonstrated as a rapid Field Radiometric Identification System (FRIS) for the U.S. Customs Service . The European Communities' Safeguards Directorate (EURATOM) is field-testing the first Safeguards prototype in their applications. The EMC-HPGe detector system design, recent applications, and results will be highlighted

  19. Prognostic validation of a 17-segment score derived from a 20-segment score for myocardial perfusion SPECT interpretation.

    Science.gov (United States)

    Berman, Daniel S; Abidov, Aiden; Kang, Xingping; Hayes, Sean W; Friedman, John D; Sciammarella, Maria G; Cohen, Ishac; Gerlach, James; Waechter, Parker B; Germano, Guido; Hachamovitch, Rory

    2004-01-01

    Recently, a 17-segment model of the left ventricle has been recommended as an optimally weighted approach for interpreting myocardial perfusion single photon emission computed tomography (SPECT). Methods to convert databases from previous 20- to new 17-segment data and criteria for abnormality for the 17-segment scores are needed. Initially, for derivation of the conversion algorithm, 65 patients were studied (algorithm population) (pilot group, n = 28; validation group, n = 37). Three conversion algorithms were derived: algorithm 1, which used mid, distal, and apical scores; algorithm 2, which used distal and apical scores alone; and algorithm 3, which used maximal scores of the distal septal, lateral, and apical segments in the 20-segment model for 3 corresponding segments of the 17-segment model. The prognosis population comprised 16,020 consecutive patients (mean age, 65 +/- 12 years; 41% women) who had exercise or vasodilator stress technetium 99m sestamibi myocardial perfusion SPECT and were followed up for 2.1 +/- 0.8 years. In this population, 17-segment scores were derived from 20-segment scores by use of algorithm 2, which demonstrated the best agreement with expert 17-segment reading in the algorithm population. The prognostic value of the 20- and 17-segment scores was compared by converting the respective summed scores into percent myocardium abnormal. Conversion algorithm 2 was found to be highly concordant with expert visual analysis by the 17-segment model (r = 0.982; kappa = 0.866) in the algorithm population. In the prognosis population, 456 cardiac deaths occurred during follow-up. When the conversion algorithm was applied, extent and severity of perfusion defects were nearly identical by 20- and derived 17-segment scores. The receiver operating characteristic curve areas by 20- and 17-segment perfusion scores were identical for predicting cardiac death (both 0.77 +/- 0.02, P = not significant). The optimal prognostic cutoff value for either 20

  20. The effectiveness of a high output/short duration radiofrequency current application technique in segmental pulmonary vein isolation for atrial fibrillation

    DEFF Research Database (Denmark)

    Nilsson, Brian; Chen, Xu; Pehrson, Steen

    2006-01-01

    groups. In the conventional group (Group 1, 45 patients), the power output was limited to 30 W with a target temperature of 50 degrees C and an RF preset duration of 120 s. In the novel group (Group 2, 45 patients), the maximum power output was preset to 45 W, with a target temperature of 55 degrees C......AIMS: Segmental pulmonary vein (PV) isolation by radiofrequency (RF) catheter ablation has become a curative therapy for atrial fibrillation (AF). However, the long procedure time limits the wide application of this procedure. The aim of the current study was to compare a novel ablation technique...... and duration of 20 s. In Group 2, a significant reduction in the PV isolation time (127+/-57 vs. 94+/-33 min, P

  1. High power laser therapy treatment compared to simple segmental physical rehabilitation in whiplash injuries (1° and 2° grade of the Quebec Task Force classification) involving muscles and ligaments.

    Science.gov (United States)

    Conforti, Maria; Fachinetti, Giorgio Paolo

    2013-04-01

    whiplash is a frequent post traumatic pathology caused by muscle, tendon and capsular elements over stretching. The authors conducted a short term prospective randomised study to test the effectiveness of a multi wave High Power Laser Therapy (HPLT) versus conventional simple segmental physical rehabilitation (PT) included in Italian tariff nomenclature performance physiotherapy Study Design: prospective randomised study (Level II). the authors identified 135 homogeneous patients with whiplash grade 1 - 2 of the Quebec Task Force classification (QTFC). INAIL, the Italian National Workers Insurance, based in Milan, was reliable source for identifying patients. All patients with whiplash injuries grade 1 or 2 QTFC, were eligible for the study, starting from April 28 2010 to September 30 2010. Patients referred to a Coordinator (C.M.) who applied the inclusion and exclusion criteria. Patients who agreed to participate were randomly assigned to one of the two treatment groups. Dates for initial treatment session were arranged, including cervical spine X-ray, and assessment. Each patient gave informed consent for participation and agreed to adopt only the study treatment for 6 weeks. Group A (84 patients) was treated with High Power Laser Therapy (HPLT), Group B (51 patients) received conventional simple segmental physical rehabilitation (PT). During the treatment period, no other electro-medical therapy, analgesics or anti-inflammatory drugs were allowed. All patients were assessed at baseline (T0) and at the end of the treatment period (T1) using a Visual Analogical Scale (VAS), (T2) the date of return to work was registered afterwards. there was a reduction in VAS pain scores at T1. Group A (VAS = 20) Group B (VAS = 34,8) (p =0.0048). Laser treatment allowed quick recovery and return to work (T2). Group A after 48 days against 66 days of Group B (p=0.0005). results suggest that High Power Laser Therapy - is an effective treatment in patients with whiplash injury

  2. Status of the segment interconnect, cable segment ancillary logic, and the cable segment hybrid driver projects

    International Nuclear Information System (INIS)

    Swoboda, C.; Barsotti, E.; Chappa, S.; Downing, R.; Goeransson, G.; Lensy, D.; Moore, G.; Rotolo, C.; Urish, J.

    1985-01-01

    The FASTBUS Segment Interconnect (SI) provides a communication path between two otherwise independent, asynchronous bus segments. In particular, the Segment Interconnect links a backplane crate segment to a cable segment. All standard FASTBUS address and data transactions can be passed through the SI or any number of SIs and segments in a path. Thus systems of arbitrary connection complexity can be formed, allowing simultaneous independent processing, yet still permitting devices associated with one segment to be accessed from others. The model S1 Segment Interconnect and the Cable Segment Ancillary Logic covered in this report comply with all the mandatory features stated in the FASTBUS specification document DOE/ER-0189. A block diagram of the SI is shown

  3. The melt rheological behavior of AB, ABA, BAB, and (AB)n block copolymers with monodisperse aramide segments

    NARCIS (Netherlands)

    Araichimani, A.; Dullaert, Konraad; Gaymans, R.J.

    2009-01-01

    The melt rheological behavior of segmented block copolymers with high melting diamide (A) hard segments (HS) and polyether (B) soft segments was studied. The block copolymers can be classified as B (monoblock), AB (diblock), ABA (triblock, diamide end segment), BAB (triblock, diamide mid-segment)

  4. Osmotic and Heat Stress Effects on Segmentation.

    Directory of Open Access Journals (Sweden)

    Julian Weiss

    Full Text Available During vertebrate embryonic development, early skin, muscle, and bone progenitor populations organize into segments known as somites. Defects in this conserved process of segmentation lead to skeletal and muscular deformities, such as congenital scoliosis, a curvature of the spine caused by vertebral defects. Environmental stresses such as hypoxia or heat shock produce segmentation defects, and significantly increase the penetrance and severity of vertebral defects in genetically susceptible individuals. Here we show that a brief exposure to a high osmolarity solution causes reproducible segmentation defects in developing zebrafish (Danio rerio embryos. Both osmotic shock and heat shock produce border defects in a dose-dependent manner, with an increase in both frequency and severity of defects. We also show that osmotic treatment has a delayed effect on somite development, similar to that observed in heat shocked embryos. Our results establish osmotic shock as an alternate experimental model for stress, affecting segmentation in a manner comparable to other known environmental stressors. The similar effects of these two distinct environmental stressors support a model in which a variety of cellular stresses act through a related response pathway that leads to disturbances in the segmentation process.

  5. Scintillation counter, segmented shield

    International Nuclear Information System (INIS)

    Olson, R.E.; Thumim, A.D.

    1975-01-01

    A scintillation counter, particularly for counting gamma ray photons, includes a massive lead radiation shield surrounding a sample-receiving zone. The shield is disassembleable into a plurality of segments to allow facile installation and removal of a photomultiplier tube assembly, the segments being so constructed as to prevent straight-line access of external radiation through the shield into radiation-responsive areas. Provisions are made for accurately aligning the photomultiplier tube with respect to one or more sample-transmitting bores extending through the shield to the sample receiving zone. A sample elevator, used in transporting samples into the zone, is designed to provide a maximum gamma-receiving aspect to maximize the gamma detecting efficiency. (U.S.)

  6. Head segmentation in vertebrates

    OpenAIRE

    Kuratani, Shigeru; Schilling, Thomas

    2008-01-01

    Classic theories of vertebrate head segmentation clearly exemplify the idealistic nature of comparative embryology prior to the 20th century. Comparative embryology aimed at recognizing the basic, primary structure that is shared by all vertebrates, either as an archetype or an ancestral developmental pattern. Modern evolutionary developmental (Evo-Devo) studies are also based on comparison, and therefore have a tendency to reduce complex embryonic anatomy into overly simplified patterns. Her...

  7. Video segmentation using keywords

    Science.gov (United States)

    Ton-That, Vinh; Vong, Chi-Tai; Nguyen-Dao, Xuan-Truong; Tran, Minh-Triet

    2018-04-01

    At DAVIS-2016 Challenge, many state-of-art video segmentation methods achieve potential results, but they still much depend on annotated frames to distinguish between background and foreground. It takes a lot of time and efforts to create these frames exactly. In this paper, we introduce a method to segment objects from video based on keywords given by user. First, we use a real-time object detection system - YOLOv2 to identify regions containing objects that have labels match with the given keywords in the first frame. Then, for each region identified from the previous step, we use Pyramid Scene Parsing Network to assign each pixel as foreground or background. These frames can be used as input frames for Object Flow algorithm to perform segmentation on entire video. We conduct experiments on a subset of DAVIS-2016 dataset in half the size of its original size, which shows that our method can handle many popular classes in PASCAL VOC 2012 dataset with acceptable accuracy, about 75.03%. We suggest widely testing by combining other methods to improve this result in the future.

  8. CT-based manual segmentation and evaluation of paranasal sinuses.

    Science.gov (United States)

    Pirner, S; Tingelhoff, K; Wagner, I; Westphal, R; Rilk, M; Wahl, F M; Bootz, F; Eichhorn, Klaus W G

    2009-04-01

    Manual segmentation of computed tomography (CT) datasets was performed for robot-assisted endoscope movement during functional endoscopic sinus surgery (FESS). Segmented 3D models are needed for the robots' workspace definition. A total of 50 preselected CT datasets were each segmented in 150-200 coronal slices with 24 landmarks being set. Three different colors for segmentation represent diverse risk areas. Extension and volumetric measurements were performed. Three-dimensional reconstruction was generated after segmentation. Manual segmentation took 8-10 h for each CT dataset. The mean volumes were: right maxillary sinus 17.4 cm(3), left side 17.9 cm(3), right frontal sinus 4.2 cm(3), left side 4.0 cm(3), total frontal sinuses 7.9 cm(3), sphenoid sinus right side 5.3 cm(3), left side 5.5 cm(3), total sphenoid sinus volume 11.2 cm(3). Our manually segmented 3D-models present the patient's individual anatomy with a special focus on structures in danger according to the diverse colored risk areas. For safe robot assistance, the high-accuracy models represent an average of the population for anatomical variations, extension and volumetric measurements. They can be used as a database for automatic model-based segmentation. None of the segmentation methods so far described provide risk segmentation. The robot's maximum distance to the segmented border can be adjusted according to the differently colored areas.

  9. Market segmentation in behavioral perspective.

    OpenAIRE

    Wells, V.K.; Chang, S.W.; Oliveira-Castro, J.M.; Pallister, J.

    2010-01-01

    A segmentation approach is presented using both traditional demographic segmentation bases (age, social class/occupation, and working status) and a segmentation by benefits sought. The benefits sought in this case are utilitarian and informational reinforcement, variables developed from the Behavioral Perspective Model (BPM). Using data from 1,847 consumers and from a total of 76,682 individual purchases, brand choice and price and reinforcement responsiveness were assessed for each segment a...

  10. EXILL—a high-efficiency, high-resolution setup for γ-spectroscopy at an intense cold neutron beam facility

    Science.gov (United States)

    Jentschel, M.; Blanc, A.; de France, G.; Köster, U.; Leoni, S.; Mutti, P.; Simpson, G.; Soldner, T.; Ur, C.; Urban, W.; Ahmed, S.; Astier, A.; Augey, L.; Back, T.; Baczyk, P.; Bajoga, A.; Balabanski, D.; Belgya, T.; Benzoni, G.; Bernards, C.; Biswas, D. C.; Bocchi, G.; Bottoni, S.; Britton, R.; Bruyneel, B.; Burnett, J.; Cakirli, R. B.; Carroll, R.; Catford, W.; Cederwall, B.; Celikovic, I.; Cieplicka-Oryńczak, N.; Clement, E.; Cooper, N.; Crespi, F.; Csatlos, M.; Curien, D.; Czerwiński, M.; Danu, L. S.; Davies, A.; Didierjean, F.; Drouet, F.; Duchêne, G.; Ducoin, C.; Eberhardt, K.; Erturk, S.; Fraile, L. M.; Gottardo, A.; Grente, L.; Grocutt, L.; Guerrero, C.; Guinet, D.; Hartig, A.-L.; Henrich, C.; Ignatov, A.; Ilieva, S.; Ivanova, D.; John, B. V.; John, R.; Jolie, J.; Kisyov, S.; Krticka, M.; Konstantinopoulos, T.; Korgul, A.; Krasznahorkay, A.; Kröll, T.; Kurpeta, J.; Kuti, I.; Lalkovski, S.; Larijani, C.; Leguillon, R.; Lica, R.; Litaize, O.; Lozeva, R.; Magron, C.; Mancuso, C.; Ruiz Martinez, E.; Massarczyk, R.; Mazzocchi, C.; Melon, B.; Mengoni, D.; Michelagnoli, C.; Million, B.; Mokry, C.; Mukhopadhyay, S.; Mulholland, K.; Nannini, A.; Napoli, D. R.; Olaizola, B.; Orlandi, R.; Patel, Z.; Paziy, V.; Petrache, C.; Pfeiffer, M.; Pietralla, N.; Podolyak, Z.; Ramdhane, M.; Redon, N.; Regan, P.; Regis, J. M.; Regnier, D.; Oliver, R. J.; Rudigier, M.; Runke, J.; Rzaca-Urban, T.; Saed-Samii, N.; Salsac, M. D.; Scheck, M.; Schwengner, R.; Sengele, L.; Singh, P.; Smith, J.; Stezowski, O.; Szpak, B.; Thomas, T.; Thürauf, M.; Timar, J.; Tom, A.; Tomandl, I.; Tornyi, T.; Townsley, C.; Tuerler, A.; Valenta, S.; Vancraeyenest, A.; Vandone, V.; Vanhoy, J.; Vedia, V.; Warr, N.; Werner, V.; Wilmsen, D.; Wilson, E.; Zerrouki, T.; Zielinska, M.

    2017-11-01

    In the EXILL campaign a highly efficient array of high purity germanium (HPGe) detectors was operated at the cold neutron beam facility PF1B of the Institut Laue-Langevin (ILL) to carry out nuclear structure studies, via measurements of γ-rays following neutron-induced capture and fission reactions. The setup consisted of a collimation system producing a pencil beam with a thermal capture equivalent flux of about 108 n s-1cm-2 at the target position and negligible neutron halo. The target was surrounded by an array of eight to ten anti-Compton shielded EXOGAM Clover detectors, four to six anti-Compton shielded large coaxial GASP detectors and two standard Clover detectors. For a part of the campaign the array was combined with 16 LaBr3:(Ce) detectors from the FATIMA collaboration. The detectors were arranged in an array of rhombicuboctahedron geometry, providing the possibility to carry out very precise angular correlation and directional-polarization correlation measurements. The triggerless acquisition system allowed a signal collection rate of up to 6 × 105 Hz. The data allowed to set multi-fold coincidences to obtain decay schemes and in combination with the FATIMA array of LaBr3:(Ce) detectors to analyze half-lives of excited levels in the pico- to microsecond range. Precise energy and efficiency calibrations of EXILL were performed using standard calibration sources of 133Ba, 60Co and 152Eu as well as data from the reactions 27Al(n,γ)28Al and 35Cl(n,γ)36Cl in the energy range from 30 keV up to 10 MeV.

  11. Segmenting the Adult Education Market.

    Science.gov (United States)

    Aurand, Tim

    1994-01-01

    Describes market segmentation and how the principles of segmentation can be applied to the adult education market. Indicates that applying segmentation techniques to adult education programs results in programs that are educationally and financially satisfying and serve an appropriate population. (JOW)

  12. Market Segmentation for Information Services.

    Science.gov (United States)

    Halperin, Michael

    1981-01-01

    Discusses the advantages and limitations of market segmentation as strategy for the marketing of information services made available by nonprofit organizations, particularly libraries. Market segmentation is defined, a market grid for libraries is described, and the segmentation of information services is outlined. A 16-item reference list is…

  13. Albedo estimation for scene segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, C H; Rosenfeld, A

    1983-03-01

    Standard methods of image segmentation do not take into account the three-dimensional nature of the underlying scene. For example, histogram-based segmentation tacitly assumes that the image intensity is piecewise constant, and this is not true when the scene contains curved surfaces. This paper introduces a method of taking 3d information into account in the segmentation process. The image intensities are adjusted to compensate for the effects of estimated surface orientation; the adjusted intensities can be regarded as reflectivity estimates. When histogram-based segmentation is applied to these new values, the image is segmented into parts corresponding to surfaces of constant reflectivity in the scene. 7 references.

  14. FRAMEWORK FOR COMPARING SEGMENTATION ALGORITHMS

    Directory of Open Access Journals (Sweden)

    G. Sithole

    2015-05-01

    Full Text Available The notion of a ‘Best’ segmentation does not exist. A segmentation algorithm is chosen based on the features it yields, the properties of the segments (point sets it generates, and the complexity of its algorithm. The segmentation is then assessed based on a variety of metrics such as homogeneity, heterogeneity, fragmentation, etc. Even after an algorithm is chosen its performance is still uncertain because the landscape/scenarios represented in a point cloud have a strong influence on the eventual segmentation. Thus selecting an appropriate segmentation algorithm is a process of trial and error. Automating the selection of segmentation algorithms and their parameters first requires methods to evaluate segmentations. Three common approaches for evaluating segmentation algorithms are ‘goodness methods’, ‘discrepancy methods’ and ‘benchmarks’. Benchmarks are considered the most comprehensive method of evaluation. This paper shortcomings in current benchmark methods are identified and a framework is proposed that permits both a visual and numerical evaluation of segmentations for different algorithms, algorithm parameters and evaluation metrics. The concept of the framework is demonstrated on a real point cloud. Current results are promising and suggest that it can be used to predict the performance of segmentation algorithms.

  15. Investigation of the pulse shape analysis for the position sensitive γ-ray spectrometer AGATA

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, Lars; Birkenbach, Benedikt; Reiter, Peter [Institut fuer Kernphysik Koeln (Germany); Collaboration: AGATA-Collaboration

    2015-07-01

    The next generation of γ-ray spectrometers like AGATA will provide high quality γ-ray spectra by the new Gamma-Ray Tracking technique (GRT). Position sensitive HPGe detectors will allow for precise Doppler correction and small broadening of lines for spectroscopy at relativistic energies. GRT is based on the interaction position of the γ-rays within the volume of the highly segmented germanium detectors provided by Pulse Shape Analysis (PSA) methods. The proof of principle of GRT was already demonstrated with great success however systematic deviations from expected results occur. The parameterization of the following detector properties and their impact on PSA were thoroughly investigated and optimized: electron and hole mobility, crystal axis orientation, space charge distributions, crystal impurities, response functions of preamplifiers and digitizers, linear and differential crosstalk, time alignment of pulses and the distance metric. Results of an improved PSA performance are presented.

  16. Fast and effective determination of strontium-90 in high volumes water samples

    International Nuclear Information System (INIS)

    Basarabova, B.; Dulanska, S.

    2014-01-01

    A simple and fast method was developed for determination of 90 Sr in high volumes of water samples from vicinity of nuclear power facilities. Samples were taken from the environment near Nuclear Power Plants in Jaslovske Bohunice and Mochovce in Slovakia. For determination of 90 Sr was used solid phase extraction using commercial sorbent Analig R Sr-01 from company IBC Advanced Technologies, Inc.. Determination of 90 Sr was performed with dilute solution of HNO 3 (1.5-2 M) and also tested in base medium with NaOH. For elution of 90 Sr was used eluent EDTA with pH in range 8-9. To achieve fast determination, automation was applied, which brings significant reduction of separation time. Concentration of water samples with evaporation was not necessary. Separation was performed immediately after filtration of analyzed samples. The aim of this study was development of less expensive, time unlimited and energy saving method for determination of 90 Sr in comparison with conventional methods. Separation time for fast-flow with volume of 10 dm 3 of water samples was 3.5 hours (flow-rate approximately 3.2 dm 3 / 1 hour). Radiochemical strontium yield was traced by using radionuclide 85 Sr. Samples were measured with HPGe detector (High-purity Germanium detector) at energy E φ = 514 keV. By using Analig R Sr-01 yields in range 72 - 96 % were achieved. Separation based on solid phase extraction using Analig R Sr-01 employing utilization of automation offers new, fast and effective method for determination of 90 Sr in water matrix. After ingrowth of yttrium samples were measured by Liquid Scintillation Spectrometer Packard Tricarb 2900 TR with software Quanta Smart. (authors)

  17. Adapting Mask-RCNN for Automatic Nucleus Segmentation

    OpenAIRE

    Johnson, Jeremiah W.

    2018-01-01

    Automatic segmentation of microscopy images is an important task in medical image processing and analysis. Nucleus detection is an important example of this task. Mask-RCNN is a recently proposed state-of-the-art algorithm for object detection, object localization, and object instance segmentation of natural images. In this paper we demonstrate that Mask-RCNN can be used to perform highly effective and efficient automatic segmentations of a wide range of microscopy images of cell nuclei, for ...

  18. Optimally segmented magnetic structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bahl, Christian; Bjørk, Rasmus

    We present a semi-analytical algorithm for magnet design problems, which calculates the optimal way to subdivide a given design region into uniformly magnetized segments.The availability of powerful rare-earth magnetic materials such as Nd-Fe-B has broadened the range of applications of permanent...... is not available.We will illustrate the results for magnet design problems from different areas, such as electric motors/generators (as the example in the picture), beam focusing for particle accelerators and magnetic refrigeration devices.......We present a semi-analytical algorithm for magnet design problems, which calculates the optimal way to subdivide a given design region into uniformly magnetized segments.The availability of powerful rare-earth magnetic materials such as Nd-Fe-B has broadened the range of applications of permanent...... magnets[1][2]. However, the powerful rare-earth magnets are generally expensive, so both the scientific and industrial communities have devoted a lot of effort into developing suitable design methods. Even so, many magnet optimization algorithms either are based on heuristic approaches[3...

  19. Open-source software platform for medical image segmentation applications

    Science.gov (United States)

    Namías, R.; D'Amato, J. P.; del Fresno, M.

    2017-11-01

    Segmenting 2D and 3D images is a crucial and challenging problem in medical image analysis. Although several image segmentation algorithms have been proposed for different applications, no universal method currently exists. Moreover, their use is usually limited when detection of complex and multiple adjacent objects of interest is needed. In addition, the continually increasing volumes of medical imaging scans require more efficient segmentation software design and highly usable applications. In this context, we present an extension of our previous segmentation framework which allows the combination of existing explicit deformable models in an efficient and transparent way, handling simultaneously different segmentation strategies and interacting with a graphic user interface (GUI). We present the object-oriented design and the general architecture which consist of two layers: the GUI at the top layer, and the processing core filters at the bottom layer. We apply the framework for segmenting different real-case medical image scenarios on public available datasets including bladder and prostate segmentation from 2D MRI, and heart segmentation in 3D CT. Our experiments on these concrete problems show that this framework facilitates complex and multi-object segmentation goals while providing a fast prototyping open-source segmentation tool.

  20. Generalized pixel profiling and comparative segmentation with application to arteriovenous malformation segmentation.

    Science.gov (United States)

    Babin, D; Pižurica, A; Bellens, R; De Bock, J; Shang, Y; Goossens, B; Vansteenkiste, E; Philips, W

    2012-07-01

    Extraction of structural and geometric information from 3-D images of blood vessels is a well known and widely addressed segmentation problem. The segmentation of cerebral blood vessels is of great importance in diagnostic and clinical applications, with a special application in diagnostics and surgery on arteriovenous malformations (AVM). However, the techniques addressing the problem of the AVM inner structure segmentation are rare. In this work we present a novel method of pixel profiling with the application to segmentation of the 3-D angiography AVM images. Our algorithm stands out in situations with low resolution images and high variability of pixel intensity. Another advantage of our method is that the parameters are set automatically, which yields little manual user intervention. The results on phantoms and real data demonstrate its effectiveness and potentials for fine delineation of AVM structure. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Segmentation of the Infant Food Market

    OpenAIRE

    Hrůzová, Daniela

    2015-01-01

    The theoretical part covers general market segmentation, namely the marketing importance of differences among consumers, the essence of market segmentation, its main conditions and the process of segmentation, which consists of four consecutive phases - defining the market, determining important criteria, uncovering segments and developing segment profiles. The segmentation criteria, segmentation approaches, methods and techniques for the process of market segmentation are also described in t...

  2. Voluminal modelling for the characterization of wastes packages by gamma emission computed tomography

    International Nuclear Information System (INIS)

    Pettier, J.L.; Thierry, R.

    2001-01-01

    The aim of this work is to model the measurement process used for multi-photon emission computed tomography on nuclear waste drum. Our model MEPHISTO (Multi-Energy PHoton Imagery through Segmented TOmography) takes into account all phenomena influencing gamma emergent flux and high resolution spectrometric measurements using an HpGe detector through a collimator aperture. These phenomena are absorption and Compton scattering of gamma photons in waste drum, geometrical blur, spatial and energetic response of the detector. The analysis of results shows better localisation and quantification performances compared with a Ray-Driven method. It proves the importance of an accurate modelization of collimated measurements to reduce noise and stabilize iterative image reconstructions. (authors)

  3. Pulse shape analysis for γ-ray tracking. Part I: Pulse shape simulation with JASS

    International Nuclear Information System (INIS)

    Schlarb, M.; Gernhaeuser, R.; Klupp, S.; Kruecken, R.

    2011-01-01

    Next-generation γ -ray spectrometers based on highly segmented HPGe detectors are using the recent technique of γ -ray tracking to significantly improve on efficiency and Doppler correction capabilities. A precise reconstruction of the individual interaction locations within the active material is possible through the use of pulse shape analysis (PSA) which, in turn, demands an accurate knowledge of the detector response. We developed JASS, a Java-based simulation software package to generate pulse shapes for the AGATA detectors from physics constraints and basic material parameters. For verifying the simulation experimental data from a coincidence scan with known interaction locations was used. The achieved position resolution, in the order of a few millimeters, is within the requirements of the γ -ray tracking array. (orig.)

  4. Calibration of lung counter using a CT model of Torso phantom and Monte Carlo method

    International Nuclear Information System (INIS)

    Zhang Binquan; Ma Jizeng; Yang Duanjie; Liu Liye; Cheng Jianping

    2006-01-01

    Tomography image of a Torso phantom was obtained from CT-Scan. The Torso phantom represents the trunk of an adult man that is 170 cm high and weight of 65 kg. After these images were segmented, cropped, and resized, a 3-dimension voxel phantom was created. The voxel phantom includes more than 2 million voxels, which size was 2.73 mm x 2.73 mm x 3 mm. This model could be used for the calibration of lung counter with Monte Carlo method. On the assumption that radioactive material was homogeneously distributed throughout the lung, counting efficiencies of a HPGe detector in different positions were calculated as Adipose Mass fraction (AMF) was different in the soft tissue in chest. The results showed that counting efficiencies of the lung counter changed up to 67% for 17.5 keV γ ray and 20% for 25 keV γ ray when AMF changed from 0 to 40%. (authors)

  5. Innovative visualization and segmentation approaches for telemedicine

    Science.gov (United States)

    Nguyen, D.; Roehrig, Hans; Borders, Marisa H.; Fitzpatrick, Kimberly A.; Roveda, Janet

    2014-09-01

    In health care applications, we obtain, manage, store and communicate using high quality, large volume of image data through integrated devices. In this paper we propose several promising methods that can assist physicians in image data process and communication. We design a new semi-automated segmentation approach for radiological images, such as CT and MRI to clearly identify the areas of interest. This approach combines the advantages from both the region-based method and boundary-based methods. It has three key steps compose: coarse segmentation by using fuzzy affinity and homogeneity operator, image division and reclassification using the Voronoi Diagram, and refining boundary lines using the level set model.

  6. Laser Truss Sensor for Segmented Telescope Phasing

    Science.gov (United States)

    Liu, Duncan T.; Lay, Oliver P.; Azizi, Alireza; Erlig, Herman; Dorsky, Leonard I.; Asbury, Cheryl G.; Zhao, Feng

    2011-01-01

    A paper describes the laser truss sensor (LTS) for detecting piston motion between two adjacent telescope segment edges. LTS is formed by two point-to-point laser metrology gauges in a crossed geometry. A high-resolution (distribution can be optimized using the range-gated metrology (RGM) approach.

  7. An Improved FCM Medical Image Segmentation Algorithm Based on MMTD

    Directory of Open Access Journals (Sweden)

    Ningning Zhou

    2014-01-01

    Full Text Available Image segmentation plays an important role in medical image processing. Fuzzy c-means (FCM is one of the popular clustering algorithms for medical image segmentation. But FCM is highly vulnerable to noise due to not considering the spatial information in image segmentation. This paper introduces medium mathematics system which is employed to process fuzzy information for image segmentation. It establishes the medium similarity measure based on the measure of medium truth degree (MMTD and uses the correlation of the pixel and its neighbors to define the medium membership function. An improved FCM medical image segmentation algorithm based on MMTD which takes some spatial features into account is proposed in this paper. The experimental results show that the proposed algorithm is more antinoise than the standard FCM, with more certainty and less fuzziness. This will lead to its practicable and effective applications in medical image segmentation.

  8. Natural color image segmentation using integrated mechanism

    Institute of Scientific and Technical Information of China (English)

    Jie Xu (徐杰); Pengfei Shi (施鹏飞)

    2003-01-01

    A new method for natural color image segmentation using integrated mechanism is proposed in this paper.Edges are first detected in term of the high phase congruency in the gray-level image. K-mean cluster is used to label long edge lines based on the global color information to estimate roughly the distribution of objects in the image, while short ones are merged based on their positions and local color differences to eliminate the negative affection caused by texture or other trivial features in image. Region growing technique is employed to achieve final segmentation results. The proposed method unifies edges, whole and local color distributions, as well as spatial information to solve the natural image segmentation problem.The feasibility and effectiveness of this method have been demonstrated by various experiments.

  9. CERES: A new cerebellum lobule segmentation method.

    Science.gov (United States)

    Romero, Jose E; Coupé, Pierrick; Giraud, Rémi; Ta, Vinh-Thong; Fonov, Vladimir; Park, Min Tae M; Chakravarty, M Mallar; Voineskos, Aristotle N; Manjón, Jose V

    2017-02-15

    The human cerebellum is involved in language, motor tasks and cognitive processes such as attention or emotional processing. Therefore, an automatic and accurate segmentation method is highly desirable to measure and understand the cerebellum role in normal and pathological brain development. In this work, we propose a patch-based multi-atlas segmentation tool called CERES (CEREbellum Segmentation) that is able to automatically parcellate the cerebellum lobules. The proposed method works with standard resolution magnetic resonance T1-weighted images and uses the Optimized PatchMatch algorithm to speed up the patch matching process. The proposed method was compared with related recent state-of-the-art methods showing competitive results in both accuracy (average DICE of 0.7729) and execution time (around 5 minutes). Copyright © 2016 Elsevier Inc. All rights reserved.

  10. SIDES - Segment Interconnect Diagnostic Expert System

    International Nuclear Information System (INIS)

    Booth, A.W.; Forster, R.; Gustafsson, L.; Ho, N.

    1989-01-01

    It is well known that the FASTBUS Segment Interconnect (SI) provides a communication path between two otherwise independent, asynchronous bus segments. The SI is probably the most important module in any FASTBUS data acquisition network since it's failure to function can cause whole segments of the network to be inaccessible and sometimes inoperable. This paper describes SIDES, an intelligent program designed to diagnose SI's both in situ as they operate in a data acquisition network, and in the laboratory in an acceptance/repair environment. The paper discusses important issues such as knowledge acquisition; extracting knowledge from human experts and other knowledge sources. SIDES can benefit high energy physics experiments, where SI problems can be diagnosed and solved more quickly. Equipment pool technicians can also benefit from SIDES, first by decreasing the number of SI's erroneously turned in for repair, and secondly as SIDES acts as an intelligent assistant to the technician in the diagnosis and repair process

  11. Fast globally optimal segmentation of cells in fluorescence microscopy images.

    Science.gov (United States)

    Bergeest, Jan-Philip; Rohr, Karl

    2011-01-01

    Accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression in high-throughput screening applications. We propose a new approach for segmenting cell nuclei which is based on active contours and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images of different cell types. We have also performed a quantitative comparison with previous segmentation approaches.

  12. 256-pixel microcalorimeter array for high-resolution γ-ray spectroscopy of mixed-actinide materials

    Energy Technology Data Exchange (ETDEWEB)

    Winkler, R., E-mail: rwinkler@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States); Hoover, A.S.; Rabin, M.W. [Los Alamos National Laboratory, Los Alamos, NM (United States); Bennett, D.A.; Doriese, W.B.; Fowler, J.W.; Hays-Wehle, J.; Horansky, R.D.; Reintsema, C.D.; Schmidt, D.R.; Vale, L.R.; Ullom, J.N. [National Institute of Standards and Technology, Boulder, CO (United States)

    2015-01-11

    The application of cryogenic microcalorimeter detectors to γ-ray spectroscopy allows for measurements with unprecedented energy resolution. These detectors are ideally suited for γ-ray spectroscopy applications for which the measurement quality is limited by the spectral overlap of many closely spaced transitions using conventional detector technologies. The non-destructive analysis of mixed-isotope Pu materials is one such application where the precision can be potentially improved utilizing microcalorimeter detectors compared to current state-of-the-art high-purity Ge detectors (HPGe). The LANL-NIST γ-ray spectrometer, a 256-pixel microcalorimeter array based on transition-edge sensors (TESs), was recently commissioned and used to collect data on a variety of Pu isotopic standards to characterize the instrument performance. These measurements represent the first time the simultaneous readout of all 256 pixels for measurements of mixed-isotope Pu materials has been achieved. The LANL-NIST γ-ray spectrometer has demonstrated an average pixel resolution of 55 eV full-width-at-half-maximum at 100 keV, nearly an order of magnitude better than HPGe detectors. Some challenges of the analysis of many-channel ultra-high resolution data and the techniques used to produce quality spectra for isotopic analysis will be presented. The LANL-NIST γ-ray spectrometer has also demonstrated stable operation and obtained high resolution measurements at total array event rates beyond 1 kHz. For a total event rate of 1.25 kHz, approximately 5.6 cps/pixel, a 72.2 eV average FWHM for the 103 keV photopeak of {sup 153}Gd was achieved.

  13. Phasing multi-segment undulators

    International Nuclear Information System (INIS)

    Chavanne, J.; Elleaume, P.; Vaerenbergh, P. Van

    1996-01-01

    An important issue in the manufacture of multi-segment undulators as a source of synchrotron radiation or as a free-electron laser (FEL) is the phasing between successive segments. The state of the art is briefly reviewed, after which a novel pure permanent magnet phasing section that is passive and does not require any current is presented. The phasing section allows the introduction of a 6 mm longitudinal gap between each segment, resulting in complete mechanical independence and reduced magnetic interaction between segments. The tolerance of the longitudinal positioning of one segment with respect to the next is found to be 2.8 times lower than that of conventional phasing. The spectrum at all gaps and useful harmonics is almost unchanged when compared with a single-segment undulator of the same total length. (au) 3 refs

  14. The LOFT Ground Segment

    DEFF Research Database (Denmark)

    Bozzo, E.; Antonelli, A.; Argan, A.

    2014-01-01

    targets per orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the proprietary period will be 12 months). The WFM continuously monitors about 1/3 of the sky at a time and provides data for about ~100 sources a day, resulting in a total of ~20 GB of additional telemetry. The LOFT...... Burst alert System additionally identifies on-board bright impulsive events (e.g., Gamma-ray Bursts, GRBs) and broadcasts the corresponding position and trigger time to the ground using a dedicated system of ~15 VHF receivers. All WFM data are planned to be made public immediately. In this contribution...... we summarize the planned organization of the LOFT ground segment (GS), as established in the mission Yellow Book 1 . We describe the expected GS contributions from ESA and the LOFT consortium. A review is provided of the planned LOFT data products and the details of the data flow, archiving...

  15. Segmented heat exchanger

    Science.gov (United States)

    Baldwin, Darryl Dean; Willi, Martin Leo; Fiveland, Scott Byron; Timmons, Kristine Ann

    2010-12-14

    A segmented heat exchanger system for transferring heat energy from an exhaust fluid to a working fluid. The heat exchanger system may include a first heat exchanger for receiving incoming working fluid and the exhaust fluid. The working fluid and exhaust fluid may travel through at least a portion of the first heat exchanger in a parallel flow configuration. In addition, the heat exchanger system may include a second heat exchanger for receiving working fluid from the first heat exchanger and exhaust fluid from a third heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the second heat exchanger in a counter flow configuration. Furthermore, the heat exchanger system may include a third heat exchanger for receiving working fluid from the second heat exchanger and exhaust fluid from the first heat exchanger. The working fluid and exhaust fluid may travel through at least a portion of the third heat exchanger in a parallel flow configuration.

  16. International EUREKA: Market Segment

    International Nuclear Information System (INIS)

    1982-03-01

    The purpose of the Market Segment of the EUREKA model is to simultaneously project uranium market prices, uranium supply and purchasing activities. The regional demands are extrinsic. However, annual forward contracting activities to meet these demands as well as inventory requirements are calculated. The annual price forecast is based on relatively short term, forward balances between available supply and desired purchases. The forecasted prices and extrapolated price trends determine decisions related to exploration and development, new production operations, and the operation of existing capacity. Purchasing and inventory requirements are also adjusted based on anticipated prices. The calculation proceeds one year at a time. Conditions calculated at the end of one year become the starting conditions for the calculation in the subsequent year

  17. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  18. Segmented rail linear induction motor

    Science.gov (United States)

    Cowan, Jr., Maynard; Marder, Barry M.

    1996-01-01

    A segmented rail linear induction motor has a segmented rail consisting of a plurality of nonferrous electrically conductive segments aligned along a guideway. The motor further includes a carriage including at least one pair of opposed coils fastened to the carriage for moving the carriage. A power source applies an electric current to the coils to induce currents in the conductive surfaces to repel the coils from adjacent edges of the conductive surfaces.

  19. Segmentation-DrivenTomographic Reconstruction

    DEFF Research Database (Denmark)

    Kongskov, Rasmus Dalgas

    such that the segmentation subsequently can be carried out by use of a simple segmentation method, for instance just a thresholding method. We tested the advantages of going from a two-stage reconstruction method to a one stage segmentation-driven reconstruction method for the phase contrast tomography reconstruction......The tomographic reconstruction problem is concerned with creating a model of the interior of an object from some measured data, typically projections of the object. After reconstructing an object it is often desired to segment it, either automatically or manually. For computed tomography (CT...

  20. Automated medical image segmentation techniques

    Directory of Open Access Journals (Sweden)

    Sharma Neeraj

    2010-01-01

    Full Text Available Accurate segmentation of medical images is a key step in contouring during radiotherapy planning. Computed topography (CT and Magnetic resonance (MR imaging are the most widely used radiographic techniques in diagnosis, clinical studies and treatment planning. This review provides details of automated segmentation methods, specifically discussed in the context of CT and MR images. The motive is to discuss the problems encountered in segmentation of CT and MR images, and the relative merits and limitations of methods currently available for segmentation of medical images.

  1. ADVANCED CLUSTER BASED IMAGE SEGMENTATION

    Directory of Open Access Journals (Sweden)

    D. Kesavaraja

    2011-11-01

    Full Text Available This paper presents efficient and portable implementations of a useful image segmentation technique which makes use of the faster and a variant of the conventional connected components algorithm which we call parallel Components. In the Modern world majority of the doctors are need image segmentation as the service for various purposes and also they expect this system is run faster and secure. Usually Image segmentation Algorithms are not working faster. In spite of several ongoing researches in Conventional Segmentation and its Algorithms might not be able to run faster. So we propose a cluster computing environment for parallel image Segmentation to provide faster result. This paper is the real time implementation of Distributed Image Segmentation in Clustering of Nodes. We demonstrate the effectiveness and feasibility of our method on a set of Medical CT Scan Images. Our general framework is a single address space, distributed memory programming model. We use efficient techniques for distributing and coalescing data as well as efficient combinations of task and data parallelism. The image segmentation algorithm makes use of an efficient cluster process which uses a novel approach for parallel merging. Our experimental results are consistent with the theoretical analysis and practical results. It provides the faster execution time for segmentation, when compared with Conventional method. Our test data is different CT scan images from the Medical database. More efficient implementations of Image Segmentation will likely result in even faster execution times.

  2. Lay-out and construction of a pressure vessel built-up of cast steel segments for a pebble-bed high temperature reactor with a thermal power of 3000 MW

    International Nuclear Information System (INIS)

    Voigt, J.

    1978-03-01

    The prestressed cast vessel is an alternative to the prestressed concrete vessel for big high temperature reactors. In this report different cast steel vessel concepts for an HTR for generation of current with 3000 MW(th) are compared concerning their realization and economy. The most favourable variant serves as a base for the lay-out of the single vessel components as cast steel segments, bracing, cooling and outer sealing. Hereby the actual available possibilities of production and transport are considered. For the concept worked out possibilities of inspection and repair are suggested. A comparison of costs with adequate proposititons of the industry for a prestressed concrete and a cast iron pressure vessel investigates the economical competition. (orig.) [de

  3. Cochlea Segmentation using Iterated Random Walks with Shape Prior

    DEFF Research Database (Denmark)

    Ruiz Pujadas, Esmeralda; Kjer, Hans Martin; Vera, Sergio

    2016-01-01

    Cochlear implants can restore hearing to deaf or partially deaf patients. In order to plan the intervention, a model from high resolution μCT images is to be built from accurate cochlea segmentations and then, adapted to a patient-specific model. Thus, a precise segmentation is required to build...

  4. Coincidence gamma-ray spectrometry

    DEFF Research Database (Denmark)

    Markovic, Nikola; Roos, Per; Nielsen, Sven Poul

    2017-01-01

    Gamma-ray spectrometry with high-purity germanium (HPGe) detectors is often the technique of choice in an environmental radioactivity laboratory. When measuring environmental samples associated activities are usually low so an important parameter that describes the performance of the spectrometer...... for a nuclide of interest is the minimum detectable activity (MDA). There are many ways for lowering the MDAs in gamma spectrometry. Recently, developments of fast and compact digital acquisition systems have led to growing number of multiple HPGe detector spectrometers. In these applications all detected...

  5. Parallel fuzzy connected image segmentation on GPU.

    Science.gov (United States)

    Zhuge, Ying; Cao, Yong; Udupa, Jayaram K; Miller, Robert W

    2011-07-01

    Image segmentation techniques using fuzzy connectedness (FC) principles have shown their effectiveness in segmenting a variety of objects in several large applications. However, one challenge in these algorithms has been their excessive computational requirements when processing large image datasets. Nowadays, commodity graphics hardware provides a highly parallel computing environment. In this paper, the authors present a parallel fuzzy connected image segmentation algorithm implementation on NVIDIA's compute unified device Architecture (CUDA) platform for segmenting medical image data sets. In the FC algorithm, there are two major computational tasks: (i) computing the fuzzy affinity relations and (ii) computing the fuzzy connectedness relations. These two tasks are implemented as CUDA kernels and executed on GPU. A dramatic improvement in speed for both tasks is achieved as a result. Our experiments based on three data sets of small, medium, and large data size demonstrate the efficiency of the parallel algorithm, which achieves a speed-up factor of 24.4x, 18.1x, and 10.3x, correspondingly, for the three data sets on the NVIDIA Tesla C1060 over the implementation of the algorithm on CPU, and takes 0.25, 0.72, and 15.04 s, correspondingly, for the three data sets. The authors developed a parallel algorithm of the widely used fuzzy connected image segmentation method on the NVIDIA GPUs, which are far more cost- and speed-effective than both cluster of workstations and multiprocessing systems. A near-interactive speed of segmentation has been achieved, even for the large data set.

  6. Region segmentation along image sequence

    International Nuclear Information System (INIS)

    Monchal, L.; Aubry, P.

    1995-01-01

    A method to extract regions in sequence of images is proposed. Regions are not matched from one image to the following one. The result of a region segmentation is used as an initialization to segment the following and image to track the region along the sequence. The image sequence is exploited as a spatio-temporal event. (authors). 12 refs., 8 figs

  7. Market Segmentation: An Instructional Module.

    Science.gov (United States)

    Wright, Peter H.

    A concept-based introduction to market segmentation is provided in this instructional module for undergraduate and graduate transportation-related courses. The material can be used in many disciplines including engineering, business, marketing, and technology. The concept of market segmentation is primarily a transportation planning technique by…

  8. IFRS 8 – OPERATING SEGMENTS

    Directory of Open Access Journals (Sweden)

    BOCHIS LEONICA

    2009-05-01

    Full Text Available Segment reporting in accordance with IFRS 8 will be mandatory for annual financial statements covering periods beginning on or after 1 January 2009. The standards replaces IAS 14, Segment Reporting, from that date. The objective of IFRS 8 is to require

  9. Reduplication Facilitates Early Word Segmentation

    Science.gov (United States)

    Ota, Mitsuhiko; Skarabela, Barbora

    2018-01-01

    This study explores the possibility that early word segmentation is aided by infants' tendency to segment words with repeated syllables ("reduplication"). Twenty-four nine-month-olds were familiarized with passages containing one novel reduplicated word and one novel non-reduplicated word. Their central fixation times in response to…

  10. The Importance of Marketing Segmentation

    Science.gov (United States)

    Martin, Gillian

    2011-01-01

    The rationale behind marketing segmentation is to allow businesses to focus on their consumers' behaviors and purchasing patterns. If done effectively, marketing segmentation allows an organization to achieve its highest return on investment (ROI) in turn for its marketing and sales expenses. If an organization markets its products or services to…

  11. Essays in international market segmentation

    NARCIS (Netherlands)

    Hofstede, ter F.

    1999-01-01

    The primary objective of this thesis is to develop and validate new methodologies to improve the effectiveness of international segmentation strategies. The current status of international market segmentation research is reviewed in an introductory chapter, which provided a number of

  12. Planning and delivering high doses to targets surrounding the spinal cord at the lower neck and upper mediastinal levels: static beam-segmentation technique executed by a multileaf collimator

    International Nuclear Information System (INIS)

    Schelfhout, J.; Derycke, S.; Fortan, L.; Van Duyse, B.; Colle, C.; De Wagter, C.; De Neve, W.

    1995-01-01

    The possibility to plan and deliver beam intensity modulated radiotherapy using a general purpose 3D-planning system (Sherouse's GRATISTM) and a linear accelerator equipped with a standard multileaf collimator (MLC) was investigated in view of limiting the dose at the spinal cord below tolerance. During the planning process, dose homogenization at the target is obtained by the calculation of the weights, given to beam segments of a specific predetermined geometry. This specific geometry maximizes the area of each segment and thus reduces the number of segments. With a virtual patient in supine position, a first planning using a single isocenter, with gantry positions of -60, -30, 0, 30 and 60 degrees was performed. Medial edges of all segments were located tangential to the spinal cord. The resulting dose distribution allowed to encompass the target by an isodose surface of 66-70 Gy without exceeding spinal cord tolerance but required 42 segments distributed over 5 gantry angles. Therefore, dose-volume histogram analysis were performed for those cases where: 1) for some gantry positions, all beam segments could be omitted; 2) at the remaining gantry angles, segments could be omitted; 3) at least 2 segments could be traded off against 1 additional gantry angle. This procedure resulted in a final plan containing 22 segments spread over 8 gantry angles. Preliminary dosimetric results on a RANDO phantom support the robustness of the method. The first clinical applications have been planned. Although up to 99 beam segments can be programmed on the Philips SL25 linear accelerator, it remained impossible to use these segments synchronized with the MLC. From a clinical viewpoint, the proposed treatment for irradiating lower neck and upper mediastinal targets could be used as a standard against which other solutions might be tested

  13. Planning and delivering high doses to targets surrounding the spinal cord at the lower neck and upper mediastinal levels: static beam-segmentation technique executed by a multileaf collimator

    Energy Technology Data Exchange (ETDEWEB)

    Schelfhout, J; Derycke, S; Fortan, L; Van Duyse, B; Colle, C; De Wagter, C; De Neve, W [Ghent Rijksuniversiteit (Belgium). Kliniek voor Radiotherapie en Kerngeneeskunde

    1995-12-01

    The possibility to plan and deliver beam intensity modulated radiotherapy using a general purpose 3D-planning system (Sherouse`s GRATISTM) and a linear accelerator equipped with a standard multileaf collimator (MLC) was investigated in view of limiting the dose at the spinal cord below tolerance. During the planning process, dose homogenization at the target is obtained by the calculation of the weights, given to beam segments of a specific predetermined geometry. This specific geometry maximizes the area of each segment and thus reduces the number of segments. With a virtual patient in supine position, a first planning using a single isocenter, with gantry positions of -60, -30, 0, 30 and 60 degrees was performed. Medial edges of all segments were located tangential to the spinal cord. The resulting dose distribution allowed to encompass the target by an isodose surface of 66-70 Gy without exceeding spinal cord tolerance but required 42 segments distributed over 5 gantry angles. Therefore, dose-volume histogram analysis were performed for those cases where: (1) for some gantry positions, all beam segments could be omitted; (2) at the remaining gantry angles, segments could be omitted; (3) at least 2 segments could be traded off against 1 additional gantry angle. This procedure resulted in a final plan containing 22 segments spread over 8 gantry angles. Preliminary dosimetric results on a RANDO phantom support the robustness of the method. The first clinical applications have been planned. Although up to 99 beam segments can be programmed on the Philips SL25 linear accelerator, it remained impossible to use these segments synchronized with the MLC. From a clinical viewpoint, the proposed treatment for irradiating lower neck and upper mediastinal targets could be used as a standard against which other solutions might be tested.

  14. Evaluation of anterior segment changes using ultrasound biomicroscopy following phacoemulsification and implantation of one-piece and three-piece intraocular lenses in high myopia

    Directory of Open Access Journals (Sweden)

    Reham Mohamed Samy

    2017-01-01

    The study of this correlation is to help in answering an important question − that is, does one diameter IOL fit all capsular bags? The statistical analyses in both groups point to a statistically significant correlation between the diameters of the capsular bags and the diameters of the implanted IOLs in both groups A and B, which demonstrates the absence of fitting of both sorts of IOLs in relation to the enlarged capsular bags of the highly myopic eyes − i.e. it focuses on the divergence between the size of the IOLs and that of the capsular bag.

  15. LOCA testing of high burnup PWR fuel in the HBWR. Additional PIE on the cladding of the segment 650-5

    Energy Technology Data Exchange (ETDEWEB)

    Oberlaender, B.C.; Espeland, M.; Jenssen, H.K.

    2008-07-01

    IFA-650.5, a test with pre-irradiated fuel in the Halden Project LOCA test series, was conducted on October 23rd, 2006. The fuel rod had been used in a commercial PWR and had a high burnup, 83 MWd/kgU. Experimental arrangements of the fifth test were similar to the preceding LOCA tests. The peak cladding temperature (PCT) level was higher than in the third and fourth tests, 1050 C. A peak temperature close to the target was achieved and cladding burst occurred at approx. 750 C. Within the joint programme framework of the Halden Project PIE was done, consisting of gamma scanning, visual inspection, neutron-radiography, hydrogen analysis and metallography / ceramography. An additional extensive PIE including metallography, hydrogen analysis, and hardness measurements of cross-sections at seven axial elevations was done. It was completed to study the high burnup and LOCA induced effects on the Zr-4 cladding, namely the migration of oxygen into the cladding from the inside surface, the cladding distension, and the burst (author)(tk)

  16. Segmental vitiligo with segmental morphea: An autoimmune link?

    Directory of Open Access Journals (Sweden)

    Pravesh Yadav

    2014-01-01

    Full Text Available An 18-year old girl with segmental vitiligo involving the left side of the trunk and left upper limb with segmental morphea involving the right side of trunk and right upper limb without any deeper involvement is illustrated. There was no history of preceding drug intake, vaccination, trauma, radiation therapy, infection, or hormonal therapy. Family history of stable vitiligo in her brother and a history of type II diabetes mellitus in the father were elicited. Screening for autoimmune diseases and antithyroid antibody was negative. An autoimmune link explaining the co-occurrence has been proposed. Cutaneous mosiacism could explain the presence of both the pathologies in a segmental distribution.

  17. Name segmentation using hidden Markov models and its application in record linkage

    Directory of Open Access Journals (Sweden)

    Rita de Cassia Braga Gonçalves

    2014-10-01

    Full Text Available This study aimed to evaluate the use of hidden Markov models (HMM for the segmentation of person names and its influence on record linkage. A HMM was applied to the segmentation of patient’s and mother’s names in the databases of the Mortality Information System (SIM, Information Subsystem for High Complexity Procedures (APAC, and Hospital Information System (AIH. A sample of 200 patients from each database was segmented via HMM, and the results were compared to those from segmentation by the authors. The APAC-SIM and APAC-AIH databases were linked using three different segmentation strategies, one of which used HMM. Conformity of segmentation via HMM varied from 90.5% to 92.5%. The different segmentation strategies yielded similar results in the record linkage process. This study suggests that segmentation of Brazilian names via HMM is no more effective than traditional segmentation approaches in the linkage process.

  18. Benchmark for license plate character segmentation

    Science.gov (United States)

    Gonçalves, Gabriel Resende; da Silva, Sirlene Pio Gomes; Menotti, David; Shwartz, William Robson

    2016-09-01

    Automatic license plate recognition (ALPR) has been the focus of many researches in the past years. In general, ALPR is divided into the following problems: detection of on-track vehicles, license plate detection, segmentation of license plate characters, and optical character recognition (OCR). Even though commercial solutions are available for controlled acquisition conditions, e.g., the entrance of a parking lot, ALPR is still an open problem when dealing with data acquired from uncontrolled environments, such as roads and highways when relying only on imaging sensors. Due to the multiple orientations and scales of the license plates captured by the camera, a very challenging task of the ALPR is the license plate character segmentation (LPCS) step, because its effectiveness is required to be (near) optimal to achieve a high recognition rate by the OCR. To tackle the LPCS problem, this work proposes a benchmark composed of a dataset designed to focus specifically on the character segmentation step of the ALPR within an evaluation protocol. Furthermore, we propose the Jaccard-centroid coefficient, an evaluation measure more suitable than the Jaccard coefficient regarding the location of the bounding box within the ground-truth annotation. The dataset is composed of 2000 Brazilian license plates consisting of 14000 alphanumeric symbols and their corresponding bounding box annotations. We also present a straightforward approach to perform LPCS efficiently. Finally, we provide an experimental evaluation for the dataset based on five LPCS approaches and demonstrate the importance of character segmentation for achieving an accurate OCR.

  19. Document flow segmentation for business applications

    Science.gov (United States)

    Daher, Hani; Belaïd, Abdel

    2013-12-01

    The aim of this paper is to propose a document flow supervised segmentation approach applied to real world heterogeneous documents. Our algorithm treats the flow of documents as couples of consecutive pages and studies the relationship that exists between them. At first, sets of features are extracted from the pages where we propose an approach to model the couple of pages into a single feature vector representation. This representation will be provided to a binary classifier which classifies the relationship as either segmentation or continuity. In case of segmentation, we consider that we have a complete document and the analysis of the flow continues by starting a new document. In case of continuity, the couple of pages are assimilated to the same document and the analysis continues on the flow. If there is an uncertainty on whether the relationship between the couple of pages should be classified as a continuity or segmentation, a rejection is decided and the pages analyzed until this point are considered as a "fragment". The first classification already provides good results approaching 90% on certain documents, which is high at this level of the system.

  20. Using Predictability for Lexical Segmentation.

    Science.gov (United States)

    Çöltekin, Çağrı

    2017-09-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.

  1. Proposal of a segmentation procedure for skid resistance data

    International Nuclear Information System (INIS)

    Tejeda, S. V.; Tampier, Hernan de Solominihac; Navarro, T.E.

    2008-01-01

    Skin resistance of pavements presents a high spatial variability along a road. This pavement characteristic is directly related to wet weather accidents; therefore, it is important to identify and characterize the skid resistance of homogeneous segments along a road in order to implement proper road safety management. Several data segmentation methods have been applied to other pavement characteristics (e.g. roughness). However, no application to skin resistance data was found during the literature review for this study. Typical segmentation methods are rather too general or too specific to ensure a detailed segmentation of skid resistance data, which can be used for managing pavement performance. The main objective of this paper is to propose a procedure for segmenting skid resistance data, based on existing data segmentation methods. The procedure needs to be efficient and to fulfill road management requirements. The proposed procedure considers the Leverage method to identify outlier data, the CUSUM method to accomplish initial data segmentation and a statistical method to group consecutive segments that are statistically similar. The statistical method applies the Student's t-test of mean equities, along with analysis of variance and the Tuckey test for the multiple comparison of means. The proposed procedure was applied to a sample of skid resistance data measured with SCRIM (Side Force Coefficient Routine Investigatory Machine) on a 4.2 km section of Chilean road and was compared to conventional segmentation methods. Results showed that the proposed procedure is more efficient than the conventional segmentation procedures, achieving the minimum weighted sum of square errors (SSEp) with all the identified segments statistically different. Due to its mathematical basis, proposed procedure can be easily adapted and programmed for use in road safety management. (author)

  2. Comparison of novel ultra-high molecular weight polyethylene tape versus conventional metal wire for sublaminar segmental fixation in the treatment of adolescent idiopathic scoliosis.

    Science.gov (United States)

    Takahata, Masahiko; Ito, Manabu; Abumi, Kuniyoshi; Kotani, Yoshihisa; Sudo, Hideki; Ohshima, Shigeki; Minami, Akio

    2007-08-01

    Retrospective study. To compare the surgical outcomes of posterior translational correction and fusion using hybrid instrumentation systems with either sublaminar Nesplon tape or sublaminar metal wire to treat adolescent idiopathic scoliosis (AIS). Nesplon tape, which consists of a thread of ultra-high molecular weight polyethylene fibers, has advantages over metal wire: (1) its soft and flexible properties avoid neural damage and (2) its flat configuration avoids focal distribution of the stresses to lamina; however, the efficacy of Nesplon tape in the correction of spinal deformity is as yet, still unclear. Thirty AIS patients at a single institution underwent posterior correction and fusion using hybrid instrumentation containing hook, pedicle screw, and either sublaminar polyethylene taping (15) or sublaminar metal wiring (15). Patients were evaluated preoperatively, immediately after surgery, and at a 2-year follow-up according to the radiographic changes in curve correction, operating time, intraoperative blood loss, complications, and the Scoliosis Research Society patient questionnaire (SRS-24) score. The average correction rate was 63.0% in the Nesplon tape group and 59.9% in the metal wire group immediately after surgery (P = 0.62). Fusion was obtained in all the patients without significant correction loss in both groups. There was no significant difference in operative time, intraoperative blood loss, and postoperative SRS-24 scores between the 2 groups. Complications were superficial skin infection in a single patient in the Nesplon tape group, and transient sensory disturbance in 1 patient and temporal superior mesenteric artery syndrome in another patient in the metal wire group. The efficacy of Nesplon tape in correction of deformity is equivalent to that of metal wire, and fusion was completed without significant correction loss. The soft and flexible properties and flat configuration of Nesplon tape make this a safe application for the treatment

  3. Validation of {sup 226}Ra, {sup 228}Ra and {sup 210}Pb measurements in soil and sediment samples through high resolution gamma ray spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Danila Carrijo da Silva; Silva, Nivaldo Carlos da; Bonifacio, Rodrigo Leandro; Guerrero, Eder Tadeu Zenun [Comissao Nacional de Energia Nuclear (LAPOC/CNEN-MG), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2013-07-01

    Radionuclides found in ore extraction waste materials are a great source of concern regarding public health and environmental safety. One technique to determine the concentration of substances is high resolution gamma ray spectrometry using HPGe. Validating a measurement technique is essential to warrant high levels of quality to any scientific work. The Laboratory of Pocos de Caldas of the Brazilian Commission for Nuclear Energy partakes into a Quality Management System project, seeking Accreditation under ISO/IEC 17025 through the validation of techniques of chemical and radiometric analysis of environmental samples from water, soil and sediment. The focus of the Radon Laboratory at LAPOC is validation of Ra-226, Ra-228 and Pb-210 concentration determinations in soil and sediment through a gamma spectrometer system. The stages of this validation process included sample reception and preparation, detector calibration and sample analyses. Dried samples were sealed in metallic containers and analyzed after radioactive equilibrium between Ra-226 and daughters Pb-214 and Bi-214. Gamma spectrometry was performed using CANBERRA HPGe detector and gamma spectrum software Genie 2000. The photo peaks used for Ra-226 determination were 609 keV and 1020 keV of Bi-214 and 351 keV of Pb-214. For the Ra-228 determination a photopeak of 911 keV was used from its short half-life daughter Ac-228 (T1/2 = 6.12 h). For Pb-210, the photopeak of 46.5 keV was used, which, due to the low energy, self-absorption correction was needed. Parameters such as precision, bias/accuracy, linearity, detection limit and uncertainty were evaluated for that purpose. The results have pointed to satisfying results. (author)

  4. Validation of 226Ra, 228Ra and 210Pb measurements in soil and sediment samples through high resolution gamma ray spectrometry

    International Nuclear Information System (INIS)

    Dias, Danila Carrijo da Silva; Silva, Nivaldo Carlos da; Bonifacio, Rodrigo Leandro; Guerrero, Eder Tadeu Zenun

    2013-01-01

    Radionuclides found in ore extraction waste materials are a great source of concern regarding public health and environmental safety. One technique to determine the concentration of substances is high resolution gamma ray spectrometry using HPGe. Validating a measurement technique is essential to warrant high levels of quality to any scientific work. The Laboratory of Pocos de Caldas of the Brazilian Commission for Nuclear Energy partakes into a Quality Management System project, seeking Accreditation under ISO/IEC 17025 through the validation of techniques of chemical and radiometric analysis of environmental samples from water, soil and sediment. The focus of the Radon Laboratory at LAPOC is validation of Ra-226, Ra-228 and Pb-210 concentration determinations in soil and sediment through a gamma spectrometer system. The stages of this validation process included sample reception and preparation, detector calibration and sample analyses. Dried samples were sealed in metallic containers and analyzed after radioactive equilibrium between Ra-226 and daughters Pb-214 and Bi-214. Gamma spectrometry was performed using CANBERRA HPGe detector and gamma spectrum software Genie 2000. The photo peaks used for Ra-226 determination were 609 keV and 1020 keV of Bi-214 and 351 keV of Pb-214. For the Ra-228 determination a photopeak of 911 keV was used from its short half-life daughter Ac-228 (T1/2 = 6.12 h). For Pb-210, the photopeak of 46.5 keV was used, which, due to the low energy, self-absorption correction was needed. Parameters such as precision, bias/accuracy, linearity, detection limit and uncertainty were evaluated for that purpose. The results have pointed to satisfying results. (author)

  5. The Hierarchy of Segment Reports

    Directory of Open Access Journals (Sweden)

    Danilo Dorović

    2015-05-01

    Full Text Available The article presents an attempt to find the connection between reports created for managers responsible for different business segments. With this purpose, the hierarchy of the business reporting segments is proposed. This can lead to better understanding of the expenses under common responsibility of more than one manager since these expenses should be in more than one report. The structure of cost defined per business segment hierarchy with the aim of new, unusual but relevant cost structure for management can be established. Both could potentially bring new information benefits for management in the context of profit reporting.

  6. Segmental dilatation of the ileum

    Directory of Open Access Journals (Sweden)

    Tune-Yie Shih

    2017-01-01

    Full Text Available A 2-year-old boy was sent to the emergency department with the chief problem of abdominal pain for 1 day. He was just discharged from the pediatric ward with the diagnosis of mycoplasmal pneumonia and paralytic ileus. After initial examinations and radiographic investigations, midgut volvulus was impressed. An emergency laparotomy was performed. Segmental dilatation of the ileum with volvulus was found. The operative procedure was resection of the dilated ileal segment with anastomosis. The postoperative recovery was uneventful. The unique abnormality of gastrointestinal tract – segmental dilatation of the ileum, is described in details and the literature is reviewed.

  7. Stacking and metamorphism of continuous segments of subducted lithosphere in a high-pressure wedge: The example of Alpine Corsica (France)

    Science.gov (United States)

    Vitale Brovarone, Alberto; Beyssac, Olivier; Malavieille, Jacques; Molli, Giancarlo; Beltrando, Marco; Compagnoni, Roberto

    2013-01-01

    Alpine Corsica consists of a stack of variably metamorphosed units of continental and Tethys-derived rocks. It represents an excellent example of high-pressure (HP) orogenic belt, such as the Western Alps, exposed over a small and accessible area. Compared to the Western Alps, the geology of Alpine Corsica is poorly unraveled. During the 1970s-80s, based on either lithostratigraphic or metamorphic field observations, various classifications of the belt have been proposed, but these classifications have been rarely matched together. Furthermore, through time, the internal complexity of large domains has been progressively left aside in the frame of large-scale geodynamic reconstructions. As a consequence, major open questions on the internal structure of the belt have remained unsolved. Apart from a few local studies, Alpine Corsica has not benefited of modern developments in petrology and basin research. This feature results in several uncertainties when combining lithostratigraphic and metamorphic patterns and, consequently, in the definition of an exhaustive architecture of the belt. In this paper we provide a review on the geology of Alpine Corsica, paying particular attention to the available lithostratigraphic and metamorphic classifications of the metamorphic terranes. These data are completed by a new and exhaustive metamorphic dataset obtained by means of thermometry based on Raman Spectroscopy of Carbonaceous Material (RSCM). This technique provides reliable insights on the peak temperature of the metamorphic history for CM-bearing metasediments. A detailed metamorphic characterization of metasediments, which have been previously largely ignored due to retrogression or to the lack of diagnostic mineralogy, is thus obtained and fruitfully coupled with the available lithostratigraphic data. Nine main tectono-metamorphic units are defined, from subgreenschist (ca. 280-300 °C) to the lawsonite-eclogite-facies (ca. 500-550 °C) condition. These units are

  8. Accounting for segment correlations in segmented gamma-ray scans

    International Nuclear Information System (INIS)

    Sheppard, G.A.; Prettyman, T.H.; Piquette, E.C.

    1994-01-01

    In a typical segmented gamma-ray scanner (SGS), the detector's field of view is collimated so that a complete horizontal slice or segment of the desired thickness is visible. Ordinarily, the collimator is not deep enough to exclude gamma rays emitted from sample volumes above and below the segment aligned with the collimator. This can lead to assay biases, particularly for certain radioactive-material distributions. Another consequence of the collimator's low aspect ratio is that segment assays at the top and bottom of the sample are biased low because the detector's field of view is not filled. This effect is ordinarily countered by placing the sample on a low-Z pedestal and scanning one or more segment thicknesses below and above the sample. This takes extra time, however, We have investigated a number of techniques that both account for correlated segments and correct for end effects in SGS assays. Also, we have developed an algorithm that facilitates estimates of assay precision. Six calculation methods have been compared by evaluating the results of thousands of simulated, assays for three types of gamma-ray source distribution and ten masses. We will report on these computational studies and their experimental verification

  9. Continuously live image processor for drift chamber track segment triggering

    International Nuclear Information System (INIS)

    Berenyi, A.; Chen, H.K.; Dao, K.

    1999-01-01

    The first portion of the BaBar experiment Level 1 Drift Chamber Trigger pipeline is the Track Segment Finder (TSF). Using a novel method incorporating both occupancy and drift-time information, the TSF system continually searches for segments in the supercells of the full 7104-wire Drift Chamber hit image at 3.7 MHz. The TSF was constructed to operate in a potentially high beam-background environment while achieving high segment-finding efficiency, deadtime-free operation, a spatial resolution of 5 simulated physics events

  10. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  11. Objectness Supervised Merging Algorithm for Color Image Segmentation

    Directory of Open Access Journals (Sweden)

    Haifeng Sima

    2016-01-01

    Full Text Available Ideal color image segmentation needs both low-level cues and high-level semantic features. This paper proposes a two-hierarchy segmentation model based on merging homogeneous superpixels. First, a region growing strategy is designed for producing homogenous and compact superpixels in different partitions. Total variation smoothing features are adopted in the growing procedure for locating real boundaries. Before merging, we define a combined color-texture histogram feature for superpixels description and, meanwhile, a novel objectness feature is proposed to supervise the region merging procedure for reliable segmentation. Both color-texture histograms and objectness are computed to measure regional similarities between region pairs, and the mixed standard deviation of the union features is exploited to make stop criteria for merging process. Experimental results on the popular benchmark dataset demonstrate the better segmentation performance of the proposed model compared to other well-known segmentation algorithms.

  12. GPU accelerated fuzzy connected image segmentation by using CUDA.

    Science.gov (United States)

    Zhuge, Ying; Cao, Yong; Miller, Robert W

    2009-01-01

    Image segmentation techniques using fuzzy connectedness principles have shown their effectiveness in segmenting a variety of objects in several large applications in recent years. However, one problem of these algorithms has been their excessive computational requirements when processing large image datasets. Nowadays commodity graphics hardware provides high parallel computing power. In this paper, we present a parallel fuzzy connected image segmentation algorithm on Nvidia's Compute Unified Device Architecture (CUDA) platform for segmenting large medical image data sets. Our experiments based on three data sets with small, medium, and large data size demonstrate the efficiency of the parallel algorithm, which achieves a speed-up factor of 7.2x, 7.3x, and 14.4x, correspondingly, for the three data sets over the sequential implementation of fuzzy connected image segmentation algorithm on CPU.

  13. Automatic Segmentation of Vessels in In-Vivo Ultrasound Scans

    DEFF Research Database (Denmark)

    Tamimi-Sarnikowski, Philip; Brink-Kjær, Andreas; Moshavegh, Ramin

    2017-01-01

    presents a fully automatic segmentation algorithm, for robustly segmenting the vessel lumen in longitudinal B-mode ultrasound images. The automatic segmentation is performed using a combination of B-mode and power Doppler images. The proposed algorithm includes a series of preprocessing steps, and performs......Ultrasound has become highly popular to monitor atherosclerosis, by scanning the carotid artery. The screening involves measuring the thickness of the vessel wall and diameter of the lumen. An automatic segmentation of the vessel lumen, can enable the determination of lumen diameter. This paper...... a vessel segmentation by use of the marker-controlled watershed transform. The ultrasound images used in the study were acquired using the bk3000 ultrasound scanner (BK Ultrasound, Herlev, Denmark) with two transducers ”8L2 Linear” and ”10L2w Wide Linear” (BK Ultrasound, Herlev, Denmark). The algorithm...

  14. Fully convolutional network with cluster for semantic segmentation

    Science.gov (United States)

    Ma, Xiao; Chen, Zhongbi; Zhang, Jianlin

    2018-04-01

    At present, image semantic segmentation technology has been an active research topic for scientists in the field of computer vision and artificial intelligence. Especially, the extensive research of deep neural network in image recognition greatly promotes the development of semantic segmentation. This paper puts forward a method based on fully convolutional network, by cluster algorithm k-means. The cluster algorithm using the image's low-level features and initializing the cluster centers by the super-pixel segmentation is proposed to correct the set of points with low reliability, which are mistakenly classified in great probability, by the set of points with high reliability in each clustering regions. This method refines the segmentation of the target contour and improves the accuracy of the image segmentation.

  15. BlobContours: adapting Blobworld for supervised color- and texture-based image segmentation

    Science.gov (United States)

    Vogel, Thomas; Nguyen, Dinh Quyen; Dittmann, Jana

    2006-01-01

    Extracting features is the first and one of the most crucial steps in recent image retrieval process. While the color features and the texture features of digital images can be extracted rather easily, the shape features and the layout features depend on reliable image segmentation. Unsupervised image segmentation, often used in image analysis, works on merely syntactical basis. That is, what an unsupervised segmentation algorithm can segment is only regions, but not objects. To obtain high-level objects, which is desirable in image retrieval, human assistance is needed. Supervised image segmentations schemes can improve the reliability of segmentation and segmentation refinement. In this paper we propose a novel interactive image segmentation technique that combines the reliability of a human expert with the precision of automated image segmentation. The iterative procedure can be considered a variation on the Blobworld algorithm introduced by Carson et al. from EECS Department, University of California, Berkeley. Starting with an initial segmentation as provided by the Blobworld framework, our algorithm, namely BlobContours, gradually updates it by recalculating every blob, based on the original features and the updated number of Gaussians. Since the original algorithm has hardly been designed for interactive processing we had to consider additional requirements for realizing a supervised segmentation scheme on the basis of Blobworld. Increasing transparency of the algorithm by applying usercontrolled iterative segmentation, providing different types of visualization for displaying the segmented image and decreasing computational time of segmentation are three major requirements which are discussed in detail.

  16. Managerial segmentation of service offerings in work commuting.

    Science.gov (United States)

    2015-03-01

    Methodology to efficiently segment markets for public transportation offerings has been introduced and exemplified in an : application to an urban travel corridor in which high tech companies predominate. The principal objective has been to introduce...

  17. Pyramidal approach to license plate segmentation

    Science.gov (United States)

    Postolache, Alexandru; Trecat, Jacques C.

    1996-07-01

    Car identification is a goal in traffic control, transport planning, travel time measurement, managing parking lot traffic and so on. Most car identification algorithms contain a standalone plate segmentation process followed by a plate contents reading. A pyramidal algorithm for license plate segmentation, looking for textured regions, has been developed on a PC based system running Unix. It can be used directly in applications not requiring real time. When input images are relatively small, real-time performance is in fact accomplished by the algorithm. When using large images, porting the algorithm to special digital signal processors can easily lead to preserving real-time performance. Experimental results, for stationary and moving cars in outdoor scenes, showed high accuracy and high scores in detecting the plate. The algorithm also deals with cases where many character strings are present in the image, and not only the one corresponding to the plate. This is done by the means of a constrained texture regions classification.

  18. What are Segments in Google Analytics

    Science.gov (United States)

    Segments find all sessions that meet a specific condition. You can then apply this segment to any report in Google Analytics (GA). Segments are a way of identifying sessions and users while filters identify specific events, like pageviews.

  19. Graph run-length matrices for histopathological image segmentation.

    Science.gov (United States)

    Tosun, Akif Burak; Gunduz-Demir, Cigdem

    2011-03-01

    The histopathological examination of tissue specimens is essential for cancer diagnosis and grading. However, this examination is subject to a considerable amount of observer variability as it mainly relies on visual interpretation of pathologists. To alleviate this problem, it is very important to develop computational quantitative tools, for which image segmentation constitutes the core step. In this paper, we introduce an effective and robust algorithm for the segmentation of histopathological tissue images. This algorithm incorporates the background knowledge of the tissue organization into segmentation. For this purpose, it quantifies spatial relations of cytological tissue components by constructing a graph and uses this graph to define new texture features for image segmentation. This new texture definition makes use of the idea of gray-level run-length matrices. However, it considers the runs of cytological components on a graph to form a matrix, instead of considering the runs of pixel intensities. Working with colon tissue images, our experiments demonstrate that the texture features extracted from "graph run-length matrices" lead to high segmentation accuracies, also providing a reasonable number of segmented regions. Compared with four other segmentation algorithms, the results show that the proposed algorithm is more effective in histopathological image segmentation.

  20. Segmentation and packaging reactor vessels internals

    International Nuclear Information System (INIS)

    Boucau, Joseph

    2014-01-01

    Document available in abstract form only, full text follows: With more than 25 years of experience in the development of reactor vessel internals and reactor vessel segmentation and packaging technology, Westinghouse has accumulated significant know-how in the reactor dismantling market. The primary challenges of a segmentation and packaging project are to separate the highly activated materials from the less-activated materials and package them into appropriate containers for disposal. Since disposal cost is a key factor, it is important to plan and optimize waste segmentation and packaging. The choice of the optimum cutting technology is also important for a successful project implementation and depends on some specific constraints. Detailed 3-D modeling is the basis for tooling design and provides invaluable support in determining the optimum strategy for component cutting and disposal in waste containers, taking account of the radiological and packaging constraints. The usual method is to start at the end of the process, by evaluating handling of the containers, the waste disposal requirements, what type and size of containers are available for the different disposal options, and working backwards to select a cutting method and finally the cut geometry required. The 3-D models can include intelligent data such as weight, center of gravity, curie content, etc, for each segmented piece, which is very useful when comparing various cutting, handling and packaging options. The detailed 3-D analyses and thorough characterization assessment can draw the attention to material potentially subject to clearance, either directly or after certain period of decay, to allow recycling and further disposal cost reduction. Westinghouse has developed a variety of special cutting and handling tools, support fixtures, service bridges, water filtration systems, video-monitoring systems and customized rigging, all of which are required for a successful reactor vessel internals

  1. Metal segmenting using abrasive and reciprocating saws

    International Nuclear Information System (INIS)

    Allen, R.P.; Fetrow, L.K.; Haun, F.E. Jr.

    1987-06-01

    This paper evaluates a light-weight, high-power abrasive saw for segmenting radioactively contaminated metal components. A unique application of a reciprocating mechanical saw for the remote disassembly of equipment in a hot cell also is described. The results of this work suggest that use of these techniques for selected remote sectioning applications could minimize operational and access problems and be very cost effective in comparison with other inherently faster sectioning methods. 2 refs., 7 figs

  2. Transmission Line Resonator Segmented with Series Capacitors

    DEFF Research Database (Denmark)

    Zhurbenko, Vitaliy; Boer, Vincent; Petersen, Esben Thade

    2016-01-01

    Transmission line resonators are often used as coils in high field MRI. Due to distributed nature of such resonators, coils based on them produce inhomogeneous field. This work investigates application of series capacitors to improve field homogeneity along the resonator. The equations for optimal...... values of evenly distributed capacitors are presented. The performances of the segmented resonator and a regular transmission line resonator are compared....

  3. Fast Streaming 3D Level set Segmentation on the GPU for Smooth Multi-phase Segmentation

    DEFF Research Database (Denmark)

    Sharma, Ojaswa; Zhang, Qin; Anton, François

    2011-01-01

    Level set method based segmentation provides an efficient tool for topological and geometrical shape handling, but it is slow due to high computational burden. In this work, we provide a framework for streaming computations on large volumetric images on the GPU. A streaming computational model...

  4. Automatic blood vessel based-liver segmentation using the portal phase abdominal CT

    Science.gov (United States)

    Maklad, Ahmed S.; Matsuhiro, Mikio; Suzuki, Hidenobu; Kawata, Yoshiki; Niki, Noboru; Shimada, Mitsuo; Iinuma, Gen

    2018-02-01

    Liver segmentation is the basis for computer-based planning of hepatic surgical interventions. In diagnosis and analysis of hepatic diseases and surgery planning, automatic segmentation of liver has high importance. Blood vessel (BV) has showed high performance at liver segmentation. In our previous work, we developed a semi-automatic method that segments the liver through the portal phase abdominal CT images in two stages. First stage was interactive segmentation of abdominal blood vessels (ABVs) and subsequent classification into hepatic (HBVs) and non-hepatic (non-HBVs). This stage had 5 interactions that include selective threshold for bone segmentation, selecting two seed points for kidneys segmentation, selection of inferior vena cava (IVC) entrance for starting ABVs segmentation, identification of the portal vein (PV) entrance to the liver and the IVC-exit for classifying HBVs from other ABVs (non-HBVs). Second stage is automatic segmentation of the liver based on segmented ABVs as described in [4]. For full automation of our method we developed a method [5] that segments ABVs automatically tackling the first three interactions. In this paper, we propose full automation of classifying ABVs into HBVs and non- HBVs and consequently full automation of liver segmentation that we proposed in [4]. Results illustrate that the method is effective at segmentation of the liver through the portal abdominal CT images.

  5. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  6. Market segmentation, targeting and positioning

    OpenAIRE

    Camilleri, Mark Anthony

    2017-01-01

    Businesses may not be in a position to satisfy all of their customers, every time. It may prove difficult to meet the exact requirements of each individual customer. People do not have identical preferences, so rarely does one product completely satisfy everyone. Many companies may usually adopt a strategy that is known as target marketing. This strategy involves dividing the market into segments and developing products or services to these segments. A target marketing strategy is focused on ...

  7. Recognition Using Classification and Segmentation Scoring

    National Research Council Canada - National Science Library

    Kimball, Owen; Ostendorf, Mari; Rohlicek, Robin

    1992-01-01

    .... We describe an approach to connected word recognition that allows the use of segmental information through an explicit decomposition of the recognition criterion into classification and segmentation scoring...

  8. Single-Molecule FISH Reveals Non-selective Packaging of Rift Valley Fever Virus Genome Segments.

    Directory of Open Access Journals (Sweden)

    Paul J Wichgers Schreur

    2016-08-01

    Full Text Available The bunyavirus genome comprises a small (S, medium (M, and large (L RNA segment of negative polarity. Although genome segmentation confers evolutionary advantages by enabling genome reassortment events with related viruses, genome segmentation also complicates genome replication and packaging. Accumulating evidence suggests that genomes of viruses with eight or more genome segments are incorporated into virions by highly selective processes. Remarkably, little is known about the genome packaging process of the tri-segmented bunyaviruses. Here, we evaluated, by single-molecule RNA fluorescence in situ hybridization (FISH, the intracellular spatio-temporal distribution and replication kinetics of the Rift Valley fever virus (RVFV genome and determined the segment composition of mature virions. The results reveal that the RVFV genome segments start to replicate near the site of infection before spreading and replicating throughout the cytoplasm followed by translocation to the virion assembly site at the Golgi network. Despite the average intracellular S, M and L genome segments approached a 1:1:1 ratio, major differences in genome segment ratios were observed among cells. We also observed a significant amount of cells lacking evidence of M-segment replication. Analysis of two-segmented replicons and four-segmented viruses subsequently confirmed the previous notion that Golgi recruitment is mediated by the Gn glycoprotein. The absence of colocalization of the different segments in the cytoplasm and the successful rescue of a tri-segmented variant with a codon shuffled M-segment suggested that inter-segment interactions are unlikely to drive the copackaging of the different segments into a single virion. The latter was confirmed by direct visualization of RNPs inside mature virions which showed that the majority of virions lack one or more genome segments. Altogether, this study suggests that RVFV genome packaging is a non-selective process.

  9. TIGRESS: TRIUMF-ISAC gamma-ray escape-suppressed spectrometer

    Science.gov (United States)

    Svensson, C. E.; Amaudruz, P.; Andreoiu, C.; Andreyev, A.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Boston, A. J.; Chakrawarthy, R. S.; Chen, A. A.; Churchman, R.; Drake, T. E.; Finlay, P.; Garrett, P. E.; Grinyer, G. F.; Hackman, G.; Hyland, B.; Jones, B.; Kanungo, R.; Maharaj, R.; Martin, J. P.; Morris, D.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Ressler, J. J.; Roy, R.; Sarazin, F.; Schumaker, M. A.; Scraggs, H. C.; Smith, M. B.; Starinsky, N.; Valiente-Dobón, J. J.; Waddington, J. C.; Watters, L. M.

    2005-10-01

    The TRIUMF-ISAC gamma-ray escape-suppressed spectrometer (TIGRESS) is a new γ-ray detector array being developed for use at TRIUMF's Isotope Separator and Accelerator (ISAC) radioactive ion beam facility. TIGRESS will comprise 12 32-fold segmented clover-type HPGe detectors coupled with 20-fold segmented modular Compton suppression shields and custom digital signal processing electronics. This paper provides an overview of the TIGRESS project and progress in its development to date.

  10. Planning and delivering high doses to targets surrounding the spinal cord at the lower neck and upper mediastinal levels: static beam-segmentation technique executed with a multileaf collimator

    International Nuclear Information System (INIS)

    Neve, W. de; Wagter, C. de; Jaeger, K. de; Thienpont, M.; Colle, C.; Derycke, S.; Schelfhout, J.

    1996-01-01

    Background and purpose. It remains a technical challenge to limit the dose to the spinal cord below tolerance if, in head and neck or thyroid cancer, the planning target volume reaches to a level below the shoulders. In order to avoid these dose limitations, we developed a standard plan involving Beam Intensity Modulation (BIM) executed by a static technique of beam segmentation. In this standard plan, many machine parameters (gantry angles, couch position, relative beam and segment weights) as well as the beam segmentation rules were identical for all patients. Materials and methods. The standard plan involved: the use of static beams with a single isocenter; BIM by field segmentation executable with a standard Philips multileaf collimator; virtual simulation and dose computation on a general 3D-planning system (Sherouse's GRATIS[reg]); heuristic computation of segment intensities and optimization (improving the dose distribution and reducing the execution time) by human intelligence. The standard plan used 20 segments spread over 8 gantry angles plus 2 non-segmented wedged beams (2 gantry angles). Results. The dose that could be achieved at the lowest target voxel, without exceeding tolerance of the spinal cord (50 Gy at highest voxel) was 70-80 Gy. The in-target 3D dose-inhomogeneity was ∼25%. The shortest time of execution of a treatment (22 segments) on a patient (unpublished) was 25 min. Conclusions. A heuristic model has been developed and investigated to obtain a 3D concave dose distribution applicable to irradiate targets in the lower neck and upper mediastinal regions. The technique spares efficiently the spinal cord and allows the delivery of higher target doses than with conventional techniques. It can be planned as a standard plan using conventional 3D-planning technology. The routine clinical implementation is performed with commercially available equipment, however, at the expense of extended execution times

  11. 2D.03: IMPROVING DIAGNOSTIC STRATEGY IN PATIENTS WITH LONG-STANDING HYPERTENSION, CHEST PAIN AND NORMAL RESTING ECG: VALUE OF THE EXERCISE HIGH-FREQUENCY QRS VERSUS ST-SEGMENT ANALYSIS.

    Science.gov (United States)

    Conti, A; Bianchi, S; Grifoni, C; Trausi, F; Angeli, E; Paolini, D; Catarzi, S; Perrotta, M E; Covelli, A; Renzi, N; Bertolini, P; Mazzucchelli, M

    2015-06-01

    The novel exercise computer-assisted high-frequency QRS-analysis (ex-HF/QRS) has demonstrated improved sensitivity and specificity over the conventional exercise-ST/ECG-segment-analysis (ex-ST/ECG) in the detection of myocardial ischemia. The aim of the present study was to test the implementation in diagnostic value of the ex-HF/QRS in patient with hypertension and chest pain (CP) versus the conventional ex-ST/ECG anlysis alone. Patients with long-standing hypertension, CP, normal ECG, troponin and echocardiography were enrolled. All patients underwent the ex-ST/ECG and ex-HF/QRS. A decrease >/=50% of the signal of ex-HF/QRS intensity recorded in two contiguous leads, at least, was considered as index of ischaemia, as ST-segment depression >/=2 mm or >/=1 mm and CP on ex-ST/ECG. Exclusion criteria were QRS duration >/=120 msec and inability to exercise. The end-point was the composite of coronary stenosis >50% or acute coronary syndrome, revascularization, cardiovascular death at 3-month follow-up. Six-hundred thirty-one patients were enrolled (age 61+/-15 y). The percentage of age-adjusted maximal predicted heart rate was 88+/-10 beat-per-minute and the maximal systolic blood pressure was 169+/-22 mmHg. Twenty-seven patients achieved the end-point. On multivariate analysis, both the ex-ST/ECG and ex-HF/QRS were predictors of the end-point. The ex-HF/QRS showed higher sensitivity (88% vs 50%; p = 0.003), lower specificity (77% vs 97%; p = 0.245) and comparable negative predictive value (99% vs 99%; p = NS) when compared to ex-ST/ECG. Receiver operator characteristics (ROC) analysis showed the incremental diagnostic value of the ex-HF/QRS (area: 0.64, 95% Confidence Intervals, CI 0.51-0.77) over conventional ex-ST/ECG (0.60, CI 0.52-0.66) and Chest Pain Score (0.53, CI 0.48-0.59); p = NS on pairwise C-statistic. In patients with long-standing hypertension and CP submitted to risk stratification with exercise tolerance test, the novel ex

  12. Neutron multiplicity of fission fragments

    Energy Technology Data Exchange (ETDEWEB)

    Abdelrahman, Y S [Physics department, mu` rah university Al-Karak, (Jordan)

    1995-10-01

    The total average neutron multiplicity of the fission fragments produced by the spontaneous fission of {sup 248} Cm has been measured. This measurement has been done by using a new experimental technique. This technique mainly depends on {gamma}-{gamma} coincidence using a very high resolution high purity germanium (HPGe) detector. 2 figs.

  13. Malignant pleural mesothelioma segmentation for photodynamic therapy planning.

    Science.gov (United States)

    Brahim, Wael; Mestiri, Makram; Betrouni, Nacim; Hamrouni, Kamel

    2018-04-01

    Medical imaging modalities such as computed tomography (CT) combined with computer-aided diagnostic processing have already become important part of clinical routine specially for pleural diseases. The segmentation of the thoracic cavity represents an extremely important task in medical imaging for different reasons. Multiple features can be extracted by analyzing the thoracic cavity space and these features are signs of pleural diseases including the malignant pleural mesothelioma (MPM) which is the main focus of our research. This paper presents a method that detects the MPM in the thoracic cavity and plans the photodynamic therapy in the preoperative phase. This is achieved by using a texture analysis of the MPM region combined with a thoracic cavity segmentation method. The algorithm to segment the thoracic cavity consists of multiple stages. First, the rib cage structure is segmented using various image processing techniques. We used the segmented rib cage to detect feature points which represent the thoracic cavity boundaries. Next, the proposed method segments the structures of the inner thoracic cage and fits 2D closed curves to the detected pleural cavity features in each slice. The missing bone structures are interpolated using a prior knowledge from manual segmentation performed by an expert. Next, the tumor region is segmented inside the thoracic cavity using a texture analysis approach. Finally, the contact surface between the tumor region and the thoracic cavity curves is reconstructed in order to plan the photodynamic therapy. Using the adjusted output of the thoracic cavity segmentation method and the MPM segmentation method, we evaluated the contact surface generated from these two steps by comparing it to the ground truth. For this evaluation, we used 10 CT scans with pathologically confirmed MPM at stages 1 and 2. We obtained a high similarity rate between the manually planned surface and our proposed method. The average value of Jaccard index

  14. Development of a segmented grating mount system for FIREX-1

    International Nuclear Information System (INIS)

    Ezaki, Y; Tabata, M; Kihara, M; Horiuchi, Y; Endo, M; Jitsuno, T

    2008-01-01

    A mount system for segmented meter-sized gratings has been developed, which has a high precision grating support mechanism and drive mechanism to minimize both deformation of the optical surfaces and misalignments in setting a segmented grating for obtaining sufficient performance of the pulse compressor. From analytical calculations, deformation of the grating surface is less than 1/20 lambda RMS and the estimated drive resolution for piston and tilt drive of the segmented grating is 1/20 lambda, which are both compliant with the requirements for the rear-end subsystem of FIREX-1

  15. Methods of evaluating segmentation characteristics and segmentation of major faults

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kie Hwa; Chang, Tae Woo; Kyung, Jai Bok [Seoul National Univ., Seoul (Korea, Republic of)] (and others)

    2000-03-15

    Seismological, geological, and geophysical studies were made for reasonable segmentation of the Ulsan fault and the results are as follows. One- and two- dimensional electrical surveys revealed clearly the fault fracture zone enlarges systematically northward and southward from the vicinity of Mohwa-ri, indicating Mohwa-ri is at the seismic segment boundary. Field Geological survey and microscope observation of fault gouge indicates that the Quaternary faults in the area are reactivated products of the preexisting faults. Trench survey of the Chonbuk fault Galgok-ri revealed thrust faults and cumulative vertical displacement due to faulting during the late Quaternary with about 1.1-1.9 m displacement per event; the latest event occurred from 14000 to 25000 yrs. BP. The seismic survey showed the basement surface os cut by numerous reverse faults and indicated the possibility that the boundary between Kyeongsangbukdo and Kyeongsannamdo may be segment boundary.

  16. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    a microscope and we show how the method can handle transparent particles with significant glare point. The method generalizes to other problems. THis is illustrated by applying the method to camera calibration images and MRI of the midsagittal plane for gray and white matter separation and segmentation......We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  17. Methods of evaluating segmentation characteristics and segmentation of major faults

    International Nuclear Information System (INIS)

    Lee, Kie Hwa; Chang, Tae Woo; Kyung, Jai Bok

    2000-03-01

    Seismological, geological, and geophysical studies were made for reasonable segmentation of the Ulsan fault and the results are as follows. One- and two- dimensional electrical surveys revealed clearly the fault fracture zone enlarges systematically northward and southward from the vicinity of Mohwa-ri, indicating Mohwa-ri is at the seismic segment boundary. Field Geological survey and microscope observation of fault gouge indicates that the Quaternary faults in the area are reactivated products of the preexisting faults. Trench survey of the Chonbuk fault Galgok-ri revealed thrust faults and cumulative vertical displacement due to faulting during the late Quaternary with about 1.1-1.9 m displacement per event; the latest event occurred from 14000 to 25000 yrs. BP. The seismic survey showed the basement surface os cut by numerous reverse faults and indicated the possibility that the boundary between Kyeongsangbukdo and Kyeongsannamdo may be segment boundary

  18. A spatiotemporal-based scheme for efficient registration-based segmentation of thoracic 4-D MRI.

    Science.gov (United States)

    Yang, Y; Van Reeth, E; Poh, C L; Tan, C H; Tham, I W K

    2014-05-01

    Dynamic three-dimensional (3-D) (four-dimensional, 4-D) magnetic resonance (MR) imaging is gaining importance in the study of pulmonary motion for respiratory diseases and pulmonary tumor motion for radiotherapy. To perform quantitative analysis using 4-D MR images, segmentation of anatomical structures such as the lung and pulmonary tumor is required. Manual segmentation of entire thoracic 4-D MRI data that typically contains many 3-D volumes acquired over several breathing cycles is extremely tedious, time consuming, and suffers high user variability. This requires the development of new automated segmentation schemes for 4-D MRI data segmentation. Registration-based segmentation technique that uses automatic registration methods for segmentation has been shown to be an accurate method to segment structures for 4-D data series. However, directly applying registration-based segmentation to segment 4-D MRI series lacks efficiency. Here we propose an automated 4-D registration-based segmentation scheme that is based on spatiotemporal information for the segmentation of thoracic 4-D MR lung images. The proposed scheme saved up to 95% of computation amount while achieving comparable accurate segmentations compared to directly applying registration-based segmentation to 4-D dataset. The scheme facilitates rapid 3-D/4-D visualization of the lung and tumor motion and potentially the tracking of tumor during radiation delivery.

  19. White blood cell segmentation by color-space-based k-means clustering.

    Science.gov (United States)

    Zhang, Congcong; Xiao, Xiaoyan; Li, Xiaomei; Chen, Ying-Jie; Zhen, Wu; Chang, Jun; Zheng, Chengyun; Liu, Zhi

    2014-09-01

    White blood cell (WBC) segmentation, which is important for cytometry, is a challenging issue because of the morphological diversity of WBCs and the complex and uncertain background of blood smear images. This paper proposes a novel method for the nucleus and cytoplasm segmentation of WBCs for cytometry. A color adjustment step was also introduced before segmentation. Color space decomposition and k-means clustering were combined for segmentation. A database including 300 microscopic blood smear images were used to evaluate the performance of our method. The proposed segmentation method achieves 95.7% and 91.3% overall accuracy for nucleus segmentation and cytoplasm segmentation, respectively. Experimental results demonstrate that the proposed method can segment WBCs effectively with high accuracy.

  20. Simultaneous minimizing monitor units and number of segments without leaf end abutment for segmental intensity modulated radiation therapy delivery

    International Nuclear Information System (INIS)

    Li Kaile; Dai Jianrong; Ma Lijun

    2004-01-01

    Leaf end abutment is seldom studied when delivering segmental intensity modulated radiation therapy (IMRT) fields. We developed an efficient leaf sequencing method to eliminate leaf end abutment for segmental IMRT delivery. Our method uses simple matrix and sorting operations to obtain a solution that simultaneously minimizes total monitor units and number of segments without leaf end abutment between segments. We implemented and demonstrated our method for multiple clinical cases. We compared the results of our method with the results from exhaustive search method. We found that our solution without leaf end abutment produced equivalent results to the unconstrained solutions in terms of minimum total monitor units and minimum number of leaf segments. We conclude that the leaf end abutment fields can be avoided without affecting the efficiency of segmental IMRT delivery. The major strength of our method is its simplicity and high computing speed. This potentially provides a useful means for generating segmental IMRT fields that require high spatial resolution or complex intensity distributions

  1. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  2. Physical basis for river segmentation from water surface observables

    Science.gov (United States)

    Samine Montazem, A.; Garambois, P. A.; Calmant, S.; Moreira, D. M.; Monnier, J.; Biancamaria, S.

    2017-12-01

    With the advent of satellite missions such as SWOT we will have access to high resolution estimates of the elevation, slope and width of the free surface. A segmentation strategy is required in order to sub-sample the data set into reach master points for further hydraulic analyzes and inverse modelling. The question that arises is : what will be the best node repartition strategy that preserves hydraulic properties of river flow? The concept of hydraulic visibility introduced by Garambois et al. (2016) is investigated in order to highlight and characterize the spatio-temporal variations of water surface slope and curvature for different flow regimes and reach geometries. We show that free surface curvature is a powerful proxy for characterizing the hydraulic behavior of a reach since concavity of water surface is driven by variations in channel geometry that impacts the hydraulic properties of the flow. We evaluated the performance of three segmentation strategies by means of a well documented case, that of the Garonne river in France. We conclude that local extrema of free surface curvature appear as the best candidate for locating the segment boundaries for an optimal hydraulic representation of the segmented river. We show that for a given river different segmentation scales are possible: a fine-scale segmentation which is driven by fine-scale hydraulic to large-scale segmentation driven by large-scale geomorphology. The segmentation technique is then applied to high resolution GPS profiles of free surface elevation collected on the Negro river basin, a major contributor of the Amazon river. We propose two segmentations: a low-resolution one that can be used for basin hydrology and a higher resolution one better suited for local hydrodynamic studies.

  3. Segmenting articular cartilage automatically using a voxel classification approach

    DEFF Research Database (Denmark)

    Folkesson, Jenny; Dam, Erik B; Olsen, Ole F

    2007-01-01

    We present a fully automatic method for articular cartilage segmentation from magnetic resonance imaging (MRI) which we use as the foundation of a quantitative cartilage assessment. We evaluate our method by comparisons to manual segmentations by a radiologist and by examining the interscan...... reproducibility of the volume and area estimates. Training and evaluation of the method is performed on a data set consisting of 139 scans of knees with a status ranging from healthy to severely osteoarthritic. This is, to our knowledge, the only fully automatic cartilage segmentation method that has good...... agreement with manual segmentations, an interscan reproducibility as good as that of a human expert, and enables the separation between healthy and osteoarthritic populations. While high-field scanners offer high-quality imaging from which the articular cartilage have been evaluated extensively using manual...

  4. Skip segment Hirschsprung disease and Waardenburg syndrome

    Directory of Open Access Journals (Sweden)

    Erica R. Gross

    2015-04-01

    Full Text Available Skip segment Hirschsprung disease describes a segment of ganglionated bowel between two segments of aganglionated bowel. It is a rare phenomenon that is difficult to diagnose. We describe a recent case of skip segment Hirschsprung disease in a neonate with a family history of Waardenburg syndrome and the genetic profile that was identified.

  5. U.S. Army Custom Segmentation System

    Science.gov (United States)

    2007-06-01

    segmentation is individual or intergroup differences in response to marketing - mix variables. Presumptions about segments: •different demands in a...product or service category, •respond differently to changes in the marketing mix Criteria for segments: •The segments must exist in the environment

  6. Skip segment Hirschsprung disease and Waardenburg syndrome

    OpenAIRE

    Gross, Erica R.; Geddes, Gabrielle C.; McCarrier, Julie A.; Jarzembowski, Jason A.; Arca, Marjorie J.

    2015-01-01

    Skip segment Hirschsprung disease describes a segment of ganglionated bowel between two segments of aganglionated bowel. It is a rare phenomenon that is difficult to diagnose. We describe a recent case of skip segment Hirschsprung disease in a neonate with a family history of Waardenburg syndrome and the genetic profile that was identified.

  7. Active noise canceling system for mechanically cooled germanium radiation detectors

    Science.gov (United States)

    Nelson, Karl Einar; Burks, Morgan T

    2014-04-22

    A microphonics noise cancellation system and method for improving the energy resolution for mechanically cooled high-purity Germanium (HPGe) detector systems. A classical adaptive noise canceling digital processing system using an adaptive predictor is used in an MCA to attenuate the microphonics noise source making the system more deployable.

  8. The issue of gamma spectral system sourceless object calibration software using in radioactive environment measurement

    International Nuclear Information System (INIS)

    Shen Ming; Zhu Yuelong; Zhao Yanzi

    2009-01-01

    The paper introduces the characteristic, based method of HPGe detector LabSOCS (Laboratory Sourceless Object Calibration Software). Compared measured efficiency and LabSOCS efficiency for different point sources, and the tolerance is about 6% at middle and high energy range. For cylinder samples of dirt, animal ash and plant ash, the results of verification is 7%-10%. (authors)

  9. Pramana – Journal of Physics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    The - and conversion electron spectra following 131Ba -decay are investigated, using HPGe detector and mini-orange electron spectrometer. Attention is particularly focussed on identifying weak transitions associated with low energy high spin levels in 131Cs level scheme earlier inferred in reaction studies but not yet ...

  10. A Novel Iris Segmentation Scheme

    Directory of Open Access Journals (Sweden)

    Chen-Chung Liu

    2014-01-01

    Full Text Available One of the key steps in the iris recognition system is the accurate iris segmentation from its surrounding noises including pupil, sclera, eyelashes, and eyebrows of a captured eye-image. This paper presents a novel iris segmentation scheme which utilizes the orientation matching transform to outline the outer and inner iris boundaries initially. It then employs Delogne-Kåsa circle fitting (instead of the traditional Hough transform to further eliminate the outlier points to extract a more precise iris area from an eye-image. In the extracted iris region, the proposed scheme further utilizes the differences in the intensity and positional characteristics of the iris, eyelid, and eyelashes to detect and delete these noises. The scheme is then applied on iris image database, UBIRIS.v1. The experimental results show that the presented scheme provides a more effective and efficient iris segmentation than other conventional methods.

  11. Differential segmentation responses to an alcohol social marketing program.

    Science.gov (United States)

    Dietrich, Timo; Rundle-Thiele, Sharyn; Schuster, Lisa; Drennan, Judy; Russell-Bennett, Rebekah; Leo, Cheryl; Gullo, Matthew J; Connor, Jason P

    2015-10-01

    This study seeks to establish whether meaningful subgroups exist within a 14-16 year old adolescent population and if these segments respond differently to the Game On: Know Alcohol (GOKA) intervention, a school-based alcohol social marketing program. This study is part of a larger cluster randomized controlled evaluation of the GOKA program implemented in 14 schools in 2013/2014. TwoStep cluster analysis was conducted to segment 2,114 high school adolescents (14-16 years old) on the basis of 22 demographic, behavioral, and psychographic variables. Program effects on knowledge, attitudes, behavioral intentions, social norms, alcohol expectancies, and drinking refusal self-efficacy of identified segments were subsequently examined. Three segments were identified: (1) Abstainers, (2) Bingers, and (3) Moderate Drinkers. Program effects varied significantly across segments. The strongest positive change effects post-participation were observed for Bingers, while mixed effects were evident for Moderate Drinkers and Abstainers. These findings provide preliminary empirical evidence supporting the application of social marketing segmentation in alcohol education programs. Development of targeted programs that meet the unique needs of each of the three identified segments will extend the social marketing footprint in alcohol education. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. On exploiting wavelet bases in statistical region-based segmentation

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Forchhammer, Søren

    2002-01-01

    Statistical region-based segmentation methods such as the Active Appearance Models establish dense correspondences by modelling variation of shape and pixel intensities in low-resolution 2D images. Unfortunately, for high-resolution 2D and 3D images, this approach is rendered infeasible due to ex...... 9-7 wavelet on cardiac MRIs and human faces show that the segmentation accuracy is minimally degraded at compression ratios of 1:10 and 1:20, respectively....

  13. Optical coherence tomography in anterior segment imaging

    Science.gov (United States)

    Kalev-Landoy, Maya; Day, Alexander C.; Cordeiro, M. Francesca; Migdal, Clive

    2008-01-01

    Purpose To evaluate the ability of optical coherence tomography (OCT), designed primarily to image the posterior segment, to visualize the anterior chamber angle (ACA) in patients with different angle configurations. Methods In a prospective observational study, the anterior segments of 26 eyes of 26 patients were imaged using the Zeiss Stratus OCT, model 3000. Imaging of the anterior segment was achieved by adjusting the focusing control on the Stratus OCT. A total of 16 patients had abnormal angle configurations including narrow or closed angles and plateau irides, and 10 had normal angle configurations as determined by prior full ophthalmic examination, including slit-lamp biomicroscopy and gonioscopy. Results In all cases, OCT provided high-resolution information regarding iris configuration. The ACA itself was clearly visualized in patients with narrow or closed angles, but not in patients with open angles. Conclusions Stratus OCT offers a non-contact, convenient and rapid method of assessing the configuration of the anterior chamber. Despite its limitations, it may be of help during the routine clinical assessment and treatment of patients with glaucoma, particularly when gonioscopy is not possible or difficult to interpret. PMID:17355288

  14. Segmented Thermoelectric Oxide-based Module

    DEFF Research Database (Denmark)

    Le, Thanh Hung; Linderoth, Søren

    for a more stable high temperature material. In this study, thermoelectric properties from 300 to 1200 K of Ca0.9Y0.1Mn1-xFexO3 for 0 ≤ x ≤ 0.25 were systematically investigated in term of Y and Fe co-doping at the Ca- and Mn-sites, respectively. It was found that with increasing the content of Fe doping......-performance segmented oxide-based module comprising of 4-unicouples using segmentation of the half-Heusler Ti0.3Zr0.35Hf0.35CoSb0.8Sn0.2 and the misfit-layered cobaltite Ca3Co4O9+δ as the p-leg and 2% Al-doped ZnO as the n-leg was successfully fabricated and characterized. The results (presented in Chapter 5) show...... result, although a slight degradation tendency could be observed after 48 hours of operating in air. Nevertheless, the total conversion efficiency of this segmented module is still low less than 2%, and needs to be further improved. A degradation mechanism was observed, which attributed to the increase...

  15. Document segmentation via oblique cuts

    Science.gov (United States)

    Svendsen, Jeremy; Branzan-Albu, Alexandra

    2013-01-01

    This paper presents a novel solution for the layout segmentation of graphical elements in Business Intelligence documents. We propose a generalization of the recursive X-Y cut algorithm, which allows for cutting along arbitrary oblique directions. An intermediate processing step consisting of line and solid region removal is also necessary due to presence of decorative elements. The output of the proposed segmentation is a hierarchical structure which allows for the identification of primitives in pie and bar charts. The algorithm was tested on a database composed of charts from business documents. Results are very promising.

  16. Optimally segmented permanent magnet structures

    DEFF Research Database (Denmark)

    Insinga, Andrea Roberto; Bjørk, Rasmus; Smith, Anders

    2016-01-01

    We present an optimization approach which can be employed to calculate the globally optimal segmentation of a two-dimensional magnetic system into uniformly magnetized pieces. For each segment the algorithm calculates the optimal shape and the optimal direction of the remanent flux density vector......, with respect to a linear objective functional. We illustrate the approach with results for magnet design problems from different areas, such as a permanent magnet electric motor, a beam focusing quadrupole magnet for particle accelerators and a rotary device for magnetic refrigeration....

  17. Intercalary bone segment transport in treatment of segmental tibial defects

    International Nuclear Information System (INIS)

    Iqbal, A.; Amin, M.S.

    2002-01-01

    Objective: To evaluate the results and complications of intercalary bone segment transport in the treatment of segmental tibial defects. Design: This is a retrospective analysis of patients with segmental tibial defects who were treated with intercalary bone segment transport method. Place and Duration of Study: The study was carried out at Combined Military Hospital, Rawalpindi from September 1997 to April 2001. Subjects and methods: Thirteen patients were included in the study who had developed tibial defects either due to open fractures with bone loss or subsequent to bone debridement of infected non unions. The mean bone defect was 6.4 cms and there were eight associated soft tissue defects. Locally made unilateral 'Naseer-Awais' (NA) fixator was used for bone segment transport. The distraction was done at the rate of 1mm/day after 7-10 days of osteotomy. The patients were followed-up fortnightly during distraction and monthly thereafter. The mean follow-up duration was 18 months. Results: The mean time in external fixation was 9.4 months. The m ean healing index' was 1.47 months/cm. Satisfactory union was achieved in all cases. Six cases (46.2%) required bone grafting at target site and in one of them grafting was required at the level of regeneration as well. All the wounds healed well with no residual infection. There was no residual leg length discrepancy of more than 20 mm nd one angular deformity of more than 5 degrees. The commonest complication encountered was pin track infection seen in 38% of Shanz Screws applied. Loosening occurred in 6.8% of Shanz screws, requiring re-adjustment. Ankle joint contracture with equinus deformity and peroneal nerve paresis occurred in one case each. The functional results were graded as 'good' in seven, 'fair' in four, and 'poor' in two patients. Overall, thirteen patients had 31 (minor/major) complications with a ratio of 2.38 complications per patient. To treat the bone defects and associated complications, a mean of

  18. Hydrophilic segmented block copolymers based on poly(ethylene oxide) and monodisperse amide segments

    NARCIS (Netherlands)

    Husken, D.; Feijen, Jan; Gaymans, R.J.

    2007-01-01

    Segmented block copolymers based on poly(ethylene oxide) (PEO) flexible segments and monodisperse crystallizable bisester tetra-amide segments were made via a polycondensation reaction. The molecular weight of the PEO segments varied from 600 to 4600 g/mol and a bisester tetra-amide segment (T6T6T)

  19. Influence of nuclei segmentation on breast cancer malignancy classification

    Science.gov (United States)

    Jelen, Lukasz; Fevens, Thomas; Krzyzak, Adam

    2009-02-01

    Breast Cancer is one of the most deadly cancers affecting middle-aged women. Accurate diagnosis and prognosis are crucial to reduce the high death rate. Nowadays there are numerous diagnostic tools for breast cancer diagnosis. In this paper we discuss a role of nuclear segmentation from fine needle aspiration biopsy (FNA) slides and its influence on malignancy classification. Classification of malignancy plays a very important role during the diagnosis process of breast cancer. Out of all cancer diagnostic tools, FNA slides provide the most valuable information about the cancer malignancy grade which helps to choose an appropriate treatment. This process involves assessing numerous nuclear features and therefore precise segmentation of nuclei is very important. In this work we compare three powerful segmentation approaches and test their impact on the classification of breast cancer malignancy. The studied approaches involve level set segmentation, fuzzy c-means segmentation and textural segmentation based on co-occurrence matrix. Segmented nuclei were used to extract nuclear features for malignancy classification. For classification purposes four different classifiers were trained and tested with previously extracted features. The compared classifiers are Multilayer Perceptron (MLP), Self-Organizing Maps (SOM), Principal Component-based Neural Network (PCA) and Support Vector Machines (SVM). The presented results show that level set segmentation yields the best results over the three compared approaches and leads to a good feature extraction with a lowest average error rate of 6.51% over four different classifiers. The best performance was recorded for multilayer perceptron with an error rate of 3.07% using fuzzy c-means segmentation.

  20. Quality assurance of the dose delivered by small radiation segments

    International Nuclear Information System (INIS)

    Hansen, Vebeke N.; Evans, Philip M.; Budgell, Geoffrey J.; Mott, Judith H.L.; Williams, Peter C.; Brugmans, Marco J.P.; Wittkaemper, Frits W.; Mijnheer, Ben J.; Brown, Kevin

    1998-01-01

    The use of intensity modulation with multiple static fields has been suggested by many authors as a way to achieve highly conformal fields in radiotherapy. However, quality assurance of linear accelerators is generally done only for beam segments of 100 MU or higher, and by measuring beam profiles once the beam has stabilized. We propose a set of measurements to check the stability of dose delivery in small segments, and present measured data from three radiotherapy centres. The dose delivered per monitor unit, MU, was measured for various numbers of MU segments. The field flatness and symmetry were measured using either photographic films that are subsequently scanned by a densitometer, or by using a diode array. We performed the set of measurements at the three radiotherapy centres on a set of five different Philips SL accelerators with energies of 6 MV, 8 MV, 10 MV and 18 MV. The dose per monitor unit over the range of 1 to 100 MU was found to be accurate to within ±5% of the nominal dose per monitor unit as defined for the delivery of 100 MU for all the energies. For four out of the five accelerators the dose per monitor unit over the same range was even found to be accurate to within ±2%. The flatness and symmetry were in some cases found to be larger for small segments by a maximum of 9% of the flatness/symmetry for large segments. The result of this study provides the dosimetric evidence that the delivery of small segment doses as top-up fields for beam intensity modulation is feasible. However, it should be stressed that linear accelerators have different characteristics for the delivery of small segments, hence this type of measurement should be performed for each machine before the delivery of small dose segments is approved. In some cases it may be advisable to use a low pulse repetition frequency (PRF) to obtain more accurate dose delivery of small segments. (author)