WorldWideScience

Sample records for bins histograms allocates

  1. optBINS: Optimal Binning for histograms

    Science.gov (United States)

    Knuth, Kevin H.

    2018-03-01

    optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

  2. Histogram bin width selection for time-dependent Poisson processes

    International Nuclear Information System (INIS)

    Koyama, Shinsuke; Shinomoto, Shigeru

    2004-01-01

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method

  3. Histogram bin width selection for time-dependent Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Koyama, Shinsuke; Shinomoto, Shigeru [Department of Physics, Graduate School of Science, Kyoto University, Sakyo-ku, Kyoto 606-8502 (Japan)

    2004-07-23

    In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.

  4. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    Science.gov (United States)

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  5. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    Science.gov (United States)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  6. Optimizing 4DCBCT projection allocation to respiratory bins

    International Nuclear Information System (INIS)

    O’Brien, Ricky T; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J

    2014-01-01

    4D cone beam computed tomography (4DCBCT) is an emerging image guidance strategy used in radiotherapy where projections acquired during a scan are sorted into respiratory bins based on the respiratory phase or displacement. 4DCBCT reduces the motion blur caused by respiratory motion but increases streaking artefacts due to projection under-sampling as a result of the irregular nature of patient breathing and the binning algorithms used. For displacement binning the streak artefacts are so severe that displacement binning is rarely used clinically. The purpose of this study is to investigate if sharing projections between respiratory bins and adjusting the location of respiratory bins in an optimal manner can reduce or eliminate streak artefacts in 4DCBCT images. We introduce a mathematical optimization framework and a heuristic solution method, which we will call the optimized projection allocation algorithm, to determine where to position the respiratory bins and which projections to source from neighbouring respiratory bins. Five 4DCBCT datasets from three patients were used to reconstruct 4DCBCT images. Projections were sorted into respiratory bins using equispaced, equal density and optimized projection allocation. The standard deviation of the angular separation between projections was used to assess streaking and the consistency of the segmented volume of a fiducial gold marker was used to assess motion blur. The standard deviation of the angular separation between projections using displacement binning and optimized projection allocation was 30%–50% smaller than conventional phase based binning and 59%–76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The standard deviation in the marker volume was 20%–90% smaller when using optimized projection allocation than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Images

  7. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    Science.gov (United States)

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance

  8. Pencarian Produk yang Mirip Melalui Automatic Online Annotation dari Web dan Berbasiskan Konten dengan Color Histogram Bin dan Surf Descriptor

    Directory of Open Access Journals (Sweden)

    Putra Pandu Adikara

    2018-03-01

    Full Text Available Banyaknya situs e-commerce memberikan kemudahan bagi pengguna yang ingin mencari dan membeli suatu produk, misalnya membeli makanan, obat, alat elektronik, kebutuhan sehari-hari, dan lain-lain. Pencarian suatu produk terhadap beberapa situs e-commerce akan menjadi sulit karena banyaknya pilihan situs, banyaknya penjual (merchant/seller yang menjual barang yang sama, dan waktu yang lama karena harus berpindah-pindah situs hingga menemukan produk yang diinginkan. Selain itu dengan adanya teknologi smartphone berkamera, augmented reality, query pencarian bisa jadi hanya berupa citra, namun pencarian produk dengan menggunakan citra pada umumnya tidak diakomodasi di situs e-commerce. Dalam penelitian ini dikembangkan sistem meta search-engine yang menggunakan query berupa citra dan berbasiskan konten untuk menggabungkan hasil pencarian dari beberapa situs e-commerce. Citra query yang tidak diketahui namanya dibangkitkan tag atau kata kuncinya melalui Google reverse image search engine. Kata kunci ini kemudian diberikan ke masing-masing situs e-commerce untuk dilakukan pencarian. Fitur yang digunakan dalam pencocokan query dengan produk adalah fitur tekstual, color histogram bin, dan keberadaan citra objek yang dicari menggunakan SURF descriptor. Fitur-fitur ini digunakan untuk menentukan relevansi terhadap hasil penelusuran. Sistem ini dapat memberikan hasil yang baik dengan precision@20 dan recall hingga 1 dengan rata-rata precision@20 dan recall masing-masing sebesar 0,564 dan 0,608, namun juga bisa gagal dengan precision@20 dan recall sebesar 0. Hasil yang kurang baik ini dikarenakan tag yang dibangkitkan terlalu umum dan situs e-commerce-pun memberikan hasil yang umum juga

  9. TH-E-17A-05: Optimizing Four Dimensional Cone Beam Computed Tomography Projection Allocation to Respiratory Bins

    International Nuclear Information System (INIS)

    OBrien, R; Shieh, C; Kipritidis, J; Keall, P

    2014-01-01

    Purpose: Four dimensional cone beam computed tomography (4DCBCT) is an emerging image guidance strategy but it can suffer from poor image quality. To avoid repeating scans it is beneficial to make the best use of the imaging data obtained. For conventional 4DCBCT the location and size of respiratory bins is fixed and projections are allocated to the respiratory bin within which it falls. Strictly adhering to this rule is unnecessary and can compromise image quality. In this study we optimize the size and location of respiratory bins and allow projections to be sourced from adjacent phases of the respiratory cycle. Methods: A mathematical optimization framework using mixed integer quadratic programming has been developed that determines when to source projections from adjacent respiratory bins and optimizes the size and location of the bins. The method, which we will call projection sharing, runs in under 2 seconds of CPU time. Five 4DCBCT datasets of stage III-IV lung cancer patients were used to test the algorithm. The standard deviation of the angular separation between projections (SD-A) and the standard deviation in the volume of the reconstructed fiducial gold coil (SD-V) were used as proxies to measure streaking artefacts and motion blur respectively. Results: The SD-A using displacement binning and projection sharing was 30%–50% smaller than conventional phase based binning and 59%–76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The SD-V was 20–90% smaller when using projection sharing than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Conclusion: Image quality was visibly and significantly improved with projection sharing. Projection sharing does not require any modifications to existing hardware and offers a more robust replacement to phase based binning, or, an option if phase based reconstruction is not of a

  10. Cardinality reasoning for bin-packing constraint : Application to a tank allocation problem

    NARCIS (Netherlands)

    Schaus, Pierre; Régin, Jean Charles; Van Schaeren, Rowan; Dullaert, Wout; Raa, Birger

    2012-01-01

    Flow reasoning has been successfully used in CP for more than a decade. It was originally introduced by Régin in the well-known Alldifferent and Global Cardinality Constraint (GCC) available in most of the CP solvers. The BinPacking constraint was introduced by Shaw and mainly uses an independent

  11. Multi-site study of diffusion metric variability: effects of site, vendor, field strength, and echo time on regions-of-interest and histogram-bin analyses.

    Science.gov (United States)

    Helmer, K G; Chou, M-C; Preciado, R I; Gimi, B; Rollins, N K; Song, A; Turner, J; Mori, S

    2016-02-27

    It is now common for magnetic-resonance-imaging (MRI) based multi-site trials to include diffusion-weighted imaging (DWI) as part of the protocol. It is also common for these sites to possess MR scanners of different manufacturers, different software and hardware, and different software licenses. These differences mean that scanners may not be able to acquire data with the same number of gradient amplitude values and number of available gradient directions. Variability can also occur in achievable b-values and minimum echo times. The challenge of a multi-site study then, is to create a common protocol by understanding and then minimizing the effects of scanner variability and identifying reliable and accurate diffusion metrics. This study describes the effect of site, scanner vendor, field strength, and TE on two diffusion metrics: the first moment of the diffusion tensor field (mean diffusivity, MD), and the fractional anisotropy (FA) using two common analyses (region-of-interest and mean-bin value of whole brain histograms). The goal of the study was to identify sources of variability in diffusion-sensitized imaging and their influence on commonly reported metrics. The results demonstrate that the site, vendor, field strength, and echo time all contribute to variability in FA and MD, though to different extent. We conclude that characterization of the variability of DTI metrics due to site, vendor, field strength, and echo time is a worthwhile step in the construction of multi-center trials.

  12. Bi-Histogram Equalization with Brightnes Preservation Using Contras Enhancement

    OpenAIRE

    A. Anitha Rani; Gowthami Rajagopal; A. Jagadeswaran

    2014-01-01

    Contrast enhancement is an important factor in the image preprocesing step. One of the widely acepted contrast enhancement method is the histogram equalization. Although histogram equalization achieves comparatively beter performance on almost al types of image, global histogram equalization sometimes produces excesive visual deterioration. A new extension of bi- histogram equalization caled Bi-Histogram Equalization with Neighborhod Metric (BHENM). First, large histogram bins that cause w...

  13. Complexity of possibly gapped histogram and analysis of histogram

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  14. Complexity of possibly gapped histogram and analysis of histogram.

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  15. A comparison of automatic histogram constructions

    NARCIS (Netherlands)

    Davies, P.L.; Gather, U.; Nordman, D.J.; Weinert, H.

    2009-01-01

    Even for a well-trained statistician the construction of a histogram for a given real-valued data set is a difficult problem. It is even more difficult to construct a fully automatic procedure which specifies the number and widths of the bins in a satisfactory manner for a wide range of data sets.

  16. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  17. Chi-square tests for comparing weighted histograms

    International Nuclear Information System (INIS)

    Gagunashvili, N.D.

    2010-01-01

    Weighted histograms in Monte Carlo simulations are often used for the estimation of probability density functions. They are obtained as a result of random experiments with random events that have weights. In this paper, the bin contents of a weighted histogram are considered as a sum of random variables with a random number of terms. Generalizations of the classical chi-square test for comparing weighted histograms are proposed. Numerical examples illustrate an application of the tests for the histograms with different statistics of events and different weighted functions. The proposed tests can be used for the comparison of experimental data histograms with simulated data histograms as well as for the two simulated data histograms.

  18. The grumpy bin

    DEFF Research Database (Denmark)

    Altarriba, Ferran; Funk, Mathias; Lanzani, Stefano Eugenio

    2017-01-01

    Domestic food waste is a world-wide problem that is complex and difficult to tackle as it touches diverse habits and social behaviors. This paper introduces the Grumpy Bin, a smart food waste bin designed for the context of student housing. The Grumpy Bin1 contributes to the state of the art...

  19. The Amazing Histogram.

    Science.gov (United States)

    Vandermeulen, H.; DeWreede, R. E.

    1983-01-01

    Presents a histogram drawing program which sorts real numbers in up to 30 categories. Entered data are sorted and saved in a text file which is then used to generate the histogram. Complete Applesoft program listings are included. (JN)

  20. Mohammed A Bin Hussain

    Indian Academy of Sciences (India)

    Home; Journals; Bulletin of Materials Science. Mohammed A Bin Hussain. Articles written in Bulletin of Materials Science. Volume 38 Issue 7 December 2015 pp 1731-1736. Sintered gahnite–cordierite glass-ceramic based on raw materials with different fluorine sources · Esmat M A Hamzawy Mohammed A Bin Hussain.

  1. The histogramming tool hparse

    International Nuclear Information System (INIS)

    Nikulin, V.; Shabratova, G.

    2005-01-01

    A general-purpose package aimed to simplify the histogramming in the data analysis is described. The proposed dedicated language for writing the histogramming scripts provides an effective and flexible tool for definition of a complicated histogram set. The script is more transparent and much easier to maintain than corresponding C++ code. In the TTree analysis it could be a good complement to the TTreeViewer class: the TTreeViewer is used for choice of the required histogram/cut set, while the hparse enables one to generate a code for systematic analysis

  2. Infrared Contrast Enhancement Through Log-Power Histogram Modification

    NARCIS (Netherlands)

    Toet, A.; Wu, T.

    2015-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance infrared (IR) image contrast. The algorithm combines a logarithm operator that smoothes the input image histogram while retaining the relative ordering of the original bins, with a power operator that restores the

  3. Refractory bin for burning

    Energy Technology Data Exchange (ETDEWEB)

    McPherson, D.L.; McPherson, T.L.

    1989-12-26

    This patent describes a refractory bin. It has a generally rectangular horizontal cross sectional configuration. It has wall structures each comprising an upper and a lower pair of elongated horizontal vertically spaced generally parallel support beams each having a vertical flange defining a support edge along its upper surface, a first generally rectangular refractory panel arranged with its lower edge at the bottom of the bin and with its outer surface in flat face contacting relation with the vertical flanges of the lower pair of support beams, a plurality of brackets each having a horizontal part and a vertical part and being secured to the outer surface of the first refractory panel.

  4. Centroid and full-width at half maximum uncertainties of histogrammed data with an underlying Gaussian distribution -- The moments method

    International Nuclear Information System (INIS)

    Valentine, J.D.; Rana, A.E.

    1996-01-01

    The effect of approximating a continuous Gaussian distribution with histogrammed data are studied. The expressions for theoretical uncertainties in centroid and full-width at half maximum (FWHM), as determined by calculation of moments, are derived using the error propagation method for a histogrammed Gaussian distribution. The results are compared with the corresponding pseudo-experimental uncertainties for computer-generated histogrammed Gaussian peaks to demonstrate the effect of binning the data. It is shown that increasing the number of bins in the histogram improves the continuous distribution approximation. For example, a FWHM ≥ 9 and FWHM ≥ 12 bins are needed to reduce the pseudo-experimental standard deviation of FWHM to within ≥5% and ≥1%, respectively, of the theoretical value for a peak containing 10,000 counts. In addition, the uncertainties in the centroid and FWHM as a function of peak area are studied. Finally, Sheppard's correction is applied to partially correct for the binning effect

  5. DPAK and HPAK: a versatile display and histogramming package

    International Nuclear Information System (INIS)

    Logg, C.A.; Boyarski, A.M.; Cook, A.J.; Cottrell, R.L.A.; Sund, S.

    1979-07-01

    The features of a display and histogram package which requires a minimal number of subroutine calls in order to generate graphic output in many flavors on a variety of devices are described. Default options are preset to values that are generally most wanted, but the default values may be readily changed to the user's needs. The description falls naturally into two parts, namely, the set of routines (DPAK) for displaying data on some device, and the set of routines (HPAK) for generating histograms. HPAK provides a means of allocating memory for histograms, accumulating data into histograms, and subsequently displaying the hisotgrams via calls to the DPAK routines. Histograms and displays of either one or two independent variables can be made

  6. Machine assisted histogram classification

    Science.gov (United States)

    Benyó, B.; Gaspar, C.; Somogyi, P.

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  7. Machine assisted histogram classification

    Energy Technology Data Exchange (ETDEWEB)

    Benyo, B; Somogyi, P [BME-IIT, H-1117 Budapest, Magyar tudosok koerutja 2. (Hungary); Gaspar, C, E-mail: Peter.Somogyi@cern.c [CERN-PH, CH-1211 Geneve 23 (Switzerland)

    2010-04-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty or ageing components can be either done visually using instruments, such as the LHCb Histogram Presenter, or with the help of automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, we propose a graph based clustering tool combined with machine learning algorithm and demonstrate its use by processing histograms representing 2D hitmaps events. We prove the concept by detecting ion feedback events in the LHCb experiment's RICH subdetector.

  8. Histogram Estimators of Bivariate Densities

    National Research Council Canada - National Science Library

    Husemann, Joyce A

    1986-01-01

    One-dimensional fixed-interval histogram estimators of univariate probability density functions are less efficient than the analogous variable-interval estimators which are constructed from intervals...

  9. COLOUR IMAGE ENHANCEMENT BASED ON HISTOGRAM EQUALIZATION

    OpenAIRE

    Kanika Kapoor and Shaveta Arora

    2015-01-01

    Histogram equalization is a nonlinear technique for adjusting the contrast of an image using its histogram. It increases the brightness of a gray scale image which is different from the mean brightness of the original image. There are various types of Histogram equalization techniques like Histogram Equalization, Contrast Limited Adaptive Histogram Equalization, Brightness Preserving Bi Histogram Equalization, Dualistic Sub Image Histogram Equalization, Minimum Mean Brightness Error Bi Histog...

  10. The Maximum Resource Bin Packing Problem

    DEFF Research Database (Denmark)

    Boyar, J.; Epstein, L.; Favrholdt, L.M.

    2006-01-01

    Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...

  11. Investigating Student Understanding of Histograms

    Science.gov (United States)

    Kaplan, Jennifer J.; Gabrosek, John G.; Curtiss, Phyllis; Malone, Chris

    2014-01-01

    Histograms are adept at revealing the distribution of data values, especially the shape of the distribution and any outlier values. They are included in introductory statistics texts, research methods texts, and in the popular press, yet students often have difficulty interpreting the information conveyed by a histogram. This research identifies…

  12. The Histogram-Area Connection

    Science.gov (United States)

    Gratzer, William; Carpenter, James E.

    2008-01-01

    This article demonstrates an alternative approach to the construction of histograms--one based on the notion of using area to represent relative density in intervals of unequal length. The resulting histograms illustrate the connection between the area of the rectangles associated with particular outcomes and the relative frequency (probability)…

  13. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  14. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    Directory of Open Access Journals (Sweden)

    Liyun Zhuang

    2017-01-01

    Full Text Available This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE, which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE. Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image.

  15. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance

    Science.gov (United States)

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image. PMID:29403529

  16. Efficient Human Action and Gait Analysis Using Multiresolution Motion Energy Histogram

    Directory of Open Access Journals (Sweden)

    Kuo-Chin Fan

    2010-01-01

    Full Text Available Average Motion Energy (AME image is a good way to describe human motions. However, it has to face the computation efficiency problem with the increasing number of database templates. In this paper, we propose a histogram-based approach to improve the computation efficiency. We convert the human action/gait recognition problem to a histogram matching problem. In order to speed up the recognition process, we adopt a multiresolution structure on the Motion Energy Histogram (MEH. To utilize the multiresolution structure more efficiently, we propose an automated uneven partitioning method which is achieved by utilizing the quadtree decomposition results of MEH. In that case, the computation time is only relevant to the number of partitioned histogram bins, which is much less than the AME method. Two applications, action recognition and gait classification, are conducted in the experiments to demonstrate the feasibility and validity of the proposed approach.

  17. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance.

    Science.gov (United States)

    Zhuang, Liyun; Guan, Yepeng

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image.

  18. Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system.

    Science.gov (United States)

    Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan

    2011-12-01

    This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. CHIL - a comprehensive histogramming language

    International Nuclear Information System (INIS)

    Milner, W.T.; Biggerstaff, J.A.

    1984-01-01

    A high level language, CHIL, has been developed for use in processing event-by-event experimental data at the Holifield Heavy Ion Research Facility (HHIRF) using PERKIN-ELMER 3230 computers. CHIL has been fully integrated into all software which supports on-line and off-line histogramming and off-line preprocessing. CHIL supports simple gates, free-form-gates (2-D regions of arbitrary shape), condition test and branch statements, bit-tests, loops, calls to up to three user supplied subroutines and histogram generating statements. Any combination of 1, 2, 3 or 4-D histograms (32 megachannels max) may be recorded at 16 or 32 bits/channel. User routines may intercept the data being processed and modify it as desired. The CPU-intensive part of the processing utilizes microcoded routines which enhance performance by about a factor of two

  20. CHILA A comprehensive histogramming language

    International Nuclear Information System (INIS)

    Milner, W.T.; Biggerstaff, J.A.

    1985-01-01

    A high level language, CHIL, has been developed for use in processing event-by-event experimental data at the Holifield Heavy Ion Research Facility (HHIRF) using PERKIN-ELMER 3230 computers. CHIL has been fully integrated into all software which supports on-line and off-line histogramming and off-line preprocessing. CHIL supports simple gates, free-form-gates (2-D regions of arbitrary shape), condition test and branch statements, bit-tests, loops, calls to up to three user supplied subroutines and histogram generating statements. Any combination of 1, 2, 3 or 4-D histograms (32 megachannels max) may be recorded at 16 or 32 bits/channel. User routines may intercept the data being processed and modify it as desired. The CPU-intensive part of the processing utilizes microcoded routines which enhance performance by about a factor of two

  1. Multispectral histogram normalization contrast enhancement

    Science.gov (United States)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  2. Live histograms in moving windows

    International Nuclear Information System (INIS)

    Zhil'tsov, V.E.

    1989-01-01

    Application of computer graphics for specific hardware testing is discussed. The hardware is position sensitive detector (multiwire proportional chamber) which is used in high energy physics experiments, and real-out electronics for it. Testing program is described (XPERT), which utilises multi-window user interface. Data are represented as histograms in windows. The windows on the screen may be moved, reordered, their sizes may be changed. Histograms may be put to any window, and hardcopy may be made. Some program internals are discussed. The computer environment is quite simple: MS-DOS IBM PC/XT, 256 KB RAM, CGA, 5.25'' FD, Epson MX. 4 refs.; 7 figs

  3. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  4. Theory and Application of DNA Histogram Analysis.

    Science.gov (United States)

    Bagwell, Charles Bruce

    The underlying principles and assumptions associated with DNA histograms are discussed along with the characteristics of fluorescent probes. Information theory was described and used to calculate the information content of a DNA histogram. Two major types of DNA histogram analyses are proposed: parametric and nonparametric analysis. Three levels…

  5. LHCb: Machine assisted histogram classification

    CERN Multimedia

    Somogyi, P; Gaspar, C

    2009-01-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty components can be either done visually using instruments such as the LHCb Histogram Presenter, or by automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, a graph-theoretic based clustering tool, combined with machine learning algorithms is proposed and demonstrated by processing histograms representing 2D event hitmaps. The concept is proven by detecting ion feedback events in the LHCb RICH subdetector.

  6. Visualizing Contour Trees within Histograms

    DEFF Research Database (Denmark)

    Kraus, Martin

    2010-01-01

    Many of the topological features of the isosurfaces of a scalar volume field can be compactly represented by its contour tree. Unfortunately, the contour trees of most real-world volume data sets are too complex to be visualized by dot-and-line diagrams. Therefore, we propose a new visualization...... that is suitable for large contour trees and efficiently conveys the topological structure of the most important isosurface components. This visualization is integrated into a histogram of the volume data; thus, it offers strictly more information than a traditional histogram. We present algorithms...... to automatically compute the graph layout and to calculate appropriate approximations of the contour tree and the surface area of the relevant isosurface components. The benefits of this new visualization are demonstrated with the help of several publicly available volume data sets....

  7. Consistency Check for the Bin Packing Constraint Revisited

    Science.gov (United States)

    Dupuis, Julien; Schaus, Pierre; Deville, Yves

    The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.

  8. Glioma grade assessment by using histogram analysis of diffusion tensor imaging-derived maps

    International Nuclear Information System (INIS)

    Jakab, Andras; Berenyi, Ervin; Molnar, Peter; Emri, Miklos

    2011-01-01

    Current endeavors in neuro-oncology include morphological validation of imaging methods by histology, including molecular and immunohistochemical techniques. Diffusion tensor imaging (DTI) is an up-to-date methodology of intracranial diagnostics that has gained importance in studies of neoplasia. Our aim was to assess the feasibility of discriminant analysis applied to histograms of preoperative diffusion tensor imaging-derived images for the prediction of glioma grade validated by histomorphology. Tumors of 40 consecutive patients included 13 grade II astrocytomas, seven oligoastrocytomas, six grade II oligodendrogliomas, three grade III oligoastrocytomas, and 11 glioblastoma multiformes. Preoperative DTI data comprised: unweighted (B 0 ) images, fractional anisotropy, longitudinal and radial diffusivity maps, directionally averaged diffusion-weighted imaging, and trace images. Sampling consisted of generating histograms for gross tumor volumes; 25 histogram bins per scalar map were calculated. The histogram bins that allowed the most precise determination of low-grade (LG) or high-grade (HG) classification were selected by multivariate discriminant analysis. Accuracy of the model was defined by the success rate of the leave-one-out cross-validation. Statistical descriptors of voxel value distribution did not differ between LG and HG tumors and did not allow classification. The histogram model had 88.5% specificity and 85.7% sensitivity in the separation of LG and HG gliomas; specificity was improved when cases with oligodendroglial components were omitted. Constructing histograms of preoperative radiological images over the tumor volume allows representation of the grade and enables discrimination of LG and HG gliomas which has been confirmed by histopathology. (orig.)

  9. Integrality gap analysis for bin packing games

    NARCIS (Netherlands)

    Kern, Walter; Qui, X.

    A cooperative bin packing game is an $N$-person game, where the player set $N$ consists of $k$ bins of capacity 1 each and $n$ items of sizes $a_1,\\dots,a_n$. The value $v(S)$ of a coalition $S$ of players is defined to be the maximum total size of items in $S$ that can be packed into the bins of

  10. Improved Taxation Rate for Bin Packing Games

    Science.gov (United States)

    Kern, Walter; Qiu, Xian

    A cooperative bin packing game is a N-person game, where the player set N consists of k bins of capacity 1 each and n items of sizes a 1, ⋯ ,a n . The value of a coalition of players is defined to be the maximum total size of items in the coalition that can be packed into the bins of the coalition. We present an alternative proof for the non-emptiness of the 1/3-core for all bin packing games and show how to improve this bound ɛ= 1/3 (slightly). We conjecture that the true best possible value is ɛ= 1/7.

  11. Bioinformatics and Astrophysics Cluster (BinAc)

    Science.gov (United States)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  12. Time-bin quantum RAM

    Science.gov (United States)

    Moiseev, E. S.; Moiseev, S. A.

    2016-11-01

    We have proposed a compact scheme of quantum random access memory (qRAM) based on the impedance matched multi-qubit photon echo quantum memory incorporated with the control four-level atom in two coupled QED cavities. A set of matching conditions for basic physical parameters of the qRAM scheme that provides an efficient quantum control of the fast single photon storage and readout has been found. In particular, it has been discovered that the efficient qRAM operations are determined by the specific properties of the excited photonic molecule coupling the two QED cavities. Herein, the maximal efficiency of the qRAM is realized when the cooperativity parameter of the photonic molecule equals to unity that can be experimentally achievable. We have also elaborated upon the new quantum address scheme where the multi-time-bin photon state is used for the control of the four-level atom during the readout of the photonic qubits from the quantum memory. The scheme reduces the required number of logical elements to one. Experimental implementation by means of current quantum technologies in the optical and microwave domains is also discussed.

  13. Clinical Utility of Blood Cell Histogram Interpretation.

    Science.gov (United States)

    Thomas, E T Arun; Bhagya, S; Majeed, Abdul

    2017-09-01

    An automated haematology analyser provides blood cell histograms by plotting the sizes of different blood cells on X-axis and their relative number on Y-axis. Histogram interpretation needs careful analysis of Red Blood Cell (RBC), White Blood Cell (WBC) and platelet distribution curves. Histogram analysis is often a neglected part of the automated haemogram which if interpreted well, has significant potential to provide diagnostically relevant information even before higher level investigations are ordered.

  14. System for histogram entry, retrieval, and plotting

    International Nuclear Information System (INIS)

    Kellogg, M.; Gallup, J.M.; Shlaer, S.; Spencer, N.

    1977-10-01

    This manual describes the systems for producing histograms and dot plots that were designed for use in connection with the Q general-purpose data-acquisition system. These systems allow for the creation of histograms; the entry, retrieval, and plotting of data in the form of histograms; and the dynamic display of scatter plots as data are acquired. Although the systems are designed for use with Q, they can also be used as a part of other applications. 3 figures

  15. Information granules in image histogram analysis.

    Science.gov (United States)

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Algorithms for adaptive histogram equalization

    International Nuclear Information System (INIS)

    Pizer, S.M.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Ter Haar Romeny, B.; Zimmerman, J.B.; Zuiderveld, K.

    1986-01-01

    Adaptive histogram equalization (ahe) is a contrast enhancement method designed to be broadly applicable and having demonstrated effectiveness [Zimmerman, 1985]. However, slow speed and the overenhancement of noise it produces in relatively homogeneous regions are two problems. The authors summarize algorithms designed to overcome these and other concerns. These algorithms include interpolated ahe, to speed up the method on general purpose computers; a version of interpolated ahe designed to run in a few seconds on feedback processors; a version of full ahe designed to run in under one second on custom VLSI hardware; and clipped ahe, designed to overcome the problem of overenhancement of noise contrast. The authors conclude that clipped ahe should become a method of choice in medical imaging and probably also in other areas of digital imaging, and that clipped ahe can be made adequately fast to be routinely applied in the normal display sequence

  17. WORKER, a program for histogram manipulation

    International Nuclear Information System (INIS)

    Bolger, J.E.; Ellinger, H.; Moore, C.F.

    1979-01-01

    A set of programs is provided which may link to any user-written program, permitting dynamic creation of histograms as well as display, manipulation and transfer of histogrammed data. With wide flexibility, constants within the user's code may be set or monitored at any time during execution. (Auth.)

  18. Interpreting Histograms. As Easy as It Seems?

    Science.gov (United States)

    Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2014-01-01

    Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…

  19. Spline smoothing of histograms by linear programming

    Science.gov (United States)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  20. Combining resummed Higgs predictions across jet bins

    Energy Technology Data Exchange (ETDEWEB)

    Boughezal, Radja [Argonne National Laboratory, IL (United States). High Energy Physics Division; Liu, Xiaohui; Petriello, Frank [Argonne National Laboratory, IL (United States). High Energy Physics Division; Northwestern Univ., Evanston, IL (United States). Dept. of Physics and Astronomy; Tackmann, Frank J. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Walsh, Jonathan R. [California Univ., Berkeley, CA (United States). Ernest Orlando Lawrence Berkeley Laboratory; California Univ., Berkeley, CA (United States). Center for Theoretical Physics

    2013-12-15

    Experimental analyses often use jet binning to distinguish between different kinematic regimes and separate contributions from background processes. To accurately model theoretical uncertainties in these measurements, a consistent description of the jet bins is required. We present a complete framework for the combination of resummed results for production processes in different exclusive jet bins, focusing on Higgs production in gluon fusion as an example. We extend the resummation of the H+1-jet cross section into the challenging low transverse momentum region, lowering the uncertainties considerably. We provide combined predictions with resummation for cross sections in the H+0-jet and H+1-jet bins, and give an improved theory covariance matrix for use in experimental studies. We estimate that the relevant theoretical uncertainties on the signal strength in the H{yields}WW{sup *} analysis are reduced by nearly a factor of 2 compared to the current value.

  1. Segmentation and Location Computation of Bin Objects

    Directory of Open Access Journals (Sweden)

    C.R. Hema

    2008-11-01

    Full Text Available In this paper we present a stereo vision based system for segmentation and location computation of partially occluded objects in bin picking environments. Algorithms to segment partially occluded objects and to find the object location [midpoint,x, y and z coordinates] with respect to the bin area are proposed. The z co ordinate is computed using stereo images and neural networks. The proposed algorithms is tested using two neural network architectures namely the Radial Basis Function nets and Simple Feedforward nets. The training results fo feedforward nets are found to be more suitable for the current application.The proposed stereo vision system is interfaced with an Adept SCARA Robot to perform bin picking operations. The vision system is found to be effective for partially occluded objects, in the absence of albedo effects. The results are validated through real time bin picking experiments on the Adept Robot.

  2. Predicting pathologic tumor response to chemoradiotherapy with histogram distances characterizing longitudinal changes in 18F-FDG uptake patterns

    Science.gov (United States)

    Tan, Shan; Zhang, Hao; Zhang, Yongxue; Chen, Wengen; D’Souza, Warren D.; Lu, Wei

    2013-01-01

    Purpose: A family of fluorine-18 (18F)-fluorodeoxyglucose (18F-FDG) positron-emission tomography (PET) features based on histogram distances is proposed for predicting pathologic tumor response to neoadjuvant chemoradiotherapy (CRT). These features describe the longitudinal change of FDG uptake distribution within a tumor. Methods: Twenty patients with esophageal cancer treated with CRT plus surgery were included in this study. All patients underwent PET/CT scans before (pre-) and after (post-) CRT. The two scans were first rigidly registered, and the original tumor sites were then manually delineated on the pre-PET/CT by an experienced nuclear medicine physician. Two histograms representing the FDG uptake distribution were extracted from the pre- and the registered post-PET images, respectively, both within the delineated tumor. Distances between the two histograms quantify longitudinal changes in FDG uptake distribution resulting from CRT, and thus are potential predictors of tumor response. A total of 19 histogram distances were examined and compared to both traditional PET response measures and Haralick texture features. Receiver operating characteristic analyses and Mann-Whitney U test were performed to assess their predictive ability. Results: Among all tested histogram distances, seven bin-to-bin and seven crossbin distances outperformed traditional PET response measures using maximum standardized uptake value (AUC = 0.70) or total lesion glycolysis (AUC = 0.80). The seven bin-to-bin distances were: L2 distance (AUC = 0.84), χ2 distance (AUC = 0.83), intersection distance (AUC = 0.82), cosine distance (AUC = 0.83), squared Euclidean distance (AUC = 0.83), L1 distance (AUC = 0.82), and Jeffrey distance (AUC = 0.82). The seven crossbin distances were: quadratic-chi distance (AUC = 0.89), earth mover distance (AUC = 0.86), fast earth mover distance (AUC = 0.86), diffusion distance (AUC = 0.88), Kolmogorov-Smirnov distance (AUC = 0.88), quadratic form distance

  3. A New Method of Histogram Computation for Efficient Implementation of the HOG Algorithm

    Directory of Open Access Journals (Sweden)

    Mariana-Eugenia Ilas

    2018-03-01

    Full Text Available In this paper we introduce a new histogram computation method to be used within the histogram of oriented gradients (HOG algorithm. The new method replaces the arctangent with the slope computation and the classical magnitude allocation based on interpolation with a simpler algorithm. The new method allows a more efficient implementation of HOG in general, and particularly in field-programmable gate arrays (FPGAs, by considerably reducing the area (thus increasing the level of parallelism, while maintaining very close classification accuracy compared to the original algorithm. Thus, the new method is attractive for many applications, including car detection and classification.

  4. Bin Set 1 Calcine Retrieval Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    R. D. Adams; S. M. Berry; K. J. Galloway; T. A. Langenwalter; D. A. Lopez; C. M. Noakes; H. K. Peterson; M. I. Pope; R. J. Turk

    1999-10-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase.

  5. Bin Set 1 Calcine Retrieval Feasibility Study

    International Nuclear Information System (INIS)

    Adams, R.D.; Berry, S.M.; Galloway, K.J.; Langenwalter, T.A.; Lopez, D.A.; Noakes, C.M.; Peterson, H.K.; Pope, M.I.; Turk, R.J.

    1999-01-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase

  6. Regionally adaptive histogram equalization of the chest

    International Nuclear Information System (INIS)

    Sherrier, R.H.; Johnson, G.A.

    1986-01-01

    Advances in digital chest radiography have resulted in the acquisition of high-quality digital images of the human chest. With these advances, there arises a genuine need for image processing algorithms, specific to chest images. The author has implemented the technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with a regionally adaptive histogram equalization method. Histograms are calculated locally and then modified according to both the mean pixel value of a given region and certain characteristics of the cumulative distribution function. The method allows certain regions of the chest radiograph to be enhanced differentially

  7. Color Histogram Diffusion for Image Enhancement

    Science.gov (United States)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  8. Adaptive histogram equalization and its variations

    NARCIS (Netherlands)

    Pizer, S.M.; Amburn, E.P.; Austin, J.D.; Cromartie, R.; Geselowitz, A.; Greer, Trey; Haar Romenij, ter B.M.; Zimmerman, J.B.; Zuiderveld, K.J.

    1987-01-01

    Adaptive histogram equalization (ahe) is a contrast enhancement method designed to be broadly applicable and having demonstrated effectiveness. However, slow speed and the overenhancement of noise it produces in relatively homogeneous regions are two problems. We report algorithms designed to

  9. The Online Histogram Presenter for the ATLAS experiment: A modular system for histogram visualization

    International Nuclear Information System (INIS)

    Dotti, Andrea; Adragna, Paolo; Vitillo, Roberto A

    2010-01-01

    The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.

  10. The Online Histogram Presenter for the ATLAS experiment: A modular system for histogram visualization

    Energy Technology Data Exchange (ETDEWEB)

    Dotti, Andrea [CERN, CH-1211 Genve 23 Switzerland (Switzerland); Adragna, Paolo [Physics Department, Queen Mary, University of London Mile End Road London E1 4RP UK (United Kingdom); Vitillo, Roberto A, E-mail: andrea.dotti@cern.c [INFN Sezione di Pisa, Ed. C Largo Bruno Pontecorvo 3, 56127 Pisa (Italy)

    2010-04-01

    The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.

  11. The equivalent Histograms in clinical practice

    International Nuclear Information System (INIS)

    Pizarro Trigo, F.; Teijeira Garcia, M.; Zaballos Carrera, S.

    2013-01-01

    Is frequently abused of The tolerances established for organ at risk [1] in diagrams of standard fractionation (2Gy/session, 5 sessions per week) when applied to Dose-Volume histograms non-standard schema. The purpose of this work is to establish when this abuse may be more important and realize a transformation of fractionation non-standard of histograms dosis-volumen. Is exposed a case that can be useful to make clinical decisions. (Author)

  12. Osama bin Ladeni uus imidzh / Urmas Kiil

    Index Scriptorium Estoniae

    Kiil, Urmas

    2007-01-01

    Pärast kolm aastat kestnud pausi ilmus terroriorganisatsiooni al-Qaeda liider Osama bin Laden kahel korral taas maailma avalikkuse ette. Enamik asjatundjad on seisukohal, et maailma esiterroristi videopöördumised olid autentsed ja tõenäoliselt olid need salvestatud juulis või augustis

  13. Binning metagenomic contigs by coverage and composition

    NARCIS (Netherlands)

    Alneberg, J.; Bjarnason, B.S.; Bruijn, de I.; Schirmer, M.; Quick, J.; Ijaz, U.Z.; Lahti, L.M.; Loman, N.J.; Andersson, A.F.; Quince, C.

    2014-01-01

    Shotgun sequencing enables the reconstruction of genomes from complex microbial communities, but because assembly does not reconstruct entire genomes, it is necessary to bin genome fragments. Here we present CONCOCT, a new algorithm that combines sequence composition and coverage across multiple

  14. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  15. ESG Allocations

    Data.gov (United States)

    Department of Housing and Urban Development — This report displays the Emergency Solutions Grants (ESG), formerly Emergency Shelter Grants, allocation by jurisdiction. The website allows users to look at...

  16. ACTION RECOGNITION USING SALIENT NEIGHBORING HISTOGRAMS

    DEFF Research Database (Denmark)

    Ren, Huamin; Moeslund, Thomas B.

    2013-01-01

    Combining spatio-temporal interest points with Bag-of-Words models achieves state-of-the-art performance in action recognition. However, existing methods based on “bag-ofwords” models either are too local to capture the variance in space/time or fail to solve the ambiguity problem in spatial...... and temporal dimensions. Instead, we propose a salient vocabulary construction algorithm to select visual words from a global point of view, and form compact descriptors to represent discriminative histograms in the neighborhoods. Those salient neighboring histograms are then trained to model different actions...

  17. Oriented Shape Index Histograms for Cell Classification

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Dahl, Anders Bjorholm; Larsen, Rasmus

    2015-01-01

    We propose a novel extension to the shape index histogram feature descriptor where the orientation of the second-order curvature is included in the histograms. The orientation of the shape index is reminiscent but not equal to gradient orientation which is widely used for feature description. We...... evaluate our new feature descriptor using a public dataset consisting of HEp-2 cell images from indirect immunoflourescence lighting. Our results show that we can improve classification performance significantly when including the shape index orientation. Notably, we show that shape index orientation...

  18. Pakistan andis Osama bin Ladeni jahtimisel alla / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Pakistani president Pervez Musharrafi sõnul on Osama bin Ladeni otsingud Lõuna-Waziristanis tulemusteta. Edaspidi keskendutakse otsingutel Põhja-Waziristani, oletatakse, et bin Laden võib olla ka Tora Bora koobastikus. Lisa: USA hambutu luureteenistus

  19. Neutrosophic Similarity Score Based Weighted Histogram for Robust Mean-Shift Tracking

    Directory of Open Access Journals (Sweden)

    Keli Hu

    2017-10-01

    Full Text Available Visual object tracking is a critical task in computer vision. Challenging things always exist when an object needs to be tracked. For instance, background clutter is one of the most challenging problems. The mean-shift tracker is quite popular because of its efficiency and performance in a range of conditions. However, the challenge of background clutter also disturbs its performance. In this article, we propose a novel weighted histogram based on neutrosophic similarity score to help the mean-shift tracker discriminate the target from the background. Neutrosophic set (NS is a new branch of philosophy for dealing with incomplete, indeterminate, and inconsistent information. In this paper, we utilize the single valued neutrosophic set (SVNS, which is a subclass of NS to improve the mean-shift tracker. First, two kinds of criteria are considered as the object feature similarity and the background feature similarity, and each bin of the weight histogram is represented in the SVNS domain via three membership functions T(Truth, I(indeterminacy, and F(Falsity. Second, the neutrosophic similarity score function is introduced to fuse those two criteria and to build the final weight histogram. Finally, a novel neutrosophic weighted mean-shift tracker is proposed. The proposed tracker is compared with several mean-shift based trackers on a dataset of 61 public sequences. The results revealed that our method outperforms other trackers, especially when confronting background clutter.

  20. Control system of hexacopter using color histogram footprint and convolutional neural network

    Science.gov (United States)

    Ruliputra, R. N.; Darma, S.

    2017-07-01

    The development of unmanned aerial vehicles (UAV) has been growing rapidly in recent years. The use of logic thinking which is implemented into the program algorithms is needed to make a smart system. By using visual input from a camera, UAV is able to fly autonomously by detecting a target. However, some weaknesses arose as usage in the outdoor environment might change the target's color intensity. Color histogram footprint overcomes the problem because it divides color intensity into separate bins that make the detection tolerant to the slight change of color intensity. Template matching compare its detection result with a template of the reference image to determine the target position and use it to position the vehicle in the middle of the target with visual feedback control based on Proportional-Integral-Derivative (PID) controller. Color histogram footprint method localizes the target by calculating the back projection of its histogram. It has an average success rate of 77 % from a distance of 1 meter. It can position itself in the middle of the target by using visual feedback control with an average positioning time of 73 seconds. After the hexacopter is in the middle of the target, Convolutional Neural Networks (CNN) classifies a number contained in the target image to determine a task depending on the classified number, either landing, yawing, or return to launch. The recognition result shows an optimum success rate of 99.2 %.

  1. Infrared Small Moving Target Detection via Saliency Histogram and Geometrical Invariability

    Directory of Open Access Journals (Sweden)

    Minjie Wan

    2017-06-01

    Full Text Available In order to detect both bright and dark small moving targets effectively in infrared (IR video sequences, a saliency histogram and geometrical invariability based method is presented in this paper. First, a saliency map that roughly highlights the salient regions of the original image is obtained by tuning its amplitude spectrum in the frequency domain. Then, a saliency histogram is constructed by means of averaging the accumulated saliency value of each gray level in the map, through which bins corresponding to bright target and dark target are assigned with large values in the histogram. Next, single-frame detection of candidate targets is accomplished by a binarized segmentation using an adaptive threshold, and their centroid coordinates with sub-pixel accuracy are calculated through a connected components labeling method as well as a gray-weighted criterion. Finally, considering the motion characteristics in consecutive frames, an inter-frame false alarm suppression method based on geometrical invariability is developed to improve the precision rate further. Quantitative analyses demonstrate the detecting precision of this proposed approach can be up to 97% and Receiver Operating Characteristic (ROC curves further verify our method outperforms other state-of-the-arts methods in both detection rate and false alarm rate.

  2. The grumpy bin : reducing food waste through playful social interactions

    NARCIS (Netherlands)

    Altarriba, F.; Funk, M.; Lanzani, S.E.; Torralba, A.

    2017-01-01

    Domestic food waste is a world-wide problem that is complex and difficult to tackle as it touches diverse habits and social behaviors. This paper introduces the Grumpy Bin, a smart food waste bin designed for the context of student housing. The Grumpy Bin1 contributes to the state of the art of food

  3. MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yu-Wei [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Simmons, Blake A. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Steven W. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-10-29

    The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning a single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.

  4. Maintainability allocation

    International Nuclear Information System (INIS)

    Guyot, Christian.

    1980-06-01

    The author gives the general lines of a method for the allocation and for the evaluation of maintainability of complex systems which is to be developed during the conference. The maintainability objective is supposed to be formulated under the form of a mean time to repair (M.T.T.R.) [fr

  5. Comparing Online Algorithms for Bin Packing Problems

    DEFF Research Database (Denmark)

    Epstein, Leah; Favrholdt, Lene Monrad; Kohrt, Jens Svalgaard

    2012-01-01

    The relative worst-order ratio is a measure of the quality of online algorithms. In contrast to the competitive ratio, this measure compares two online algorithms directly instead of using an intermediate comparison with an optimal offline algorithm. In this paper, we apply the relative worst-ord......-order ratio to online algorithms for several common variants of the bin packing problem. We mainly consider pairs of algorithms that are not distinguished by the competitive ratio and show that the relative worst-order ratio prefers the intuitively better algorithm of each pair....

  6. Histogram-based adaptive gray level scaling for texture feature classification of colorectal polyps

    Science.gov (United States)

    Pomeroy, Marc; Lu, Hongbing; Pickhardt, Perry J.; Liang, Zhengrong

    2018-02-01

    Texture features have played an ever increasing role in computer aided detection (CADe) and diagnosis (CADx) methods since their inception. Texture features are often used as a method of false positive reduction for CADe packages, especially for detecting colorectal polyps and distinguishing them from falsely tagged residual stool and healthy colon wall folds. While texture features have shown great success there, the performance of texture features for CADx have lagged behind primarily because of the more similar features among different polyps types. In this paper, we present an adaptive gray level scaling and compare it to the conventional equal-spacing of gray level bins. We use a dataset taken from computed tomography colonography patients, with 392 polyp regions of interest (ROIs) identified and have a confirmed diagnosis through pathology. Using the histogram information from the entire ROI dataset, we generate the gray level bins such that each bin contains roughly the same number of voxels Each image ROI is the scaled down to two different numbers of gray levels, using both an equal spacing of Hounsfield units for each bin, and our adaptive method. We compute a set of texture features from the scaled images including 30 gray level co-occurrence matrix (GLCM) features and 11 gray level run length matrix (GLRLM) features. Using a random forest classifier to distinguish between hyperplastic polyps and all others (adenomas and adenocarcinomas), we find that the adaptive gray level scaling can improve performance based on the area under the receiver operating characteristic curve by up to 4.6%.

  7. Efficient contrast enhancement through log-power histogram modification

    NARCIS (Netherlands)

    Wu, T.; Toet, A.

    2014-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance digital image contrast. First a logarithm operator reduces the effect of spikes and transforms the image histogram into a smoothed one that approximates a uniform histogram while retaining the relative size ordering of

  8. SU-G-BRC-08: Evaluation of Dose Mass Histogram as a More Representative Dose Description Method Than Dose Volume Histogram in Lung Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Liu, J; Eldib, A; Ma, C [Fox Chase Cancer Center, Philadelphia, PA (United States); Lin, M [The University of Texas Southwestern Medical Ctr, Dallas, TX (United States); Li, J [Cyber Medical Inc, Xian, Shaanxi (China); Mora, G [Universidade de Lisboa, Codex, Lisboa (Portugal)

    2016-06-15

    Purpose: Dose-volume-histogram (DVH) is widely used for plan evaluation in radiation treatment. The concept of dose-mass-histogram (DMH) is expected to provide a more representative description as it accounts for heterogeneity in tissue density. This study is intended to assess the difference between DVH and DMH for evaluating treatment planning quality. Methods: 12 lung cancer treatment plans were exported from the treatment planning system. DVHs for the planning target volume (PTV), the normal lung and other structures of interest were calculated. DMHs were calculated in a similar way as DVHs expect that the voxel density converted from the CT number was used in tallying the dose histogram bins. The equivalent uniform dose (EUD) was calculated based on voxel volume and mass, respectively. The normal tissue complication probability (NTCP) in relation to the EUD was calculated for the normal lung to provide quantitative comparison of DVHs and DMHs for evaluating the radiobiological effect. Results: Large differences were observed between DVHs and DMHs for lungs and PTVs. For PTVs with dense tumor cores, DMHs are higher than DVHs due to larger mass weighing in the high dose conformal core regions. For the normal lungs, DMHs can either be higher or lower than DVHs depending on the target location within the lung. When the target is close to the lower lung, DMHs show higher values than DVHs because the lower lung has higher density than the central portion or the upper lung. DMHs are lower than DVHs for targets in the upper lung. The calculated NTCPs showed a large range of difference between DVHs and DMHs. Conclusion: The heterogeneity of lung can be well considered using DMH for evaluating target coverage and normal lung pneumonitis. Further studies are warranted to quantify the benefits of DMH over DVH for plan quality evaluation.

  9. Gamma histograms for radiotherapy plan evaluation

    International Nuclear Information System (INIS)

    Spezi, Emiliano; Lewis, D. Geraint

    2006-01-01

    Background and purpose: The technique known as the 'γ evaluation method' incorporates pass-fail criteria for both distance-to-agreement and dose difference analysis of 3D dose distributions and provides a numerical index (γ) as a measure of the agreement between two datasets. As the γ evaluation index is being adopted in more centres as part of treatment plan verification procedures for 2D and 3D dose maps, the development of methods capable of encapsulating the information provided by this technique is recommended. Patients and methods: In this work the concept of γ index was extended to create gamma histograms (GH) in order to provide a measure of the agreement between two datasets in two or three dimensions. Gamma area histogram (GAH) and gamma volume histogram (GVH) graphs were produced using one or more 2D γ maps generated for each slice of the irradiated volume. GHs were calculated for IMRT plans, evaluating the 3D dose distribution from a commercial treatment planning system (TPS) compared to a Monte Carlo (MC) calculation used as reference dataset. Results: The extent of local anatomical inhomogenities in the plans under consideration was strongly correlated with the level of difference between reference and evaluated calculations. GHs provided an immediate visual representation of the proportion of the treated volume that fulfilled the γ criterion and offered a concise method for comparative numerical evaluation of dose distributions. Conclusions: We have introduced the concept of GHs and investigated its applications to the evaluation and verification of IMRT plans. The gamma histogram concept set out in this paper can provide a valuable technique for quantitative comparison of dose distributions and could be applied as a tool for the quality assurance of treatment planning systems

  10. Robust histogram-based image retrieval

    Czech Academy of Sciences Publication Activity Database

    Höschl, Cyril; Flusser, Jan

    2016-01-01

    Roč. 69, č. 1 (2016), s. 72-81 ISSN 0167-8655 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Image retrieval * Noisy image * Histogram * Convolution * Moments * Invariants Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.995, year: 2016 http://library.utia.cas.cz/separaty/2015/ZOI/hoschl-0452147.pdf

  11. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  12. Corrosion monitoring of storage bins for radioactive calcines

    International Nuclear Information System (INIS)

    Hoffman, T.L.

    1975-01-01

    Highly radioactive liquid waste produced at the Idaho Chemical Processing Plant is calcined to a granular solid for long term storage in stainless steel bins. Corrosion evaluation of coupons withdrawn from these bins indicates excellent performance for the materials of construction of the bins. At exposure periods of up to six years the average penetration rates are 0.01 and 0.05 mils per year for Types 304 and 405 stainless steels, respectively. (auth)

  13. A histogram-free multicanonical Monte Carlo algorithm for the construction of analytical density of states

    Energy Technology Data Exchange (ETDEWEB)

    Eisenbach, Markus [ORNL; Li, Ying Wai [ORNL

    2017-06-01

    We report a new multicanonical Monte Carlo (MC) algorithm to obtain the density of states (DOS) for physical systems with continuous state variables in statistical mechanics. Our algorithm is able to obtain an analytical form for the DOS expressed in a chosen basis set, instead of a numerical array of finite resolution as in previous variants of this class of MC methods such as the multicanonical (MUCA) sampling and Wang-Landau (WL) sampling. This is enabled by storing the visited states directly in a data set and avoiding the explicit collection of a histogram. This practice also has the advantage of avoiding undesirable artificial errors caused by the discretization and binning of continuous state variables. Our results show that this scheme is capable of obtaining converged results with a much reduced number of Monte Carlo steps, leading to a significant speedup over existing algorithms.

  14. SPHINX--an algorithm for taxonomic binning of metagenomic sequences.

    Science.gov (United States)

    Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Singh, Nitin Kumar; Mande, Sharmila S

    2011-01-01

    Compared with composition-based binning algorithms, the binning accuracy and specificity of alignment-based binning algorithms is significantly higher. However, being alignment-based, the latter class of algorithms require enormous amount of time and computing resources for binning huge metagenomic datasets. The motivation was to develop a binning approach that can analyze metagenomic datasets as rapidly as composition-based approaches, but nevertheless has the accuracy and specificity of alignment-based algorithms. This article describes a hybrid binning approach (SPHINX) that achieves high binning efficiency by utilizing the principles of both 'composition'- and 'alignment'-based binning algorithms. Validation results with simulated sequence datasets indicate that SPHINX is able to analyze metagenomic sequences as rapidly as composition-based algorithms. Furthermore, the binning efficiency (in terms of accuracy and specificity of assignments) of SPHINX is observed to be comparable with results obtained using alignment-based algorithms. A web server for the SPHINX algorithm is available at http://metagenomics.atc.tcs.com/SPHINX/.

  15. Implementation of Bin Packing Model for Reed Switch Production Planning

    Energy Technology Data Exchange (ETDEWEB)

    Parra, Rainier Romero [Polytechnic University of Baja California, Calle de la Claridad S/N, Col Plutarco Elias Calles, Mexicali, B. C., 21376 (Mexico); Burtseva, Larysa [Engineering Institute, Autonomous University of Baja California, Calle de la Normal S/N, Col. Insurgentes Este, Mexicali, BC, 21280 (Mexico)

    2010-06-17

    This paper presents a form to resolve a real problem of efficient material election in reed switch manufacturing. The carrying out of the consumer demands depends on the stochastic results of the classification process where each lot of switches is distributed into bins according to an electric measure value. Various glass types are employed for the switch manufacturing. The effect caused by the glass type variation on the switch classification results was investigated. Based on real data statistic analysis, the problem is reduced to the lot number minimizing taking into consideration the glass type, and interpreted as a bin packing problem generalization. On difference to the classic bin packing problem, in the considered case, an item represents a set of pieces; a container is divided into a number of bins (sub-containers); the bin capacity is variable; there are the assignment restrictions between bins and sets of pieces; the items are allowed to be fragmented into bins and containers. The problem has a high complexity. A heuristic offline algorithm is proposed to find the quantity, types and packing sequence of containers, the item fragments associated with containers and bins. The bin capacities do not affect the algorithm.

  16. Implementation of Bin Packing Model for Reed Switch Production Planning

    International Nuclear Information System (INIS)

    Parra, Rainier Romero; Burtseva, Larysa

    2010-01-01

    This paper presents a form to resolve a real problem of efficient material election in reed switch manufacturing. The carrying out of the consumer demands depends on the stochastic results of the classification process where each lot of switches is distributed into bins according to an electric measure value. Various glass types are employed for the switch manufacturing. The effect caused by the glass type variation on the switch classification results was investigated. Based on real data statistic analysis, the problem is reduced to the lot number minimizing taking into consideration the glass type, and interpreted as a bin packing problem generalization. On difference to the classic bin packing problem, in the considered case, an item represents a set of pieces; a container is divided into a number of bins (sub-containers); the bin capacity is variable; there are the assignment restrictions between bins and sets of pieces; the items are allowed to be fragmented into bins and containers. The problem has a high complexity. A heuristic offline algorithm is proposed to find the quantity, types and packing sequence of containers, the item fragments associated with containers and bins. The bin capacities do not affect the algorithm.

  17. Efficient binning for bitmap indices on high-cardinality attributes

    Energy Technology Data Exchange (ETDEWEB)

    Rotem, Doron; Stockinger, Kurt; Wu, Kesheng

    2004-11-17

    Bitmap indexing is a common technique for indexing high-dimensional data in data warehouses and scientific applications. Though efficient for low-cardinality attributes, query processing can be rather costly for high-cardinality attributes due to the large storage requirements for the bitmap indices. Binning is a common technique for reducing storage costs of bitmap indices. This technique partitions the attribute values into a number of ranges, called bins, and uses bitmap vectors to represent bins (attribute ranges) rather than distinct values. Although binning may reduce storage costs, it may increase the access costs of queries that do not fall on exact bin boundaries (edge bins). For this kind of queries the original data values associated with edge bins must be accessed, in order to check them against the query constraints.In this paper we study the problem of finding optimal locations for the bin boundaries in order to minimize these access costs subject to storage constraints. We propose a dynamic programming algorithm for optimal partitioning of attribute values into bins that takes into account query access patterns as well as data distribution statistics. Mathematical analysis and experiments on real life data sets show that the optimal partitioning achieved by this algorithm can lead to a significant improvement in the access costs of bitmap indexing systems for high-cardinality attributes.

  18. Boundary condition histograms for modulated phases

    International Nuclear Information System (INIS)

    Benakli, M.; Gabay, M.; Saslow, W.M.

    1997-11-01

    Boundary conditions strongly affect the results of numerical computations for finite size inhomogeneous or incommensurate structures. We present a method which allows to deal with this problem, both for ground state and for critical properties: it combines fluctuating boundary conditions and specific histogram techniques. Our approach concerns classical as well as quantum systems. In particular, current-current correlation functions, which probe large scale coherence of the states, can be accurately evaluated. We illustrate our method on a frustrated two dimensional XY model. (author)

  19. Miks Osama bin Ladenit pole siiani tabatud? / Peeter Ernits

    Index Scriptorium Estoniae

    Ernits, Peeter, 1953-

    2008-01-01

    Afganistanis viibinud ajakirjaniku eesmärgiks oli teada saada, miks vaatamata kõigile püüdlustele vägivald Afganistanis siiski ei vähene ning kus varjab ennast Al Qaeda liider Osama bin Laden. Lisad: Osama bin Laden; 11. september 2001

  20. Design and implement of BESIII online histogramming software

    International Nuclear Information System (INIS)

    Li Fei; Wang Liang; Liu Yingjie; Chinese Academy of Sciences, Beijing; Zhu Kejun; Zhao Jingwei

    2007-01-01

    The online histogramming software is an important part of the BESIII DAQ (Data Acquisition) system. This article introduces the main requirements and design of the online histogramming software and presents how to produce, transmit and gather histograms in the distributed environment in the current software implement. The article also illustrate one smart, simple and easy to expand way of setup with xml configure database. (authors)

  1. A monitoring program of the histograms based on ROOT package

    International Nuclear Information System (INIS)

    Zhou Yongzhao; Liang Hao; Chen Yixin; Xue Jundong; Yang Tao; Gong Datao; Jin Ge; Yu Xiaoqi

    2002-01-01

    KHBOOK is a histogram monitor and browser based on ROOT package, which reads the histogram file in HBOOK format from Physmon, converts it into ROOT format, and browses the histograms in Repeat and Overlap modes to monitor and trace the quality of the data from DAQ. KHBOOK is a program of small memory, easy maintenance and fast running as well, using mono-behavior classes and a communication class of C ++

  2. Decomposition analysis of differential dose volume histograms

    International Nuclear Information System (INIS)

    Heuvel, Frank van den

    2006-01-01

    Dose volume histograms are a common tool to assess the value of a treatment plan for various forms of radiation therapy treatment. The purpose of this work is to introduce, validate, and apply a set of tools to analyze differential dose volume histograms by decomposing them into physically and clinically meaningful normal distributions. A weighted sum of the decomposed normal distributions (e.g., weighted dose) is proposed as a new measure of target dose, rather than the more unstable point dose. The method and its theory are presented and validated using simulated distributions. Additional validation is performed by analyzing simple four field box techniques encompassing a predefined target, using different treatment energies inside a water phantom. Furthermore, two clinical situations are analyzed using this methodology to illustrate practical usefulness. A comparison of a treatment plan for a breast patient using a tangential field setup with wedges is compared to a comparable geometry using dose compensators. Finally, a normal tissue complication probability (NTCP) calculation is refined using this decomposition. The NTCP calculation is performed on a liver as organ at risk in a treatment of a mesothelioma patient with involvement of the right lung. The comparison of the wedged breast treatment versus the compensator technique yields comparable classical dose parameters (e.g., conformity index ≅1 and equal dose at the ICRU dose point). The methodology proposed here shows a 4% difference in weighted dose outlining the difference in treatment using a single parameter instead of at least two in a classical analysis (e.g., mean dose, and maximal dose, or total dose variance). NTCP-calculations for the mesothelioma case are generated automatically and show a 3% decrease with respect to the classical calculation. The decrease is slightly dependant on the fractionation and on the α/β-value utilized. In conclusion, this method is able to distinguish clinically

  3. Design and Development of a Smart Waste Bin

    Directory of Open Access Journals (Sweden)

    Michael E.

    2017-10-01

    Full Text Available For years waste bin has been part of our lives this has necessitated many inventions and innovations to make it automated. In this light much research was channeled towards the opening and closing of the bin when the presence of human is sensed. However this may be considered less smart since the bin will operate when the presence of human is sensed even though there is no intention to use it. To avert this ill this paper presents the design and development of a smart waste bin. The objective of this paper is to develop a smart waste bin that detects the presence of man at a particular distance 1 meter for usage so as not to spill the dirt and obeys voice command to open or close the lid. This is achieved by the use of PIR ultrasonic module voice recognition module Arduino and servo motor. Results gotten after testing the developed system shows that the performance of the waste bin attains a better level of smartness compared to existing waste bin.

  4. Value-at-risk estimation with fuzzy histograms

    NARCIS (Netherlands)

    Almeida, R.J.; Kaymak, U.

    2008-01-01

    Value at risk (VaR) is a measure for senior management that summarises the financial risk a company faces into one single number. In this paper, we consider the use of fuzzy histograms for quantifying the value-at-risk of a portfolio. It is shown that the use of fuzzy histograms provides a good

  5. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  6. Calibration of 14C Histograms : A Comparison of Methods

    NARCIS (Netherlands)

    Stolk, Ad; Törnqvist, Torbjörn E.; Hekhuis, Kilian P.V.; Berendsen, Henk J.A.; Plicht, Johannes van der

    1994-01-01

    The interpretation of C-14 histograms is complicated by the non-linearity of the C-14 time scale in terms of Calendar years, which may result in clustering of C-14 ages in certain time intervals unrelated to the (geologic or archaeologic) phenomenon of interest. One can calibrate C-14 histograms for

  7. Vicarious revenge and the death of Osama bin Laden.

    Science.gov (United States)

    Gollwitzer, Mario; Skitka, Linda J; Wisneski, Daniel; Sjöström, Arne; Liberman, Peter; Nazir, Syed Javed; Bushman, Brad J

    2014-05-01

    Three hypotheses were derived from research on vicarious revenge and tested in the context of the assassination of Osama bin Laden in 2011. In line with the notion that revenge aims at delivering a message (the "message hypothesis"), Study 1 shows that Americans' vengeful desires in the aftermath of 9/11 predicted a sense of justice achieved after bin Laden's death, and that this effect was mediated by perceptions that his assassination sent a message to the perpetrators to not "mess" with the United States. In line with the "blood lust hypothesis," his assassination also sparked a desire to take further revenge and to continue the "war on terror." Finally, in line with the "intent hypothesis," Study 2 shows that Americans (but not Pakistanis or Germans) considered the fact that bin Laden was killed intentionally more satisfactory than the possibility of bin Laden being killed accidentally (e.g., in an airplane crash).

  8. Soft-collinear factorization and zero-bin subtractions

    International Nuclear Information System (INIS)

    Chiu Juiyu; Fuhrer, Andreas; Kelley, Randall; Manohar, Aneesh V.; Hoang, Andre H.

    2009-01-01

    We study the Sudakov form factor for a spontaneously broken gauge theory using a (new) Δ-regulator. To be well defined, the effective theory requires zero-bin subtractions for the collinear sectors. The zero-bin subtractions depend on the gauge boson mass M and are not scaleless. They have both finite and 1/ε contributions and are needed to give the correct anomalous dimension and low-scale matching contributions. We also demonstrate the necessity of zero-bin subtractions for soft-collinear factorization. We find that after zero-bin subtractions the form factor is the sum of the collinear contributions minus a soft mass-mode contribution, in agreement with a previous result of Idilbi and Mehen in QCD. This appears to conflict with the method-of-regions approach, where one gets the sum of contributions from different regions.

  9. AHIMSA - Ad hoc histogram information measure sensing algorithm for feature selection in the context of histogram inspired clustering techniques

    Science.gov (United States)

    Dasarathy, B. V.

    1976-01-01

    An algorithm is proposed for dimensionality reduction in the context of clustering techniques based on histogram analysis. The approach is based on an evaluation of the hills and valleys in the unidimensional histograms along the different features and provides an economical means of assessing the significance of the features in a nonparametric unsupervised data environment. The method has relevance to remote sensing applications.

  10. Residency Allocation Database

    Data.gov (United States)

    Department of Veterans Affairs — The Residency Allocation Database is used to determine allocation of funds for residency programs offered by Veterans Affairs Medical Centers (VAMCs). Information...

  11. The volatile compound BinBase mass spectral database.

    Science.gov (United States)

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The Bin

  12. The volatile compound BinBase mass spectral database

    Directory of Open Access Journals (Sweden)

    Barupal Dinesh K

    2011-08-01

    Full Text Available Abstract Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species. Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http

  13. IMPLEMENTASI METODE HISTOGRAM EQUALIZATION UNTUK MENINGKATKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Isa Akhlis

    2012-02-01

    Full Text Available Radiografi dapat digunakan untuk membantu mendiagnosis penyakit dalam bidang medis. Umumnya citra radiograf masih tampak kabur sehingga memerlukan pengolahan untuk menghilangkan atau mengurangi kekaburan tersebut. Tujuan penelitian ini adalah mendesain perangkat lunak untuk meningkatkan kualitas citra digital foto Roentgen yaitu dengan meningkatkan kontras citra tersebut. Salah satu metode untuk meningkatkan kontras citra digital adalah dengan menggunakan metode histogram equalization. Metoda tersebut membuat tingkat keabuan citra tersebar merata pada semua tingkat keabuan. Hasil penelitian menunjukkan bahwa metoda histogram equalization dapat digunakan untuk meningkatkan kontras citra.  Hal ini dapat langsung dilihat pada layar monitor.   Kata kunci: citra radiograf,  histogram equalization

  14. HPLOT: the graphics interface package for the HBOOK histogramming package

    International Nuclear Information System (INIS)

    Watkins, H.

    1978-01-01

    The subroutine package HPLOT described in this report, enables the CERN histogramming package HBOOK to produce high-quality pictures by means of high-resolution devices such as plotters. HPLOT can be implemented on any scientific computing system with a Fortran IV compiler and can be interfaced with any graphics package; spectral routines in addition to the basic ones enable users to embellish their histograms. Examples are also given of the use of HPLOT as a graphics package for plotting simple pictures without histograms. (Auth.)

  15. Solar Radiation Pressure Binning for the Geosynchronous Orbit

    Science.gov (United States)

    Hejduk, M. D.; Ghrist, R. W.

    2011-01-01

    Orbital maintenance parameters for individual satellites or groups of satellites have traditionally been set by examining orbital parameters alone, such as through apogee and perigee height binning; this approach ignored the other factors that governed an individual satellite's susceptibility to non-conservative forces. In the atmospheric drag regime, this problem has been addressed by the introduction of the "energy dissipation rate," a quantity that represents the amount of energy being removed from the orbit; such an approach is able to consider both atmospheric density and satellite frontal area characteristics and thus serve as a mechanism for binning satellites of similar behavior. The geo-synchronous orbit (of broader definition than the geostationary orbit -- here taken to be from 1300 to 1800 minutes in orbital period) is not affected by drag; rather, its principal non-conservative force is that of solar radiation pressure -- the momentum imparted to the satellite by solar radiometric energy. While this perturbation is solved for as part of the orbit determination update, no binning or division scheme, analogous to the drag regime, has been developed for the geo-synchronous orbit. The present analysis has begun such an effort by examining the behavior of geosynchronous rocket bodies and non-stabilized payloads as a function of solar radiation pressure susceptibility. A preliminary examination of binning techniques used in the drag regime gives initial guidance regarding the criteria for useful bin divisions. Applying these criteria to the object type, solar radiation pressure, and resultant state vector accuracy for the analyzed dataset, a single division of "large" satellites into two bins for the purposes of setting related sensor tasking and orbit determination (OD) controls is suggested. When an accompanying analysis of high area-to-mass objects is complete, a full set of binning recommendations for the geosynchronous orbit will be available.

  16. Extracting rate coefficients from single-molecule photon trajectories and FRET efficiency histograms for a fast-folding protein.

    Science.gov (United States)

    Chung, Hoi Sung; Gopich, Irina V; McHale, Kevin; Cellmer, Troy; Louis, John M; Eaton, William A

    2011-04-28

    Recently developed statistical methods by Gopich and Szabo were used to extract folding and unfolding rate coefficients from single-molecule Förster resonance energy transfer (FRET) data for proteins with kinetics too fast to measure waiting time distributions. Two types of experiments and two different analyses were performed. In one experiment bursts of photons were collected from donor and acceptor fluorophores attached to a 73-residue protein, α(3)D, freely diffusing through the illuminated volume of a confocal microscope system. In the second, the protein was immobilized by linkage to a surface, and photons were collected until one of the fluorophores bleached. Folding and unfolding rate coefficients and mean FRET efficiencies for the folded and unfolded subpopulations were obtained from a photon by photon analysis of the trajectories using a maximum likelihood method. The ability of the method to describe the data in terms of a two-state model was checked by recoloring the photon trajectories with the extracted parameters and comparing the calculated FRET efficiency histograms with the measured histograms. The sum of the rate coefficients for the two-state model agreed to within 30% with the relaxation rate obtained from the decay of the donor-acceptor cross-correlation function, confirming the high accuracy of the method. Interestingly, apparently reliable rate coefficients could be extracted using the maximum likelihood method, even at low (rate coefficients and mean FRET efficiencies were also obtained in an approximate procedure by simply fitting the FRET efficiency histograms, calculated by binning the donor and acceptor photons, with a sum of three-Gaussian functions. The kinetics are exposed in these histograms by the growth of a FRET efficiency peak at values intermediate between the folded and unfolded peaks as the bin size increases, a phenomenon with similarities to NMR exchange broadening. When comparable populations of folded and unfolded

  17. Transport Infrastructure Slot Allocation

    NARCIS (Netherlands)

    Koolstra, K.

    2005-01-01

    In this thesis, transport infrastructure slot allocation has been studied, focusing on selection slot allocation, i.e. on longer-term slot allocation decisions determining the traffic patterns served by infrastructure bottlenecks, rather than timetable-related slot allocation problems. The

  18. Face recognition algorithm using extended vector quantization histogram features.

    Science.gov (United States)

    Yan, Yan; Lee, Feifei; Wu, Xueqian; Chen, Qiu

    2018-01-01

    In this paper, we propose a face recognition algorithm based on a combination of vector quantization (VQ) and Markov stationary features (MSF). The VQ algorithm has been shown to be an effective method for generating features; it extracts a codevector histogram as a facial feature representation for face recognition. Still, the VQ histogram features are unable to convey spatial structural information, which to some extent limits their usefulness in discrimination. To alleviate this limitation of VQ histograms, we utilize Markov stationary features (MSF) to extend the VQ histogram-based features so as to add spatial structural information. We demonstrate the effectiveness of our proposed algorithm by achieving recognition results superior to those of several state-of-the-art methods on publicly available face databases.

  19. A Modified Image Comparison Algorithm Using Histogram Features

    OpenAIRE

    Al-Oraiqat, Anas M.; Kostyukova, Natalya S.

    2018-01-01

    This article discuss the problem of color image content comparison. Particularly, methods of image content comparison are analyzed, restrictions of color histogram are described and a modified method of images content comparison is proposed. This method uses the color histograms and considers color locations. Testing and analyzing of based and modified algorithms are performed. The modified method shows 97% average precision for a collection containing about 700 images without loss of the adv...

  20. Multi-dimensional Bin Packing Problems with Guillotine Constraints

    DEFF Research Database (Denmark)

    Amossen, Rasmus Resen; Pisinger, David

    2010-01-01

    The problem addressed in this paper is the decision problem of determining if a set of multi-dimensional rectangular boxes can be orthogonally packed into a rectangular bin while satisfying the requirement that the packing should be guillotine cuttable. That is, there should exist a series of face...... parallel straight cuts that can recursively cut the bin into pieces so that each piece contains a box and no box has been intersected by a cut. The unrestricted problem is known to be NP-hard. In this paper we present a generalization of a constructive algorithm for the multi-dimensional bin packing...... problem, with and without the guillotine constraint, based on constraint programming....

  1. Bin-packing problems with load balancing and stability constraints

    DEFF Research Database (Denmark)

    Trivella, Alessio; Pisinger, David

    apper in a wide range of disciplines, including transportation and logistics, computer science, engineering, economics and manufacturing. The problem is well-known to be N P-hard and difficult to solve in practice, especially when dealing with the multi-dimensional cases. Closely connected to the BPP...... realistic constraints related to e.g. load balancing, cargo stability and weight limits, in the multi-dimensional BPP. The BPP poses additional challenges compared to the CLP due to the supplementary objective of minimizing the number of bins. In particular, in section 2 we discuss how to integrate bin......-packing and load balancing of items. The problem has only been considered in the literature in simplified versions, e.g. balancing a single bin or introducing a feasible region for the barycenter. In section 3 we generalize the problem to handle cargo stability and weight constraints....

  2. Grasp Densities for Grasp Refinement in Industrial Bin Picking

    DEFF Research Database (Denmark)

    Hupfauf, Benedikt; Hahn, Heiko; Bodenhagen, Leon

    in terms of object-relative gripper pose, can be learned from empirical experience, and allow the automatic choice of optimal grasps in a given scene context (object pose, workspace constraints, etc.). We will show grasp densities extracted from empirical data in a real industrial bin picking context...... generated in industrial bin-picking for grasp learning. This aim is achieved by using the novel concept of grasp densities (Detry et al., 2010). Grasp densities can describe the full variety of grasps that apply to specific objects using specific grippers. They represent the likelihood of grasp success...

  3. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented......This paper presents work on automatic grasp generation and grasp learning for reducing the manual setup time and increase grasp success rates within bin-picking applications. We propose an approach that is able to generate good grasps automatically using a dynamic grasp simulator, a newly developed...

  4. Tensor network states in time-bin quantum optics

    Science.gov (United States)

    Lubasch, Michael; Valido, Antonio A.; Renema, Jelmer J.; Kolthammer, W. Steven; Jaksch, Dieter; Kim, M. S.; Walmsley, Ian; García-Patrón, Raúl

    2018-06-01

    The current shift in the quantum optics community towards experiments with many modes and photons necessitates new classical simulation techniques that efficiently encode many-body quantum correlations and go beyond the usual phase-space formulation. To address this pressing demand we formulate linear quantum optics in the language of tensor network states. We extensively analyze the quantum and classical correlations of time-bin interference in a single fiber loop. We then generalize our results to more complex time-bin quantum setups and identify different classes of architectures for high-complexity and low-overhead boson sampling experiments.

  5. Vision guided robot bin picking of cylindrical objects

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg; Dyhr-Nielsen, Carsten

    1997-01-01

    In order to achieve increased flexibility on robotic production lines an investigation of the rovbot bin-picking problem is presented. In the paper, the limitations related to previous attempts to solve the problem are pointed uot and a set of innovative methods are presented. The main elements...

  6. Rumsfeld: Osama bin Ladenil polnud sidemeid Saddamiga / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Valeks on osutunud väide, et Saddam Husseini ja bin Ladeni vahel oli tihe koostöö. Autori sõnul uurisid Richard Cheney, Donald Rumsfeld ja USA praegune asekaitseminister Paul Wolfowitz juba 1990-ndate keskel võimalusi uueks sõjaks Iraagi vastu. Lisa: Bremer süüdistab Valget Maja

  7. Binning sequences using very sparse labels within a metagenome

    Directory of Open Access Journals (Sweden)

    Halgamuge Saman K

    2008-04-01

    Full Text Available Abstract Background In metagenomic studies, a process called binning is necessary to assign contigs that belong to multiple species to their respective phylogenetic groups. Most of the current methods of binning, such as BLAST, k-mer and PhyloPythia, involve assigning sequence fragments by comparing sequence similarity or sequence composition with already-sequenced genomes that are still far from comprehensive. We propose a semi-supervised seeding method for binning that does not depend on knowledge of completed genomes. Instead, it extracts the flanking sequences of highly conserved 16S rRNA from the metagenome and uses them as seeds (labels to assign other reads based on their compositional similarity. Results The proposed seeding method is implemented on an unsupervised Growing Self-Organising Map (GSOM, and called Seeded GSOM (S-GSOM. We compared it with four well-known semi-supervised learning methods in a preliminary test, separating random-length prokaryotic sequence fragments sampled from the NCBI genome database. We identified the flanking sequences of the highly conserved 16S rRNA as suitable seeds that could be used to group the sequence fragments according to their species. S-GSOM showed superior performance compared to the semi-supervised methods tested. Additionally, S-GSOM may also be used to visually identify some species that do not have seeds. The proposed method was then applied to simulated metagenomic datasets using two different confidence threshold settings and compared with PhyloPythia, k-mer and BLAST. At the reference taxonomic level Order, S-GSOM outperformed all k-mer and BLAST results and showed comparable results with PhyloPythia for each of the corresponding confidence settings, where S-GSOM performed better than PhyloPythia in the ≥ 10 reads datasets and comparable in the ≥ 8 kb benchmark tests. Conclusion In the task of binning using semi-supervised learning methods, results indicate S-GSOM to be the best of

  8. Steganalytic methods for the detection of histogram shifting data-hiding schemes

    OpenAIRE

    Lerch Hostalot, Daniel

    2011-01-01

    In this paper, some steganalytic techniques designed to detect the existence of hidden messages using histogram shifting methods are presented. Firstly, some techniques to identify specific methods of histogram shifting, based on visible marks on the histogram or abnormal statistical distributions are suggested. Then, we present a general technique capable of detecting all histogram shifting techniques analyzed. This technique is based on the effect of histogram shifting methods on the "volat...

  9. Proposed first-generation WSQ bit allocation procedure

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J.N.; Brislawn, C.M.

    1993-09-08

    The Wavelet/Scalar Quantization (WSQ) gray-scale fingerprint image compression algorithm involves a symmetric wavelet transform (SWT) image decomposition followed by uniform scalar quantization of each subband. The algorithm is adaptive insofar as the bin widths for the scalar quantizers are image-specific and are included in the compressed image format. Since the decoder requires only the actual bin width values -- but not the method by which they were computed -- the standard allows for future refinements of the WSQ algorithm by improving the method used to select the scalar quantizer bin widths. This report proposes a bit allocation procedure for use with the first-generation WSQ encoder. In previous work a specific formula is provided for the relative sizes of the scalar quantizer bin widths in terms of the variances of the SWT subbands. An explicit specification for the constant of proportionality, q, that determines the absolute bin widths was not given. The actual compression ratio produced by the WSQ algorithm will generally vary from image to image depending on the amount of coding gain obtained by the run-length and Huffman coding, stages of the algorithm, but testing performed by the FBI established that WSQ compression produces archival quality images at compression ratios of around 20 to 1. The bit allocation procedure described in this report possesses a control parameter, r, that can be set by the user to achieve a predetermined amount of lossy compression, effectively giving the user control over the amount of distortion introduced by quantization noise. The variability observed in final compression ratios is thus due only to differences in lossless coding gain from image to image, chiefly a result of the varying amounts of blank background surrounding the print area in the images. Experimental results are presented that demonstrate the proposed method`s effectiveness.

  10. Discrimination of paediatric brain tumours using apparent diffusion coefficient histograms

    International Nuclear Information System (INIS)

    Bull, Jonathan G.; Clark, Christopher A.; Saunders, Dawn E.

    2012-01-01

    To determine if histograms of apparent diffusion coefficients (ADC) can be used to differentiate paediatric brain tumours. Imaging of histologically confirmed tumours with pre-operative ADC maps were reviewed (54 cases, 32 male, mean age 6.1 years; range 0.1-15.8 years) comprising 6 groups. Whole tumour ADC histograms were calculated; normalised for volume. Stepwise logistic regression analysis was used to differentiate tumour types using histogram metrics, initially for all groups and then for specific subsets. All 6 groups (5 dysembryoplastic neuroectodermal tumours, 22 primitive neuroectodermal tumours (PNET), 5 ependymomas, 7 choroid plexus papillomas, 4 atypical teratoid rhabdoid tumours (ATRT) and 9 juvenile pilocytic astrocytomas (JPA)) were compared. 74% (40/54) were correctly classified using logistic regression of ADC histogram parameters. In the analysis of posterior fossa tumours, 80% of ependymomas, 100% of astrocytomas and 94% of PNET-medulloblastoma were classified correctly. All PNETs were discriminated from ATRTs (22 PNET and 4 supratentorial ATRTs) (100%). ADC histograms are useful in differentiating paediatric brain tumours, in particular, the common posterior fossa tumours of childhood. PNETs were differentiated from supratentorial ATRTs, in all cases, which has important implications in terms of clinical management. (orig.)

  11. Defect detection based on extreme edge of defective region histogram

    Directory of Open Access Journals (Sweden)

    Zouhir Wakaf

    2018-01-01

    Full Text Available Automatic thresholding has been used by many applications in image processing and pattern recognition systems. Specific attention was given during inspection for quality control purposes in various industries like steel processing and textile manufacturing. Automatic thresholding problem has been addressed well by the commonly used Otsu method, which provides suitable results for thresholding images based on a histogram of bimodal distribution. However, the Otsu method fails when the histogram is unimodal or close to unimodal. Defects have different shapes and sizes, ranging from very small to large. The gray-level distributions of the image histogram can vary between unimodal and multimodal. Furthermore, Otsu-revised methods, like the valley-emphasis method and the background histogram mode extents, which overcome the drawbacks of the Otsu method, require preprocessing steps and fail to use the general threshold for multimodal defects. This study proposes a new automatic thresholding algorithm based on the acquisition of the defective region histogram and the selection of its extreme edge as the threshold value to segment all defective objects in the foreground from the image background. To evaluate the proposed defect-detection method, common standard images for experimentation were used. Experimental results of the proposed method show that the proposed method outperforms the current methods in terms of defect detection.

  12. Finding significantly connected voxels based on histograms of connection strengths

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-01-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based...... on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives...... and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null...

  13. 3D Model Retrieval Based on Vector Quantisation Index Histograms

    International Nuclear Information System (INIS)

    Lu, Z M; Luo, H; Pan, J S

    2006-01-01

    This paper proposes a novel technique to retrieval 3D mesh models using vector quantisation index histograms. Firstly, points are sampled uniformly on mesh surface. Secondly, to a point five features representing global and local properties are extracted. Thus feature vectors of points are obtained. Third, we select several models from each class, and employ their feature vectors as a training set. After training using LBG algorithm, a public codebook is constructed. Next, codeword index histograms of the query model and those in database are computed. The last step is to compute the distance between histograms of the query and those of the models in database. Experimental results show the effectiveness of our method

  14. PENGARUH HISTOGRAM EQUALIZATION UNTUK PERBAIKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Sisilia Daeng Bakka Mau

    2016-04-01

    Full Text Available Penelitian ini membahas penggunaan metode histogram equalization yang akan digunakan untuk perbaikan kualitas citra. Perbaikan kualitas citra (image enhancement merupakan salah satu proses awal dalam peningkatan mutu citra. Peningkatan mutu citra diperlukan karena seringkali citra yang dijadikan objek pembahasan mempunyai kualitas yang buruk, misalnya citra mengalami derau, kabur, citra terlalu gelap atau terang, citra kurang tajam dan sebagainya. Perbaikan kualitas citra adalah proses memperjelas dan mempertajam ciri atau fitur tertentu dari citra agar citra lebih mudah dipersepsi maupun dianalisa secara lebih teliti. Hasil penelitian ini membuktikan bahwa penggunaan metode histogram equalization dapat digunakan untuk meningkatkan kontras citra dan dapat meningkatkan kualitas citra, sehingga informasi yang ada pada citra lebih jelas terlihat. Kata kunci: perbaikan kualitas citra, histogram equalization, citra digital

  15. VHDL implementation on histogram with ADC CAMAC module

    International Nuclear Information System (INIS)

    Ruby Santhi, R.; Satyanarayana, V.V.V.; Ajith Kumar, B.P.

    2007-01-01

    Modern nuclear spectroscopy systems the data acquisition and analysis in experimental science have been undergoing major changes because of faster speed and higher resolution. The CAMAC module which is described here is FPGA based 8K x 24 bit Histogram Memory integrated with ADC on a single board has been designed and fabricated. This module accepts input from Spectroscopy Amplifier for Pulse Height Analysis and offers all features single spectra for a few selected parameters. These on line histograms are to monitor the progress of the experiments during on line experiments

  16. Integral Histogram with Random Projection for Pedestrian Detection.

    Directory of Open Access Journals (Sweden)

    Chang-Hua Liu

    Full Text Available In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed.

  17. Demonstration of analyzers for multimode photonic time-bin qubits

    Science.gov (United States)

    Jin, Jeongwan; Agne, Sascha; Bourgoin, Jean-Philippe; Zhang, Yanbao; Lütkenhaus, Norbert; Jennewein, Thomas

    2018-04-01

    We demonstrate two approaches for unbalanced interferometers as time-bin qubit analyzers for quantum communication, robust against mode distortions and polarization effects as expected from free-space quantum communication systems including wavefront deformations, path fluctuations, pointing errors, and optical elements. Despite strong spatial and temporal distortions of the optical mode of a time-bin qubit, entangled with a separate polarization qubit, we verify entanglement using the Negative Partial Transpose, with the measured visibility of up to 0.85 ±0.01 . The robustness of the analyzers is further demonstrated for various angles of incidence up to 0 .2∘ . The output of the interferometers is coupled into multimode fiber yielding a high system throughput of 0.74. Therefore, these analyzers are suitable and efficient for quantum communication over multimode optical channels.

  18. Benchmarking motion planning algorithms for bin-picking applications

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter

    2017-01-01

    Purpose For robot motion planning there exists a large number of different algorithms, each appropriate for a certain domain, and the right choice of planner depends on the specific use case. The purpose of this paper is to consider the application of bin picking and benchmark a set of motion...... planning algorithms to identify which are most suited in the given context. Design/methodology/approach The paper presents a selection of motion planning algorithms and defines benchmarks based on three different bin-picking scenarios. The evaluation is done based on a fixed set of tasks, which are planned...... and executed on a real and a simulated robot. Findings The benchmarking shows a clear difference between the planners and generally indicates that algorithms integrating optimization, despite longer planning time, perform better due to a faster execution. Originality/value The originality of this work lies...

  19. Osama bin Ladeni tapmine toonuks tuumapõrgu / Kaivo Kopli

    Index Scriptorium Estoniae

    Kopli, Kaivo

    2011-01-01

    Wikileaksi kaudu on avalikustatud märkmeid Guantánamo vangide ülekuulamistest. Ühe al-Qaida komandöri väitel on terroristidel kuhugi Euroopasse peidetud tuumapomm, mis oleks pandud plahvatama, kui Osama bin Laden oleks vangistatud või tapetud. Selgub, et Guantanamos oli umbes 150 kinnipeetut täiesti süütud

  20. Bin mode estimation methods for Compton camera imaging

    International Nuclear Information System (INIS)

    Ikeda, S.; Odaka, H.; Uemura, M.; Takahashi, T.; Watanabe, S.; Takeda, S.

    2014-01-01

    We study the image reconstruction problem of a Compton camera which consists of semiconductor detectors. The image reconstruction is formulated as a statistical estimation problem. We employ a bin-mode estimation (BME) and extend an existing framework to a Compton camera with multiple scatterers and absorbers. Two estimation algorithms are proposed: an accelerated EM algorithm for the maximum likelihood estimation (MLE) and a modified EM algorithm for the maximum a posteriori (MAP) estimation. Numerical simulations demonstrate the potential of the proposed methods

  1. Osama bin Laden võib olla uues piiramisrõngas / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Briti ajakirjanduse andmeil on suudetud tuvastada bin Ladeni asukoht. 1993. aastal CIA peakorteri ees kaks USA luureagenti tapnud pakistanlase Mir Aimal Kasi ja bin Ladeni tagaotsimisest. Vt. samas: Taliban kogub uut jõudu

  2. Integrated technologies for solid waste bin monitoring system.

    Science.gov (United States)

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  3. Comments on 'Reconsidering the definition of a dose-volume histogram'-dose-mass histogram (DMH) versus dose-volume histogram (DVH) for predicting radiation-induced pneumonitis

    International Nuclear Information System (INIS)

    Mavroidis, Panayiotis; Plataniotis, Georgios A; Gorka, Magdalena Adamus; Lind, Bengt K

    2006-01-01

    In a recently published paper (Nioutsikou et al 2005 Phys. Med. Biol. 50 L17) the authors showed that the use of the dose-mass histogram (DMH) concept is a more accurate descriptor of the dose delivered to lung than the traditionally used dose-volume histogram (DVH) concept. Furthermore, they state that if a functional imaging modality could also be registered to the anatomical imaging modality providing a functional weighting across the organ (functional mass) then the more general and realistic concept of the dose-functioning mass histogram (D[F]MH) could be an even more appropriate descriptor. The comments of the present letter to the editor are in line with the basic arguments of that work since their general conclusions appear to be supported by the comparison of the DMH and DVH concepts using radiobiological measures. In this study, it is examined whether the dose-mass histogram (DMH) concept deviated significantly from the widely used dose-volume histogram (DVH) concept regarding the expected lung complications and if there are clinical indications supporting these results. The problem was investigated theoretically by applying two hypothetical dose distributions (Gaussian and semi-Gaussian shaped) on two lungs of uniform and varying densities. The influence of the deviation between DVHs and DMHs on the treatment outcome was estimated by using the relative seriality and LKB models using the Gagliardi et al (2000 Int. J. Radiat. Oncol. Biol. Phys. 46 373) and Seppenwoolde et al (2003 Int. J. Radiat. Oncol. Biol. Phys. 55 724) parameter sets for radiation pneumonitis, respectively. Furthermore, the biological equivalent of their difference was estimated by the biologically effective uniform dose (D-bar) and equivalent uniform dose (EUD) concepts, respectively. It is shown that the relation between the DVHs and DMHs varies depending on the underlying cell density distribution and the applied dose distribution. However, the range of their deviation in terms of

  4. Fuzzy Logic-Based Histogram Equalization for Image Contrast Enhancement

    Directory of Open Access Journals (Sweden)

    V. Magudeeswaran

    2013-01-01

    Full Text Available Fuzzy logic-based histogram equalization (FHE is proposed for image contrast enhancement. The FHE consists of two stages. First, fuzzy histogram is computed based on fuzzy set theory to handle the inexactness of gray level values in a better way compared to classical crisp histograms. In the second stage, the fuzzy histogram is divided into two subhistograms based on the median value of the original image and then equalizes them independently to preserve image brightness. The qualitative and quantitative analyses of proposed FHE algorithm are evaluated using two well-known parameters like average information contents (AIC and natural image quality evaluator (NIQE index for various images. From the qualitative and quantitative measures, it is interesting to see that this proposed method provides optimum results by giving better contrast enhancement and preserving the local information of the original image. Experimental result shows that the proposed method can effectively and significantly eliminate washed-out appearance and adverse artifacts induced by several existing methods. The proposed method has been tested using several images and gives better visual quality as compared to the conventional methods.

  5. Hybrid Histogram Descriptor: A Fusion Feature Representation for Image Retrieval.

    Science.gov (United States)

    Feng, Qinghe; Hao, Qiaohong; Chen, Yuqi; Yi, Yugen; Wei, Ying; Dai, Jiangyan

    2018-06-15

    Currently, visual sensors are becoming increasingly affordable and fashionable, acceleratingly the increasing number of image data. Image retrieval has attracted increasing interest due to space exploration, industrial, and biomedical applications. Nevertheless, designing effective feature representation is acknowledged as a hard yet fundamental issue. This paper presents a fusion feature representation called a hybrid histogram descriptor (HHD) for image retrieval. The proposed descriptor comprises two histograms jointly: a perceptually uniform histogram which is extracted by exploiting the color and edge orientation information in perceptually uniform regions; and a motif co-occurrence histogram which is acquired by calculating the probability of a pair of motif patterns. To evaluate the performance, we benchmarked the proposed descriptor on RSSCN7, AID, Outex-00013, Outex-00014 and ETHZ-53 datasets. Experimental results suggest that the proposed descriptor is more effective and robust than ten recent fusion-based descriptors under the content-based image retrieval framework. The computational complexity was also analyzed to give an in-depth evaluation. Furthermore, compared with the state-of-the-art convolutional neural network (CNN)-based descriptors, the proposed descriptor also achieves comparable performance, but does not require any training process.

  6. Histogram equalization with Bayesian estimation for noise robust speech recognition.

    Science.gov (United States)

    Suh, Youngjoo; Kim, Hoirin

    2018-02-01

    The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.

  7. Improved LSB matching steganography with histogram characters reserved

    Science.gov (United States)

    Chen, Zhihong; Liu, Wenyao

    2008-03-01

    This letter bases on the researches of LSB (least significant bit, i.e. the last bit of a binary pixel value) matching steganographic method and the steganalytic method which aims at histograms of cover images, and proposes a modification to LSB matching. In the LSB matching, if the LSB of the next cover pixel matches the next bit of secret data, do nothing; otherwise, choose to add or subtract one from the cover pixel value at random. In our improved method, a steganographic information table is defined and records the changes which embedded secrete bits introduce in. Through the table, the next LSB which has the same pixel value will be judged to add or subtract one dynamically in order to ensure the histogram's change of cover image is minimized. Therefore, the modified method allows embedding the same payload as the LSB matching but with improved steganographic security and less vulnerability to attacks compared with LSB matching. The experimental results of the new method show that the histograms maintain their attributes, such as peak values and alternative trends, in an acceptable degree and have better performance than LSB matching in the respects of histogram distortion and resistance against existing steganalysis.

  8. 30 CFR 56.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... MINES Materials Storage and Handling § 56.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled or...

  9. 30 CFR 57.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... NONMETAL MINES Materials Storage and Handling § 57.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled...

  10. Meeting the EU recycling targets by introducing a 2-compartment bin to households

    DEFF Research Database (Denmark)

    Jensen, Morten Bang; Scheutz, Charlotte; Møller, Jacob

    A Danish municipality has introduced a 2-compartment bin in the waste collection scheme, this bin should increase recycling of dry household recyclables. An excessive waste sorting campaign was conducted and the efficiency of the bin assessed. The waste sorting campaign yielded a full waste...... targets can be fulfilled, there is still room for improvement (increase source separation), especially for hard plastic and metals....

  11. Estimation of pneumonitis risk in three-dimensional treatment planning using dose-volume histogram analysis

    International Nuclear Information System (INIS)

    Oetzel, Dieter; Schraube, Peter; Hensley, Frank; Sroka-Perez, Gabriele; Menke, Markus; Flentje, Michael

    1995-01-01

    Purpose: Investigations to study correlations between the estimations of biophysical models in three dimensional (3D) treatment planning and clinical observations are scarce. The development of clinically symptomatic pneumonitis in the radiotherapy of thoracic malignomas was chosen to test the predictive power of Lyman's normal tissue complication probability (NTCP) model for the assessment of side effects for nonuniform irradiation. Methods and Materials: In a retrospective analysis individual computed-tomography-based 3D dose distributions of a random sample of (46(20)) patients with lung/esophageal cancer were reconstructed. All patients received tumor doses between 50 and 60 Gy in a conventional treatment schedule. Biological isoeffective dose-volume histograms (DVHs) were used for the calculation of complication probabilities after applying Lyman's and Kutcher's DVH-reduction algorithm. Lung dose statistics were performed for single lung (involved ipsilateral and contralateral) and for the lung as a paired organ. Results: In the lung cancer group, about 20% of the patients (9 out of 46) developed pneumonitis 3-12 (median 7.5) weeks after completion of radiotherapy. For the majority of these lung cancer patients, the involved ipsilateral lung received a much higher dose than the contralateral lung, and the pneumonitis patients had on average a higher lung exposure with a doubling of the predicted complication risk (38% vs. 20%). The lower lung exposure for the esophagus patients resulted in a mean lung dose of 13.2 Gy (lung cancer: 20.5 Gy) averaged over all patients in correlation with an almost zero complication risk and only one observed case of pneumonitis (1 out of 20). To compare the pneumonitis risk estimations with observed complication rates, the patients were ranked into bins of mean ipsilateral lung dose. Particularly, in the bins with the highest patient numbers, a good correlation was achieved. Agreement was not reached for the lung functioning as

  12. 3D facial expression recognition based on histograms of surface differential quantities

    KAUST Repository

    Li, Huibin; Morvan, Jean-Marie; Chen, Liming

    2011-01-01

    . To characterize shape information of the local neighborhood of facial landmarks, we calculate the weighted statistical distributions of surface differential quantities, including histogram of mesh gradient (HoG) and histogram of shape index (HoS). Normal cycle

  13. ADC histogram analysis for adrenal tumor histogram analysis of apparent diffusion coefficient in differentiating adrenal adenoma from pheochromocytoma.

    Science.gov (United States)

    Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi

    2017-04-01

    To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC 200 ], 0 and 400 [ADC 400 ], and 0 and 800 s/mm 2 [ADC 800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC 800 were significantly higher in pheochromocytomas than in adrenal adenomas (P histogram parameters for diagnosing adrenal adenomas (ADC 200 , 0.82; ADC 400 , 0.87; and ADC 800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC 200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC 400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC 800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.

  14. Improving a Spectral Bin Microphysical Scheme Using TRMM Satellite Observations

    Science.gov (United States)

    Li, Xiaowen; Tao, Wei-Kuo; Matsui, Toshihisa; Liu, Chuntao; Masunaga, Hirohiko

    2010-01-01

    Comparisons between cloud model simulations and observations are crucial in validating model performance and improving physical processes represented in the mod Tel.hese modeled physical processes are idealized representations and almost always have large rooms for improvements. In this study, we use data from two different sensors onboard TRMM (Tropical Rainfall Measurement Mission) satellite to improve the microphysical scheme in the Goddard Cumulus Ensemble (GCE) model. TRMM observed mature-stage squall lines during late spring, early summer in central US over a 9-year period are compiled and compared with a case simulation by GCE model. A unique aspect of the GCE model is that it has a state-of-the-art spectral bin microphysical scheme, which uses 33 different bins to represent particle size distribution of each of the seven hydrometeor species. A forward radiative transfer model calculates TRMM Precipitation Radar (PR) reflectivity and TRMM Microwave Imager (TMI) 85 GHz brightness temperatures from simulated particle size distributions. Comparisons between model outputs and observations reveal that the model overestimates sizes of snow/aggregates in the stratiform region of the squall line. After adjusting temperature-dependent collection coefficients among ice-phase particles, PR comparisons become good while TMI comparisons worsen. Further investigations show that the partitioning between graupel (a high-density form of aggregate), and snow (a low-density form of aggregate) needs to be adjusted in order to have good comparisons in both PR reflectivity and TMI brightness temperature. This study shows that long-term satellite observations, especially those with multiple sensors, can be very useful in constraining model microphysics. It is also the first study in validating and improving a sophisticated spectral bin microphysical scheme according to long-term satellite observations.

  15. Adaptive Motion Planning in Bin-Picking with Object Uncertainties

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter; Miró, Jaime Valls

    2017-01-01

    Doing motion planning for bin-picking with object uncertainties requires either a re-grasp of picked objects or an online sensor system. Using the latter is advantageous in terms of computational time, as no time is wasted doing an extra pick and place action. It does, however, put extra...... requirements on the motion planner, as the target position may change on-the-fly. This paper solves that problem by using a state adjusting Partial Observable Markov Decision Process, where the state space is modified between runs, to better fit earlier solved problems. The approach relies on a set...

  16. Histogram analysis for smartphone-based rapid hematocrit determination

    Science.gov (United States)

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  17. Histogram Equalization to Model Adaptation for Robust Speech Recognition

    Directory of Open Access Journals (Sweden)

    Suh Youngjoo

    2010-01-01

    Full Text Available We propose a new model adaptation method based on the histogram equalization technique for providing robustness in noisy environments. The trained acoustic mean models of a speech recognizer are adapted into environmentally matched conditions by using the histogram equalization algorithm on a single utterance basis. For more robust speech recognition in the heavily noisy conditions, trained acoustic covariance models are efficiently adapted by the signal-to-noise ratio-dependent linear interpolation between trained covariance models and utterance-level sample covariance models. Speech recognition experiments on both the digit-based Aurora2 task and the large vocabulary-based task showed that the proposed model adaptation approach provides significant performance improvements compared to the baseline speech recognizer trained on the clean speech data.

  18. Multifractal analysis of three-dimensional histogram from color images

    International Nuclear Information System (INIS)

    Chauveau, Julien; Rousseau, David; Richard, Paul; Chapeau-Blondeau, Francois

    2010-01-01

    Natural images, especially color or multicomponent images, are complex information-carrying signals. To contribute to the characterization of this complexity, we investigate the possibility of multiscale organization in the colorimetric structure of natural images. This is realized by means of a multifractal analysis applied to the three-dimensional histogram from natural color images. The observed behaviors are confronted to those of reference models with known multifractal properties. We use for this purpose synthetic random images with trivial monofractal behavior, and multidimensional multiplicative cascades known for their actual multifractal behavior. The behaviors observed on natural images exhibit similarities with those of the multifractal multiplicative cascades and display the signature of elaborate multiscale organizations stemming from the histograms of natural color images. This type of characterization of colorimetric properties can be helpful to various tasks of digital image processing, as for instance modeling, classification, indexing.

  19. A novel parallel architecture for local histogram equalization

    Science.gov (United States)

    Ohannessian, Mesrob I.; Choueiter, Ghinwa F.; Diab, Hassan

    2005-07-01

    Local histogram equalization is an image enhancement algorithm that has found wide application in the pre-processing stage of areas such as computer vision, pattern recognition and medical imaging. The computationally intensive nature of the procedure, however, is a main limitation when real time interactive applications are in question. This work explores the possibility of performing parallel local histogram equalization, using an array of special purpose elementary processors, through an HDL implementation that targets FPGA or ASIC platforms. A novel parallelization scheme is presented and the corresponding architecture is derived. The algorithm is reduced to pixel-level operations. Processing elements are assigned image blocks, to maintain a reasonable performance-cost ratio. To further simplify both processor and memory organizations, a bit-serial access scheme is used. A brief performance assessment is provided to illustrate and quantify the merit of the approach.

  20. Histogram specification as a method of density modification

    International Nuclear Information System (INIS)

    Harrison, R.W.

    1988-01-01

    A new method for improving the quality and extending the resolution of Fourier maps is described. The method is based on a histogram analysis of the electron density. The distribution of electron density values in the map is forced to be 'ideal'. The 'ideal' distribution is assumed to be Gaussian. The application of the method to improve the electron density map for the protein Acinetobacter asparaginase, which is a tetrameric enzyme of molecular weight 140000 daltons, is described. (orig.)

  1. Breast density pattern characterization by histogram features and texture descriptors

    OpenAIRE

    Carneiro,Pedro Cunha; Franco,Marcelo Lemos Nunes; Thomaz,Ricardo de Lima; Patrocinio,Ana Claudia

    2017-01-01

    Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammo...

  2. Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.

    Science.gov (United States)

    Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck

    2018-04-20

    Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.

  3. Retrospective Reconstructions of Active Bone Marrow Dose-Volume Histograms

    International Nuclear Information System (INIS)

    Veres, Cristina; Allodji, Rodrigue S.; Llanas, Damien; Vu Bezin, Jérémi; Chavaudra, Jean; Mège, Jean Pierre; Lefkopoulos, Dimitri; Quiniou, Eric; Deutsh, Eric; Vathaire, Florent de; Diallo, Ibrahima

    2014-01-01

    Purpose: To present a method for calculating dose-volume histograms (DVH's) to the active bone marrow (ABM) of patients who had undergone radiation therapy (RT) and subsequently developed leukemia. Methods and Materials: The study focuses on 15 patients treated between 1961 and 1996. Whole-body RT planning computed tomographic (CT) data were not available. We therefore generated representative whole-body CTs similar to patient anatomy. In addition, we developed a method enabling us to obtain information on the density distribution of ABM all over the skeleton. Dose could then be calculated in a series of points distributed all over the skeleton in such a way that their local density reflected age-specific data for ABM distribution. Dose to particular regions and dose-volume histograms of the entire ABM were estimated for all patients. Results: Depending on patient age, the total number of dose calculation points generated ranged from 1,190,970 to 4,108,524. The average dose to ABM ranged from 0.3 to 16.4 Gy. Dose-volume histograms analysis showed that the median doses (D 50% ) ranged from 0.06 to 12.8 Gy. We also evaluated the inhomogeneity of individual patient ABM dose distribution according to clinical situation. It was evident that the coefficient of variation of the dose for the whole ABM ranged from 1.0 to 5.7, which means that the standard deviation could be more than 5 times higher than the mean. Conclusions: For patients with available long-term follow-up data, our method provides reconstruction of dose-volume data comparable to detailed dose calculations, which have become standard in modern CT-based 3-dimensional RT planning. Our strategy of using dose-volume histograms offers new perspectives to retrospective epidemiological studies

  4. Independent histogram pursuit for segmentation of skin lesions

    DEFF Research Database (Denmark)

    Gomez, D.D.; Butakoff, C.; Ersbøll, Bjarne Kjær

    2008-01-01

    In this paper, an unsupervised algorithm, called the Independent Histogram Pursuit (HIP), for segmenting dermatological lesions is proposed. The algorithm estimates a set of linear combinations of image bands that enhance different structures embedded in the image. In particular, the first estima...... to deal with different types of dermatological lesions. The boundary detection precision using k-means segmentation was close to 97%. The proposed algorithm can be easily combined with the majority of classification algorithms....

  5. Color and Contrast Enhancement by Controlled Piecewise Affine Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Jose-Luis Lisani

    2012-10-01

    Full Text Available This paper presents a simple contrast enhancement algorithm based on histogram equalization (HE. The proposed algorithm performs a piecewise affine transform of the intensity levels of a digital image such that the new cumulative distribution function will be approximately uniform (as with HE, but where the stretching of the range is locally controlled to avoid brutal noise enhancement. We call this algorithm Piecewise Affine Equalization (PAE. Several experiments show that, in general, the new algorithm improves HE results.

  6. Breast density pattern characterization by histogram features and texture descriptors

    Directory of Open Access Journals (Sweden)

    Pedro Cunha Carneiro

    2017-04-01

    Full Text Available Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammographic images into categories of breast density using an Artificial Neural Network. Methods We used 307 mammographic images from the INbreast digital database, extracting histogram features and texture descriptors of all mammograms and selecting them with the K-means technique. Then, these groups of selected features were used as inputs of an Artificial Neural Network to classify the images automatically into the four categories reported by radiologists. Results An average accuracy of 92.9% was obtained in a few tests using only some of the Haralick texture descriptors. Also, the accuracy rate increased to 98.95% when texture descriptors were mixed with some features based on a histogram. Conclusion Texture descriptors have proven to be better than gray levels features at differentiating the breast densities in mammographic images. From this paper, it was possible to automate the feature selection and the classification with acceptable error rates since the extraction of the features is suitable to the characteristics of the images involving the problem.

  7. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chung-Cheng Chiu

    2016-06-01

    Full Text Available Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA, which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods.

  8. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    Science.gov (United States)

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-01-01

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods. PMID:27338412

  9. Test Plan: WIPP bin-scale CH TRU waste tests

    International Nuclear Information System (INIS)

    Molecke, M.A.

    1990-08-01

    This WIPP Bin-Scale CH TRU Waste Test program described herein will provide relevant composition and kinetic rate data on gas generation and consumption resulting from TRU waste degradation, as impacted by synergistic interactions due to multiple degradation modes, waste form preparation, long-term repository environmental effects, engineered barrier materials, and, possibly, engineered modifications to be developed. Similar data on waste-brine leachate compositions and potentially hazardous volatile organic compounds released by the wastes will also be provided. The quantitative data output from these tests and associated technical expertise are required by the WIPP Performance Assessment (PA) program studies, and for the scientific benefit of the overall WIPP project. This Test Plan describes the necessary scientific and technical aspects, justifications, and rational for successfully initiating and conducting the WIPP Bin-Scale CH TRU Waste Test program. This Test Plan is the controlling scientific design definition and overall requirements document for this WIPP in situ test, as defined by Sandia National Laboratories (SNL), scientific advisor to the US Department of Energy, WIPP Project Office (DOE/WPO). 55 refs., 16 figs., 19 tabs

  10. Smart Bin: Internet-of-Things Garbage Monitoring System

    Directory of Open Access Journals (Sweden)

    Mustafa M.R

    2017-01-01

    Full Text Available This work introduces the design and development of smart green environment of garbage monitoring system by measuring the garbage level in real time and to alert the municipality where never the bin is full based on the types of garbage. The proposed system consisted the ultrasonic sensors which measure the garbage level, an ARM microcontroller which controls system operation whereas everything will be connected to ThingSpeak. This work demonstrates a system that allows the waste management to monitor based on the level of the garbage depth inside the dustbin. The system shows the status of different four types of garbage; domestic waste, paper, glass and plastic through LCD and ThingSpeak in a real time to store the data for future use and analysis, such as prediction of peak level of garbage bin fullness. It is expected that this system can create greener environment by monitoring and controlling the collection of garbage smartly through Internet-of-Things.

  11. Non-parametric comparison of histogrammed two-dimensional data distributions using the Energy Test

    International Nuclear Information System (INIS)

    Reid, Ivan D; Lopes, Raul H C; Hobson, Peter R

    2012-01-01

    When monitoring complex experiments, comparison is often made between regularly acquired histograms of data and reference histograms which represent the ideal state of the equipment. With the larger HEP experiments now ramping up, there is a need for automation of this task since the volume of comparisons could overwhelm human operators. However, the two-dimensional histogram comparison tools available in ROOT have been noted in the past to exhibit shortcomings. We discuss a newer comparison test for two-dimensional histograms, based on the Energy Test of Aslan and Zech, which provides more conclusive discrimination between histograms of data coming from different distributions than methods provided in a recent ROOT release.

  12. Risk capital allocation

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Smilgins, Aleksandrs

    Risk capital allocation problems have been widely discussed in the academic literature. We consider a company with multiple subunits having individual portfolios. Hence, when portfolios of subunits are merged, a diversification benefit arises: the risk of the company as a whole is smaller than...... the sum of the risks of the individual sub-units. The question is how to allocate the risk capital of the company among the subunits in a fair way. In this paper we propose to use the Lorenz set as an allocation method. We show that the Lorenz set is operational and coherent. Moreover, we propose a set...... of new axioms related directly to the problem of risk capital allocation and show that the Lorenz set satisfies these new axioms in contrast to other well-known coherent methods. Finally, we discuss how to deal with non-uniqueness of the Lorenz set....

  13. CPD Allocations and Awards

    Data.gov (United States)

    Department of Housing and Urban Development — The CPD Allocation and Award database provides filterable on-screen and exportable reports on select programs, such as the Community Development Block Grant Program,...

  14. Brassinosteroids regulate pavement cell growth by mediating BIN2-induced microtubule stabilization.

    Science.gov (United States)

    Liu, Xiaolei; Yang, Qin; Wang, Yuan; Wang, Linhai; Fu, Ying; Wang, Xuelu

    2018-02-23

    Brassinosteroids (BRs), a group of plant steroid hormones, play important roles in regulating plant development. The cytoskeleton also affects key developmental processes and a deficiency in BR biosynthesis or signaling leads to abnormal phenotypes similar to those of microtubule-defective mutants. However, how BRs regulate microtubule and cell morphology remains unknown. Here, using liquid chromatography-tandem mass spectrometry, we identified tubulin proteins that interact with Arabidopsis BRASSINOSTEROID INSENSITIVE2 (BIN2), a negative regulator of BR responses in plants. In vitro and in vivo pull-down assays confirmed that BIN2 interacts with tubulin proteins. High-speed co-sedimentation assays demonstrated that BIN2 also binds microtubules. The Arabidopsis genome also encodes two BIN2 homologs, BIN2-LIKE 1 (BIL1) and BIL2, which function redundantly with BIN2. In the bin2-3 bil1 bil2 triple mutant, cortical microtubules were more sensitive to treatment with the microtubule-disrupting drug oryzalin than in wild-type, whereas in the BIN2 gain-of-function mutant bin2-1, cortical microtubules were insensitive to oryzalin treatment. These results provide important insight into how BR regulates plant pavement cell and leaf growth by mediating the stabilization of microtubules by BIN2.

  15. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  16. Low-Light Image Enhancement Using Adaptive Digital Pixel Binning

    Directory of Open Access Journals (Sweden)

    Yoonjong Yoo

    2015-06-01

    Full Text Available This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP. Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor.

  17. Bin packing problem solution through a deterministic weighted finite automaton

    Science.gov (United States)

    Zavala-Díaz, J. C.; Pérez-Ortega, J.; Martínez-Rebollar, A.; Almanza-Ortega, N. N.; Hidalgo-Reyes, M.

    2016-06-01

    In this article the solution of Bin Packing problem of one dimension through a weighted finite automaton is presented. Construction of the automaton and its application to solve three different instances, one synthetic data and two benchmarks are presented: N1C1W1_A.BPP belonging to data set Set_1; and BPP13.BPP belonging to hard28. The optimal solution of synthetic data is obtained. In the first benchmark the solution obtained is one more container than the ideal number of containers and in the second benchmark the solution is two more containers than the ideal solution (approximately 2.5%). The runtime in all three cases was less than one second.

  18. BinAligner: a heuristic method to align biological networks.

    Science.gov (United States)

    Yang, Jialiang; Li, Jun; Grünewald, Stefan; Wan, Xiu-Feng

    2013-01-01

    The advances in high throughput omics technologies have made it possible to characterize molecular interactions within and across various species. Alignments and comparison of molecular networks across species will help detect orthologs and conserved functional modules and provide insights on the evolutionary relationships of the compared species. However, such analyses are not trivial due to the complexity of network and high computational cost. Here we develop a mixture of global and local algorithm, BinAligner, for network alignments. Based on the hypotheses that the similarity between two vertices across networks would be context dependent and that the information from the edges and the structures of subnetworks can be more informative than vertices alone, two scoring schema, 1-neighborhood subnetwork and graphlet, were introduced to derive the scoring matrices between networks, besides the commonly used scoring scheme from vertices. Then the alignment problem is formulated as an assignment problem, which is solved by the combinatorial optimization algorithm, such as the Hungarian method. The proposed algorithm was applied and validated in aligning the protein-protein interaction network of Kaposi's sarcoma associated herpesvirus (KSHV) and that of varicella zoster virus (VZV). Interestingly, we identified several putative functional orthologous proteins with similar functions but very low sequence similarity between the two viruses. For example, KSHV open reading frame 56 (ORF56) and VZV ORF55 are helicase-primase subunits with sequence identity 14.6%, and KSHV ORF75 and VZV ORF44 are tegument proteins with sequence identity 15.3%. These functional pairs can not be identified if one restricts the alignment into orthologous protein pairs. In addition, BinAligner identified a conserved pathway between two viruses, which consists of 7 orthologous protein pairs and these proteins are connected by conserved links. This pathway might be crucial for virus packing and

  19. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles.

    Directory of Open Access Journals (Sweden)

    Bing Xu

    2017-08-01

    Full Text Available Microparticles (MPs are cell-cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17 creates transverse-tubule (t-tubule membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA, and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT-III subunit charged multivesicular body protein 4B (CHMP4B colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily

  20. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles.

    Science.gov (United States)

    Xu, Bing; Fu, Ying; Liu, Yan; Agvanian, Sosse; Wirka, Robert C; Baum, Rachel; Zhou, Kang; Shaw, Robin M; Hong, TingTing

    2017-08-01

    Microparticles (MPs) are cell-cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17) creates transverse-tubule (t-tubule) membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA), and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT)-III subunit charged multivesicular body protein 4B (CHMP4B) colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR) domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily to the ESCRT

  1. An experimental comparison of some heuristics for cardinality constrained bin packing problem

    Directory of Open Access Journals (Sweden)

    Maja Remic

    2012-01-01

    Full Text Available Background: Bin packing is an NPhard optimization problem of packing items of given sizes into minimum number of capacitylimited bins. Besides the basic problem, numerous other variants of bin packing exist. The cardinality constrained bin packing adds an additional constraint that the number of items in a bin must not exceed a given limit Nmax. Objectives: Goal of the paper is to present a preliminary experimental study which demostrates adaptations of the new algorithms to the general cardinality constrained bin packing problem. Methods/Approach: Straightforward modifications of First Fit Decreasing (FFD, Refined First Fit (RFF and the algorithm by Zhang et al. for the bin packing problem are compared to four cardinality constrained bin packing problem specific algorithms on random lists of items with 0%, 10%, 30% and 50% of large items. The behaviour of all algorithms when cardinality constraint Nmax increases is also studied. Results: Results show that all specific algorithms outperform the general algorithms on lists with low percentage of big items. Conclusions: One of the specific algorithms performs better or equally well even on lists with high percentage of big items and is therefore of significant interest. The behaviour when Nmax increases shows that specific algorithms can be used for solving the general bin packing problem as well.

  2. Histogram specification as a method of density modification

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, R.W.

    1988-12-01

    A new method for improving the quality and extending the resolution of Fourier maps is described. The method is based on a histogram analysis of the electron density. The distribution of electron density values in the map is forced to be 'ideal'. The 'ideal' distribution is assumed to be Gaussian. The application of the method to improve the electron density map for the protein Acinetobacter asparaginase, which is a tetrameric enzyme of molecular weight 140000 daltons, is described.

  3. Histogram Modification and Wavelet Transform for High Performance Watermarking

    Directory of Open Access Journals (Sweden)

    Ying-Shen Juang

    2012-01-01

    Full Text Available This paper proposes a reversible watermarking technique for natural images. According to the similarity of neighbor coefficients’ values in wavelet domain, most differences between two adjacent pixels are close to zero. The histogram is built based on these difference statistics. As more peak points can be used for secret data hiding, the hiding capacity is improved compared with those conventional methods. Moreover, as the differences concentricity around zero is improved, the transparency of the host image can be increased. Experimental results and comparison show that the proposed method has both advantages in hiding capacity and transparency.

  4. High capacity, high speed histogramming data acquisition memory

    International Nuclear Information System (INIS)

    Epstein, A.; Boulin, C.

    1996-01-01

    A double width CAMAC DRAM store module was developed for use as a histogramming memory in fast time-resolved synchrotron radiation applications to molecular biology. High speed direct memory modify (3 MHz) is accomplished by using a discrete DRAM controller and fast page mode access. The module can be configured using standard SIMMs to sizes of up to 64M-words. The word width is 16 bit and the module can handle overflows by storing the overflow addresses in a dedicated FIFO. Simultaneous front panel DMM/DMI access and CAMAC readout of the overflow addresses is supported

  5. WASP (Write a Scientific Paper) using Excel - 4: Histograms.

    Science.gov (United States)

    Grech, Victor

    2018-02-01

    Plotting data into graphs is a crucial step in data analysis as part of an initial descriptive statistics exercise since it gives the researcher an overview of the shape and nature of the data. Outlier values may also be identified, and these may be incorrect data, or true and important outliers. This paper explains how to access Microsoft Excel's Analysis Toolpak and provides some pointers for the utilisation of the histogram tool within the Toolpak. Copyright © 2018. Published by Elsevier B.V.

  6. Text-Independent Speaker Identification Using the Histogram Transform Model

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Yu, Hong; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose a novel probabilistic method for the task of text-independent speaker identification (SI). In order to capture the dynamic information during SI, we design a super-MFCCs features by cascading three neighboring Mel-frequency Cepstral coefficients (MFCCs) frames together....... These super-MFCC vectors are utilized for probabilistic model training such that the speaker’s characteristics can be sufficiently captured. The probability density function (PDF) of the aforementioned super-MFCCs features is estimated by the recently proposed histogram transform (HT) method. To recedes...

  7. Allocating multiple units

    DEFF Research Database (Denmark)

    Tranæs, Torben; Krishna, Kala

    2002-01-01

    This paper studies the allocation and rent distribution in multi-unit, combinatorial-bid auctions under complete information. We focus on the natural multi-unit analogue of the first-price auction, where buyers bid total payments, pay their bids, and where the seller allocates goods to maximize his...... auction, which is the multi unit analogue of a second-price auction. Furthermore, we characterize these equilibria when valuations take a number of different forms: diminishing marginal valuations, increasing average valuations, and marginal valuations with single turning points...

  8. Potential fitting biases resulting from grouping data into variable width bins

    International Nuclear Information System (INIS)

    Towers, S.

    2014-01-01

    When reading peer-reviewed scientific literature describing any analysis of empirical data, it is natural and correct to proceed with the underlying assumption that experiments have made good faith efforts to ensure that their analyses yield unbiased results. However, particle physics experiments are expensive and time consuming to carry out, thus if an analysis has inherent bias (even if unintentional), much money and effort can be wasted trying to replicate or understand the results, particularly if the analysis is fundamental to our understanding of the universe. In this note we discuss the significant biases that can result from data binning schemes. As we will show, if data are binned such that they provide the best comparison to a particular (but incorrect) model, the resulting model parameter estimates when fitting to the binned data can be significantly biased, leading us to too often accept the model hypothesis when it is not in fact true. When using binned likelihood or least squares methods there is of course no a priori requirement that data bin sizes need to be constant, but we show that fitting to data grouped into variable width bins is particularly prone to produce biased results if the bin boundaries are chosen to optimize the comparison of the binned data to a wrong model. The degree of bias that can be achieved simply with variable binning can be surprisingly large. Fitting the data with an unbinned likelihood method, when possible to do so, is the best way for researchers to show that their analyses are not biased by binning effects. Failing that, equal bin widths should be employed as a cross-check of the fitting analysis whenever possible

  9. Potential fitting biases resulting from grouping data into variable width bins

    Energy Technology Data Exchange (ETDEWEB)

    Towers, S., E-mail: smtowers@asu.edu

    2014-07-30

    When reading peer-reviewed scientific literature describing any analysis of empirical data, it is natural and correct to proceed with the underlying assumption that experiments have made good faith efforts to ensure that their analyses yield unbiased results. However, particle physics experiments are expensive and time consuming to carry out, thus if an analysis has inherent bias (even if unintentional), much money and effort can be wasted trying to replicate or understand the results, particularly if the analysis is fundamental to our understanding of the universe. In this note we discuss the significant biases that can result from data binning schemes. As we will show, if data are binned such that they provide the best comparison to a particular (but incorrect) model, the resulting model parameter estimates when fitting to the binned data can be significantly biased, leading us to too often accept the model hypothesis when it is not in fact true. When using binned likelihood or least squares methods there is of course no a priori requirement that data bin sizes need to be constant, but we show that fitting to data grouped into variable width bins is particularly prone to produce biased results if the bin boundaries are chosen to optimize the comparison of the binned data to a wrong model. The degree of bias that can be achieved simply with variable binning can be surprisingly large. Fitting the data with an unbinned likelihood method, when possible to do so, is the best way for researchers to show that their analyses are not biased by binning effects. Failing that, equal bin widths should be employed as a cross-check of the fitting analysis whenever possible.

  10. Differentially Private Event Histogram Publication on Sequences over Graphs

    Institute of Scientific and Technical Information of China (English)

    Ning Wang; Yu Gu; Jia Xu; Fang-Fang Li; Ge Yu

    2017-01-01

    The big data era is coming with strong and ever-growing demands on analyzing personal information and footprints in the cyber world. To enable such analysis without privacy leak risk, differential privacy (DP) has been quickly rising in recent years, as the first practical privacy protection model with rigorous theoretical guarantee. This paper discusses how to publish differentially private histograms on events in time series domain, with sequences of personal events over graphs with events as edges. Such individual-generated sequences commonly appear in formalized industrial workflows, online game logs, and spatial-temporal trajectories. Directly publishing the statistics of sequences may compromise personal privacy. While existing DP mechanisms mainly target at normalized domains with fixed and aligned dimensions, our problem raises new challenges when the sequences could follow arbitrary paths on the graph. To tackle the problem, we reformulate the problem with a three-step framework, which 1) carefully truncates the original sequences, trading off errors introduced by the truncation with those introduced by the noise added to guarantee privacy, 2) decomposes the event graph into path sub-domains based on a group of event pivots, and 3) employs a deeply optimized tree-based histogram construction approach for each sub-domain to benefit with less noise addition. We present a careful analysis on our framework to support thorough optimizations over each step of the framework, and verify the huge improvements of our proposals over state-of-the-art solutions.

  11. Variational Histogram Equalization for Single Color Image Defogging

    Directory of Open Access Journals (Sweden)

    Li Zhou

    2016-01-01

    Full Text Available Foggy images taken in the bad weather inevitably suffer from contrast loss and color distortion. Existing defogging methods merely resort to digging out an accurate scene transmission in ignorance of their unpleasing distortion and high complexity. Different from previous works, we propose a simple but powerful method based on histogram equalization and the physical degradation model. By revising two constraints in a variational histogram equalization framework, the intensity component of a fog-free image can be estimated in HSI color space, since the airlight is inferred through a color attenuation prior in advance. To cut down the time consumption, a general variation filter is proposed to obtain a numerical solution from the revised framework. After getting the estimated intensity component, it is easy to infer the saturation component from the physical degradation model in saturation channel. Accordingly, the fog-free image can be restored with the estimated intensity and saturation components. In the end, the proposed method is tested on several foggy images and assessed by two no-reference indexes. Experimental results reveal that our method is relatively superior to three groups of relevant and state-of-the-art defogging methods.

  12. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chih-Chung Ting

    2015-07-01

    Full Text Available Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods.

  13. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    Science.gov (United States)

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-01-01

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods. PMID:26184219

  14. Accelerated weight histogram method for exploring free energy landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Lindahl, V.; Lidmar, J.; Hess, B. [Department of Theoretical Physics and Swedish e-Science Research Center, KTH Royal Institute of Technology, 10691 Stockholm (Sweden)

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  15. Image compression using moving average histogram and RBF network

    International Nuclear Information System (INIS)

    Khowaja, S.; Ismaili, I.A.

    2015-01-01

    Modernization and Globalization have made the multimedia technology as one of the fastest growing field in recent times but optimal use of bandwidth and storage has been one of the topics which attract the research community to work on. Considering that images have a lion share in multimedia communication, efficient image compression technique has become the basic need for optimal use of bandwidth and space. This paper proposes a novel method for image compression based on fusion of moving average histogram and RBF (Radial Basis Function). Proposed technique employs the concept of reducing color intensity levels using moving average histogram technique followed by the correction of color intensity levels using RBF networks at reconstruction phase. Existing methods have used low resolution images for the testing purpose but the proposed method has been tested on various image resolutions to have a clear assessment of the said technique. The proposed method have been tested on 35 images with varying resolution and have been compared with the existing algorithms in terms of CR (Compression Ratio), MSE (Mean Square Error), PSNR (Peak Signal to Noise Ratio), computational complexity. The outcome shows that the proposed methodology is a better trade off technique in terms of compression ratio, PSNR which determines the quality of the image and computational complexity. (author)

  16. Stochastic Learning of Multi-Instance Dictionary for Earth Mover's Distance based Histogram Comparison

    OpenAIRE

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover's distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stoc...

  17. Increasing Donations to Supermarket Food-Bank Bins Using Proximal Prompts

    Science.gov (United States)

    Farrimond, Samantha J.; Leland, Louis S., Jr.

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin. (Contains 1…

  18. Increasing Donations to Supermarket Food-Bank Bins Using Proximal Prompts

    OpenAIRE

    Farrimond, Samantha J; Leland, Louis S

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin.

  19. A compost bin for handling privy wastes: its fabrication and use

    Science.gov (United States)

    R.E. Leonard; S.C. Fay

    1978-01-01

    A 24-ft3 (6.8-m3) fiberglass bin was constructed and tested for its effectiveness in composting privy wastes. A mixture of ground hardwood bark and raw sewage was used for composting. Temperatures in excess of 60°C for 36 hours were produced in the bin by aerobic, thermophilic composting. This temperature is...

  20. Using a combination of binning strategies and taxonomic approaches to unravel the anaerobic digestion microbiome

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    of scaffolds comprehensive of thousands genome sequences, but the binning of these scaffolds into OTUs representative of microbial genomes is still challenging. In the attempt to obtain a deep characterization of the anaerobic digestion microbiome, different metagenomic binning approaches were integrated...

  1. Waste Isolation Pilot Plant Dry Bin-Scale Integrated Systems Checkout Plan

    International Nuclear Information System (INIS)

    1991-04-01

    In order to determine the long-term performance of the Waste Isolation Pilot Plant (WIPP) disposal system, in accordance with the requirements of the US Environmental Protection Agency (EPA) Standard 40 CFR 191, Subpart B, Sections 13 and 15, two performance assessment tests will be conducted. The tests are titled WIPP Bin-Scale Contact Handled (CH) Transuranic (TRU) Waste Tests and WIPP In Situ Alcove CH TRU Waste Tests. These tests are designed to measure the gas generation characteristics of CH TRU waste. Much of the waste will be specially prepared to provide data for a better understanding of the interactions due to differing degradation modes, waste forms, and repository environmental affects. The bin-scale test is designed to emplace nominally 146 bins. The majority of the bins will contain various forms of waste. Eight bins will be used as reference bins and will contain no waste. This checkout plan exercises the systems, operating procedures, and training readiness of personnel to safely carry out those specifically dedicated activities associated with conducting the bin-scale test plan for dry bins only. The plan does not address the entire WIPP facility readiness state. 18 refs., 6 figs., 3 tabs

  2. Using histograms to introduce randomization in the generation of ensembles of decision trees

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  3. Evaluation Of Plutonium Oxide Destructive Chemical Analyses For Validity Of Original 3013 Container Binning

    International Nuclear Information System (INIS)

    Mcclard, J.; Kessinger, G.

    2010-01-01

    The surveillance program for 3013 containers is based, in part, on the separation of containers into various bins related to potential container failure mechanisms. The containers are assigned to bins based on moisture content and pre-storage estimates of content chemistry. While moisture content is measured during the packaging of each container, chemistry estimates are made by using a combination of process knowledge, packaging data and prompt gamma analyses to establish the moisture and chloride/fluoride content of the materials. Packages with high moisture and chloride/fluoride contents receive more detailed surveillances than packages with less chloride/fluoride and/or moisture. Moisture verification measurements and chemical analyses performed during the surveillance program provided an opportunity to validate the binning process. Validation results demonstrated that the binning effort was generally successful in placing the containers in the appropriate bin for surveillance and analysis.

  4. Designing a power supply for Nim-bin formatted equipment

    International Nuclear Information System (INIS)

    Banuelos G, L. E.; Hernandez D, V. M.; Vega C, H. R.

    2016-09-01

    From an old Nuclear Chicago power supply that was practically in the trash, was able to recover the 19 inches casing, rear connectors and the housing where the circuits were. From here all mechanical parts were cleaned and the electronic design was started to replace the original voltage and current functions of this equipment. The cards for the ±6, ±12 and ±24 voltages were designed, simulated and tested with circuitry that does not rely on specialized components or that is sold only by the equipment manufacturer. In the handling of the current by each voltage to operate, was possible to tie with the specifications of the manufacturers like Ortec or Canberra where a model of power supply gives a power of 160 Watts. Basic tests were performed to show that the behavior is very similar to commercial equipment; such as the full load regulation index and the noise level in the supply voltages. So our Nim-bin voltage source is viable for use in our institution laboratories. (Author)

  5. A lifelong learning hyper-heuristic method for bin packing.

    Science.gov (United States)

    Sim, Kevin; Hart, Emma; Paechter, Ben

    2015-01-01

    We describe a novel hyper-heuristic system that continuously learns over time to solve a combinatorial optimisation problem. The system continuously generates new heuristics and samples problems from its environment; and representative problems and heuristics are incorporated into a self-sustaining network of interacting entities inspired by methods in artificial immune systems. The network is plastic in both its structure and content, leading to the following properties: it exploits existing knowledge captured in the network to rapidly produce solutions; it can adapt to new problems with widely differing characteristics; and it is capable of generalising over the problem space. The system is tested on a large corpus of 3,968 new instances of 1D bin-packing problems as well as on 1,370 existing problems from the literature; it shows excellent performance in terms of the quality of solutions obtained across the datasets and in adapting to dynamically changing sets of problem instances compared to previous approaches. As the network self-adapts to sustain a minimal repertoire of both problems and heuristics that form a representative map of the problem space, the system is further shown to be computationally efficient and therefore scalable.

  6. Looking at large data sets using binned data plots

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  7. Fontes binárias supermoles de raios X

    Science.gov (United States)

    Pires, A. M.; Janot Pacheco, E.

    2003-08-01

    Estuda-se as características físicas das fontes supermoles (de raios X (SSS), utilizando dados ópticos e em altas energias, no âmbito de um trabalho de IC. Trata-se de binárias que apresentam espectro X muito mole, baixas temperaturas e altas luminosidades bolométricas. Esse sistemas são compostos por uma anã branca realizando fusão em sua superfície, a partir de matéria perdida pela estrela companheira. Os resíduos de fusão se acumulam na superfície da anã branca, e essa pode ultrapassar o limite de Chadrasekhar, produzir um colapso gravitacional, sendo esse um dos cenários propostos para as explosões de SN Ia. Apresentamos nesta comunicação o estado da arte das características físicas das fontes SSS, situando-as no âmbito das VCs. Procuramos também situar esses objetos em relação às variáveis galácticas V Sge, na medida em que os dois grupos apresentam certas caracerísticas bastante semelhantes.A metodologia adotada é aquela pedagógico-cognitiva clássica de um trabalho de IC na área de ciências exatas.

  8. Solid waste bin detection and classification using Dynamic Time Warping and MLP classifier

    Energy Technology Data Exchange (ETDEWEB)

    Islam, Md. Shafiqul, E-mail: shafique@eng.ukm.my [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Hannan, M.A., E-mail: hannan@eng.ukm.my [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Basri, Hassan [Dept. of Civil and Structural Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Hussain, Aini; Arebey, Maher [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia)

    2014-02-15

    Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensor intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level.

  9. Surface contamination of hazardous drug pharmacy storage bins and pharmacy distributor shipping containers.

    Science.gov (United States)

    Redic, Kimberly A; Fang, Kayleen; Christen, Catherine; Chaffee, Bruce W

    2018-03-01

    Purpose This study was conducted to determine whether there is contamination on exterior drug packaging using shipping totes from the distributor and carousel storage bins as surrogate markers of external packaging contamination. Methods A two-part study was conducted to measure the presence of 5-fluorouracil, ifosfamide, cyclophosphamide, docetaxel and paclitaxel using surrogate markers for external drug packaging. In Part I, 10 drug distributor shipping totes designated for transport of hazardous drugs provided a snapshot view of contamination from regular use and transit in and out of the pharmacy. An additional two totes designated for transport of non-hazardous drugs served as controls. In Part II, old carousel storage bins (i.e. those in use pre-study) were wiped for snapshot view of hazardous drug contamination on storage bins. New carousel storage bins were then put into use for storage of the five tested drugs and used for routine storage and inventory maintenance activities. Carousel bins were wiped at time intervals 0, 8, 16 and 52 weeks to measure surface contamination. Results Two of the 10 hazardous shipping totes were contaminated. Three of the five-old carousel bins were contaminated with cyclophosphamide. One of the old carousel bins was also contaminated with ifosfamide. There were no detectable levels of hazardous drugs on any of the new storage bins at time 0, 8 or 16 weeks. However, at the Week 52, there was a detectable level of 5-FU present in the 5-FU carousel bin. Conclusions Contamination of the surrogate markers suggests that external packaging for hazardous drugs is contaminated, either during the manufacturing process or during routine chain of custody activities. These results demonstrate that occupational exposure may occur due to contamination from shipping totes and storage bins, and that handling practices including use of personal protective equipment is warranted.

  10. TOP-DRAWER, Histograms, Scatterplots, Curve-Smoothing

    International Nuclear Information System (INIS)

    Chaffee, R.B.

    1988-01-01

    Description of program or function: TOP DRAWER produces histograms, scatterplots, data points with error bars and plots symbols, and curves passing through data points, with elaborate titles. It also does smoothing and calculates frequency distributions. There is little facility, however, for arithmetic manipulation. Because of its restricted applicability, TOP DRAWER can be controlled by a relatively simple set of commands, and this control is further simplified by the choice of reasonable default values for all parameters. Despite this emphasis on simplicity, TOP DRAWER plots are of exceptional quality and are suitable for publication. Input is normally from card-image records, although a set of subroutines is provided to accommodate FORTRAN calls. The program contains switches which can be set to generate code suitable for execution on IBM, DECX VAX, and PRIME computers

  11. Steam leak detection method in pipeline using histogram analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun [Saean Engineering Corp, Seoul (Korea, Republic of); Park, Jong Won [Dept. of Information Communications Engineering, Chungnam NationalUnversity, Daejeon (Korea, Republic of)

    2015-10-15

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results.

  12. Histogram plots and cutoff energies for nuclear discrete levels

    International Nuclear Information System (INIS)

    Belgya, T.; Molnar, G.; Fazekas, B.; Oestoer, J.

    1997-05-01

    Discrete level schemes for 1277 nuclei, from 6 Li through 251 Es, extracted from the Evaluated Nuclear Structure Data File were analyzed. Cutoff energies (U max ), indicating the upper limit of level scheme completeness, were deduced from the inspection of histograms of the cumulative number of levels. Parameters of the constant-temperature level density formula (nuclear temperature T and energy shift U 0 ) were obtained by means of the least square fit of the formula to the known levels below cutoff energy. The results are tabulated for all 1277 nuclei allowing for an easy and reliable application of the constant-temperature level density approach. A complete set of cumulative plots of discrete levels is also provided. (author). 5 figs, 2 tabs

  13. TSimpleAnalysis: histogramming many trees in parallel

    CERN Document Server

    Giommi, Luca

    2016-01-01

    I worked inside the ROOT team of EP-SFT group. My project focuses on writing a ROOT class that has the aim of creating histograms from a TChain. The name of the class is TSimpleAnalysis and it is already integrated in ROOT. The work that I have done was to write the source, the header le of the class and also a python script, that allows to the user to use the class through the command line. This represents a great improvement respect to the usual user code that counts lines and lines of code to do the same thing. (Link for the class: https://root.cern.ch/doc/master/classTSimpleAnalysis.html)

  14. Fast Graph Partitioning Active Contours for Image Segmentation Using Histograms

    Directory of Open Access Journals (Sweden)

    Nath SumitK

    2009-01-01

    Full Text Available Abstract We present a method to improve the accuracy and speed, as well as significantly reduce the memory requirements, for the recently proposed Graph Partitioning Active Contours (GPACs algorithm for image segmentation in the work of Sumengen and Manjunath (2006. Instead of computing an approximate but still expensive dissimilarity matrix of quadratic size, , for a 2D image of size and regular image tiles of size , we use fixed length histograms and an intensity-based symmetric-centrosymmetric extensor matrix to jointly compute terms associated with the complete dissimilarity matrix. This computationally efficient reformulation of GPAC using a very small memory footprint offers two distinct advantages over the original implementation. It speeds up convergence of the evolving active contour and seamlessly extends performance of GPAC to multidimensional images.

  15. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....

  16. Emissions allocation in transportation routes

    NARCIS (Netherlands)

    Leenders, B.P.J.; Velázquez Martínez, J.; Fransoo, J.C.

    2017-01-01

    This article studies the allocation of CO2 emissions to a specific shipment in routing transportation. The authors show that this problem differs from a cost allocation problem specifically because the concavity condition does not hold necessarily in the CO2 allocation problem. This implies that a

  17. Symbol recognition via statistical integration of pixel-level constraint histograms: a new descriptor.

    Science.gov (United States)

    Yang, Su

    2005-02-01

    A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.

  18. Hand Vein Images Enhancement Based on Local Gray-level Information Histogram

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2015-06-01

    Full Text Available Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image information. Then, we propose the histogram equalization method that is composed of an automatic histogram separation module and an intensity transformation module, and the histogram separation module is a combination of the proposed prompt multiple threshold procedure and an optimum peak signal-to-noise (PSNR calculation to separate the histogram into small-scale detail, the use of the intensity transformation module can enhance the vein images with vein topological structure and gray information preservation for each generated sub-histogram. Experimental results show that the proposed method can achieve extremely good contrast enhancement effect.

  19. Thresholding using two-dimensional histogram and watershed algorithm in the luggage inspection system

    International Nuclear Information System (INIS)

    Chen Jingyun; Cong Peng; Song Qi

    2006-01-01

    The authors present a new DR image segmentation method based on two-dimensional histogram and watershed algorithm. The authors use watershed algorithm to locate threshold on the vertical projection plane of two-dimensional histogram. This method is applied to the segmentation of DR images produced by luggage inspection system with DR-CT. The advantage of this method is also analyzed. (authors)

  20. Curvature histogram features for retrieval of images of smooth 3D objects

    International Nuclear Information System (INIS)

    Zhdanov, I; Scherbakov, O; Potapov, A; Peterson, M

    2014-01-01

    We consider image features on the base of histograms of oriented gradients (HOG) with addition of contour curvature histogram (HOG-CH), and also compare it with results of known scale-invariant feature transform (SIFT) approach in application to retrieval of images of smooth 3D objects.

  1. Cross-interval histogram analysis of neuronal activity on multi-electrode arrays

    NARCIS (Netherlands)

    Castellone, P.; Rutten, Wim; Marani, Enrico

    2003-01-01

    Cross-neuron-interval histogram (CNIH) analysis has been performed in order to study correlated activity and connectivity between pairs of neurons in a spontaneously active developing cultured network of rat cortical cells. Thirty-eight histograms could be analyzed using two parameters, one for the

  2. Test Plan Addendum No. 1: Waste Isolation Pilot Plant bin-scale CH TRU waste tests

    International Nuclear Information System (INIS)

    Molecke, M.A.; Lappin, A.R.

    1990-12-01

    This document is the first major revision to the Test Plan: WIPP Bin-Scale CH TRU Waste Tests. Factors that make this revision necessary are described and justified in Section 1, and elaborated upon in Section 4. This addendum contains recommended estimates of, and details for: (1) The total separation of waste leaching/solubility tests from bin-scale gas tests, including preliminary details and quantities of leaching tests required for testing of Levels 1, 2, and 3 WIPP CH TRU wastes; (2) An initial description and quantification of bin-scale gas test Phase 0, added to provide a crucial tie to pretest waste characterization representatives and overall test statistical validation; (3) A revision to the number of test bins required for Phases 1 and 2 of the bin gas test program, and specification of the numbers of additional bin tests required for incorporating gas testing of Level 2 wastes into test Phase 3. Contingencies are stated for the total number of test bins required, both positive and negative, including the supporting assumptions, logic, and decision points. (4) Several other general test detail updates occurring since the Test Plan was approved and published in January, 1990. Possible impacts of recommended revisions included in this Addendum on WIPP site operations are called out and described. 56 refs., 12 tabs

  3. Treatment plan evaluation using dose-volume histogram (DVH) and spatial dose-volume histogram (zDVH)

    International Nuclear Information System (INIS)

    Cheng, C.-W.; Das, Indra J.

    1999-01-01

    Objective: The dose-volume histogram (DVH) has been accepted as a tool for treatment-plan evaluation. However, DVH lacks spatial information. A new concept, the z-dependent dose-volume histogram (zDVH), is presented as a supplement to the DVH in three-dimensional (3D) treatment planning to provide the spatial variation, as well as the size and magnitude of the different dose regions within a region of interest. Materials and Methods: Three-dimensional dose calculations were carried out with various plans for three disease sites: lung, breast, and prostate. DVHs were calculated for the entire volume. A zDVH is defined as a differential dose-volume histogram with respect to a computed tomographic (CT) slice position. In this study, zDVHs were calculated for each CT slice in the treatment field. DVHs and zDVHs were compared. Results: In the irradiation of lung, DVH calculation indicated that the treatment plan satisfied the dose-volume constraint placed on the lung and zDVH of the lung revealed that a sizable fraction of the lung centered about the central axis (CAX) received a significant dose, a situation that warranted a modification of the treatment plan due to the removal of one lung. In the irradiation of breast with tangential fields, the DVH showed that about 7% of the breast volume received at least 110% of the prescribed dose (PD) and about 11% of the breast received less than 98% PD. However, the zDVHs of the breast volume in each of seven planes showed the existence of high-dose regions of 34% and 15%, respectively, of the volume in the two caudal-most planes and cold spots of about 40% in the two cephalic planes. In the treatment planning of prostate, DVHs showed that about 15% of the bladder and 40% of the rectum received 102% PD, whereas about 30% of the bladder and 50% of the rectum received the full dose. Taking into account the hollow structure of both the bladder and the rectum, the dose-surface histograms (DSH) showed larger hot-spot volume, about

  4. Neutron stars as X-ray burst sources. II. Burst energy histograms and why they burst

    International Nuclear Information System (INIS)

    Baan, W.A.

    1979-01-01

    In this work we explore some of the implications of a model for X-ray burst sources where bursts are caused by Kruskal-Schwarzschild instabilities at the magnetopause of an accreting and rotating neutron star. A number of simplifying assumptions are made in order to test the model using observed burst-energy histograms for the rapid burster MXB 1730--335. The predicted histograms have a correct general shape, but it appears that other effects are important as well, and that mode competition, for instance, may suppress the histograms at high burst energies. An explanation is ventured for the enhancement in the histogram at the highest burst energies, which produces the bimodal shape in high accretion rate histograms. Quantitative criteria are given for deciding when accreting neutron stars are steady sources or burst sources, and these criteria are tested using the X-ray pulsars

  5. Whole-Lesion Histogram Analysis of Apparent Diffusion Coefficient for the Assessment of Cervical Cancer.

    Science.gov (United States)

    Guan, Yue; Shi, Hua; Chen, Ying; Liu, Song; Li, Weifeng; Jiang, Zhuoran; Wang, Huanhuan; He, Jian; Zhou, Zhengyang; Ge, Yun

    2016-01-01

    The aim of this study was to explore the application of whole-lesion histogram analysis of apparent diffusion coefficient (ADC) values of cervical cancer. A total of 54 women (mean age, 53 years) with cervical cancers underwent 3-T diffusion-weighted imaging with b values of 0 and 800 s/mm prospectively. Whole-lesion histogram analysis of ADC values was performed. Paired sample t test was used to compare differences in ADC histogram parameters between cervical cancers and normal cervical tissues. Receiver operating characteristic curves were constructed to identify the optimal threshold of each parameter. All histogram parameters in this study including ADCmean, ADCmin, ADC10%-ADC90%, mode, skewness, and kurtosis of cervical cancers were significantly lower than those of normal cervical tissues (all P histogram analysis of ADC maps is useful in the assessment of cervical cancer.

  6. Regulation of the interaction between the neuronal BIN1 isoform 1 and Tau proteins - role of the SH3 domain.

    Science.gov (United States)

    Malki, Idir; Cantrelle, François-Xavier; Sottejeau, Yoann; Lippens, Guy; Lambert, Jean-Charles; Landrieu, Isabelle

    2017-10-01

    Bridging integrator 1 (bin1) gene is a genetic determinant of Alzheimer's disease (AD) and has been reported to modulate Alzheimer's pathogenesis through pathway(s) involving Tau. The functional impact of Tau/BIN1 interaction as well as the molecular details of this interaction are still not fully resolved. As a consequence, how BIN1 through its interaction with Tau affects AD risk is also still not determined. To progress in this understanding, interaction of Tau with two BIN1 isoforms was investigated using Nuclear Magnetic Resonance spectroscopy. 1 H, 15 N spectra showed that the C-terminal SH3 domain of BIN1 isoform 1 (BIN1Iso1) is not mobile in solution but locked with the core of the protein. In contrast, the SH3 domain of BIN1 isoform 9 (BIN1Iso9) behaves as an independent mobile domain. This reveals an equilibrium between close and open conformations for the SH3 domain. Interestingly, a 334-376 peptide from the clathrin and AP-2-binding domain (CLAP) domain of BIN1Iso1, which contains a SH3-binding site, is able to compete with BIN1-SH3 intramolecular interaction. For both BIN1 isoforms, the SH3 domain can interact with Tau(210-240) sequence. Tau(210-240) peptide can indeed displace the intramolecular interaction of the BIN1-SH3 of BIN1Iso1 and form a complex with the released domain. The measured K d were in agreement with a stronger affinity of Tau peptide. Both CLAP and Tau peptides occupied the same surface on the BIN1-SH3 domain, showing that their interaction is mutually exclusive. These results emphasize an additional level of complexity in the regulation of the interaction between BIN1 and Tau dependent of the BIN1 isoforms. © 2017 Federation of European Biochemical Societies.

  7. GeneBins: a database for classifying gene expression data, with application to plant genome arrays

    Directory of Open Access Journals (Sweden)

    Weiller Georg

    2007-03-01

    Full Text Available Abstract Background To interpret microarray experiments, several ontological analysis tools have been developed. However, current tools are limited to specific organisms. Results We developed a bioinformatics system to assign the probe set sequences of any organism to a hierarchical functional classification modelled on KEGG ontology. The GeneBins database currently supports the functional classification of expression data from four Affymetrix arrays; Arabidopsis thaliana, Oryza sativa, Glycine max and Medicago truncatula. An online analysis tool to identify relevant functions is also provided. Conclusion GeneBins provides resources to interpret gene expression results from microarray experiments. It is available at http://bioinfoserver.rsbs.anu.edu.au/utils/GeneBins/

  8. Cost allocation with limited information

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tind, Jørgen

    This article investigates progressive development of Aumann-Shapley cost allocation in a multilevel organizational or production structure. In particular, we study a linear parametric programming setup utilizing the Dantzig-Wolfe decomposition procedure. Typically cost allocation takes place after...... all activities have been performed, for example by finishing all outputs. Here the allocation is made progressively with suggestions for activities. I other words cost allocation is performed in parallel for example with a production planning process. This development does not require detailed...... information about some technical constraints in order to make the cost allocation....

  9. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However

  10. Osama bin Laden - elus või siiski ammu surnud? / Aadu Hiietamm

    Index Scriptorium Estoniae

    Hiietamm, Aadu, 1954-

    2009-01-01

    USA on terrorivõrgustiku al Qaeda juhti Osama bin Ladenit jahitud juba kaheksa aastat, kuid tulemusteta. Vandenõuteoreetikute arvates on ta ammu surnud, kuid USA Luure Keskagentuuri juhi arvates varjab ta end Pakistanis

  11. BinMag: Widget for comparing stellar observed with theoretical spectra

    Science.gov (United States)

    Kochukhov, O.

    2018-05-01

    BinMag examines theoretical stellar spectra computed with Synth/SynthMag/Synmast/Synth3/SME spectrum synthesis codes and compare them to observations. An IDL widget program, BinMag applies radial velocity shift and broadening to the theoretical spectra to account for the effects of stellar rotation, radial-tangential macroturbulence, instrumental smearing. The code can also simulate spectra of spectroscopic binary stars by appropriate coaddition of two synthetic spectra. Additionally, BinMag can be used to measure equivalent width, fit line profile shapes with analytical functions, and to automatically determine radial velocity and broadening parameters. BinMag interfaces with the Synth3 (ascl:1212.010) and SME (ascl:1202.013) codes, allowing the user to determine chemical abundances and stellar atmospheric parameters from the observed spectra.

  12. Deterministically swapping frequency-bin entanglement from photon-photon to atom-photon hybrid systems

    Science.gov (United States)

    Ou, Bao-Quan; Liu, Chang; Sun, Yuan; Chen, Ping-Xing

    2018-02-01

    Inspired by the recent developments of the research on the atom-photon quantum interface and energy-time entanglement between single-photon pulses, we are motivated to study the deterministic protocol for the frequency-bin entanglement of the atom-photon hybrid system, which is analogous to the frequency-bin entanglement between single-photon pulses. We show that such entanglement arises naturally in considering the interaction between a frequency-bin entangled single-photon pulse pair and a single atom coupled to an optical cavity, via straightforward atom-photon phase gate operations. Its anticipated properties and preliminary examples of its potential application in quantum networking are also demonstrated. Moreover, we construct a specific quantum entanglement witness tool to detect such extended frequency-bin entanglement from a reasonably general set of separable states, and prove its capability theoretically. We focus on the energy-time considerations throughout the analysis.

  13. MetaBAT: Metagenome Binning based on Abundance and Tetranucleotide frequence

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dongwan; Froula, Jeff; Egan, Rob; Wang, Zhong

    2014-03-21

    Grouping large fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Here we developed automated metagenome binning software, called MetaBAT, which integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency. On synthetic datasets MetaBAT on average achieves 98percent precision and 90percent recall at the strain level with 281 near complete unique genomes. Applying MetaBAT to a human gut microbiome data set we recovered 176 genome bins with 92percent precision and 80percent recall. Further analyses suggest MetaBAT is able to recover genome fragments missed in reference genomes up to 19percent, while 53 genome bins are novel. In summary, we believe MetaBAT is a powerful tool to facilitate comprehensive understanding of complex microbial communities.

  14. Independent technical review of the Bin and Alcove test programs at the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    1993-12-01

    This Independent Technical Review (ITR) assessed the need for and technical validity of the proposed Bin and Alcove test programs using TRU-waste at the WIPP site. The ITR Team recommends that the planned Bin and Alcove tests be abandoned, and that new activities be initiated in support of the WIPP regulatory compliance processes. Recommendations in this report offer an alternate path for expeditiously attaining disposal certification and permitting

  15. Max–min Bin Packing Algorithm and its application in nano-particles filling

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    With regard to existing bin packing algorithms, higher packing efficiency often leads to lower packing speed while higher packing speed leads to lower packing efficiency. Packing speed and packing efficiency of existing bin packing algorithms including NFD, NF, FF, FFD, BF and BFD correlates negatively with each other, thus resulting in the failure of existing bin packing algorithms to satisfy the demand of nano-particles filling for both high speed and high efficiency. The paper provides a new bin packing algorithm, Max–min Bin Packing Algorithm (MM), which realizes both high packing speed and high packing efficiency. MM has the same packing speed as NFD (whose packing speed ranks no. 1 among existing bin packing algorithms); in case that the size repetition rate of objects to be packed is over 5, MM can realize almost the same packing efficiency as BFD (whose packing efficiency ranks No. 1 among existing bin packing algorithms), and in case that the size repetition rate of objects to be packed is over 500, MM can achieve exactly the same packing efficiency as BFD. With respect to application of nano-particles filling, the size repetition rate of nano particles to be packed is usually in thousands or ten thousands, far higher than 5 or 500. Consequently, in application of nano-particles filling, the packing efficiency of MM is exactly equal to that of BFD. Thus the irreconcilable conflict between packing speed and packing efficiency is successfully removed by MM, which leads to MM having better packing effect than any existing bin packing algorithm. In practice, there are few cases when the size repetition of objects to be packed is lower than 5. Therefore the MM is not necessarily limited to nano-particles filling, and can also be widely used in other applications besides nano-particles filling. Especially, MM has significant value in application of nano-particles filling such as nano printing and nano tooth filling.

  16. Application of Genetic Algorithm for the Bin Packing Problem with a New Representation Scheme

    Directory of Open Access Journals (Sweden)

    N. Mohamadi

    2010-10-01

    Full Text Available The Bin Packing Problem (BPP is to find the minimum number of binsneeded to pack a given set of objects of known sizes so that they donot exceed the capacity of each bin. This problem is known to beNP-Hard [5]; hence many heuristic procedures for its solution havebeen suggested. In this paper we propose a new representation schemeand solve the problem by a Genetic Algorithm. Limited computationalresults show the efficiency of this scheme.

  17. Genetic patterns in European geometrid moths revealed by the Barcode Index Number (BIN system.

    Directory of Open Access Journals (Sweden)

    Axel Hausmann

    Full Text Available BACKGROUND: The geometrid moths of Europe are one of the best investigated insect groups in traditional taxonomy making them an ideal model group to test the accuracy of the Barcode Index Number (BIN system of BOLD (Barcode of Life Datasystems, a method that supports automated, rapid species delineation and identification. METHODOLOGY/PRINCIPAL FINDINGS: This study provides a DNA barcode library for 219 of the 249 European geometrid moth species (88% in five selected subfamilies. The data set includes COI sequences for 2130 specimens. Most species (93% were found to possess diagnostic barcode sequences at the European level while only three species pairs (3% were genetically indistinguishable in areas of sympatry. As a consequence, 97% of the European species we examined were unequivocally discriminated by barcodes within their natural areas of distribution. We found a 1:1 correspondence between BINs and traditionally recognized species for 67% of these species. Another 17% of the species (15 pairs, three triads shared BINs, while specimens from the remaining species (18% were divided among two or more BINs. Five of these species are mixtures, both sharing and splitting BINs. For 82% of the species with two or more BINs, the genetic splits involved allopatric populations, many of which have previously been hypothesized to represent distinct species or subspecies. CONCLUSIONS/SIGNIFICANCE: This study confirms the effectiveness of DNA barcoding as a tool for species identification and illustrates the potential of the BIN system to characterize formal genetic units independently of an existing classification. This suggests the system can be used to efficiently assess the biodiversity of large, poorly known assemblages of organisms. For the moths examined in this study, cases of discordance between traditionally recognized species and BINs arose from several causes including overlooked species, synonymy, and cases where DNA barcodes revealed

  18. Afghanistan, the Taliban, and Osama bin Laden: The Background to September 11

    Science.gov (United States)

    Social Education, 2011

    2011-01-01

    On May 1, 2011, a group of U.S. soldiers boarded helicopters at a base in Afghanistan, hoping to find a man named Osama bin Laden. Bin Laden, the leader of the al Qaeda terrorist network, was responsible for a number of terrorist attacks around the world, including those of September 11, 2001, that killed nearly 3,000 people in the United States.…

  19. Histogram-based ionogram displays and their application to autoscaling

    Science.gov (United States)

    Lynn, Kenneth J. W.

    2018-03-01

    A simple method is described for displaying and auto scaling the basic ionogram parameters foF2 and h'F2 as well as some additional layer parameters from digital ionograms. The technique employed is based on forming frequency and height histograms in each ionogram. This technique has now been applied specifically to ionograms produced by the IPS5D ionosonde developed and operated by the Australian Space Weather Service (SWS). The SWS ionograms are archived in a cleaned format and readily available from the SWS internet site. However, the method is applicable to any ionosonde which produces ionograms in a digital format at a useful signal-to-noise level. The most novel feature of the technique for autoscaling is its simplicity and the avoidance of the mathematical imaging and line fitting techniques often used. The program arose from the necessity to display many days of ionogram output to allow the location of specific types of ionospheric event such as ionospheric storms, travelling ionospheric disturbances and repetitive ionospheric height changes for further investigation and measurement. Examples and applications of the method are given including the removal of sporadic E and spread F.

  20. Efficient Scalable Median Filtering Using Histogram-Based Operations.

    Science.gov (United States)

    Green, Oded

    2018-05-01

    Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.

  1. Landmark Detection in Orbital Images Using Salience Histograms

    Science.gov (United States)

    Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa

    2010-01-01

    NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.

  2. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  3. Histogram analysis of diffusion measures in clinically isolated syndromes and relapsing-remitting multiple sclerosis

    International Nuclear Information System (INIS)

    Yu Chunshui; Lin Fuchun; Liu Yaou; Duan Yunyun; Lei Hao; Li Kuncheng

    2008-01-01

    Objective: The purposes of our study were to employ diffusion tensor imaging (DTI)-based histogram analysis to determine the presence of occult damage in clinically isolated syndrome (CIS), to compare its severity with relapsing-remitting multiple sclerosis (RRMS), and to determine correlations between DTI histogram measures and clinical and MRI indices in these two diseases. Materials and methods: DTI scans were performed in 19 CIS and 19 RRMS patients and 19 matched healthy volunteers. Histogram analyses of mean diffusivity and fractional anisotropy were performed in normal-appearing brain tissue (NABT), normal-appearing white matter (NAWM) and gray matter (NAGM). Correlations were analyzed between these measures and expanded disability status scale (EDSS) scores, T 2 WI lesion volumes (LV) and normalized brain tissue volumes (NBTV) in CIS and RRMS patients. Results: Significant differences were found among CIS, RRMS and control groups in the NBTV and most of the DTI histogram measures of the NABT, NAWM and NAGM. In CIS patients, some DTI histogram measures showed significant correlations with LV and NBTV, but none of them with EDSS. In RRMS patients, however, some DTI histogram measures were significantly correlated with LV, NBTV and EDSS. Conclusion: Occult damage occurs in both NAGM and NAWM in CIS, but the severity is milder than that in RRMS. In CIS and RRMS, the occult damage might be related to both T2 lesion load and brain tissue atrophy. Some DTI histogram measures might be useful for assessing the disease progression in RRMS patients

  4. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    Science.gov (United States)

    Batra, Marion; Nägele, Thomas

    2015-01-01

    Purpose. The distribution of apparent diffusion coefficient (ADC) values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects. PMID:26609526

  5. Histogram analysis of T2*-based pharmacokinetic imaging in cerebral glioma grading.

    Science.gov (United States)

    Liu, Hua-Shan; Chiang, Shih-Wei; Chung, Hsiao-Wen; Tsai, Ping-Huei; Hsu, Fei-Ting; Cho, Nai-Yu; Wang, Chao-Ying; Chou, Ming-Chung; Chen, Cheng-Yu

    2018-03-01

    To investigate the feasibility of histogram analysis of the T2*-based permeability parameter volume transfer constant (K trans ) for glioma grading and to explore the diagnostic performance of the histogram analysis of K trans and blood plasma volume (v p ). We recruited 31 and 11 patients with high- and low-grade gliomas, respectively. The histogram parameters of K trans and v p , derived from the first-pass pharmacokinetic modeling based on the T2* dynamic susceptibility-weighted contrast-enhanced perfusion-weighted magnetic resonance imaging (T2* DSC-PW-MRI) from the entire tumor volume, were evaluated for differentiating glioma grades. Histogram parameters of K trans and v p showed significant differences between high- and low-grade gliomas and exhibited significant correlations with tumor grades. The mean K trans derived from the T2* DSC-PW-MRI had the highest sensitivity and specificity for differentiating high-grade gliomas from low-grade gliomas compared with other histogram parameters of K trans and v p . Histogram analysis of T2*-based pharmacokinetic imaging is useful for cerebral glioma grading. The histogram parameters of the entire tumor K trans measurement can provide increased accuracy with additional information regarding microvascular permeability changes for identifying high-grade brain tumors. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    Directory of Open Access Journals (Sweden)

    Uwe Klose

    2015-01-01

    Full Text Available Purpose. The distribution of apparent diffusion coefficient (ADC values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects.

  7. Quadrant Dynamic with Automatic Plateau Limit Histogram Equalization for Image Enhancement

    Directory of Open Access Journals (Sweden)

    P. Jagatheeswari

    2014-01-01

    Full Text Available The fundamental and important preprocessing stage in image processing is the image contrast enhancement technique. Histogram equalization is an effective contrast enhancement technique. In this paper, a histogram equalization based technique called quadrant dynamic with automatic plateau limit histogram equalization (QDAPLHE is introduced. In this method, a hybrid of dynamic and clipped histogram equalization methods are used to increase the brightness preservation and to reduce the overenhancement. Initially, the proposed QDAPLHE algorithm passes the input image through a median filter to remove the noises present in the image. Then the histogram of the filtered image is divided into four subhistograms while maintaining second separated point as the mean brightness. Then the clipping process is implemented by calculating automatically the plateau limit as the clipped level. The clipped portion of the histogram is modified to reduce the loss of image intensity value. Finally the clipped portion is redistributed uniformly to the entire dynamic range and the conventional histogram equalization is executed in each subhistogram independently. Based on the qualitative and the quantitative analysis, the QDAPLHE method outperforms some existing methods in literature.

  8. Quantitative image quality evaluation of pixel-binning in a flat-panel detector for x-ray fluoroscopy

    International Nuclear Information System (INIS)

    Srinivas, Yogesh; Wilson, David L.

    2004-01-01

    X-ray fluoroscopy places stringent design requirements on new flat-panel (FP) detectors, requiring both low-noise electronics and high data transfer rates. Pixel-binning, wherein data from more that one detector pixel are collected simultaneously, not only lowers the data transfer rate but also increases x-ray counts and pixel signal-to-noise ratio (SNR). In this study, we quantitatively assessed image quality of image sequences from four acquisition methods; no-binning and three types of binning; in synthetic images using a clinically relevant task of detecting an extended guidewire in a four-alternative forced-choice paradigm. Binning methods were conventional data-line (D) and gate-line (G) binning, and a novel method in which alternate frames in an image sequence used D and G binning. Two detector orientations placed the data lines either parallel or perpendicular to the guide wire. At a low exposure of 0.6 μR (1.548x10 -10 C/kg) per frame, irrespective of detector orientation, D binning with its reduced electronic noise was significantly (p -10 C/kg) per frame, with data lines parallel to the guidewire, detection with D binning was significantly (p<0.1) better than G binning. However, with data lines perpendicular to the guidewire, G binning was significantly (p<0.1) better than D binning because the partial area effect was reduced. Alternate binning was the best binning method when results were averaged over both orientations, and it was as good as the best binning method at either orientation. In addition, at low and high exposures, alternate binning gave a temporally fused image with a smooth guidewire, an important image quality feature not assessed in a detection experiment. While at high exposure, detection with no binning was as good, or better, than the best binning method, it might be impractical at fluoroscopy imaging rates. A computational observer model based on signal detection theory successfully fit data and was used to predict effects of

  9. A robust and accurate binning algorithm for metagenomic sequences with arbitrary species abundance ratio.

    Science.gov (United States)

    Leung, Henry C M; Yiu, S M; Yang, Bin; Peng, Yu; Wang, Yi; Liu, Zhihua; Chen, Jingchi; Qin, Junjie; Li, Ruiqiang; Chin, Francis Y L

    2011-06-01

    With the rapid development of next-generation sequencing techniques, metagenomics, also known as environmental genomics, has emerged as an exciting research area that enables us to analyze the microbial environment in which we live. An important step for metagenomic data analysis is the identification and taxonomic characterization of DNA fragments (reads or contigs) resulting from sequencing a sample of mixed species. This step is referred to as 'binning'. Binning algorithms that are based on sequence similarity and sequence composition markers rely heavily on the reference genomes of known microorganisms or phylogenetic markers. Due to the limited availability of reference genomes and the bias and low availability of markers, these algorithms may not be applicable in all cases. Unsupervised binning algorithms which can handle fragments from unknown species provide an alternative approach. However, existing unsupervised binning algorithms only work on datasets either with balanced species abundance ratios or rather different abundance ratios, but not both. In this article, we present MetaCluster 3.0, an integrated binning method based on the unsupervised top--down separation and bottom--up merging strategy, which can bin metagenomic fragments of species with very balanced abundance ratios (say 1:1) to very different abundance ratios (e.g. 1:24) with consistently higher accuracy than existing methods. MetaCluster 3.0 can be downloaded at http://i.cs.hku.hk/~alse/MetaCluster/.

  10. An evaluation of an improved method for computing histograms in dynamic tracer studies using positron-emission tomography

    International Nuclear Information System (INIS)

    Ollinger, J.M.; Snyder, D.L.

    1986-01-01

    A method for computing approximate minimum-mean-square-error estimates of histograms from list-mode data for use in dynamic tracer studies is evaluated. Parameters estimated from these histograms are significantly more accurate than those estimated from histograms computed by a commonly used method

  11. Theory of stable allocations

    Directory of Open Access Journals (Sweden)

    Pantelić Svetlana

    2014-01-01

    Full Text Available The Swedish Royal Academy awarded the 2012 Nobel Prize in Economics to Lloyd Shapley and Alvin Roth, for the theory of stable allocations and the practice of market design. These two American researchers worked independently from each other, combining basic theory and empirical investigations. Through their experiments and practical design they generated a flourishing field of research and improved the performance of many markets. Born in 1923 in Cambridge, Massachusetts, Shapley defended his doctoral thesis at Princeton University in 1953. For many years he worked at RAND, and for more than thirty years he was a professor at UCLA University. He published numerous scientific papers, either by himself or in cooperation with other economists.

  12. SSC accelerator availability allocation

    International Nuclear Information System (INIS)

    Dixon, K.T.; Franciscovich, J.

    1991-03-01

    Superconducting Super Collider (SSC) operational availability is an area of major concern, judged by the Central Design Group to present such risk that use of modern engineering tools would be essential to program success. Experience has shown that as accelerator beam availability falls below about 80%, efficiency of physics experiments degrades rapidly due to inability to maintain adequate coincident accelerator and detector operation. For this reason, the SSC availability goal has been set at 80%, even though the Fermi National Accelerator Laboratory accelerator, with a fraction of the SSC's complexity, has only recently approached that level. This paper describes the allocation of the top-level goal to part-level reliability and maintainability requirements, and it gives the results of parameter sensitivity studies designed to help identify the best approach to achieve the needed system availability within funding and schedule constraints. 1 ref., 12 figs., 4 tabs

  13. HEp-2 Cell Classification Using Shape Index Histograms With Donut-Shaped Spatial Pooling

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Vestergaard, Jacob Schack; Larsen, Rasmus

    2014-01-01

    We present a new method for automatic classification of indirect immunoflourescence images of HEp-2 cells into different staining pattern classes. Our method is based on a new texture measure called shape index histograms that captures second-order image structure at multiple scales. Moreover, we...... datasets. Our results show that shape index histograms are superior to other popular texture descriptors for HEp-2 cell classification. Moreover, when comparing to other automated systems for HEp-2 cell classification we show that shape index histograms are very competitive; especially considering...

  14. An alternative to γ histograms for ROI-based quantitative dose comparisons

    International Nuclear Information System (INIS)

    Dvorak, P

    2009-01-01

    An alternative to gamma (γ) histograms for ROI-based quantitative comparisons of dose distributions using the γ concept is proposed. The method provides minimum values of dose difference and distance-to-agreement such that a pre-set fraction of the region of interest passes the γ test. Compared to standard γ histograms, the method provides more information in terms of pass rate per γ calculation. This is achieved at negligible additional calculation cost and without loss of accuracy. The presented method is proposed as a useful and complementary alternative to standard γ histograms, increasing both the quantity and quality of information for use in acceptance or rejection decisions. (note)

  15. Importance measures and resource allocation

    International Nuclear Information System (INIS)

    Guey, C.N.; Morgan, T.; Hughes, E.A.

    1987-01-01

    This paper discusses various importance measures and their practical relevance to allocating resources. The characteristics of importance measures are illustrated through simple examples. Important factors associated with effectively allocating resources to improve plant system performance or to prevent system degradation are discussed. It is concluded that importance measures are only indicative of and not equal to the risk significance of a component, system, or event. A decision framework is suggested to provide a comprehensive basis for resource allocation

  16. IPO Allocations: Discriminatory or Discretionary?

    OpenAIRE

    William Wilhelm; Alexander Ljungqvist

    2001-01-01

    We estimate the structural links between IPO allocations, pre-market information production, and initial underpricing returns, within the context of theories of bookbuilding. Using a sample of both US and international IPOs we find evidence of the following: ? IPO allocation policies favour institutional investors, both in the US and worldwide. ? Increasing institutional allocations results in offer prices that deviate more from the indicative price range established prior to bankers’ efforts...

  17. Zinc allocation and re-allocation in rice

    NARCIS (Netherlands)

    Stomph, T.J.; Jiang, W.; Putten, van der P.E.L.; Struik, P.C.

    2014-01-01

    Aims: Agronomy and breeding actively search for options to enhance cereal grain Zn density. Quantifying internal (re-)allocation of Zn as affected by soil and crop management or genotype is crucial. We present experiments supporting the development of a conceptual model of whole plant Zn allocation

  18. Research on allocation efficiency of the daisy chain allocation algorithm

    Science.gov (United States)

    Shi, Jingping; Zhang, Weiguo

    2013-03-01

    With the improvement of the aircraft performance in reliability, maneuverability and survivability, the number of the control effectors increases a lot. How to distribute the three-axis moments into the control surfaces reasonably becomes an important problem. Daisy chain method is simple and easy to be carried out in the design of the allocation system. But it can not solve the allocation problem for entire attainable moment subset. For the lateral-directional allocation problem, the allocation efficiency of the daisy chain can be directly measured by the area of its subset of attainable moments. Because of the non-linear allocation characteristic, the subset of attainable moments of daisy-chain method is a complex non-convex polygon, and it is difficult to solve directly. By analyzing the two-dimensional allocation problems with a "micro-element" idea, a numerical calculation algorithm is proposed to compute the area of the non-convex polygon. In order to improve the allocation efficiency of the algorithm, a genetic algorithm with the allocation efficiency chosen as the fitness function is proposed to find the best pseudo-inverse matrix.

  19. Histogram-driven cupping correction (HDCC) in CT

    Science.gov (United States)

    Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.

    2010-04-01

    Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.

  20. Utilization of deletion bins to anchor and order sequences along the wheat 7B chromosome.

    Science.gov (United States)

    Belova, Tatiana; Grønvold, Lars; Kumar, Ajay; Kianian, Shahryar; He, Xinyao; Lillemo, Morten; Springer, Nathan M; Lien, Sigbjørn; Olsen, Odd-Arne; Sandve, Simen R

    2014-09-01

    A total of 3,671 sequence contigs and scaffolds were mapped to deletion bins on wheat chromosome 7B providing a foundation for developing high-resolution integrated physical map for this chromosome. Bread wheat (Triticum aestivum L.) has a large, complex and highly repetitive genome which is challenging to assemble into high quality pseudo-chromosomes. As part of the international effort to sequence the hexaploid bread wheat genome by the international wheat genome sequencing consortium (IWGSC) we are focused on assembling a reference sequence for chromosome 7B. The successful completion of the reference chromosome sequence is highly dependent on the integration of genetic and physical maps. To aid the integration of these two types of maps, we have constructed a high-density deletion bin map of chromosome 7B. Using the 270 K Nimblegen comparative genomic hybridization (CGH) array on a set of cv. Chinese spring deletion lines, a total of 3,671 sequence contigs and scaffolds (~7.8 % of chromosome 7B physical length) were mapped into nine deletion bins. Our method of genotyping deletions on chromosome 7B relied on a model-based clustering algorithm (Mclust) to accurately predict the presence or absence of a given genomic sequence in a deletion line. The bin mapping results were validated using three different approaches, viz. (a) PCR-based amplification of randomly selected bin mapped sequences (b) comparison with previously mapped ESTs and (c) comparison with a 7B genetic map developed in the present study. Validation of the bin mapping results suggested a high accuracy of the assignment of 7B sequence contigs and scaffolds to the 7B deletion bins.

  1. Unsupervised binning of environmental genomic fragments based on an error robust selection of l-mers.

    Science.gov (United States)

    Yang, Bin; Peng, Yu; Leung, Henry Chi-Ming; Yiu, Siu-Ming; Chen, Jing-Chi; Chin, Francis Yuk-Lun

    2010-04-16

    With the rapid development of genome sequencing techniques, traditional research methods based on the isolation and cultivation of microorganisms are being gradually replaced by metagenomics, which is also known as environmental genomics. The first step, which is still a major bottleneck, of metagenomics is the taxonomic characterization of DNA fragments (reads) resulting from sequencing a sample of mixed species. This step is usually referred as "binning". Existing binning methods are based on supervised or semi-supervised approaches which rely heavily on reference genomes of known microorganisms and phylogenetic marker genes. Due to the limited availability of reference genomes and the bias and instability of marker genes, existing binning methods may not be applicable in many cases. In this paper, we present an unsupervised binning method based on the distribution of a carefully selected set of l-mers (substrings of length l in DNA fragments). From our experiments, we show that our method can accurately bin DNA fragments with various lengths and relative species abundance ratios without using any reference and training datasets. Another feature of our method is its error robustness. The binning accuracy decreases by less than 1% when the sequencing error rate increases from 0% to 5%. Note that the typical sequencing error rate of existing commercial sequencing platforms is less than 2%. We provide a new and effective tool to solve the metagenome binning problem without using any reference datasets or markers information of any known reference genomes (species). The source code of our software tool, the reference genomes of the species for generating the test datasets and the corresponding test datasets are available at http://i.cs.hku.hk/~alse/MetaCluster/.

  2. Efficient Metropolitan Resource Allocation

    Directory of Open Access Journals (Sweden)

    Richard Arnott

    2016-05-01

    Full Text Available Over the past 30 years Calgary has doubled in size, from a population of 640,645 in 1985 to 1,230,915 in 2015. During that time the City has had five different mayors, hosted the Winter Olympics, and expanded the C-Train from 25 platforms to 45. Calgary’s Metropolitan Area has grown too, with Airdrie, Chestermere, Okotoks and Cochrane growing into full-fledged cities, ripe with inter-urban commuters.* And with changes to provincial legislation in the mid-’90s, rural Rocky View County and the Municipal District of Foothills are now real competitors for residential, commercial and industrial development that in the past would have been considered urban. In this metropolitan system, where people live, their household structure, and their place of work informs the services they need to conduct their daily lives, and directly impacts the spatial character of the City and the broader region. In sum, Metropolitan Calgary is increasingly complex. Calgary and the broader metropolitan area will continue to grow, even with the current economic slowdown. Frictions within Calgary, between the various municipalities in the metropolitan area, and the priorities of other local authorities (such as the School Boards and Alberta Health Services will continue to impact the agendas of local politicians and their ability to answer to the needs of their residents. How resources – whether it is hard infrastructure, affordable housing, classrooms, or hospital beds – are allocated over space and how these resources are funded, directly impacts these relationships. This technical paper provides my perspective as an urban economist on the efficient allocation of resources within a metropolitan system in general, with reference to Calgary where appropriate, and serves as a companion to the previously released “Reflections on Calgary’s Spatial Structure: An Urban Economists Critique of Municipal Planning in Calgary.” It is hoped that the concepts reviewed

  3. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  4. Adaptive Histogram Equalization Based Image Forensics Using Statistics of DC DCT Coefficients

    Directory of Open Access Journals (Sweden)

    Neetu Singh

    2018-01-01

    Full Text Available The vulnerability of digital images is growing towards manipulation. This motivated an area of research to deal with digital image forgeries. The certifying origin and content of digital images is an open problem in the multimedia world. One of the ways to find the truth of images is finding the presence of any type of contrast enhancement. In this work, novel and simple machine learning tool is proposed to detect the presence of histogram equalization using statistical parameters of DC Discrete Cosine Transform (DCT coefficients. The statistical parameters of the Gaussian Mixture Model (GMM fitted to DC DCT coefficients are used as features for classifying original and histogram equalized images. An SVM classifier has been developed to classify original and histogram equalized image which can detect histogram equalized image with accuracy greater than 95% when false rate is less than 5%.

  5. Automatic analysis of flow cytometric DNA histograms from irradiated mouse male germ cells

    International Nuclear Information System (INIS)

    Lampariello, F.; Mauro, F.; Uccelli, R.; Spano, M.

    1989-01-01

    An automatic procedure for recovering the DNA content distribution of mouse irradiated testis cells from flow cytometric histograms is presented. First, a suitable mathematical model is developed, to represent the pattern of DNA content and fluorescence distribution in the sample. Then a parameter estimation procedure, based on the maximum likelihood approach, is constructed by means of an optimization technique. This procedure has been applied to a set of DNA histograms relative to different doses of 0.4-MeV neutrons and to different time intervals after irradiation. In each case, a good agreement between the measured histograms and the corresponding fits has been obtained. The results indicate that the proposed method for the quantitative analysis of germ cell DNA histograms can be usefully applied to the study of the cytotoxic and mutagenic action of agents of toxicological interest such as ionizing radiations.18 references

  6. Histograms of Arecibo World Days Measurements and Linear-H Fits Between 1985 and 1995

    National Research Council Canada - National Science Library

    Melendez-Alvira, D

    1998-01-01

    This document presents histograms of linear-H model fits to electron density profiles measured with the incoherent scatter radar of the Arecibo Observatory in Puerto Rico during the World Days between 1985 and 1995...

  7. Dose-volume histograms for optimization of treatment plans illustrated by the example of oesophagus carcinoma

    International Nuclear Information System (INIS)

    Roth, J.; Huenig, R.; Huegli, C.

    1995-01-01

    Using the example of oesophagus carcinoma, dose-volume histograms for diverse treatment techniques are calculated and judged by means of multiplanar isodose representations. The selected treatment plans are ranked with the aid of the dose-volume histograms. We distinguish the tissue inside and outside of the target volume. The description of the spatial dose distribution in dependence of the different volumes and the respective fractions of the tumor dose therein with the help of dose-volume histograms brings about a correlation between the physical parameters and the biological effects. In addition one has to bear in mind the consequences of measures that influence the reaction and the side-effects of radiotherapy (e.g. chemotherapy), i.e. the recuperation of the tissues that were irradiated intentionally or inevitably. Taking all that into account it is evident that the dose-volume histograms are a powerful tool for assessing the quality of treatment plans. (orig./MG) [de

  8. The equivalent Histograms in clinical practice; Los histogramas equivalentes en la practica clinica

    Energy Technology Data Exchange (ETDEWEB)

    Pizarro Trigo, F.; Teijeira Garcia, M.; Zaballos Carrera, S.

    2013-07-01

    Is frequently abused of The tolerances established for organ at risk [1] in diagrams of standard fractionation (2Gy/session, 5 sessions per week) when applied to Dose-Volume histograms non-standard schema. The purpose of this work is to establish when this abuse may be more important and realize a transformation of fractionation non-standard of histograms dosis-volumen. Is exposed a case that can be useful to make clinical decisions. (Author)

  9. Endogeneously arising network allocation rules

    NARCIS (Netherlands)

    Slikker, M.

    2006-01-01

    In this paper we study endogenously arising network allocation rules. We focus on three allocation rules: the Myerson value, the position value and the component-wise egalitarian solution. For any of these three rules we provide a characterization based on component efficiency and some balanced

  10. Carbon allocation in forest ecosystems

    Science.gov (United States)

    Creighton M. Litton; James W. Raich; Michael G. Ryan

    2007-01-01

    Carbon allocation plays a critical role in forest ecosystem carbon cycling. We reviewed existing literature and compiled annual carbon budgets for forest ecosystems to test a series of hypotheses addressing the patterns, plasticity, and limits of three components of allocation: biomass, the amount of material present; flux, the flow of carbon to a component per unit...

  11. Risk allocation under liquidity constraints

    NARCIS (Netherlands)

    Csóka, P.; Herings, P.J.J.

    2013-01-01

    Risk allocation games are cooperative games that are used to attribute the risk of a financial entity to its divisions. In this paper, we extend the literature on risk allocation games by incorporating liquidity considerations. A liquidity policy specifies state-dependent liquidity requirements that

  12. Histogram based analysis of lung perfusion of children after congenital diaphragmatic hernia repair.

    Science.gov (United States)

    Kassner, Nora; Weis, Meike; Zahn, Katrin; Schaible, Thomas; Schoenberg, Stefan O; Schad, Lothar R; Zöllner, Frank G

    2018-05-01

    To investigate a histogram based approach to characterize the distribution of perfusion in the whole left and right lung by descriptive statistics and to show how histograms could be used to visually explore perfusion defects in two year old children after Congenital Diaphragmatic Hernia (CDH) repair. 28 children (age of 24.2±1.7months; all left sided hernia; 9 after extracorporeal membrane oxygenation therapy) underwent quantitative DCE-MRI of the lung. Segmentations of left and right lung were manually drawn to mask the calculated pulmonary blood flow maps and then to derive histograms for each lung side. Individual and group wise analysis of histograms of left and right lung was performed. Ipsilateral and contralateral lung show significant difference in shape and descriptive statistics derived from the histogram (Wilcoxon signed-rank test, phistogram derived parameters. Histogram analysis can be a valuable tool to characterize and visualize whole lung perfusion of children after CDH repair. It allows for several possibilities to analyze the data, either describing the perfusion differences between the right and left lung but also to explore and visualize localized perfusion patterns in the 3D lung volume. Subgroup analysis will be possible given sufficient sample sizes. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. ADC histogram analysis of muscle lymphoma - Correlation with histopathology in a rare entity.

    Science.gov (United States)

    Meyer, Hans-Jonas; Pazaitis, Nikolaos; Surov, Alexey

    2018-06-21

    Diffusion weighted imaging (DWI) is able to reflect histopathology architecture. A novel imaging approach, namely histogram analysis, is used to further characterize lesion on MRI. The purpose of this study is to correlate histogram parameters derived from apparent diffusion coefficient- (ADC) maps with histopathology parameters in muscle lymphoma. Eight patients (mean age 64.8 years, range 45-72 years) with histopathologically confirmed muscle lymphoma were retrospectively identified. Cell count, total nucleic and average nucleic areas were estimated using ImageJ. Additionally, Ki67-index was calculated. DWI was obtained on a 1.5T scanner by using the b values of 0 and 1000 s/mm2. Histogram analysis was performed as a whole lesion measurement by using a custom-made Matlabbased application. The correlation analysis revealed statistically significant correlation between cell count and ADCmean (p=-0.76, P=0.03) as well with ADCp75 (p=-0.79, P=0.02). Kurtosis and entropy correlated with average nucleic area (p=-0.81, P=0.02, p=0.88, P=0.007, respectively). None of the analyzed ADC parameters correlated with total nucleic area and with Ki67-index. This study identified significant correlations between cellularity and histogram parameters derived from ADC maps in muscle lymphoma. Thus, histogram analysis parameters reflect histopathology in muscle tumors. Advances in knowledge: Whole lesion ADC histogram analysis is able to reflect histopathology parameters in muscle lymphomas.

  14. Histogram-based quantitative evaluation of endobronchial ultrasonography images of peripheral pulmonary lesion.

    Science.gov (United States)

    Morikawa, Kei; Kurimoto, Noriaki; Inoue, Takeo; Mineshita, Masamichi; Miyazawa, Teruomi

    2015-01-01

    Endobronchial ultrasonography using a guide sheath (EBUS-GS) is an increasingly common bronchoscopic technique, but currently, no methods have been established to quantitatively evaluate EBUS images of peripheral pulmonary lesions. The purpose of this study was to evaluate whether histogram data collected from EBUS-GS images can contribute to the diagnosis of lung cancer. Histogram-based analyses focusing on the brightness of EBUS images were retrospectively conducted: 60 patients (38 lung cancer; 22 inflammatory diseases), with clear EBUS images were included. For each patient, a 400-pixel region of interest was selected, typically located at a 3- to 5-mm radius from the probe, from recorded EBUS images during bronchoscopy. Histogram height, width, height/width ratio, standard deviation, kurtosis and skewness were investigated as diagnostic indicators. Median histogram height, width, height/width ratio and standard deviation were significantly different between lung cancer and benign lesions (all p histogram standard deviation. Histogram standard deviation appears to be the most useful characteristic for diagnosing lung cancer using EBUS images. © 2015 S. Karger AG, Basel.

  15. DE-STRIPING FOR TDICCD REMOTE SENSING IMAGE BASED ON STATISTICAL FEATURES OF HISTOGRAM

    Directory of Open Access Journals (Sweden)

    H.-T. Gao

    2016-06-01

    Full Text Available Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  16. Fungal volatiles associated with moldy grain in ventilated and non-ventilated bin-stored wheat.

    Science.gov (United States)

    Sinha, R N; Tuma, D; Abramson, D; Muir, W E

    1988-01-01

    The fungal odor compounds 3-methyl-1-butanol, 1-octen-3-ol and 3-octanone were monitored in nine experimental bins in Winnipeg, Manitoba containing a hard red spring wheat during the autumn, winter and summer seasons of 1984-85. Quality changes were associated with seed-borne microflora and moisture content in both ventilated and non-ventilated bins containing wheat of 15.6 and 18.2% initial moisture content. All three odor compounds occurred in considerably greater amounts in bulk wheat in non-ventilated than in ventilated bins, particularly in those with wheat having 18.2% moisture content. The presence of these compounds usually coincided with infection of the seeds by the fungi Alternaria alternata (Fr.) Keissler, Aspergillus repens DeBarry, A. versicolor (Vuill.) Tiraboschi, Penicillium crustosum Thom, P. oxalicum Currie and Thom, P. aurantiogriesum Dierckx, and P. citrinum Thom. High production of all three odor compounds in damp wheat stored in non-ventilated bins was associated with heavy fungal infection of the seeds and reduction in seed germinability. High initial moisture content of the harvested grain accelerated the production of all three fungal volatiles in non-ventilated bins.

  17. Fast and accurate taxonomic assignments of metagenomic sequences using MetaBin.

    Directory of Open Access Journals (Sweden)

    Vineet K Sharma

    Full Text Available Taxonomic assignment of sequence reads is a challenging task in metagenomic data analysis, for which the present methods mainly use either composition- or homology-based approaches. Though the homology-based methods are more sensitive and accurate, they suffer primarily due to the time needed to generate the Blast alignments. We developed the MetaBin program and web server for better homology-based taxonomic assignments using an ORF-based approach. By implementing Blat as the faster alignment method in place of Blastx, the analysis time has been reduced by severalfold. It is benchmarked using both simulated and real metagenomic datasets, and can be used for both single and paired-end sequence reads of varying lengths (≥45 bp. To our knowledge, MetaBin is the only available program that can be used for the taxonomic binning of short reads (<100 bp with high accuracy and high sensitivity using a homology-based approach. The MetaBin web server can be used to carry out the taxonomic analysis, by either submitting reads or Blastx output. It provides several options including construction of taxonomic trees, creation of a composition chart, functional analysis using COGs, and comparative analysis of multiple metagenomic datasets. MetaBin web server and a standalone version for high-throughput analysis are available freely at http://metabin.riken.jp/.

  18. Robotic vision system for random bin picking with dual-arm robots

    Directory of Open Access Journals (Sweden)

    Kang Sangseung

    2016-01-01

    Full Text Available Random bin picking is one of the most challenging industrial robotics applications available. It constitutes a complicated interaction between the vision system, robot, and control system. For a packaging operation requiring a pick-and-place task, the robot system utilized should be able to perform certain functions for recognizing the applicable target object from randomized objects in a bin. In this paper, we introduce a robotic vision system for bin picking using industrial dual-arm robots. The proposed system recognizes the best object from randomized target candidates based on stereo vision, and estimates the position and orientation of the object. It then sends the result to the robot control system. The system was developed for use in the packaging process of cell phone accessories using dual-arm robots.

  19. Zinc allocation and re-allocation in rice

    Science.gov (United States)

    Stomph, Tjeerd Jan; Jiang, Wen; Van Der Putten, Peter E. L.; Struik, Paul C.

    2014-01-01

    Aims: Agronomy and breeding actively search for options to enhance cereal grain Zn density. Quantifying internal (re-)allocation of Zn as affected by soil and crop management or genotype is crucial. We present experiments supporting the development of a conceptual model of whole plant Zn allocation and re-allocation in rice. Methods: Two solution culture experiments using 70Zn applications at different times during crop development and an experiment on within-grain distribution of Zn are reported. In addition, results from two earlier published experiments are re-analyzed and re-interpreted. Results: A budget analysis showed that plant zinc accumulation during grain filling was larger than zinc allocation to the grains. Isotope data showed that zinc taken up during grain filling was only partly transported directly to the grains and partly allocated to the leaves. Zinc taken up during grain filling and allocated to the leaves replaced zinc re-allocated from leaves to grains. Within the grains, no major transport barrier was observed between vascular tissue and endosperm. At low tissue Zn concentrations, rice plants maintained concentrations of about 20 mg Zn kg−1 dry matter in leaf blades and reproductive tissues, but let Zn concentrations in stems, sheath, and roots drop below this level. When plant zinc concentrations increased, Zn levels in leaf blades and reproductive tissues only showed a moderate increase while Zn levels in stems, roots, and sheaths increased much more and in that order. Conclusions: In rice, the major barrier to enhanced zinc allocation towards grains is between stem and reproductive tissues. Enhancing root to shoot transfer will not contribute proportionally to grain zinc enhancement. PMID:24478788

  20. Solar-Powered Compaction Garbage Bins in Public Areas: A Preliminary Economic and Environmental Evaluation

    Directory of Open Access Journals (Sweden)

    Long Duc Nghiem

    2010-02-01

    Full Text Available An excel-based model was developed to evaluate economic and environmental benefits of the solar-powered compaction garbage bins in public areas in Australia. Input data were collected from Brisbane and Wollongong City councils, and Sydney Olympic Park. The results demonstrate that solar-powered compaction garbage bins would provide environmental benefits in all scenarios. However, results of the economic analysis of the three studied areas varied significantly. The unique situation of Sydney Olympic Park made implementation in that facility particularly appealing. A lower monthly rental cost is needed for the implementation of this novel waste management practice.

  1. High-visibility time-bin entanglement for testing chained Bell inequalities

    Science.gov (United States)

    Tomasin, Marco; Mantoan, Elia; Jogenfors, Jonathan; Vallone, Giuseppe; Larsson, Jan-Åke; Villoresi, Paolo

    2017-03-01

    The violation of Bell's inequality requires a well-designed experiment to validate the result. In experiments using energy-time and time-bin entanglement, initially proposed by Franson in 1989, there is an intrinsic loophole due to the high postselection. To obtain a violation in this type of experiment, a chained Bell inequality must be used. However, the local realism bound requires a high visibility in excess of 94.63% in the time-bin entangled state. In this work, we show how such a high visibility can be reached in order to violate a chained Bell inequality with six, eight, and ten terms.

  2. Propagation and survival of frequency-bin entangled photons in metallic nanostructures

    Directory of Open Access Journals (Sweden)

    Olislager Laurent

    2015-01-01

    Full Text Available We report on the design of two plasmonic nanostructures and the propagation of frequency-bin entangled photons through them. The experimental findings clearly show the robustness of frequency-bin entanglement, which survives after interactions with both a hybrid plasmo-photonic structure, and a nano-pillar array. These results confirm that quantum states can be encoded into the collective motion of a many-body electronic system without demolishing their quantum nature, and pave the way towards applications of plasmonic structures in quantum information.

  3. Interpretation of erythrocyte histograms obtained from automated hematology analyzers in hematologic diseases

    Directory of Open Access Journals (Sweden)

    Ali Maleki

    2015-12-01

    Full Text Available Background: Presently, the graphical data of blood cells (histograms and cytograms or/ scattergrams that they are usually available in all modern automated hematology analyzers are an integral a part of automated complete blood count (CBC. To find incorrect results from automated hematology analyzer and establish the samples that require additional analysis, Laboratory employees will use those data for quality control of obtaining results, to assist identification of complex and troublesome cases. Methods: During this descriptive analytic study, in addition to erythrocyte graphs from variety of patients, referring from March 2013 to Feb 2014 to our clinical laboratory, Zagros Hospital, Kermanshah, Iran, are given, the papers published in relevant literature as well as available published manuals of automatic blood cell counters were used. articles related to the key words of erythrocyte graphs and relevant literature as well as available published manuals of automatic blood cell counters were searched from valid databases such as Springer Link, google scholar, Pubmed and Sciencedirect. Then, the articles related to erythrogram, erythrocyte histogram and hematology analyzer graphs are involved in diagnosis of hematological disorder were searched and selected for this study. Results: Histograms and different automated CBC parameter become abnormal in various pathologic conditions, and can present important clues for diagnosis and treatment of hematologic and non-hematologic disorders. In several instances, these histograms have characteristic appearances in an exceedingly wide range of pathological conditions. In some hematologic disorders like iron deficiency or megaloblastic anemia, a sequential histogram can clearly show the progressive treatment and management. Conclusion: These graphical data are often accompanied by other automated CBC parameter and microscopic examination of peripheral blood smears (PBS, and can help in monitoring and

  4. Can histogram analysis of MR images predict aggressiveness in pancreatic neuroendocrine tumors?

    Science.gov (United States)

    De Robertis, Riccardo; Maris, Bogdan; Cardobi, Nicolò; Tinazzi Martini, Paolo; Gobbo, Stefano; Capelli, Paola; Ortolani, Silvia; Cingarlini, Sara; Paiella, Salvatore; Landoni, Luca; Butturini, Giovanni; Regi, Paolo; Scarpa, Aldo; Tortora, Giampaolo; D'Onofrio, Mirko

    2018-06-01

    To evaluate MRI derived whole-tumour histogram analysis parameters in predicting pancreatic neuroendocrine neoplasm (panNEN) grade and aggressiveness. Pre-operative MR of 42 consecutive patients with panNEN >1 cm were retrospectively analysed. T1-/T2-weighted images and ADC maps were analysed. Histogram-derived parameters were compared to histopathological features using the Mann-Whitney U test. Diagnostic accuracy was assessed by ROC-AUC analysis; sensitivity and specificity were assessed for each histogram parameter. ADC entropy was significantly higher in G2-3 tumours with ROC-AUC 0.757; sensitivity and specificity were 83.3 % (95 % CI: 61.2-94.5) and 61.1 % (95 % CI: 36.1-81.7). ADC kurtosis was higher in panNENs with vascular involvement, nodal and hepatic metastases (p= .008, .021 and .008; ROC-AUC= 0.820, 0.709 and 0.820); sensitivity and specificity were: 85.7/74.3 % (95 % CI: 42-99.2 /56.4-86.9), 36.8/96.5 % (95 % CI: 17.2-61.4 /76-99.8) and 100/62.8 % (95 % CI: 56.1-100/44.9-78.1). No significant differences between groups were found for other histogram-derived parameters (p >.05). Whole-tumour histogram analysis of ADC maps may be helpful in predicting tumour grade, vascular involvement, nodal and liver metastases in panNENs. ADC entropy and ADC kurtosis are the most accurate parameters for identification of panNENs with malignant behaviour. • Whole-tumour ADC histogram analysis can predict aggressiveness in pancreatic neuroendocrine neoplasms. • ADC entropy and kurtosis are higher in aggressive tumours. • ADC histogram analysis can quantify tumour diffusion heterogeneity. • Non-invasive quantification of tumour heterogeneity can provide adjunctive information for prognostication.

  5. International Development Aid Allocation Determinants

    OpenAIRE

    Tapas Mishra; Bazoumana Ouattara; Mamata Parhi

    2012-01-01

    This paper investigates the factors explaining aid allocation by bilateral and multilateral donors. We use data for 146 aid recipient countries over the period 1990-2007 and employ Bayesian Averaging of Classical Estimates Approach (BACE) approach and find that both the recipient need and donor interest motives are `significant' determinants of bilateral and multilateral aid allocation process. Our results also indicate that the measures for recipient need and donor interests vary from bilate...

  6. Application of an allocation methodology

    International Nuclear Information System (INIS)

    Youngblood, R.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrate performance) in an optimal way over elements of a conceptual design

  7. How should INGOs allocate resources?

    Directory of Open Access Journals (Sweden)

    Scott Wisor

    2012-02-01

    Full Text Available International Non-governmental Organizations (INGOs face difficult choices when choosing to allocate resources. Given that the resources made available to INGOs fall far short of what is needed to reduce massive human rights deficits, any chosen scheme of resource allocation requires failing to reach other individuals in great need. Facing these moral opportunity costs, what moral reasons should guide INGO resource allocation? Two reasons that clearly matter, and are recognized by philosophers and development practitioners, are the consequences (or benefit or harm reduction of any given resource allocation and the need (or priority of individual beneficiaries. If accepted, these reasons should lead INGOs to allocate resources to a limited number of countries where the most prioritarian weighted harm reduction will be achieved. I make three critiques against this view. First, on grounds the consequentialist accepts, I argue that INGOs ought to maintain a reasonably wide distribution of resources. Second, I argue that even if one is a consequentialist, consequentialism ought not act as an action guiding principle for INGOs. Third, I argue that additional moral reasons should influence decision making about INGO resource allocation. Namely, INGO decision making should attend to relational reasons, desert, respect for agency, concern for equity, and the importance of expressing a view of moral wrongs.

  8. DSP+FPGA-based real-time histogram equalization system of infrared image

    Science.gov (United States)

    Gu, Dongsheng; Yang, Nansheng; Pi, Defu; Hua, Min; Shen, Xiaoyan; Zhang, Ruolan

    2001-10-01

    Histogram Modification is a simple but effective method to enhance an infrared image. There are several methods to equalize an infrared image's histogram due to the different characteristics of the different infrared images, such as the traditional HE (Histogram Equalization) method, and the improved HP (Histogram Projection) and PE (Plateau Equalization) method and so on. If to realize these methods in a single system, the system must have a mass of memory and extremely fast speed. In our system, we introduce a DSP + FPGA based real-time procession technology to do these things together. FPGA is used to realize the common part of these methods while DSP is to do the different part. The choice of methods and the parameter can be input by a keyboard or a computer. By this means, the function of the system is powerful while it is easy to operate and maintain. In this article, we give out the diagram of the system and the soft flow chart of the methods. And at the end of it, we give out the infrared image and its histogram before and after the process of HE method.

  9. Particle swarm optimization-based local entropy weighted histogram equalization for infrared image enhancement

    Science.gov (United States)

    Wan, Minjie; Gu, Guohua; Qian, Weixian; Ren, Kan; Chen, Qian; Maldague, Xavier

    2018-06-01

    Infrared image enhancement plays a significant role in intelligent urban surveillance systems for smart city applications. Unlike existing methods only exaggerating the global contrast, we propose a particle swam optimization-based local entropy weighted histogram equalization which involves the enhancement of both local details and fore-and background contrast. First of all, a novel local entropy weighted histogram depicting the distribution of detail information is calculated based on a modified hyperbolic tangent function. Then, the histogram is divided into two parts via a threshold maximizing the inter-class variance in order to improve the contrasts of foreground and background, respectively. To avoid over-enhancement and noise amplification, double plateau thresholds of the presented histogram are formulated by means of particle swarm optimization algorithm. Lastly, each sub-image is equalized independently according to the constrained sub-local entropy weighted histogram. Comparative experiments implemented on real infrared images prove that our algorithm outperforms other state-of-the-art methods in terms of both visual and quantized evaluations.

  10. Histogram Curve Matching Approaches for Object-based Image Classification of Land Cover and Land Use

    Science.gov (United States)

    Toure, Sory I.; Stow, Douglas A.; Weeks, John R.; Kumar, Sunil

    2013-01-01

    The classification of image-objects is usually done using parametric statistical measures of central tendency and/or dispersion (e.g., mean or standard deviation). The objectives of this study were to analyze digital number histograms of image objects and evaluate classifications measures exploiting characteristic signatures of such histograms. Two histograms matching classifiers were evaluated and compared to the standard nearest neighbor to mean classifier. An ADS40 airborne multispectral image of San Diego, California was used for assessing the utility of curve matching classifiers in a geographic object-based image analysis (GEOBIA) approach. The classifications were performed with data sets having 0.5 m, 2.5 m, and 5 m spatial resolutions. Results show that histograms are reliable features for characterizing classes. Also, both histogram matching classifiers consistently performed better than the one based on the standard nearest neighbor to mean rule. The highest classification accuracies were produced with images having 2.5 m spatial resolution. PMID:24403648

  11. Histogram Analysis of Diffusion Tensor Imaging Parameters in Pediatric Cerebellar Tumors.

    Science.gov (United States)

    Wagner, Matthias W; Narayan, Anand K; Bosemani, Thangamadhan; Huisman, Thierry A G M; Poretti, Andrea

    2016-05-01

    Apparent diffusion coefficient (ADC) values have been shown to assist in differentiating cerebellar pilocytic astrocytomas and medulloblastomas. Previous studies have applied only ADC measurements and calculated the mean/median values. Here we investigated the value of diffusion tensor imaging (DTI) histogram characteristics of the entire tumor for differentiation of cerebellar pilocytic astrocytomas and medulloblastomas. Presurgical DTI data were analyzed with a region of interest (ROI) approach to include the entire tumor. For each tumor, histogram-derived metrics including the 25th percentile, 75th percentile, and skewness were calculated for fractional anisotropy (FA) and mean (MD), axial (AD), and radial (RD) diffusivity. The histogram metrics were used as primary predictors of interest in a logistic regression model. Statistical significance levels were set at p histogram skewness showed statistically significant differences for MD between low- and high-grade tumors (P = .008). The 25th percentile for MD yields the best results for the presurgical differentiation between pediatric cerebellar pilocytic astrocytomas and medulloblastomas. The analysis of other DTI metrics does not provide additional diagnostic value. Our study confirms the diagnostic value of the quantitative histogram analysis of DTI data in pediatric neuro-oncology. Copyright © 2015 by the American Society of Neuroimaging.

  12. Value of MR histogram analyses for prediction of microvascular invasion of hepatocellular carcinoma.

    Science.gov (United States)

    Huang, Ya-Qin; Liang, He-Yue; Yang, Zhao-Xia; Ding, Ying; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-06-01

    The objective is to explore the value of preoperative magnetic resonance (MR) histogram analyses in predicting microvascular invasion (MVI) of hepatocellular carcinoma (HCC).Fifty-one patients with histologically confirmed HCC who underwent diffusion-weighted and contrast-enhanced MR imaging were included. Histogram analyses were performed and mean, variance, skewness, kurtosis, 1th, 10th, 50th, 90th, and 99th percentiles were derived. Quantitative histogram parameters were compared between HCCs with and without MVI. Receiver operating characteristics (ROC) analyses were generated to compare the diagnostic performance of tumor size, histogram analyses of apparent diffusion coefficient (ADC) maps, and MR enhancement.The mean, 1th, 10th, and 50th percentiles of ADC maps, and the mean, variance. 1th, 10th, 50th, 90th, and 99th percentiles of the portal venous phase (PVP) images were significantly different between the groups with and without MVI (P histogram analyses-in particular for 1th percentile for PVP images-held promise for prediction of MVI of HCC.

  13. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Lee Hae-Yeoun

    2010-01-01

    Full Text Available Abstract We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  14. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Heung-Kyu Lee

    2010-01-01

    Full Text Available We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  15. Histogram analysis of noise performance on fractional anisotropy brain MR image with different diffusion gradient numbers

    International Nuclear Information System (INIS)

    Chang, Yong Min; Kim, Yong Sun; Kang, Duk Sik; Lee, Young Joo; Sohn, Chul Ho; Woo, Seung Koo; Suh, Kyung Jin

    2005-01-01

    We wished to analyze, qualitatively and quantitatively, the noise performance of fractional anisotropy brain images along with the different diffusion gradient numbers by using the histogram method. Diffusion tensor images were acquired using a 3.0 T MR scanner from ten normal volunteers who had no neurological symptoms. The single-shot spin-echo EPI with a Stejskal-Tanner type diffusion gradient scheme was employed for the diffusion tensor measurement. With a b-valuee of 1000 s/mm 2 , the diffusion tensor images were obtained for 6, 11, 23, 35 and 47 diffusion gradient directions. FA images were generated for each DTI scheme. The histograms were then obtained at selected ROIs for the anatomical structures on the FA image. At the same ROI location, the mean FA value and the standard deviation of the mean FA value were calculated. The quality of the FA image was improved as the number of diffusion gradient directions increased by showing better contrast between the WM and GM. The histogram showed that the variance of FA values was reduced as the number of diffusion gradient directions increased. This histogram analysis was in good agreement with the result obtained using quantitative analysis. The image quality of the FA map was significantly improved as the number of diffusion gradient directions increased. The histogram analysis well demonstrated that the improvement in the FA images resulted from the reduction in the variance of the FA values included in the ROI

  16. Genetic patterns in European Geometrid Moths revealed by the Barcode Index Number (BIN) System

    NARCIS (Netherlands)

    Hausmann, A.; Godfray, H.C.J.; Huemer, J.; Mutane, M.; Rougerie, R.; Nieukerken, van E.J.; Ratnasingham, S.; Hebert, P.D.N.

    2013-01-01

    Background: The geometrid moths of Europe are one of the best investigated insect groups in traditional taxonomy making them an ideal model group to test the accuracy of the Barcode Index Number (BIN) system of BOLD (Barcode of Life Datasystems), a method that supports automated, rapid species

  17. Exploiting jet binning to identify the initial state of high-mass resonances

    Science.gov (United States)

    Ebert, Markus A.; Liebler, Stefan; Moult, Ian; Stewart, Iain W.; Tackmann, Frank J.; Tackmann, Kerstin; Zeune, Lisa

    2016-09-01

    If a new high-mass resonance is discovered at the Large Hadron Collider, model-independent techniques to identify the production mechanism will be crucial to understand its nature and effective couplings to Standard Model particles. We present a powerful and model-independent method to infer the initial state in the production of any high-mass color-singlet system by using a tight veto on accompanying hadronic jets to divide the data into two mutually exclusive event samples (jet bins). For a resonance of several hundred GeV, the jet binning cut needed to discriminate quark and gluon initial states is in the experimentally accessible range of several tens of GeV. It also yields comparable cross sections for both bins, making this method viable already with the small event samples available shortly after a discovery. Theoretically, the method is made feasible by utilizing an effective field theory setup to compute the jet cut dependence precisely and model independently and to systematically control all sources of theoretical uncertainties in the jet binning, as well as their correlations. We use a 750 GeV scalar resonance as an example to demonstrate the viability of our method.

  18. Exploiting jet binning to identify the initial state of high-mass resonances

    International Nuclear Information System (INIS)

    Ebert, Markus A.; Liebler, Stefan; Tackmann, Frank J.; Tackmann, Kerstin; Moult, Ian; Stewart, Iain W.; Zeune, Lisa

    2016-05-01

    If a new high-mass resonance is discovered at the Large Hadron Collider, model-independent techniques to identify the production mechanism will be crucial to understand its nature and effective couplings to Standard Model particles. We present a powerful and model-independent method to infer the initial state in the production of any high-mass color-singlet system by using a tight veto on accompanying hadronic jets to divide the data into two mutually exclusive event samples (jet bins). For a resonance of several hundred GeV, the jet binning cut needed to discriminate quark and gluon initial states is in the experimentally accessible range of several tens of GeV. It also yields comparable cross sections for both bins, making this method viable already with the small event samples available shortly after a discovery. Theoretically, the method is made feasible by utilizing an effective field theory setup to compute the jet cut dependence precisely and model-independently and to systematically control all sources of theoretical uncertainties in the jet binning, as well as their correlations. We use a 750 GeV scalar resonance as an example to demonstrate the viability of our method.

  19. The effect of stocking density and bin feeder space on performance ...

    African Journals Online (AJOL)

    Unknown

    The effect of stocking density and bin feeder space on performance in pigs. G.A. Lavers# and N.S. Ferguson. School of Agricultural Sciences & Agribusiness, University of Natal, P Bag X01, Scottsville 3209. #Email: lavers@agric.unp.ac.za. Introduction. Pigs housed individually have been shown to have higher feed intakes ...

  20. The effect of stocking density and bin feeder space on performance ...

    African Journals Online (AJOL)

    The effect of stocking density and bin feeder space on performance in pigs. G.A. Lavers, N.S. Ferguson. Abstract. (South African J of Animal Science, 2000, 30, Supplement 1: 70-71). Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  1. ANALISIS TINGKAT OPTIMASI ALGORITMA GENETIKA DALAM HUKUM KETETAPAN HARDY-WEINBERG PADA BIN PACKING PROBLEM

    Directory of Open Access Journals (Sweden)

    Terry Noviar Panggabean

    2016-08-01

    Full Text Available Abstrak—Karna representasi abstrak dari beberapa sistem pengambilan keputusan yang nyata dalam kehidupan sehari hari membuat masalah optimasi kombinatorial umumnya sangat sulit untuk dipecahkan. Bin packing problem ialah solusi terbaik dalam mengatasi masalah optimasi kombinatorial, yang digunakan untuk mencari sebuah objek secara optimal dari sekelompok himpunan objek yang berhingga. Serangkaian pendekatan hybrid telah dikembangkan dalam hal ini untuk memecahkan masalah Bin Packing. Metaheuristik adalah salah satu pendekatan tingkat tinggi dalam memandu dalam memodifikasi beberapa metode heuristik lainnya untuk mencari tingkat optimasi yang lebih baik. Genetic Algorithm atau Algoritma Genetika juga merupakan metode metaheuristik yang digunakan untuk menyelesaikan berbagai masalah dalam hal peningkatan optimasi. Dalam algoritma genetika terdapat bermacam-macam varian. Dalam penelitian dipaparkan mengenai taksonomi dari algoritma genetika parallel (Parallel Genetic Algorithm yang memiliki kemampuan yang lebih baik dari algoritma genetika konvensional dalam hal kinerja dan skalabilitasnya. Tetapi algoritma genetika paralel ini hanya cocok untuk permasalahan jaringan komputer heterogen dan sistem terdistribusi. Berdasarkan penelitian yang sudah pernah dilakukan sebelumnya dan dari uraian diatas maka penulis tertarik untuk melakukan penelitian bagaimana menerapkan hukum ketetapan Hardy-Weinberg dari bidang biologi kedalam algoritma genetika melakukan analisis tingkat optimasi terhadap Bin Packing Problem..   Keywords— Genetic Algortihm, Hardy-Weinberg, Bin Packing Problem.

  2. Research and Development of a New Waste Collection Bin to Facilitate Education in Plastic Recycling

    Science.gov (United States)

    Chow, Cheuk-fai; So, Wing-Mui Winnie; Cheung, Tsz-Yan

    2016-01-01

    Plastic recycling has been an alternative method for solid waste management apart from landfill and incineration. However, recycling quality is affected when all plastics are discarded into a single recycling bin that increases cross contaminations and operation cost to the recycling industry. Following the engineering design process, a new…

  3. The Peril of Hasty Triumphalism and Osama bin Laden’s Death

    Directory of Open Access Journals (Sweden)

    Eugenio Lilli

    2011-05-01

    Full Text Available On May 1, 2011 the headlines of a large number of newspapers and TV channels around the world were saying “justice has been done”. Those were the words used by the US President Barack Obama to announce to the world the killing of Osama bin Laden, the number one terrorist on the US most-wanted list.

  4. Effects of Number and Location of Bins on Plastic Recycling at a University

    Science.gov (United States)

    O'Connor, Ryan T.; Lerman, Dorothea C.; Fritz, Jennifer N.; Hodde, Henry B.

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location…

  5. Monitoring household waste recycling centres performance using mean bin weight analyses.

    Science.gov (United States)

    Maynard, Sarah; Cherrett, Tom; Waterson, Ben

    2009-02-01

    This paper describes a modelling approach used to investigate the significance of key factors (vehicle type, compaction type, site design, temporal effects) in influencing the variability in observed nett amenity bin weights produced by household waste recycling centres (HWRCs). This new method can help to quickly identify sites that are producing significantly lighter bins, enabling detailed back-end analyses to be efficiently targeted and best practice in HWRC operation identified. Tested on weigh ticket data from nine HWRCs across West Sussex, UK, the model suggests that compaction technique, vehicle type, month and site design explained 76% of the variability in the observed nett amenity weights. For each factor, a weighting coefficient was calculated to generate a predicted nett weight for each bin transaction and three sites were subsequently identified as having similar characteristics but returned significantly different mean nett bin weights. Waste and site audits were then conducted at the three sites to try and determine the possible sources of the remaining variability. Significant differences were identified in the proportions of contained waste (bagged), wood, and dry recyclables entering the amenity waste stream, particularly at one site where significantly less contaminated waste and dry recyclables were observed.

  6. Markov chain Monte Carlo linkage analysis: effect of bin width on the probability of linkage.

    Science.gov (United States)

    Slager, S L; Juo, S H; Durner, M; Hodge, S E

    2001-01-01

    We analyzed part of the Genetic Analysis Workshop (GAW) 12 simulated data using Monte Carlo Markov chain (MCMC) methods that are implemented in the computer program Loki. The MCMC method reports the "probability of linkage" (PL) across the chromosomal regions of interest. The point of maximum PL can then be taken as a "location estimate" for the location of the quantitative trait locus (QTL). However, Loki does not provide a formal statistical test of linkage. In this paper, we explore how the bin width used in the calculations affects the max PL and the location estimate. We analyzed age at onset (AO) and quantitative trait number 5, Q5, from 26 replicates of the general simulated data in one region where we knew a major gene, MG5, is located. For each trait, we found the max PL and the corresponding location estimate, using four different bin widths. We found that bin width, as expected, does affect the max PL and the location estimate, and we recommend that users of Loki explore how their results vary with different bin widths.

  7. BinQuasi: a peak detection method for ChIP-sequencing data with biological replicates.

    Science.gov (United States)

    Goren, Emily; Liu, Peng; Wang, Chao; Wang, Chong

    2018-04-19

    ChIP-seq experiments that are aimed at detecting DNA-protein interactions require biological replication to draw inferential conclusions, however there is no current consensus on how to analyze ChIP-seq data with biological replicates. Very few methodologies exist for the joint analysis of replicated ChIP-seq data, with approaches ranging from combining the results of analyzing replicates individually to joint modeling of all replicates. Combining the results of individual replicates analyzed separately can lead to reduced peak classification performance compared to joint modeling. Currently available methods for joint analysis may fail to control the false discovery rate at the nominal level. We propose BinQuasi, a peak caller for replicated ChIP-seq data, that jointly models biological replicates using a generalized linear model framework and employs a one-sided quasi-likelihood ratio test to detect peaks. When applied to simulated data and real datasets, BinQuasi performs favorably compared to existing methods, including better control of false discovery rate than existing joint modeling approaches. BinQuasi offers a flexible approach to joint modeling of replicated ChIP-seq data which is preferable to combining the results of replicates analyzed individually. Source code is freely available for download at https://cran.r-project.org/package=BinQuasi, implemented in R. pliu@iastate.edu or egoren@iastate.edu. Supplementary material is available at Bioinformatics online.

  8. Effects of Outside Air Temperature on Movement of Phosphine Gas in Concrete Elevator Bins

    Science.gov (United States)

    Studies that measured the movement and concentration of phosphine gas in upright concrete bins over time indicated that fumigant movement was dictated by air currents, which in turn, were a function of the difference between the average grain temperature and the average outside air temperature durin...

  9. Infrared face recognition based on LBP histogram and KW feature selection

    Science.gov (United States)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  10. ANALISA PENINGKATAN KUALITAS CITRA BAWAH AIR BERBASIS KOREKSI GAMMA dan HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    Aria Hendrawan

    2016-11-01

    Full Text Available Underwater image of water quality in the dark, it depends on the depth of water at the time of image acquisition or image. The results of the image quality is adversely affecting the results matching the image pairs underwater with SIFT algorithm. This research aims to use the method of image preprocessing and Histogram Equalization Gamma Correction that works to improve the quality of images underwater. The results showed 27.76% increase using image preprocessing Gamma Correction and Histogram Equalization compared with no increase in image quality. Results of paired t-test has the null hypothesis is rejected so that there is a significant difference between the application of Gamma Correction Histogram Equalization with and without image enhancement.

  11. A novel JPEG steganography method based on modulus function with histogram analysis

    Directory of Open Access Journals (Sweden)

    V. Banoci

    2012-06-01

    Full Text Available In this paper, we present a novel steganographic method for embedding of secret data in still grayscale JPEG image. In order to provide large capacity of the proposed method while maintaining good visual quality of stego-image, the embedding process is performed in quantized transform coefficients of Discrete Cosine transform (DCT by modifying coefficients according to modulo function, what gives to the steganography system blind extraction predisposition. After-embedding histogram of proposed Modulo Histogram Fitting (MHF method is analyzed to secure steganography system against steganalysis attacks. In addition, AES ciphering was implemented to increase security and improve histogram after-embedding characteristics of proposed steganography system as experimental results show.

  12. The binning of metagenomic contigs for microbial physiology of mixed cultures.

    Science.gov (United States)

    Strous, Marc; Kraft, Beate; Bisdorf, Regina; Tegetmeyer, Halina E

    2012-01-01

    So far, microbial physiology has dedicated itself mainly to pure cultures. In nature, cross feeding and competition are important aspects of microbial physiology and these can only be addressed by studying complete communities such as enrichment cultures. Metagenomic sequencing is a powerful tool to characterize such mixed cultures. In the analysis of metagenomic data, well established algorithms exist for the assembly of short reads into contigs and for the annotation of predicted genes. However, the binning of the assembled contigs or unassembled reads is still a major bottleneck and required to understand how the overall metabolism is partitioned over different community members. Binning consists of the clustering of contigs or reads that apparently originate from the same source population. In the present study eight metagenomic samples from the same habitat, a laboratory enrichment culture, were sequenced. Each sample contained 13-23 Mb of assembled contigs and up to eight abundant populations. Binning was attempted with existing methods but they were found to produce poor results, were slow, dependent on non-standard platforms or produced errors. A new binning procedure was developed based on multivariate statistics of tetranucleotide frequencies combined with the use of interpolated Markov models. Its performance was evaluated by comparison of the results between samples with BLAST and in comparison to existing algorithms for four publicly available metagenomes and one previously published artificial metagenome. The accuracy of the new approach was comparable or higher than existing methods. Further, it was up to a 100 times faster. It was implemented in Java Swing as a complete open source graphical binning application available for download and further development (http://sourceforge.net/projects/metawatt).

  13. The binning of metagenomic contigs for microbial physiology of mixed cultures

    Directory of Open Access Journals (Sweden)

    Marc eStrous

    2012-12-01

    Full Text Available So far, microbial physiology has dedicated itself mainly to pure cultures. In nature, cross feeding and competition are important aspects of microbial physiology and these can only be addressed by studying complete communities such as enrichment cultures. Metagenomic sequencing is a powerful tool to characterize such mixed cultures. In the analysis of metagenomic data, well established algorithms exist for the assembly of short reads into contigs and for the annotation of predicted genes. However, the binning of the assembled contigs or unassembled reads is still a major bottleneck and required to understand how the overall metabolism is partitioned over different community members. Binning consists of the clustering of contigs or reads that apparently originate from the same source population.In the present study eight metagenomic samples originating from the same habitat, a laboratory enrichment culture, were sequenced. Each sample contained 13-23 Mb of assembled contigs and up to eight abundant populations. Binning was attempted with existing methods but they were found to produce poor results, were slow, dependent on non-standard platforms or produced errors. A new binning procedure was developed based on multivariate statistics of tetranucleotide frequencies combined with the use of interpolated Markov models. Its performance was evaluated by comparison of the results between samples with BLAST and in comparison to exisiting algorithms for four publicly available metagenomes and one previously published artificial metagenome. The accuracy of the new approach was comparable or higher than existing methods. Further, it was up to a hunderd times faster. It was implemented in Java Swing as a complete open source graphical binning application available for download and further development (http://sourceforge.net/projects/metawatt.

  14. Histogram analysis of apparent diffusion coefficient maps for differentiating primary CNS lymphomas from tumefactive demyelinating lesions.

    Science.gov (United States)

    Lu, Shan Shan; Kim, Sang Joon; Kim, Namkug; Kim, Ho Sung; Choi, Choong Gon; Lim, Young Min

    2015-04-01

    This study intended to investigate the usefulness of histogram analysis of apparent diffusion coefficient (ADC) maps for discriminating primary CNS lymphomas (PCNSLs), especially atypical PCNSLs, from tumefactive demyelinating lesions (TDLs). Forty-seven patients with PCNSLs and 18 with TDLs were enrolled in our study. Hyperintense lesions seen on T2-weighted images were defined as ROIs after ADC maps were registered to the corresponding T2-weighted image. ADC histograms were calculated from the ROIs containing the entire lesion on every section and on a voxel-by-voxel basis. The ADC histogram parameters were compared among all PCNSLs and TDLs as well as between the subgroup of atypical PCNSLs and TDLs. ROC curves were constructed to evaluate the diagnostic performance of the histogram parameters and to determine the optimum thresholds. The differences between the PCNSLs and TDLs were found in the minimum ADC values (ADCmin) and in the 5th and 10th percentiles (ADC5% and ADC10%) of the cumulative ADC histograms. However, no statistical significance was found in the mean ADC value or in the ADC value concerning the mode, kurtosis, and skewness. The ADCmin, ADC5%, and ADC10% were also lower in atypical PCNSLs than in TDLs. ADCmin was the best indicator for discriminating atypical PCNSLs from TDLs, with a threshold of 556×10(-6) mm2/s (sensitivity, 81.3 %; specificity, 88.9%). Histogram analysis of ADC maps may help to discriminate PCNSLs from TDLs and may be particularly useful in differentiating atypical PCNSLs from TDLs.

  15. Development of a new bin filler for apple harvesting and infield sorting with a review of existing technologies

    Science.gov (United States)

    The bin filler, which receives apples from the sorting system and then places them in the bin evenly without causing bruise damage, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges ...

  16. Evaluation of five sampling methods for Liposcelis entomophila (Enderlein) and L. decolor (Pearman) (Psocoptera: Liposcelididae) in steel bins containing wheat

    Science.gov (United States)

    An evaluation of five sampling methods for studying psocid population levels was conducted in two steel bins containing 32.6 metric tonnes of wheat in Manhattan, KS. Psocids were sampled using a 1.2-m open-ended trier, corrugated cardboard refuges placed on the underside of the bin hatch or the surf...

  17. Cost allocation in distribution planning

    Energy Technology Data Exchange (ETDEWEB)

    Engevall, S

    1997-12-31

    This thesis concerns cost allocation problems in distribution planning. The cost allocation problems we study are illustrated using the distribution planning situation at the Logistics department of Norsk Hydro Olje AB. The planning situation is modeled as a Traveling Salesman Problem and a Vehicle Routing Problem with an inhomogeneous fleet. The cost allocation problems are the problems of how to divide the transportation costs among the customers served in each problem. The cost allocation problems are formulated as cooperative games, in characteristic function form, where the customers are defined to be the players. The games contain five and 21 players respectively. Game theoretical solution concepts such as the core, the nucleolus, the Shapley value and the {tau}-value are discussed. From the empirical results we can, among other things, conclude that the core of the Traveling Salesman Game is large, and that the core of the Vehicle Routing Game is empty. In the accounting of Norsk Hydro the cost per m{sup 3} can be found for each tour. We conclude that for a certain definition of the characteristic function, a cost allocation according to this principle will not be included in the core of the Traveling Salesman Game. The models and methods presented in this thesis can be applied to transportation problems similar to that of Norsk Hydro, independent of the type of products that are delivered. 96 refs, 11 figs, 26 tabs

  18. Cost allocation in distribution planning

    Energy Technology Data Exchange (ETDEWEB)

    Engevall, S.

    1996-12-31

    This thesis concerns cost allocation problems in distribution planning. The cost allocation problems we study are illustrated using the distribution planning situation at the Logistics department of Norsk Hydro Olje AB. The planning situation is modeled as a Traveling Salesman Problem and a Vehicle Routing Problem with an inhomogeneous fleet. The cost allocation problems are the problems of how to divide the transportation costs among the customers served in each problem. The cost allocation problems are formulated as cooperative games, in characteristic function form, where the customers are defined to be the players. The games contain five and 21 players respectively. Game theoretical solution concepts such as the core, the nucleolus, the Shapley value and the {tau}-value are discussed. From the empirical results we can, among other things, conclude that the core of the Traveling Salesman Game is large, and that the core of the Vehicle Routing Game is empty. In the accounting of Norsk Hydro the cost per m{sup 3} can be found for each tour. We conclude that for a certain definition of the characteristic function, a cost allocation according to this principle will not be included in the core of the Traveling Salesman Game. The models and methods presented in this thesis can be applied to transportation problems similar to that of Norsk Hydro, independent of the type of products that are delivered. 96 refs, 11 figs, 26 tabs

  19. Cost allocation in distribution planning

    International Nuclear Information System (INIS)

    Engevall, S.

    1996-01-01

    This thesis concerns cost allocation problems in distribution planning. The cost allocation problems we study are illustrated using the distribution planning situation at the Logistics department of Norsk Hydro Olje AB. The planning situation is modeled as a Traveling Salesman Problem and a Vehicle Routing Problem with an inhomogeneous fleet. The cost allocation problems are the problems of how to divide the transportation costs among the customers served in each problem. The cost allocation problems are formulated as cooperative games, in characteristic function form, where the customers are defined to be the players. The games contain five and 21 players respectively. Game theoretical solution concepts such as the core, the nucleolus, the Shapley value and the τ-value are discussed. From the empirical results we can, among other things, conclude that the core of the Traveling Salesman Game is large, and that the core of the Vehicle Routing Game is empty. In the accounting of Norsk Hydro the cost per m 3 can be found for each tour. We conclude that for a certain definition of the characteristic function, a cost allocation according to this principle will not be included in the core of the Traveling Salesman Game. The models and methods presented in this thesis can be applied to transportation problems similar to that of Norsk Hydro, independent of the type of products that are delivered. 96 refs, 11 figs, 26 tabs

  20. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    Science.gov (United States)

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed

  1. Quick cytogenetic screening of breeding bulls using flow cytometric sperm DNA histogram analysis.

    Science.gov (United States)

    Nagy, Szabolcs; Polgár, Péter J; Andersson, Magnus; Kovács, András

    2016-09-01

    The aim of the present study was to test the FXCycle PI/RNase kit for routine DNA analyses in order to detect breeding bulls and/or insemination doses carrying cytogenetic aberrations. In a series of experiments we first established basic DNA histogram parameters of cytogenetically healthy breeding bulls by measuring the intraspecific genome size variation of three animals, then we compared the histogram profiles of bulls carrying cytogenetic defects to the baseline values. With the exception of one case the test was able to identify bulls with cytogenetic defects. Therefore, we conclude that the assay could be incorporated into the laboratory routine where flow cytometry is applied for semen quality control.

  2. A 64 Mbyte VME histogramming memory card for the GA.SP gamma spectrometer

    International Nuclear Information System (INIS)

    Cavedini, Z.; DePoli, M.; Maron, G.; Vedovato, G.

    1990-01-01

    This paper reports on a 64 Mbyte VME histogramming memory card designed and built to cover the on-line and off-line data analysis needs of the GA.SP spectrometer (a 40 HpGe gamma detector array in development at LNL). The card combines the standard features of the VME/VSB bus with some special built-in functions: single cycle fast histogramming operations (typical channel increment time of 550 ns including the bus arbitration), fast clear of the whole memory (∼1 second to erase 64 Mbyte) and data broadcasting

  3. A novel approach to find and optimize bin locations and collection routes using a geographic information system.

    Science.gov (United States)

    Erfani, Seyed Mohammad Hassan; Danesh, Shahnaz; Karrabi, Seyed Mohsen; Shad, Rouzbeh

    2017-07-01

    One of the major challenges in big cities is planning and implementation of an optimized, integrated solid waste management system. This optimization is crucial if environmental problems are to be prevented and the expenses to be reduced. A solid waste management system consists of many stages including collection, transfer and disposal. In this research, an integrated model was proposed and used to optimize two functional elements of municipal solid waste management (storage and collection systems) in the Ahmadabad neighbourhood located in the City of Mashhad - Iran. The integrated model was performed by modelling and solving the location allocation problem and capacitated vehicle routing problem (CVRP) through Geographic Information Systems (GIS). The results showed that the current collection system is not efficient owing to its incompatibility with the existing urban structure and population distribution. Application of the proposed model could significantly improve the storage and collection system. Based on the results of minimizing facilities analyses, scenarios with 100, 150 and 180 m walking distance were considered to find optimal bin locations for Alamdasht, C-metri and Koohsangi. The total number of daily collection tours was reduced to seven as compared to the eight tours carried out in the current system (12.50% reduction). In addition, the total number of required crews was minimized and reduced by 41.70% (24 crews in the current collection system vs 14 in the system provided by the model). The total collection vehicle routing was also optimized such that the total travelled distances during night and day working shifts was cut back by 53%.

  4. Application of an allocation methodology

    International Nuclear Information System (INIS)

    Youngblood, R.; de Oliveira, L.F.S.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrated performance) in an optimal way over elements of a conceptual design. 6 refs., 3 figs., 2 tabs

  5. Allocation Problems and Market Design

    DEFF Research Database (Denmark)

    Smilgins, Aleksandrs

    The thesis contains six independent papers with a common theme: Allocation problems and market design. The first paper is concerned with fair allocation of risk capital where independent autonomous subunits have risky activities and together constitute the entity's total risk, whose associated risk...... at a certain point in time involves countries that have excess demand and countries that have surplus of green energy. The problem addressed here is how the gains from trade ought to influence the way that members of the grid share common costs. The fifth paper extends the classical two-sided one...

  6. Comparison of an alternative and existing binning methods to reduce the acquisition duration of 4D PET/CT

    International Nuclear Information System (INIS)

    Didierlaurent, David; Ribes, Sophie; Caselles, Olivier; Jaudet, Cyril; Dierickx, Lawrence O.; Zerdoud, Slimane; Brillouet, Severine; Weits, Kathleen; Batatia, Hadj; Courbon, Frédéric

    2014-01-01

    Purpose: Respiratory motion is a source of artifacts that reduce image quality in PET. Four dimensional (4D) PET/CT is one approach to overcome this problem. Existing techniques to limiting the effects of respiratory motions are based on prospective phase binning which requires a long acquisition duration (15–25 min). This time is uncomfortable for the patients and limits the clinical exploitation of 4D PET/CT. In this work, the authors evaluated an existing method and an alternative retrospective binning method to reduce the acquisition duration of 4D PET/CT. Methods: The authors studied an existing mixed-amplitude binning (MAB) method and an alternative binning method by mixed-phases (MPhB). Before implementing MPhB, they analyzed the regularity of the breathing patterns in patients. They studied the breathing signal drift and missing CT slices that could be challenging for implementing MAB. They compared the performance of MAB and MPhB with current binning methods to measure the maximum uptake, internal volume, and maximal range of tumor motion. Results: MPhB can be implemented depending on an optimal phase (in average, the exhalation peak phase −4.1% of the entire breathing cycle duration). Signal drift of patients was in average 35% relative to the breathing amplitude. Even after correcting this drift, MAB was feasible in 4D CT for only 64% of patients. No significant differences appeared between the different binning methods to measure the maximum uptake, internal volume, and maximal range of tumor motion. The authors also determined the inaccuracies of MAB and MPhB to measure the maximum amplitude of tumor motion with three bins (less than 3 mm for movement inferior to 12 mm, up to 6.4 mm for a 21 mm movement). Conclusions: The authors proposed an alternative binning method by mixed-phase binning that halves the acquisition duration of 4D PET/CT. Mixed-amplitude binning was challenging because of signal drift and missing CT slices. They showed that more

  7. Robust Face Recognition by Computing Distances from Multiple Histograms of Oriented Gradients

    NARCIS (Netherlands)

    Karaaba, Mahir; Surinta, Olarik; Schomaker, Lambertus; Wiering, Marco

    2015-01-01

    The Single Sample per Person Problem is a challenging problem for face recognition algorithms. Patch-based methods have obtained some promising results for this problem. In this paper, we propose a new face recognition algorithm that is based on a combination of different histograms of oriented

  8. Using color histogram normalization for recovering chromatic illumination-changed images.

    Science.gov (United States)

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  9. Strain histograms are equal to strain ratios in predicting malignancy in breast tumours

    DEFF Research Database (Denmark)

    Carlsen, Jonathan Frederik; Ewertsen, Caroline; Sletting, Susanne

    2017-01-01

    Objectives: To assess whether strain histograms are equal to strain ratios in predicting breast tumour malignancy and to see if either could be used to upgrade Breast Imaging Reporting and Data System (BI-RADS) 3 tumours for immediate biopsy. Methods: Ninety-nine breast tumours were examined using...

  10. DIF Testing with an Empirical-Histogram Approximation of the Latent Density for Each Group

    Science.gov (United States)

    Woods, Carol M.

    2011-01-01

    This research introduces, illustrates, and tests a variation of IRT-LR-DIF, called EH-DIF-2, in which the latent density for each group is estimated simultaneously with the item parameters as an empirical histogram (EH). IRT-LR-DIF is used to evaluate the degree to which items have different measurement properties for one group of people versus…

  11. The application of the distance histogram in microdosimetry for evaluating heterogeneity

    International Nuclear Information System (INIS)

    Dieren, E.B. van; Lingen, A. van; Roos, J.C.; Teule, G.J.J.

    1992-01-01

    Heterogeneity of radionuclide distributions at a microscopic level is relevant for the dosimetry of short path-length emissions. The present study explores the methodological aspects and the limitations of source target histograms by using computer simulations of radionuclide distributions. Sources were formed by labeled cells, containing 50 decay sites each. Cell nuclei were considered as targets. Within a matrix of 2,500 cells, the authors investigated uniform distributions (MIRD assumption), various cluster sizes, the single labeled cell, and a random distribution. Furthermore, four different intracellular source localizations were studied in a matrix of one cell. The distance histograms for both matrices were combined. For both 125 I and 131 I , absorbed doses in the targets were calculated from multiplication of the distance histograms by the point source absorbed radiation dose distribution. The presented results indicate that the use of distance histograms might be a mathematically convenient approach to microdosimetrical studies. They provide a means to study combinations of source distributions at various levels of magnification for several radionuclides within a reasonable calculation time

  12. Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging

    Science.gov (United States)

    Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...

  13. Optimization of radiation therapy, III: a method of assessing complication probabilities from dose-volume histograms

    International Nuclear Information System (INIS)

    Lyman, J.T.; Wolbarst, A.B.

    1987-01-01

    To predict the likelihood of success of a therapeutic strategy, one must be able to assess the effects of the treatment upon both diseased and healthy tissues. This paper proposes a method for determining the probability that a healthy organ that receives a non-uniform distribution of X-irradiation, heat, chemotherapy, or other agent will escape complications. Starting with any given dose distribution, a dose-cumulative-volume histogram for the organ is generated. This is then reduced by an interpolation scheme (involving the volume-weighting of complication probabilities) to a slightly different histogram that corresponds to the same overall likelihood of complications, but which contains one less step. The procedure is repeated, one step at a time, until there remains a final, single-step histogram, for which the complication probability can be determined. The formalism makes use of a complication response function C(D, V) which, for the given treatment schedule, represents the probability of complications arising when the fraction V of the organ receives dose D and the rest of the organ gets none. Although the data required to generate this function are sparse at present, it should be possible to obtain the necessary information from in vivo and clinical studies. Volume effects are taken explicitly into account in two ways: the precise shape of the patient's histogram is employed in the calculation, and the complication response function is a function of the volume

  14. [Clinical application of MRI histogram in evaluation of muscle fatty infiltration].

    Science.gov (United States)

    Zheng, Y M; Du, J; Li, W Z; Wang, Z X; Zhang, W; Xiao, J X; Yuan, Y

    2016-10-18

    To describe a method based on analysis of the histogram of intensity values produced from the magnetic resonance imaging (MRI) for quantifying the degree of fatty infiltration. The study included 25 patients with dystrophinopathy. All the subjects underwent muscle MRI test at thigh level. The histogram M values of 250 muscles adjusted for subcutaneous fat, representing the degree of fatty infiltration, were compared with the expert visual reading using the modified Mercuri scale. There was a significant positive correlation between the histogram M values and the scores of visual reading (r=0.854, Phistogram M values was similar to that of visual reading and results in literature. The histogram M values had stronger correlations with the clinical data than the scores of visual reading as follows: the correlations with age (r=0.730, Phistogram M values analysis had better repeatability than visual reading with the interclass correlation coefficient was 0.998 (95% CI: 0.997-0.998, PHistogram M values analysis of MRI with the advantages of repeatability and objectivity can be used to evaluate the degree of muscle fatty infiltration.

  15. Quantitative Evaluation for Differentiating Malignant and Benign Thyroid Nodules Using Histogram Analysis of Grayscale Sonograms.

    Science.gov (United States)

    Nam, Se Jin; Yoo, Jaeheung; Lee, Hye Sun; Kim, Eun-Kyung; Moon, Hee Jung; Yoon, Jung Hyun; Kwak, Jin Young

    2016-04-01

    To evaluate the diagnostic value of histogram analysis using grayscale sonograms for differentiation of malignant and benign thyroid nodules. From July 2013 through October 2013, 579 nodules in 563 patients who had undergone ultrasound-guided fine-needle aspiration were included. For the grayscale histogram analysis, pixel echogenicity values in regions of interest were measured as 0 to 255 (0, black; 255, white) with in-house software. Five parameters (mean, skewness, kurtosis, standard deviation, and entropy) were obtained for each thyroid nodule. With principal component analysis, an index was derived. Diagnostic performance rates for the 5 histogram parameters and the principal component analysis index were calculated. A total of 563 patients were included in the study (mean age ± SD, 50.3 ± 12.3 years;range, 15-79 years). Of the 579 nodules, 431 were benign, and 148 were malignant. Among the 5 parameters and the principal component analysis index, the standard deviation (75.546 ± 14.153 versus 62.761 ± 16.01; P histogram analysis was feasible for differentiating malignant and benign thyroid nodules but did not show better diagnostic performance than subjective analysis performed by radiologists. Further technical advances will be needed to objectify interpretations of thyroid grayscale sonograms. © 2016 by the American Institute of Ultrasound in Medicine.

  16. Reducing variability in the output of pattern classifiers using histogram shaping

    International Nuclear Information System (INIS)

    Gupta, Shalini; Kan, Chih-Wen; Markey, Mia K.

    2010-01-01

    Purpose: The authors present a novel technique based on histogram shaping to reduce the variability in the output and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs. Methods: The authors identify different sources of variability in the output of linear pattern classifiers with identical ROC curves, which also result in classifiers with differently distributed outputs. They theoretically develop a novel technique based on the matching of the histograms of these differently distributed pattern classifier outputs to reduce the variability in their (sensitivity, specificity) pairs at fixed decision thresholds, and to reduce the variability in their actual output values. They empirically demonstrate the efficacy of the proposed technique by means of analyses on the simulated data and real world mammography data. Results: For the simulated data, with three different known sources of variability, and for the real world mammography data with unknown sources of variability, the proposed classifier output calibration technique significantly reduced the variability in the classifiers' (sensitivity, specificity) pairs at fixed decision thresholds. Furthermore, for classifiers with monotonically or approximately monotonically related output variables, the histogram shaping technique also significantly reduced the variability in their actual output values. Conclusions: Classifier output calibration based on histogram shaping can be successfully employed to reduce the variability in the output values and (sensitivity, specificity) pairs of pattern classifiers with identical ROC curves, but differently distributed outputs.

  17. Whole-tumor MRI histogram analyses of hepatocellular carcinoma: Correlations with Ki-67 labeling index.

    Science.gov (United States)

    Hu, Xin-Xing; Yang, Zhao-Xia; Liang, He-Yue; Ding, Ying; Grimm, Robert; Fu, Cai-Xia; Liu, Hui; Yan, Xu; Ji, Yuan; Zeng, Meng-Su; Rao, Sheng-Xiang

    2017-08-01

    To evaluate whether whole-tumor histogram-derived parameters for an apparent diffusion coefficient (ADC) map and contrast-enhanced magnetic resonance imaging (MRI) could aid in assessing Ki-67 labeling index (LI) of hepatocellular carcinoma (HCC). In all, 57 patients with HCC who underwent pretreatment MRI with a 3T MR scanner were included retrospectively. Histogram parameters including mean, median, standard deviation, skewness, kurtosis, and percentiles (5 th , 25 th , 75 th , 95 th ) were derived from the ADC map and MR enhancement. Correlations between histogram parameters and Ki-67 LI were evaluated and differences between low Ki-67 (≤10%) and high Ki-67 (>10%) groups were assessed. Mean, median, 5 th , 25 th , 75 th percentiles of ADC, and mean, median, 25 th , 75 th , 95 th percentiles of enhancement of arterial phase (AP) demonstrated significant inverse correlations with Ki-67 LI (rho up to -0.48 for ADC, -0.43 for AP) and showed significant differences between low and high Ki-67 groups (P Histogram-derived parameters of ADC and AP were potentially helpful for predicting Ki-67 LI of HCC. 3 Technical Efficacy: Stage 3 J. MAGN. RESON. IMAGING 2017;46:383-392. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Flat-histogram methods in quantum Monte Carlo simulations: Application to the t-J model

    International Nuclear Information System (INIS)

    Diamantis, Nikolaos G.; Manousakis, Efstratios

    2016-01-01

    We discuss that flat-histogram techniques can be appropriately applied in the sampling of quantum Monte Carlo simulation in order to improve the statistical quality of the results at long imaginary time or low excitation energy. Typical imaginary-time correlation functions calculated in quantum Monte Carlo are subject to exponentially growing errors as the range of imaginary time grows and this smears the information on the low energy excitations. We show that we can extract the low energy physics by modifying the Monte Carlo sampling technique to one in which configurations which contribute to making the histogram of certain quantities flat are promoted. We apply the diagrammatic Monte Carlo (diag-MC) method to the motion of a single hole in the t-J model and we show that the implementation of flat-histogram techniques allows us to calculate the Green's function in a wide range of imaginary-time. In addition, we show that applying the flat-histogram technique alleviates the “sign”-problem associated with the simulation of the single-hole Green's function at long imaginary time. (paper)

  19. Whole-lesion apparent diffusion coefficient histogram analysis: significance in T and N staging of gastric cancers.

    Science.gov (United States)

    Liu, Song; Zhang, Yujuan; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang

    2017-10-02

    Whole-lesion apparent diffusion coefficient (ADC) histogram analysis has been introduced and proved effective in assessment of multiple tumors. However, the application of whole-volume ADC histogram analysis in gastrointestinal tumors has just started and never been reported in T and N staging of gastric cancers. Eighty patients with pathologically confirmed gastric carcinomas underwent diffusion weighted (DW) magnetic resonance imaging before surgery prospectively. Whole-lesion ADC histogram analysis was performed by two radiologists independently. The differences of ADC histogram parameters among different T and N stages were compared with independent-samples Kruskal-Wallis test. Receiver operating characteristic (ROC) analysis was performed to evaluate the performance of ADC histogram parameters in differentiating particular T or N stages of gastric cancers. There were significant differences of all the ADC histogram parameters for gastric cancers at different T (except ADC min and ADC max ) and N (except ADC max ) stages. Most ADC histogram parameters differed significantly between T1 vs T3, T1 vs T4, T2 vs T4, N0 vs N1, N0 vs N3, and some parameters (ADC 5% , ADC 10% , ADC min ) differed significantly between N0 vs N2, N2 vs N3 (all P histogram parameters held great potential in differentiating different T and N stages of gastric cancers preoperatively.

  20. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong

    2016-09-17

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD-based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stochastic learning framework, we have one triplet of bags, including one basic bag, one positive bag, and one negative bag. These bags are mapped to histograms using a multi-instance dictionary. We argue that the EMD between the basic histogram and the positive histogram should be smaller than that between the basic histogram and the negative histogram. Base on this condition, we design a hinge loss. By minimizing this hinge loss and some regularization terms of the dictionary, we update the dictionary instances. The experiments over multi-instance retrieval applications shows its effectiveness when compared to other dictionary learning methods over the problems of medical image retrieval and natural language relation classification. © 2016 The Natural Computing Applications Forum

  1. ADC Histogram Analysis of Cervical Cancer Aids Detecting Lymphatic Metastases-a Preliminary Study.

    Science.gov (United States)

    Schob, Stefan; Meyer, Hans Jonas; Pazaitis, Nikolaos; Schramm, Dominik; Bremicker, Kristina; Exner, Marc; Höhn, Anne Kathrin; Garnov, Nikita; Surov, Alexey

    2017-12-01

    Apparent diffusion coefficient (ADC) histogram analysis has been used to some extent in cervical cancer (CC) to distinguish between low-grade and high-grade tumors. Although this differentiation is undoubtedly helpful, it would be even more crucial in the presurgical setting to determine whether a tumor already gained the potential to metastasize via the lymphatic system. So far, no studies investigated the potential of 3T ADC histogram analysis in CC to differentiate between nodal-positive and nodal-negative entities. Therefore, the principal aim of our study was to investigate the potential of 3T ADC histogram analysis to differentiate between CC with and without lymph node metastasis. The second aim was to elucidate possible differences in ADC histogram parameters between CC with limited vs. advanced tumor stages and well-differentiated vs. undifferentiated lesions. Finally, correlations of p53 expression and Ki-67 index with ADC parameters were analyzed. Eighteen female patients (mean age 55.4 years, range 32-79 years) with histopathologically confirmed cervical squamous cell carcinoma of the uterine cervix were prospectively enrolled. Tumor stages, tumor grading, status of metastatic dissemination, Ki67-index, and p53 expression were assessed in these patients. Diffusion weighted imaging (DWI) was obtained in a 3T scanner using the following b values: b0 and b1000 s/mm 2 . Group comparisons using Mann-Whitney U test revealed the following findings: nodal-positive CC had statistically significant lower ADC parameters (ADCmin, ADCmean, median ADC, Mode, p10, p25, p75, and p90) in comparison to nodal-negative CC (all p histogram analysis in 3T DWI. This information is crucial for the gynecological surgeon to identify the optimal treatment strategy for patients suffering from CC. Furthermore, ADCentropy was identified as a potential imaging biomarker for tumor heterogeneity and might be able to indicate further molecular changes like loss of p53 expression

  2. Regulating nutrient allocation in plants

    Science.gov (United States)

    Udvardi, Michael; Yang, Jiading; Worley, Eric

    2014-12-09

    The invention provides coding and promoter sequences for a VS-1 and AP-2 gene, which affects the developmental process of senescence in plants. Vectors, transgenic plants, seeds, and host cells comprising heterologous VS-1 and AP-2 genes are also provided. Additionally provided are methods of altering nutrient allocation and composition in a plant using the VS-1 and AP-2 genes.

  3. Centralized Allocation in Multiple Markets

    DEFF Research Database (Denmark)

    Monte, Daniel; Tumennasan, Norovsambuu

    The problem of allocating indivisible objects to different agents, where each indi vidual is assigned at most one object, has been widely studied. Pápai (2000) shows that the set of strategy-proof, nonbossy, Pareto optimal and reallocation-proof rules are hierarchical exchange rules | generalizat...... and nonbossy rules are sequential dictatorships, a special case of Pápai's hierarchical exchange rules....

  4. Designing for dynamic task allocation

    NARCIS (Netherlands)

    Dongen, van K.; Maanen, van P.P.

    2005-01-01

    Future platforms are envisioned in which human-machine teams are able to share and trade tasks as demands in situations change. It seems that human-machine coordination has not received the attention it deserves by past and present approaches to task allocation. In this paper a simple way to make

  5. Planning and Resource Allocation Management.

    Science.gov (United States)

    Coleman, Jack W.

    1986-01-01

    Modern scientific management techniques provide college administrators with valuable planning and resource allocation insights and enhances the decision process. The planning model should incorporate assessment, strategic planning, dynamic and long-term budgeting, operational planning, and feedback and control for actual operations. (MSE)

  6. User's manual for BINIAC: A computer code to translate APET bins

    International Nuclear Information System (INIS)

    Gough, S.T.

    1994-03-01

    This report serves as the user's manual for the FORTRAN code BINIAC. BINIAC is a utility code designed to format the output from the Defense Waste Processing Facility (DWPF) Accident Progression Event Tree (APET) methodology. BINIAC inputs the accident progression bins from the APET methodology, converts the frequency from occurrences per hour to occurrences per year, sorts the progression bins, and converts the individual dimension character codes into facility attributes. Without the use of BINIAC, this process would be done manually at great time expense. BINIAC was written under the quality assurance control of IQ34 QAP IV-1, revision 0, section 4.1.4. Configuration control is established through the use of a proprietor and a cognizant users list

  7. Apparent diffusion coefficient histogram analysis can evaluate radiation-induced parotid damage and predict late xerostomia degree in nasopharyngeal carcinoma.

    Science.gov (United States)

    Zhou, Nan; Guo, Tingting; Zheng, Huanhuan; Pan, Xia; Chu, Chen; Dou, Xin; Li, Ming; Liu, Song; Zhu, Lijing; Liu, Baorui; Chen, Weibo; He, Jian; Yan, Jing; Zhou, Zhengyang; Yang, Xiaofeng

    2017-09-19

    We investigated apparent diffusion coefficient (ADC) histogram analysis to evaluate radiation-induced parotid damage and predict xerostomia degrees in nasopharyngeal carcinoma (NPC) patients receiving radiotherapy. The imaging of bilateral parotid glands in NPC patients was conducted 2 weeks before radiotherapy (time point 1), one month after radiotherapy (time point 2), and four months after radiotherapy (time point 3). From time point 1 to 2, parotid volume, skewness, and kurtosis decreased ( P histogram parameters increased (all P histogram parameters. Early mean change rates for bilateral parotid SD and ADC max could predict late xerostomia degrees at seven months after radiotherapy (three months after time point 3) with AUC of 0.781 and 0.818 ( P = 0.014, 0.005, respectively). ADC histogram parameters were reproducible (intraclass correlation coefficient, 0.830 - 0.999). ADC histogram analysis could be used to evaluate radiation-induced parotid damage noninvasively, and predict late xerostomia degrees of NPC patients treated with radiotherapy.

  8. Histogram and gray level co-occurrence matrix on gray-scale ultrasound images for diagnosing lymphocytic thyroiditis.

    Science.gov (United States)

    Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young

    2016-08-01

    The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Efficient use of design-based binning methodology in a DRAM fab

    Science.gov (United States)

    Karsenti, Laurent; Wehner, Arno; Fischer, Andreas; Seifert, Uwe; Goeckeritz, Jens; Geshel, Mark; Gscheidlen, Dieter; Bartov, Avishai

    2009-03-01

    It is a well established fact that as design rules and printed features shrink, sophisticated techniques are required to ensure the design intent is indeed printed on the wafer. Techniques of this kind are Optical Proximity Correction (OPC), Resolution Enhancement Techniques (RET) and DFM Design for Manufacturing (DFM). As these methods are applied to the overall chip and rely on complex modeling and simulations, they increase the risk of creating local areas or layouts with a limiting process window. Hence, it is necessary to verify the manufacturability (sufficient depth of focus) of the overall die and not only of a pre-defined set of metrology structures. The verification process is commonly based on full chip defect density inspection of a Focus Exposure Matrix (FEM) wafer, combined with appropriate post processing of the inspection data. This is necessary to avoid time consuming search for the Defects of Interest (DOI's) as defect counts are usually too high to be handled by manual SEM review. One way to post process defect density data is the so called design based binning (DBB). The Litho Qualification Monitor (LQM) system allows to classify and also to bin defects based on design information. In this paper we will present an efficient way to combine classification and binning in order to check design rules and to determine the marginal features (layout with low depth of focus). The Design Based Binning has been connected to the Yield Management System (YMS) to allow new process monitoring approaches towards Design Based SPC. This could dramatically cut the time to detect systematic defects inline.

  10. Morphological and Strength Properties of Tanjung Bin Coal Ash Mixtures for Applied in Geotechnical Engineering Work

    OpenAIRE

    Awang, Abd. Rahim; Marto, Aminaton; Makhtar, Ahmad Maher

    2012-01-01

    In Malaysia, coal has been used as a raw material to generate electricity since 1988. In the past, most of the wastage of coal burning especially the bottom ash was not managed properly as it was dumped in the waste pond and accumulated drastically.This paper focuses on some properties of coal ash mixtures (fly  ash and bottom ash mixtures) from Tanjung Bin power plant. The characteristics studied were morphological properties, compaction behaviour and strength properties. Strength properties...

  11. Two-Bin Kanban: Ordering Impact at Navy Medical Center San Diego

    Science.gov (United States)

    2016-06-01

    cost of healthcare . This is done by “focusing on quality, eliminating waste , reducing unwarranted variation, and considering not just the cost of...of the two-bin inventory management system was the supermarket. Food waste is a very high expense for many supermarkets. To mitigate food waste , the...stockpiled medical supplies. Stockpiling supplies is called the “bull whip effect” and it is a wasteful impact of a poorly managed supply system. The

  12. Incremental Prognostic Value of Apparent Diffusion Coefficient Histogram Analysis in Head and Neck Squamous Cell Carcinoma.

    Science.gov (United States)

    Li, Xiaoxia; Yuan, Ying; Ren, Jiliang; Shi, Yiqian; Tao, Xiaofeng

    2018-03-26

    We aimed to investigate the incremental prognostic value of apparent diffusion coefficient (ADC) histogram analysis in patients with head and neck squamous cell carcinoma (HNSCC) and integrate it into a multivariate prognostic model. A retrospective review of magnetic resonance imaging findings was conducted in patients with pathologically confirmed HNSCC between June 2012 and December 2015. For each tumor, six histogram parameters were derived: the 10th, 50th, and 90th percentiles of ADC (ADC 10 , ADC 50 , and ADC 90 ); mean ADC values (ADC mean ); kurtosis; and skewness. The clinical variables included age, sex, smoking status, tumor volume, and tumor node metastasis stage. The association of these histogram and clinical variables with overall survival (OS) was determined. Further validation of the histogram parameters as independent biomarkers was performed using multivariate Cox proportional hazard models combined with clinical variables, which was compared to the clinical model. Models were assessed with C index and receiver operating characteristic curve analyses for the 12- and 36-month OS. Ninety-six patients were eligible for analysis. Median follow-up was 877 days (range, 54-1516 days). A total of 29 patients died during follow-up (30%). Patients with higher ADC values (ADC 10  > 0.958 × 10 -3 mm 2 /s, ADC 50  > 1.089 × 10 -3 mm 2 /s, ADC 90  > 1.152 × 10 -3 mm 2 /s, ADC mean  > 1.047 × 10 -3 mm 2 /s) and lower kurtosis (≤0.967) were significant predictors of poor OS (P histogram analysis has incremental prognostic value in patients with HNSCC and increases the performance of a multivariable prognostic model in addition to clinical variables. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  13. Calculation of complication probability of pion treatment at PSI using dose-volume histograms

    International Nuclear Information System (INIS)

    Nakagawa, Keiichi; Akanuma, Atsuo; Aoki, Yukimasa

    1991-01-01

    In the conformation technique a target volume is irradiated uniformly as in conventional radiations, whereas surrounding tissue and organs are nonuniformly irradiated. Clinical data on radiation injuries that accumulate with conventional radiation are not applicable without appropriate compensation. Recently a putative solution of this problem was proposed by Lyman using dose-volume histograms. This histogram reduction method reduces a given dose-volume histogram of an organ to a single step which corresponds to the equivalent complication probability by interpolation. As a result it converts nonuniform radiation into a unique dose to the whole organ which has the equivalent likelihood of radiation injury. This method is based on low LET radiation with conventional fractionation schedules. When it is applied to high LET radiation such as negative pion treatment, a high LET dose should be converted to an equivalent photon dose using an appropriate value of RBE. In the present study the histogram reduction method was applied to actual patients treated by the negative pion conformation technique at the Paul Scherrer Institute. Out of evaluable 90 cases of pelvic tumors, 16 developed grade III-IV bladder injury, and 7 developed grade III-IV rectal injury. The 90 cases were divided into roughly equal groups according to the equivalent doses to the entire bladder and rectum. Complication rates and equivalent doses to the full organs in these groups could be represented by a sigmoid dose-effect relation. When RBE from a pion dose to a photon dose is assumed to be 2.1 for bladder injury, the rates of bladder complications fit best to the theoretical complication curve. When the RBE value was 2.3, the rates of rectal injury fit the theoretical curve best. These values are close to the conversion factor of 2.0 that is used in clinical practice at PSI. This agreement suggests the clinical feasibility of the histogram reduction method in conformation radiotherapy. (author)

  14. Correlation of histogram analysis of apparent diffusion coefficient with uterine cervical pathologic finding.

    Science.gov (United States)

    Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar

    2015-05-01

    The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.

  15. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    Science.gov (United States)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  16. Artificial intelligent techniques for optimizing water allocation in a reservoir watershed

    Science.gov (United States)

    Chang, Fi-John; Chang, Li-Chiu; Wang, Yu-Chung

    2014-05-01

    This study proposes a systematical water allocation scheme that integrates system analysis with artificial intelligence techniques for reservoir operation in consideration of the great uncertainty upon hydrometeorology for mitigating droughts impacts on public and irrigation sectors. The AI techniques mainly include a genetic algorithm and adaptive-network based fuzzy inference system (ANFIS). We first derive evaluation diagrams through systematic interactive evaluations on long-term hydrological data to provide a clear simulation perspective of all possible drought conditions tagged with their corresponding water shortages; then search the optimal reservoir operating histogram using genetic algorithm (GA) based on given demands and hydrological conditions that can be recognized as the optimal base of input-output training patterns for modelling; and finally build a suitable water allocation scheme through constructing an adaptive neuro-fuzzy inference system (ANFIS) model with a learning of the mechanism between designed inputs (water discount rates and hydrological conditions) and outputs (two scenarios: simulated and optimized water deficiency levels). The effectiveness of the proposed approach is tested on the operation of the Shihmen Reservoir in northern Taiwan for the first paddy crop in the study area to assess the water allocation mechanism during drought periods. We demonstrate that the proposed water allocation scheme significantly and substantially avails water managers of reliably determining a suitable discount rate on water supply for both irrigation and public sectors, and thus can reduce the drought risk and the compensation amount induced by making restrictions on agricultural use water.

  17. Quantum secret sharing based on modulated high-dimensional time-bin entanglement

    International Nuclear Information System (INIS)

    Takesue, Hiroki; Inoue, Kyo

    2006-01-01

    We propose a scheme for quantum secret sharing (QSS) that uses a modulated high-dimensional time-bin entanglement. By modulating the relative phase randomly by {0,π}, a sender with the entanglement source can randomly change the sign of the correlation of the measurement outcomes obtained by two distant recipients. The two recipients must cooperate if they are to obtain the sign of the correlation, which is used as a secret key. We show that our scheme is secure against intercept-and-resend (IR) and beam splitting attacks by an outside eavesdropper thanks to the nonorthogonality of high-dimensional time-bin entangled states. We also show that a cheating attempt based on an IR attack by one of the recipients can be detected by changing the dimension of the time-bin entanglement randomly and inserting two 'vacant' slots between the packets. Then, cheating attempts can be detected by monitoring the count rate in the vacant slots. The proposed scheme has better experimental feasibility than previously proposed entanglement-based QSS schemes

  18. Allocating application to group of consecutive processors in fault-tolerant deadlock-free routing path defined by routers obeying same rules for path selection

    Science.gov (United States)

    Leung, Vitus J [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM; Bender, Michael A [East Northport, NY; Bunde, David P [Urbana, IL

    2009-07-21

    In a multiple processor computing apparatus, directional routing restrictions and a logical channel construct permit fault tolerant, deadlock-free routing. Processor allocation can be performed by creating a linear ordering of the processors based on routing rules used for routing communications between the processors. The linear ordering can assume a loop configuration, and bin-packing is applied to this loop configuration. The interconnection of the processors can be conceptualized as a generally rectangular 3-dimensional grid, and the MC allocation algorithm is applied with respect to the 3-dimensional grid.

  19. Heuristics for the Variable Sized Bin Packing Problem Using a Hybrid P-System and CUDA Architecture

    OpenAIRE

    AlEnezi, Qadha'a; AboElFotoh, Hosam; AlBdaiwi, Bader; AlMulla, Mohammad Ali

    2016-01-01

    The Variable Sized Bin Packing Problem has a wide range of application areas including packing, scheduling, and manufacturing. Given a list of items and variable sized bin types, the objective is to minimize the total size of the used bins. This problem is known to be NP-hard. In this article, we present two new heuristics for solving the problem using a new variation of P systems with active membranes, which we call a hybrid P system, implemented in CUDA. Our hybrid P-system model allows usi...

  20. Sistem Verifikasi Tanda Tangan Off-Line Berdasar Ciri Histogram Of Oriented Gradient (HOG Dan Histogram Of Curvature (HoC

    Directory of Open Access Journals (Sweden)

    Agus Wahyu Widodo

    2015-08-01

    Full Text Available Abstrak Tanda tangan dengan sifat uniknya merupakan salah satu dari sekian banyak atribut personal yang diterima secara luas untuk verifikasi indentitas seseorang, alat pembuktian kepemilikan berbagai transaksi atau dokumen di dalam masyarakat. Keberhasilan penggunaan ciri gradien dan curvature dalam bidang-bidang penelitian pengenalan pola dan bahwa tanda tangan dapat dikatakan merupakan hasil tulisan tangan yang tersusun atas beragam garis dan lengkungan (curve yang memiliki arah atau orientasi merupakan alasan bahwa kedua ciri tersebut digunakan sebagai metoda verifikasi tanda tangan offline di penelitian ini. Berbagai implementasi dari pre-processing, ekstraksi dan representasi ciri, dan pembelajaran SVM serta usaha perbaikan yang telah dilakukan dalam penelitian ini menunjukkan hasil bahwa ciri HOG dan HoC mampu dimanfaatkan dalam proses verifikasi tanda tangan secara offline.  Pada basis data GPDS960Signature, HOG dan HoC yang dihitung pada ukuran sel 30 x 30 piksel memberikan dengan nilai %FRR terbaik 26,90 dan %FAR 37,56.  Sedangkan pada basis data FUM-PHSDB, HOG dn HoC yang dihitung pada ukuran 60 x 60 piksel memberikan nilai %FRR terbaik 4 dan %FAR 57. Kata kunci: verifikasi tanda tangan, curvature, orientation, gradient, histogram of curvature (HoC, histogram of oriented gradient (HOG Abstract Signature with unique properties is one of the many personal attributes that are widely accepted to verify a person's identity, proof of ownership transactions instrument or document in the community. The successful use of gradient and curvature feature in the research fields of pattern recognition is the reason that both of these features are used as an offline signature verification method in this study. Various implementations of preprocessing, feature extraction and representation, and SVM learning has been done in the study showed results that HOG and HoC feature can be utilized in the process of offline signature verification.  HOG and

  1. Klasifikasi Ilmu Pengetahuan Dalam Perspektif Jābir Bin Ḥayyān

    Directory of Open Access Journals (Sweden)

    Asep N. Musadad

    2015-12-01

    Full Text Available Abstract : The central purpose of this article is to provide a preliminary exploration on epistemological background of the classical Islamic sciences through the investigation on the classification of knowledge. One of the earliest exponent to turn to this problem is Jābir bin Ḥayyān (Geber (721 - 815 AD, known as the Father of Arabic and – indirectly – Latin Alchemy. Start on describing  Jābir’s  profile  and  his  significance  in  Islamic  philosophy,  this  article  discusses the  classification  of  knowledge  according  to  him  in  his  book  “Kitāb  al-Ḥudūd”  (Book  of Limits, in which he make his own classification of various knowledge. It finally deals with the philosophical basis of the classification and the intellectual perspective of its author. Keywords : Jabir bin Hayyan, classification of knowledge, Islamic philosophy, natural sciences.   Abstrak : Artikel ini bertujuan untuk melakukan eksplorasi singkat terkait latar epistemologis dalam pengetahuan  Islam  klasik  melalui  analisis  terhadap  konsep  klasifikasi  ilmu  pengetahuan. Salah satu eksponen awal yang berbicara hal tersebut adalah Jābir bin Ḥayyān (Geber (721 - 815 M, dikenal sebagai Bapak Ilmu Kimia Arab dan – secara tidak langsung – Eropa-Latin. Diawali  dengan  penjelasan  tentang  signifikansi  Jābir  bin  Ḥayyān  dalam  wacana  filsafat Islam,  artikel  ini  mendiskusikan  klasifikasi  ilmu  pengetahuan  dalam  perspektifnya  yang terdapat  dalam  karyanya,  “Kitāb  al-Ḥudūd”  (Buku  tentang  Batas-Batas,  yang  membuat klasifikasi versinya sendiri dari berbagai macam ilmu pengetahuan. Pada akhirnya, tulisan ini juga akan menjelaskan basis filosofis dari klasifikasi tersebut dan perspektif intelektual pembuatnya. Kata-Kata Kunci : Jabir bin Hayyan, klasifikasi pengetahuan, filsafat Islam, ilmu-ilmu alam.

  2. Moleculo Long-Read Sequencing Facilitates Assembly and Genomic Binning from Complex Soil Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard Allen; Bottos, Eric M.; Roy Chowdhury, Taniya; Zucker, Jeremy D.; Brislawn, Colin J.; Nicora, Carrie D.; Fansler, Sarah J.; Glaesemann, Kurt R.; Glass, Kevin; Jansson, Janet K.; Langille, Morgan

    2016-06-28

    ABSTRACT

    Soil metagenomics has been touted as the “grand challenge” for metagenomics, as the high microbial diversity and spatial heterogeneity of soils make them unamenable to current assembly platforms. Here, we aimed to improve soil metagenomic sequence assembly by applying the Moleculo synthetic long-read sequencing technology. In total, we obtained 267 Gbp of raw sequence data from a native prairie soil; these data included 109.7 Gbp of short-read data (~100 bp) from the Joint Genome Institute (JGI), an additional 87.7 Gbp of rapid-mode read data (~250 bp), plus 69.6 Gbp (>1.5 kbp) from Moleculo sequencing. The Moleculo data alone yielded over 5,600 reads of >10 kbp in length, and over 95% of the unassembled reads mapped to contigs of >1.5 kbp. Hybrid assembly of all data resulted in more than 10,000 contigs over 10 kbp in length. We mapped three replicate metatranscriptomes derived from the same parent soil to the Moleculo subassembly and found that 95% of the predicted genes, based on their assignments to Enzyme Commission (EC) numbers, were expressed. The Moleculo subassembly also enabled binning of >100 microbial genome bins. We obtained via direct binning the first complete genome, that of “CandidatusPseudomonas sp. strain JKJ-1” from a native soil metagenome. By mapping metatranscriptome sequence reads back to the bins, we found that several bins corresponding to low-relative-abundanceAcidobacteriawere highly transcriptionally active, whereas bins corresponding to high-relative-abundanceVerrucomicrobiawere not. These results demonstrate that Moleculo sequencing provides a significant advance for resolving complex soil microbial communities.

    IMPORTANCESoil microorganisms carry out key processes for life on our planet, including cycling of carbon and other nutrients and supporting growth of plants. However, there is poor molecular-level understanding of their

  3. Learning and strategic asset allocation

    OpenAIRE

    Kearns, Michael

    2016-01-01

    This thesis investigates whether or not models that portray the relationship between what an investor learns and how he allocates his portfolio can explain phenomena related to household behaviour in the stock market. Endogenous modelling of household learning is utilised, which builds on a growing literature called bounded rationality with increasing explanatory power, offering an alternative to the classical rational expectations theory. Such phenomena include firstly why households often h...

  4. Subtype differentiation of renal tumors using voxel-based histogram analysis of intravoxel incoherent motion parameters.

    Science.gov (United States)

    Gaing, Byron; Sigmund, Eric E; Huang, William C; Babb, James S; Parikh, Nainesh S; Stoffel, David; Chandarana, Hersh

    2015-03-01

    The aim of this study was to determine if voxel-based histogram analysis of intravoxel incoherent motion imaging (IVIM) parameters can differentiate various subtypes of renal tumors, including benign and malignant lesions. A total of 44 patients with renal tumors who underwent surgery and had histopathology available were included in this Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, single-institution prospective study. In addition to routine renal magnetic resonance imaging examination performed on a 1.5-T system, all patients were imaged with axial diffusion-weighted imaging using 8 b values (range, 0-800 s/mm). A biexponential model was fitted to the diffusion signal data using a segmented algorithm to extract the IVIM parameters perfusion fraction (fp), tissue diffusivity (Dt), and pseudodiffusivity (Dp) for each voxel. Mean and histogram measures of heterogeneity (standard deviation, skewness, and kurtosis) of IVIM parameters were correlated with pathology results of tumor subtype using unequal variance t tests to compare subtypes in terms of each measure. Correction for multiple comparisons was accomplished using the Tukey honestly significant difference procedure. A total of 44 renal tumors including 23 clear cell (ccRCC), 4 papillary (pRCC), 5 chromophobe, and 5 cystic renal cell carcinomas, as well as benign lesions, 4 oncocytomas (Onc) and 3 angiomyolipomas (AMLs), were included in our analysis. Mean IVIM parameters fp and Dt differentiated 8 of 15 pairs of renal tumors. Histogram analysis of IVIM parameters differentiated 9 of 15 subtype pairs. One subtype pair (ccRCC vs pRCC) was differentiated by mean analysis but not by histogram analysis. However, 2 other subtype pairs (AML vs Onc and ccRCC vs Onc) were differentiated by histogram distribution parameters exclusively. The standard deviation of Dt [σ(Dt)] differentiated ccRCC (0.362 ± 0.136 × 10 mm/s) from AML (0.199 ± 0.043 × 10 mm/s) (P = 0

  5. Longitudinal Acceleration Tests of Overhead Luggage Bins and Auxiliary Fuel Tank in a Transport Airplane Airframe Section

    National Research Council Canada - National Science Library

    McGuire, Robert

    1999-01-01

    This report contains the description and test results of overhead stowage bin calibrations and longitudinal impact testing of a 10-foot transport airframe section conducted at the Transportation Research Center Inc. (TRC...

  6. Longitudinal Acceleration Test of Overhead Luggage Bins and Auxiliary Fuel Tank in a Transport Airplane Airframe Section, Part 2

    National Research Council Canada - National Science Library

    McGuire, Robert

    2000-01-01

    This report contains the description and test results of overhead stowage bin calibrations and longitudinal impact testing of a 10-foot transport airframe section conducted at the Transportation Research Center Inc. (TRC...

  7. Resource allocation in networks via coalitional games

    NARCIS (Netherlands)

    Shams, F.

    2016-01-01

    The main goal of this dissertation is to manage resource allocation in network engineering problems and to introduce efficient cooperative algorithms to obtain high performance, ensuring fairness and stability. Specifically, this dissertation introduces new approaches for resource allocation in

  8. Asset Allocation of Mutual Fund Investors

    OpenAIRE

    Dengpan Luo

    2003-01-01

    This paper studies mutual fund investors' asset allocation decisions using monthly flow data of U.S mutual fund industry from 1984 to 1998. We find that mutual fund investors change their asset allocations between stocks and bonds in reaction to business conditions tracked by changes in expected stock market returns. They tend to allocate less into stock funds during the trough of a business cycle when expected stock market returns are higher and to allocate more into stock funds during the p...

  9. Optimal resource allocation for distributed video communication

    CERN Document Server

    He, Yifeng

    2013-01-01

    While most books on the subject focus on resource allocation in just one type of network, this book is the first to examine the common characteristics of multiple distributed video communication systems. Comprehensive and systematic, Optimal Resource Allocation for Distributed Video Communication presents a unified optimization framework for resource allocation across these systems. The book examines the techniques required for optimal resource allocation over Internet, wireless cellular networks, wireless ad hoc networks, and wireless sensor networks. It provides you with the required foundat

  10. Intelligent tactical asset allocation support system

    NARCIS (Netherlands)

    Hiemstra, Y.

    1995-01-01

    This paper presents an advanced support system for Tactical Asset Allocation. Asset allocation explains over 90% of portfolio performance (Brinson, Hood and Beebower, 1988). Tactical asset allocation adjusts a strategic portfolio on the basis of short term market outlooks. The system includes

  11. Contrast Enhancement Using Brightness Preserving Histogram Equalization Technique for Classification of Date Varieties

    Directory of Open Access Journals (Sweden)

    G Thomas

    2014-06-01

    Full Text Available Computer vision technique is becoming popular for quality assessment of many products in food industries. Image enhancement is the first step in analyzing the images in order to obtain detailed information for the determination of quality. In this study, Brightness preserving histogram equalization technique was used to enhance the features of gray scale images to classify three date varieties (Khalas, Fard and Madina. Mean, entropy, kurtosis and skewness features were extracted from the original and enhanced images. Mean and entropy from original images and kurtosis from the enhanced images were selected based on Lukka's feature selection approach. An overall classification efficiency of 93.72% was achieved with just three features. Brightness preserving histogram equalization technique has great potential to improve the classification in various quality attributes of food and agricultural products with minimum features.

  12. Real time object localization based on histogram of s-RGB

    Science.gov (United States)

    Mudjirahardjo, Panca; Suyono, Hadi; Setyawan, Raden Arief

    2017-09-01

    Object localization is the first task in pattern detection and recognition. This task is very important due to it reduces the searching time to the interest object. In this paper we introduce our novel method of object localization based on color feature. Our novel method is a histogram of s-RGB. This histogram is used in the training phase to determine the color dominant in the initial Region of Interest (ROI). Then this information is used to label the interest object. To reduce noise and localize the interest object, we apply the row and column density function of pixels. The comparison result with some processes, our system gives a best result and takes a short computation time of 26.56 ms, in the video rate of 15 frames per second (fps).

  13. 3D facial expression recognition based on histograms of surface differential quantities

    KAUST Repository

    Li, Huibin

    2011-01-01

    3D face models accurately capture facial surfaces, making it possible for precise description of facial activities. In this paper, we present a novel mesh-based method for 3D facial expression recognition using two local shape descriptors. To characterize shape information of the local neighborhood of facial landmarks, we calculate the weighted statistical distributions of surface differential quantities, including histogram of mesh gradient (HoG) and histogram of shape index (HoS). Normal cycle theory based curvature estimation method is employed on 3D face models along with the common cubic fitting curvature estimation method for the purpose of comparison. Based on the basic fact that different expressions involve different local shape deformations, the SVM classifier with both linear and RBF kernels outperforms the state of the art results on the subset of the BU-3DFE database with the same experimental setting. © 2011 Springer-Verlag.

  14. Clarification of the use of chi-square and likelihood functions in fits to histograms

    International Nuclear Information System (INIS)

    Baker, S.; Cousins, R.D.

    1984-01-01

    We consider the problem of fitting curves to histograms in which the data obey multinomial or Poisson statistics. Techniques commonly used by physicists are examined in light of standard results found in the statistics literature. We review the relationship between multinomial and Poisson distributions, and clarify a sufficient condition for equality of the area under the fitted curve and the number of events on the histogram. Following the statisticians, we use the likelihood ratio test to construct a general Z 2 statistic, Zsub(lambda) 2 , which yields parameter and error estimates identical to those of the method of maximum likelihood. The Zsub(lambda) 2 statistic is further useful for testing goodness-of-fit since the value of its minimum asymptotically obeys a classical chi-square distribution. One should be aware, however, of the potential for statistical bias, especially when the number of events is small. (orig.)

  15. Kernel Learning of Histogram of Local Gabor Phase Patterns for Face Recognition

    Directory of Open Access Journals (Sweden)

    Bineng Zhong

    2008-06-01

    Full Text Available This paper proposes a new face recognition method, named kernel learning of histogram of local Gabor phase pattern (K-HLGPP, which is based on Daugman’s method for iris recognition and the local XOR pattern (LXP operator. Unlike traditional Gabor usage exploiting the magnitude part in face recognition, we encode the Gabor phase information for face classification by the quadrant bit coding (QBC method. Two schemes are proposed for face recognition. One is based on the nearest-neighbor classifier with chi-square as the similarity measurement, and the other makes kernel discriminant analysis for HLGPP (K-HLGPP using histogram intersection and Gaussian-weighted chi-square kernels. The comparative experiments show that K-HLGPP achieves a higher recognition rate than other well-known face recognition systems on the large-scale standard FERET, FERET200, and CAS-PEAL-R1 databases.

  16. Adaptive local backlight dimming algorithm based on local histogram and image characteristics

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari

    2013-01-01

    -off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms. © (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.......Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image......, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted...

  17. An improved contrast enhancement algorithm for infrared images based on adaptive double plateaus histogram equalization

    Science.gov (United States)

    Li, Shuo; Jin, Weiqi; Li, Li; Li, Yiyang

    2018-05-01

    Infrared thermal images can reflect the thermal-radiation distribution of a particular scene. However, the contrast of the infrared images is usually low. Hence, it is generally necessary to enhance the contrast of infrared images in advance to facilitate subsequent recognition and analysis. Based on the adaptive double plateaus histogram equalization, this paper presents an improved contrast enhancement algorithm for infrared thermal images. In the proposed algorithm, the normalized coefficient of variation of the histogram, which characterizes the level of contrast enhancement, is introduced as feedback information to adjust the upper and lower plateau thresholds. The experiments on actual infrared images show that compared to the three typical contrast-enhancement algorithms, the proposed algorithm has better scene adaptability and yields better contrast-enhancement results for infrared images with more dark areas or a higher dynamic range. Hence, it has high application value in contrast enhancement, dynamic range compression, and digital detail enhancement for infrared thermal images.

  18. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    Science.gov (United States)

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  19. Locally advanced rectal cancer: post-chemoradiotherapy ADC histogram analysis for predicting a complete response.

    Science.gov (United States)

    Cho, Seung Hyun; Kim, Gab Chul; Jang, Yun-Jin; Ryeom, Hunkyu; Kim, Hye Jung; Shin, Kyung-Min; Park, Jun Seok; Choi, Gyu-Seog; Kim, See Hyung

    2015-09-01

    The value of diffusion-weighted imaging (DWI) for reliable differentiation between pathologic complete response (pCR) and residual tumor is still unclear. Recently, a few studies reported that histogram analysis can be helpful to monitor the therapeutic response in various cancer research. To investigate whether post-chemoradiotherapy (CRT) apparent diffusion coefficient (ADC) histogram analysis can be helpful to predict a pCR in locally advanced rectal cancer (LARC). Fifty patients who underwent preoperative CRT followed by surgery were enrolled in this retrospective study, non-pCR (n = 41) and pCR (n = 9), respectively. ADC histogram analysis encompassing the whole tumor was performed on two post-CRT ADC600 and ADC1000 (b factors 0, 600 vs. 0, 1000 s/mm(2)) maps. Mean, minimum, maximum, SD, mode, 10th, 25th, 50th, 75th, 90th percentile ADCs, skewness, and kurtosis were derived. Diagnostic performance for predicting pCR was evaluated and compared. On both maps, 10th and 25th ADCs showed better diagnostic performance than that using mean ADC. Tenth percentile ADCs revealed the best diagnostic performance on both ADC600 (AZ 0.841, sensitivity 100%, specificity 70.7%) and ADC1000 (AZ 0.821, sensitivity 77.8%, specificity 87.8%) maps. In comparison between 10th percentile and mean ADC, the specificity was significantly improved on both ADC600 (70.7% vs. 53.7%; P = 0.031) and ADC1000 (87.8% vs. 73.2%; P = 0.039) maps. Post-CRT ADC histogram analysis is helpful for predicting pCR in LARC, especially, in improving the specificity, compared with mean ADC. © The Foundation Acta Radiologica 2014.

  20. Histogram Matching Extends Acceptable Signal Strength Range on Optical Coherence Tomography Images

    Science.gov (United States)

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Sigal, Ian A.; Kagemann, Larry; Schuman, Joel S.

    2015-01-01

    Purpose. We minimized the influence of image quality variability, as measured by signal strength (SS), on optical coherence tomography (OCT) thickness measurements using the histogram matching (HM) method. Methods. We scanned 12 eyes from 12 healthy subjects with the Cirrus HD-OCT device to obtain a series of OCT images with a wide range of SS (maximal range, 1–10) at the same visit. For each eye, the histogram of an image with the highest SS (best image quality) was set as the reference. We applied HM to the images with lower SS by shaping the input histogram into the reference histogram. Retinal nerve fiber layer (RNFL) thickness was automatically measured before and after HM processing (defined as original and HM measurements), and compared to the device output (device measurements). Nonlinear mixed effects models were used to analyze the relationship between RNFL thickness and SS. In addition, the lowest tolerable SSs, which gave the RNFL thickness within the variability margin of manufacturer recommended SS range (6–10), were determined for device, original, and HM measurements. Results. The HM measurements showed less variability across a wide range of image quality than the original and device measurements (slope = 1.17 vs. 4.89 and 1.72 μm/SS, respectively). The lowest tolerable SS was successfully reduced to 4.5 after HM processing. Conclusions. The HM method successfully extended the acceptable SS range on OCT images. This would qualify more OCT images with low SS for clinical assessment, broadening the OCT application to a wider range of subjects. PMID:26066749

  1. Treatment and Combination of Data Quality Monitoring Histograms to Perform Data vs. Monte Carlo Validation

    CERN Document Server

    Colin, Nolan

    2013-01-01

    In CMS's automated data quality validation infrastructure, it is not currently possible to assess how well Monte Carlo simulations describe data from collisions, if at all. In order to guarantee high quality data, a novel work flow was devised to perform `data vs. Monte Carlo' validation. Support for this comparison was added by allowing distributions from several Monte Carlo samples to be combined, matched to the data and then displayed in a histogram stack, overlaid with the experimental data.

  2. A DNA-based registry for all animal species: the barcode index number (BIN system.

    Directory of Open Access Journals (Sweden)

    Sujeevan Ratnasingham

    Full Text Available Because many animal species are undescribed, and because the identification of known species is often difficult, interim taxonomic nomenclature has often been used in biodiversity analysis. By assigning individuals to presumptive species, called operational taxonomic units (OTUs, these systems speed investigations into the patterning of biodiversity and enable studies that would otherwise be impossible. Although OTUs have conventionally been separated through their morphological divergence, DNA-based delineations are not only feasible, but have important advantages. OTU designation can be automated, data can be readily archived, and results can be easily compared among investigations. This study exploits these attributes to develop a persistent, species-level taxonomic registry for the animal kingdom based on the analysis of patterns of nucleotide variation in the barcode region of the cytochrome c oxidase I (COI gene. It begins by examining the correspondence between groups of specimens identified to a species through prior taxonomic work and those inferred from the analysis of COI sequence variation using one new (RESL and four established (ABGD, CROP, GMYC, jMOTU algorithms. It subsequently describes the implementation, and structural attributes of the Barcode Index Number (BIN system. Aside from a pragmatic role in biodiversity assessments, BINs will aid revisionary taxonomy by flagging possible cases of synonymy, and by collating geographical information, descriptive metadata, and images for specimens that are likely to belong to the same species, even if it is undescribed. More than 274,000 BIN web pages are now available, creating a biodiversity resource that is positioned for rapid growth.

  3. A novel method for the evaluation of uncertainty in dose-volume histogram computation.

    Science.gov (United States)

    Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas

    2008-03-15

    Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  4. Sistem Pendeteksi Kualitas Daging Dengan Ekualisasi Histogram Dan Thresholding Berbasis Android

    Directory of Open Access Journals (Sweden)

    Anggit Sri Herlambang

    2016-04-01

    Full Text Available Kebutuhan daging sapi yang meningkat sering dimanfaatkan oleh penjual daging sapi untuk melakukan kecurangan. Kecurangan yang sering dimanfaatkan biasanya dalam hal kualitas daging sapi. Kualitas daging ditentukan oleh beberapa parameter, termasuk parameter ukuran, tekstur, karakteristik warna, bau daging dan lain - lain. Parameter adalah salah satu faktor penting untuk menentukan kualitas daging. Umumnya dalam menentukan kualitas daging dilakukan dengan menggunakan indra penglihatan. Sehingga cara manual masih bersifat subjektif dalam menilai kualitas daging. Penelitian ini bertujuan untuk merancang aplikasi sistem pendeteksi kualitas daging dengan sampel 20 citra daging data uji. Sistem pendeteksi kualitas daging dengan ekualisasi histogram dan thresholding berbasis android ini dibangun dengan menggunakan bahasa pemrograman berbasis Android yang terintegrasi dengan SDK Android, Eclipse dan library OpenCV. Metode yang digunakan menggunakan metode pra-pengolahan ekualisasi histogram dan segmentasi thresholding pengolahan citra. Deteksi kualitas daging dilakukan dengan mencari nilai statistik ekstraksi ciri citra berdasarkan data citra daging dari penelitian. Hasil penelitian ini adalah dapat menentukan nilai statistik mean dan standar deviasi dari hasil citra olahan ekualisasi histogram dan thresholding disertai analisis kualitas citra daging sapi. Pengujian black box dari aplikasi sistem pendeteksi kualitas daging ini menunjukkan bahwa semua fungsi yang terdapat pada aplikasi ini telah berhasil berjalan sesuai fungsinya. Penelitian ini harapannya bisa digunakan untuk membantu penelitian tahap selanjutnya.

  5. RGB Color Cube-Based Histogram Specification for Hue-Preserving Color Image Enhancement

    Directory of Open Access Journals (Sweden)

    Kohei Inoue

    2017-07-01

    Full Text Available A large number of color image enhancement methods are based on the methods for grayscale image enhancement in which the main interest is contrast enhancement. However, since colors usually have three attributes, including hue, saturation and intensity of more than only one attribute of grayscale values, the naive application of the methods for grayscale images to color images often results in unsatisfactory consequences. Conventional hue-preserving color image enhancement methods utilize histogram equalization (HE for enhancing the contrast. However, they cannot always enhance the saturation simultaneously. In this paper, we propose a histogram specification (HS method for enhancing the saturation in hue-preserving color image enhancement. The proposed method computes the target histogram for HS on the basis of the geometry of RGB (rad, green and blue color space, whose shape is a cube with a unit side length. Therefore, the proposed method includes no parameters to be set by users. Experimental results show that the proposed method achieves higher color saturation than recent parameter-free methods for hue-preserving color image enhancement. As a result, the proposed method can be used for an alternative method of HE in hue-preserving color image enhancement.

  6. Introducing the Jacobian-volume-histogram of deforming organs: application to parotid shrinkage evaluation

    International Nuclear Information System (INIS)

    Fiorino, Claudio; Maggiulli, Eleonora; Broggi, Sara; Cattaneo, Giovanni Mauro; Calandrino, Riccardo; Liberini, Simone; Faggiano, Elena; Rizzo, Giovanna; Dell'Oca, Italo; Di Muzio, Nadia

    2011-01-01

    The Jacobian of the deformation field of elastic registration between images taken during radiotherapy is a measure of inter-fraction local deformation. The histogram of the Jacobian values (Jac) within an organ was introduced (JVH-Jacobian-volume-histogram) and first applied in quantifying parotid shrinkage. MVCTs of 32 patients previously treated with helical tomotherapy for head-neck cancers were collected. Parotid deformation was evaluated through elastic registration between MVCTs taken at the first and last fractions. Jac was calculated for each voxel of all parotids, and integral JVHs were calculated for each parotid; the correlation between the JVH and the planning dose-volume histogram (DVH) was investigated. On average, 82% (±17%) of the voxels shrinks (Jac 50% (Jac < 0.5). The best correlation between the DVH and the JVH was found between V10 and V15, and Jac < 0.4-0.6 (p < 0.01). The best constraint predicting a higher number of largely compressing voxels (Jac0.5<7.5%, median value) was V15 ≥ 75% (OR: 7.6, p = 0.002). Jac and the JVH are promising tools for scoring/modelling toxicity and for evaluating organ/contour variations with potential applications in adaptive radiotherapy.

  7. Whole brain magnetization transfer histogram analysis of pediatric acute lymphoblastic leukemia patients receiving intrathecal methotrexate therapy

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Akira [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: yakira@kuhp.kyoto-u.ac.jp; Miki, Yukio [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: mikiy@kuhp.kyoto-u.ac.jp; Adachi, Souichi [Department of Pediatrics, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: sadachi@kuhp.kyoto-u.ac.jp (and others)

    2006-03-15

    Background and purpose: The purpose of this prospective study was to evaluate the hypothesis that magnetization transfer ratio (MTR) histogram analysis of the whole brain could detect early and subtle brain changes nonapparent on conventional magnetic resonance imaging (MRI) in children with acute lymphoblastic leukemia (ALL) receiving methotrexate (MTX) therapy. Materials and methods: Subjects in this prospective study comprised 10 children with ALL (mean age, 6 years; range, 0-16 years). In addition to conventional MRI, magnetization transfer images were obtained before and after intrathecal and intravenous MTX therapy. MTR values were calculated and plotted as a histogram, and peak height and location were calculated. Differences in peak height and location between pre- and post-MTX therapy scans were statistically analyzed. Conventional MRI was evaluated for abnormal signal area in white matter. Results: MTR peak height was significantly lower on post-MTX therapy scans than on pre-MTX therapy scans (p = 0.002). No significant differences in peak location were identified between pre- and post-chemotherapy imaging. No abnormal signals were noted in white matter on either pre- or post-MTX therapy conventional MRI. Conclusions: This study demonstrates that MTR histogram analysis allows better detection of early and subtle brain changes in ALL patients who receive MTX therapy than conventional MRI.

  8. LOR-OSEM: statistical PET reconstruction from raw line-of-response histograms

    International Nuclear Information System (INIS)

    Kadrmas, Dan J

    2004-01-01

    Iterative statistical reconstruction methods are becoming the standard in positron emission tomography (PET). Conventional maximum-likelihood expectation-maximization (MLEM) and ordered-subsets (OSEM) algorithms act on data which have been pre-processed into corrected, evenly-spaced histograms; however, such pre-processing corrupts the Poisson statistics. Recent advances have incorporated attenuation, scatter and randoms compensation into the iterative reconstruction. The objective of this work was to incorporate the remaining pre-processing steps, including arc correction, to reconstruct directly from raw unevenly-spaced line-of-response (LOR) histograms. This exactly preserves Poisson statistics and full spatial information in a manner closely related to listmode ML, making full use of the ML statistical model. The LOR-OSEM algorithm was implemented using a rotation-based projector which maps directly to the unevenly-spaced LOR grid. Simulation and phantom experiments were performed to characterize resolution, contrast and noise properties for 2D PET. LOR-OSEM provided a beneficial noise-resolution tradeoff, outperforming AW-OSEM by about the same margin that AW-OSEM outperformed pre-corrected OSEM. The relationship between LOR-ML and listmode ML algorithms was explored, and implementation differences are discussed. LOR-OSEM is a viable alternative to AW-OSEM for histogram-based reconstruction with improved spatial resolution and noise properties

  9. BED-Volume histograms calculation for routine clinical dosimetry in brachytherapy

    International Nuclear Information System (INIS)

    Galelli, M.; Feroldi, P.

    1995-01-01

    The consideration of volumes is essential in Brachytherapy clinical dosimetry (I.C.R.U). Indeed, several indices, all based on dose-volume histograms (DVHs), have been designed in order to evaluate: before the therapy the volumetric quality of different possible implant geometries; during the therapy the consistency of the real and the previsional implants. Radiobiological evaluations, considering the dose deposition temporal pattern of treatment, can be usefully added to dosimetric calculations, to compare different treatment schedules. The Linear-Quadratic model is the most used: radiobiological modelisation and Biologically Effective Dose (BED) is principal related dosimetric quantity. Therefore, the consideration of BED-volume histogram (BED-VHs) is a straightforward extension of DVHs. In practice, BED-VHs can help relative comparisons and optimisations in treatment planning when combined to dose-volume histograms. Since 1994 the dosimetric calculations for all the gynecological brachytherapy treatments are performed considering also DVHs and BED-VHs. In this presentation we show the methods of BEDVHs calculation, together with some typical results

  10. Histogram analysis of diffusion kurtosis imaging of nasopharyngeal carcinoma: Correlation between quantitative parameters and clinical stage.

    Science.gov (United States)

    Xu, Xiao-Quan; Ma, Gao; Wang, Yan-Jun; Hu, Hao; Su, Guo-Yi; Shi, Hai-Bin; Wu, Fei-Yun

    2017-07-18

    To evaluate the correlation between histogram parameters derived from diffusion-kurtosis (DK) imaging and the clinical stage of nasopharyngeal carcinoma (NPC). High T-stage (T3/4) NPC showed significantly higher Kapp-mean (P = 0.018), Kapp-median (P = 0.029) and Kapp-90th (P = 0.003) than low T-stage (T1/2) NPC. High N-stage NPC (N2/3) showed significantly lower Dapp-mean (P = 0.002), Dapp-median (P = 0.002) and Dapp-10th (P Histogram parameters, including mean, median, 10th, 90th percentiles, skewness and kurtosis of Dapp and Kapp were calculated. Patients were divided into low and high T, N and clinical stage based on American Joint Committee on Cancer (AJCC) staging system. Differences of histogram parameters between low and high T, N and AJCC stages were compared using t test. Multiple receiver operating characteristic (ROC) curves were used to determine and compare the value of significant parameters in predicting high T, N and AJCC stage, respectively. DK imaging-derived parameters correlated well with clinical stage of NPC, therefore could serve as an adjunctive imaging technique for evaluating NPC.

  11. Digital image classification with the help of artificial neural network by simple histogram.

    Science.gov (United States)

    Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant

    2016-01-01

    Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations.

  12. Multipeak Mean Based Optimized Histogram Modification Framework Using Swarm Intelligence for Image Contrast Enhancement

    Directory of Open Access Journals (Sweden)

    P. Babu

    2015-01-01

    Full Text Available A novel approach, Multipeak mean based optimized histogram modification framework (MMOHM is introduced for the purpose of enhancing the contrast as well as preserving essential details for any given gray scale and colour images. The basic idea of this technique is the calculation of multiple peaks (local maxima from the original histogram. The mean value of multiple peaks is computed and the input image’s histogram is segmented into two subhistograms based on this multipeak mean (mmean value. Then, a bicriteria optimization problem is formulated and the subhistograms are modified by selecting optimal contrast enhancement parameters. While formulating the enhancement parameters, particle swarm optimization is employed to find optimal values of them. Finally, the union of the modified subhistograms produces a contrast enhanced and details preserved output image. This mechanism enhances the contrast of the input image better than the existing contemporary HE methods. The performance of the proposed method is well supported by the contrast enhancement quantitative metrics such as discrete entropy, natural image quality evaluator, and absolute mean brightness error.

  13. Differential diagnosis of normal pressure hydrocephalus by MRI mean diffusivity histogram analysis.

    Science.gov (United States)

    Ivkovic, M; Liu, B; Ahmed, F; Moore, D; Huang, C; Raj, A; Kovanlikaya, I; Heier, L; Relkin, N

    2013-01-01

    Accurate diagnosis of normal pressure hydrocephalus is challenging because the clinical symptoms and radiographic appearance of NPH often overlap those of other conditions, including age-related neurodegenerative disorders such as Alzheimer and Parkinson diseases. We hypothesized that radiologic differences between NPH and AD/PD can be characterized by a robust and objective MR imaging DTI technique that does not require intersubject image registration or operator-defined regions of interest, thus avoiding many pitfalls common in DTI methods. We collected 3T DTI data from 15 patients with probable NPH and 25 controls with AD, PD, or dementia with Lewy bodies. We developed a parametric model for the shape of intracranial mean diffusivity histograms that separates brain and ventricular components from a third component composed mostly of partial volume voxels. To accurately fit the shape of the third component, we constructed a parametric function named the generalized Voss-Dyke function. We then examined the use of the fitting parameters for the differential diagnosis of NPH from AD, PD, and DLB. Using parameters for the MD histogram shape, we distinguished clinically probable NPH from the 3 other disorders with 86% sensitivity and 96% specificity. The technique yielded 86% sensitivity and 88% specificity when differentiating NPH from AD only. An adequate parametric model for the shape of intracranial MD histograms can distinguish NPH from AD, PD, or DLB with high sensitivity and specificity.

  14. Conductance histogram evolution of an EC-MCBJ fabricated Au atomic point contact

    Energy Technology Data Exchange (ETDEWEB)

    Yang Yang; Liu Junyang; Chen Zhaobin; Tian Jinghua; Jin Xi; Liu Bo; Yang Fangzu; Tian Zhongqun [State Key Laboratory of Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005 (China); Li Xiulan; Tao Nongjian [Center for Bioelectronics and Biosensors, Biodesign Institute, Department of Electrical Engineering, Arizona State University, Tempe, AZ 85287-6206 (United States); Luo Zhongzi; Lu Miao, E-mail: zqtian@xmu.edu.cn [Micro-Electro-Mechanical Systems Research Center, Pen-Tung Sah Micro-Nano Technology Institute, Xiamen University, Xiamen 361005 (China)

    2011-07-08

    This work presents a study of Au conductance quantization based on a combined electrochemical deposition and mechanically controllable break junction (MCBJ) method. We describe the microfabrication process and discuss improved features of our microchip structure compared to the previous one. The improved structure prolongs the available life of the microchip and also increases the success rate of the MCBJ experiment. Stepwise changes in the current were observed at the last stage of atomic point contact breakdown and conductance histograms were constructed. The evolution of 1G{sub 0} peak height in conductance histograms was used to investigate the probability of formation of an atomic point contact. It has been shown that the success rate in forming an atomic point contact can be improved by decreasing the stretching speed and the degree that the two electrodes are brought into contact. The repeated breakdown and formation over thousands of cycles led to a distinctive increase of 1G{sub 0} peak height in the conductance histograms, and this increased probability of forming a single atomic point contact is discussed.

  15. Conductance histogram evolution of an EC-MCBJ fabricated Au atomic point contact

    International Nuclear Information System (INIS)

    Yang Yang; Liu Junyang; Chen Zhaobin; Tian Jinghua; Jin Xi; Liu Bo; Yang Fangzu; Tian Zhongqun; Li Xiulan; Tao Nongjian; Luo Zhongzi; Lu Miao

    2011-01-01

    This work presents a study of Au conductance quantization based on a combined electrochemical deposition and mechanically controllable break junction (MCBJ) method. We describe the microfabrication process and discuss improved features of our microchip structure compared to the previous one. The improved structure prolongs the available life of the microchip and also increases the success rate of the MCBJ experiment. Stepwise changes in the current were observed at the last stage of atomic point contact breakdown and conductance histograms were constructed. The evolution of 1G 0 peak height in conductance histograms was used to investigate the probability of formation of an atomic point contact. It has been shown that the success rate in forming an atomic point contact can be improved by decreasing the stretching speed and the degree that the two electrodes are brought into contact. The repeated breakdown and formation over thousands of cycles led to a distinctive increase of 1G 0 peak height in the conductance histograms, and this increased probability of forming a single atomic point contact is discussed.

  16. A frequency bin-wise nonlinear masking algorithm in convolutive mixtures for speech segregation.

    Science.gov (United States)

    Chi, Tai-Shih; Huang, Ching-Wen; Chou, Wen-Sheng

    2012-05-01

    A frequency bin-wise nonlinear masking algorithm is proposed in the spectrogram domain for speech segregation in convolutive mixtures. The contributive weight from each speech source to a time-frequency unit of the mixture spectrogram is estimated by a nonlinear function based on location cues. For each sound source, a non-binary mask is formed from the estimated weights and is multiplied to the mixture spectrogram to extract the sound. Head-related transfer functions (HRTFs) are used to simulate convolutive sound mixtures perceived by listeners. Simulation results show our proposed method outperforms convolutive independent component analysis and degenerate unmixing and estimation technique methods in almost all test conditions.

  17. MetaABC--an integrated metagenomics platform for data adjustment, binning and clustering.

    Science.gov (United States)

    Su, Chien-Hao; Hsu, Ming-Tsung; Wang, Tse-Yi; Chiang, Sufeng; Cheng, Jen-Hao; Weng, Francis C; Kao, Cheng-Yan; Wang, Daryi; Tsai, Huai-Kuang

    2011-08-15

    MetaABC is a metagenomic platform that integrates several binning tools coupled with methods for removing artifacts, analyzing unassigned reads and controlling sampling biases. It allows users to arrive at a better interpretation via series of distinct combinations of analysis tools. After execution, MetaABC provides outputs in various visual formats such as tables, pie and bar charts as well as clustering result diagrams. MetaABC source code and documentation are available at http://bits2.iis.sinica.edu.tw/MetaABC/ CONTACT: dywang@gate.sinica.edu.tw; hktsai@iis.sinica.edu.tw Supplementary data are available at Bioinformatics online.

  18. Optical Extinction Measurements of Dust Density in the GMRO Regolith Test Bin

    Science.gov (United States)

    Lane, J.; Mantovani, J.; Mueller, R.; Nugent, M.; Nick, A.; Schuler, J.; Townsend, I.

    2016-01-01

    A regolith simulant test bin was constructed and completed in the Granular Mechanics and Regolith Operations (GMRO) Lab in 2013. This Planetary Regolith Test Bed (PRTB) is a 64 sq m x 1 m deep test bin, is housed in a climate-controlled facility, and contains 120 MT of lunar-regolith simulant, called Black Point-1 or BP-1, from Black Point, AZ. One of the current uses of the test bin is to study the effects of difficult lighting and dust conditions on Telerobotic Perception Systems to better assess and refine regolith operations for asteroid, Mars and polar lunar missions. Low illumination and low angle of incidence lighting pose significant problems to computer vision and human perception. Levitated dust on Asteroids interferes with imaging and degrades depth perception. Dust Storms on Mars pose a significant problem. Due to these factors, the likely performance of telerobotics is poorly understood for future missions. Current space telerobotic systems are only operated in bright lighting and dust-free conditions. This technology development testing will identify: (1) the impact of degraded lighting and environmental dust on computer vision and operator perception, (2) potential methods and procedures for mitigating these impacts, (3) requirements for telerobotic perception systems for asteroid capture, Mars dust storms and lunar regolith ISRU missions. In order to solve some of the Telerobotic Perception system problems, a plume erosion sensor (PES) was developed in the Lunar Regolith Simulant Bin (LRSB), containing 2 MT of JSC-1a lunar simulant. PES is simply a laser and digital camera with a white target. Two modes of operation have been investigated: (1) single laser spot - the brightness of the spot is dependent on the optical extinction due to dust and is thus an indirect measure of particle number density, and (2) side-scatter - the camera images the laser from the side, showing beam entrance into the dust cloud and the boundary between dust and void. Both

  19. Diviner lunar radiometer gridded brightness temperatures from geodesic binning of modeled fields of view

    Science.gov (United States)

    Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.

    2017-12-01

    An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the

  20. Tradable permit allocations and sequential choice

    Energy Technology Data Exchange (ETDEWEB)

    MacKenzie, Ian A. [Centre for Economic Research, ETH Zuerich, Zurichbergstrasse 18, 8092 Zuerich (Switzerland)

    2011-01-15

    This paper investigates initial allocation choices in an international tradable pollution permit market. For two sovereign governments, we compare allocation choices that are either simultaneously or sequentially announced. We show sequential allocation announcements result in higher (lower) aggregate emissions when announcements are strategic substitutes (complements). Whether allocation announcements are strategic substitutes or complements depends on the relationship between the follower's damage function and governments' abatement costs. When the marginal damage function is relatively steep (flat), allocation announcements are strategic substitutes (complements). For quadratic abatement costs and damages, sequential announcements provide a higher level of aggregate emissions. (author)

  1. Cost allocation review : staff discussion paper

    International Nuclear Information System (INIS)

    2005-09-01

    This report addressed the need for updated cost allocation studies filed by local electricity distribution companies because they ensure that distribution rates for each customer class remain just and reasonable. According to the 2001 Electricity Distribution Rate Handbook, the Ontario Energy Board requires new cost allocation studies before implementing any future incentive regulation plans. A review of cost allocations allows the Board to consider the need for adjustments to the current share of distribution costs paid by different classes of ratepayers. This report included 14 sections to facilitate consultations with stakeholders on financial information requirements for cost allocation; directly assignable costs; functionalization; categorization; allocation methods; allocation of other costs; load data requirements; cost allocation implementation issues; addition of new rate class and rate design for scattered unmetered loads; addition of new rate class for larger users; rates to charge embedded distributors; treatment of the rate sub-classification identified as time-of-use; and, rate design implementation issues. 1 fig., 7 appendices

  2. Breast lesion characterization using whole-lesion histogram analysis with stretched-exponential diffusion model.

    Science.gov (United States)

    Liu, Chunling; Wang, Kun; Li, Xiaodan; Zhang, Jine; Ding, Jie; Spuhler, Karl; Duong, Timothy; Liang, Changhong; Huang, Chuan

    2018-06-01

    Diffusion-weighted imaging (DWI) has been studied in breast imaging and can provide more information about diffusion, perfusion and other physiological interests than standard pulse sequences. The stretched-exponential model has previously been shown to be more reliable than conventional DWI techniques, but different diagnostic sensitivities were found from study to study. This work investigated the characteristics of whole-lesion histogram parameters derived from the stretched-exponential diffusion model for benign and malignant breast lesions, compared them with conventional apparent diffusion coefficient (ADC), and further determined which histogram metrics can be best used to differentiate malignant from benign lesions. This was a prospective study. Seventy females were included in the study. Multi-b value DWI was performed on a 1.5T scanner. Histogram parameters of whole lesions for distributed diffusion coefficient (DDC), heterogeneity index (α), and ADC were calculated by two radiologists and compared among benign lesions, ductal carcinoma in situ (DCIS), and invasive carcinoma confirmed by pathology. Nonparametric tests were performed for comparisons among invasive carcinoma, DCIS, and benign lesions. Comparisons of receiver operating characteristic (ROC) curves were performed to show the ability to discriminate malignant from benign lesions. The majority of histogram parameters (mean/min/max, skewness/kurtosis, 10-90 th percentile values) from DDC, α, and ADC were significantly different among invasive carcinoma, DCIS, and benign lesions. DDC 10% (area under curve [AUC] = 0.931), ADC 10% (AUC = 0.893), and α mean (AUC = 0.787) were found to be the best metrics in differentiating benign from malignant tumors among all histogram parameters derived from ADC and α, respectively. The combination of DDC 10% and α mean , using logistic regression, yielded the highest sensitivity (90.2%) and specificity (95.5%). DDC 10% and α mean derived from

  3. Measuring kinetics of complex single ion channel data using mean-variance histograms.

    Science.gov (United States)

    Patlak, J B

    1993-07-01

    The measurement of single ion channel kinetics is difficult when those channels exhibit subconductance events. When the kinetics are fast, and when the current magnitudes are small, as is the case for Na+, Ca2+, and some K+ channels, these difficulties can lead to serious errors in the estimation of channel kinetics. I present here a method, based on the construction and analysis of mean-variance histograms, that can overcome these problems. A mean-variance histogram is constructed by calculating the mean current and the current variance within a brief "window" (a set of N consecutive data samples) superimposed on the digitized raw channel data. Systematic movement of this window over the data produces large numbers of mean-variance pairs which can be assembled into a two-dimensional histogram. Defined current levels (open, closed, or sublevel) appear in such plots as low variance regions. The total number of events in such low variance regions is estimated by curve fitting and plotted as a function of window width. This function decreases with the same time constants as the original dwell time probability distribution for each of the regions. The method can therefore be used: 1) to present a qualitative summary of the single channel data from which the signal-to-noise ratio, open channel noise, steadiness of the baseline, and number of conductance levels can be quickly determined; 2) to quantify the dwell time distribution in each of the levels exhibited. In this paper I present the analysis of a Na+ channel recording that had a number of complexities. The signal-to-noise ratio was only about 8 for the main open state, open channel noise, and fast flickers to other states were present, as were a substantial number of subconductance states. "Standard" half-amplitude threshold analysis of these data produce open and closed time histograms that were well fitted by the sum of two exponentials, but with apparently erroneous time constants, whereas the mean

  4. Whole-tumor apparent diffusion coefficient (ADC) histogram analysis to differentiate benign peripheral neurogenic tumors from soft tissue sarcomas.

    Science.gov (United States)

    Nakajo, Masanori; Fukukura, Yoshihiko; Hakamada, Hiroto; Yoneyama, Tomohide; Kamimura, Kiyohisa; Nagano, Satoshi; Nakajo, Masayuki; Yoshiura, Takashi

    2018-02-22

    Apparent diffusion coefficient (ADC) histogram analyses have been used to differentiate tumor grades and predict therapeutic responses in various anatomic sites with moderate success. To determine the ability of diffusion-weighted imaging (DWI) with a whole-tumor ADC histogram analysis to differentiate benign peripheral neurogenic tumors (BPNTs) from soft tissue sarcomas (STSs). Retrospective study, single institution. In all, 25 BPNTs and 31 STSs. Two-b value DWI (b-values = 0, 1000s/mm 2 ) was at 3.0T. The histogram parameters of whole-tumor for ADC were calculated by two radiologists and compared between BPNTs and STSs. Nonparametric tests were performed for comparisons between BPNTs and STSs. P histogram parameters except kurtosis and entropy differed significantly between BPNTs and STSs. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.

  5. Aerial radiometric and magnetic reconnaissance survey of the Delta Quadrangle, Utah. Volume 2. Maps, profiles, and histograms. Final report

    International Nuclear Information System (INIS)

    1978-11-01

    Results of the interpretation of the gamma-ray spectrometric data in the form of a preferred anomaly map, along with significance-factor profile maps, stacked profiles, and histograms are presented in Volume 2

  6. Steganalysis Method for LSB Replacement Based on Local Gradient of Image Histogram

    Directory of Open Access Journals (Sweden)

    M. Mahdavi

    2008-10-01

    Full Text Available In this paper we present a new accurate steganalysis method for the LSBreplacement steganography. The suggested method is based on the changes that occur in thehistogram of an image after the embedding of data. Every pair of neighboring bins of ahistogram are either inter-related or unrelated depending on whether embedding of a bit ofdata in the image could affect both bins or not. We show that the overall behavior of allinter-related bins, when compared with that of the unrelated ones, could give an accuratemeasure for the amount of the embedded data. Both analytical analysis and simulationresults show the accuracy of the proposed method. The suggested method has beenimplemented and tested for over 2000 samples and compared with the RS Steganalysismethod. Mean and variance of error were 0.0025 and 0.0037 for the suggested methodwhere these quantities were 0.0070 and 0.0182 for the RS Steganalysis. Using 4800samples, we showed that the performance of the suggested method is comparable withthose of the RS steganalysis for JPEG filtered images. The new approach is applicable forthe detection of both random and sequential LSB embedding.

  7. Designing a power supply for Nim-bin formatted equipment; Diseno de una fuente de alimentacion para equipos con formato Nim-bin

    Energy Technology Data Exchange (ETDEWEB)

    Banuelos G, L. E.; Hernandez D, V. M.; Vega C, H. R., E-mail: lebluis2012@hotmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98060 Zacatecas, Zac. (Mexico)

    2016-09-15

    From an old Nuclear Chicago power supply that was practically in the trash, was able to recover the 19 inches casing, rear connectors and the housing where the circuits were. From here all mechanical parts were cleaned and the electronic design was started to replace the original voltage and current functions of this equipment. The cards for the ±6, ±12 and ±24 voltages were designed, simulated and tested with circuitry that does not rely on specialized components or that is sold only by the equipment manufacturer. In the handling of the current by each voltage to operate, was possible to tie with the specifications of the manufacturers like Ortec or Canberra where a model of power supply gives a power of 160 Watts. Basic tests were performed to show that the behavior is very similar to commercial equipment; such as the full load regulation index and the noise level in the supply voltages. So our Nim-bin voltage source is viable for use in our institution laboratories. (Author)

  8. Potential of MR histogram analyses for prediction of response to chemotherapy in patients with colorectal hepatic metastases.

    Science.gov (United States)

    Liang, He-Yue; Huang, Ya-Qin; Yang, Zhao-Xia; Ying-Ding; Zeng, Meng-Su; Rao, Sheng-Xiang

    2016-07-01

    To determine if magnetic resonance imaging (MRI) histogram analyses can help predict response to chemotherapy in patients with colorectal hepatic metastases by using response evaluation criteria in solid tumours (RECIST1.1) as the reference standard. Standard MRI including diffusion-weighted imaging (b=0, 500 s/mm(2)) was performed before chemotherapy in 53 patients with colorectal hepatic metastases. Histograms were performed for apparent diffusion coefficient (ADC) maps, arterial, and portal venous phase images; thereafter, mean, percentiles (1st, 10th, 50th, 90th, 99th), skewness, kurtosis, and variance were generated. Quantitative histogram parameters were compared between responders (partial and complete response, n=15) and non-responders (progressive and stable disease, n=38). Receiver operator characteristics (ROC) analyses were further analyzed for the significant parameters. The mean, 1st percentile, 10th percentile, 50th percentile, 90th percentile, 99th percentile of the ADC maps were significantly lower in responding group than that in non-responding group (p=0.000-0.002) with area under the ROC curve (AUCs) of 0.76-0.82. The histogram parameters of arterial and portal venous phase showed no significant difference (p>0.05) between the two groups. Histogram-derived parameters for ADC maps seem to be a promising tool for predicting response to chemotherapy in patients with colorectal hepatic metastases. • ADC histogram analyses can potentially predict chemotherapy response in colorectal liver metastases. • Lower histogram-derived parameters (mean, percentiles) for ADC tend to have good response. • MR enhancement histogram analyses are not reliable to predict response.

  9. Allocation decisions in network industries

    Energy Technology Data Exchange (ETDEWEB)

    Bolle, Friedel [Europa-Universitaet Viadrina Frankfurt, Lehrstuhl Volkswirtschaftslehre, insbesondere Wirtschaftstheorie (Mikrooekonomie), Postfach 1786 15207 Frankfurt (Germany)

    2008-01-15

    In this paper, I want to propagate a new analytical tool: The usage of Menu Auctions for modelling complicated auctions, negotiations, rent seeking, etc. is advocated, because, contrary to 'normal' auctions and bargaining models, an arbitrary number of additional aspects can be taken into account. After concentrating on 'Truthful Equilibria' [Bernheim, B.D., Whinston, M.D., 1986. Menu auctions, resource allocation, and economic influence, Quarterly Journal of Economics, 1-31.] a certain broad class of Menu Auctions show unique and efficient allocations. Under an additional concavity condition even the equilibrium bids are unique. Two examples are discussed: the privatisation of a state-owned industry and the buying of wholesale electricity (concluding contracts with a number of producers) by a utility. These examples also serve to trace the sources of 'non-concavities' which can prevent the uniqueness of bids and can provide the auctioneer with incentives to exclude bidders from the competition. (author)

  10. Allocation - the Howe measurement challenges

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, Jim; Moksnes, Paul Ove

    2005-07-01

    The Howe Field is located in the Central North Sea Block 22/12a approximately 160 km east of Aberdeen in a water depth of 85 m. The reservoir lies some 12 km east of the Shell operated Nelson Platform, which is situated in adjacent Block 22/11. The Howe project was initiated by Shell Exploration and Production to augment the operating life and production capacity of the Nelson platform, involving the development of an additional subset infrastructure and the installation of topside facilities. The owners of the Howe Field are Enterprise Oil PLC , Intrepid Energy and OMV. The Howe well fluids are commingled with Nelson fluids. Therefore, it is required to measure the Howe well fluids to differentiate between the fields and to determine how much money each partner is allocated. The commercial agreements have stipulated that the measurements of Howe fluids are required to be measured within an accuracy of +- 5% of reading. In addition to accuracy constraints, it was important to minimise capex to ensure the development was economically viable. Given this, multiphase metering was considered to be a solution for allocation between the different ownerships, as opposed to traditional separator metering. This paper will present the journey of the project activity through the selection criteria, flow loop test, installation, commissioning and the first 3 months of operation of the MPFM including verification with the Nelson test separator. Detailing with careful management and engineering support how to succeed with this type of application. (author) (tk)

  11. Metagenomic binning of a marine sponge microbiome reveals unity in defense but metabolic specialization.

    Science.gov (United States)

    Slaby, Beate M; Hackl, Thomas; Horn, Hannes; Bayer, Kristina; Hentschel, Ute

    2017-11-01

    Marine sponges are ancient metazoans that are populated by distinct and highly diverse microbial communities. In order to obtain deeper insights into the functional gene repertoire of the Mediterranean sponge Aplysina aerophoba, we combined Illumina short-read and PacBio long-read sequencing followed by un-targeted metagenomic binning. We identified a total of 37 high-quality bins representing 11 bacterial phyla and two candidate phyla. Statistical comparison of symbiont genomes with selected reference genomes revealed a significant enrichment of genes related to bacterial defense (restriction-modification systems, toxin-antitoxin systems) as well as genes involved in host colonization and extracellular matrix utilization in sponge symbionts. A within-symbionts genome comparison revealed a nutritional specialization of at least two symbiont guilds, where one appears to metabolize carnitine and the other sulfated polysaccharides, both of which are abundant molecules in the sponge extracellular matrix. A third guild of symbionts may be viewed as nutritional generalists that perform largely the same metabolic pathways but lack such extraordinary numbers of the relevant genes. This study characterizes the genomic repertoire of sponge symbionts at an unprecedented resolution and it provides greater insights into the molecular mechanisms underlying microbial-sponge symbiosis.

  12. Likelihood functions for the analysis of single-molecule binned photon sequences

    Energy Technology Data Exchange (ETDEWEB)

    Gopich, Irina V., E-mail: irinag@niddk.nih.gov [Laboratory of Chemical Physics, National Institute of Diabetes and Digestive and Kidney Diseases, National Institutes of Health, Bethesda, MD 20892 (United States)

    2012-03-02

    Graphical abstract: Folding of a protein with attached fluorescent dyes, the underlying conformational trajectory of interest, and the observed binned photon trajectory. Highlights: Black-Right-Pointing-Pointer A sequence of photon counts can be analyzed using a likelihood function. Black-Right-Pointing-Pointer The exact likelihood function for a two-state kinetic model is provided. Black-Right-Pointing-Pointer Several approximations are considered for an arbitrary kinetic model. Black-Right-Pointing-Pointer Improved likelihood functions are obtained to treat sequences of FRET efficiencies. - Abstract: We consider the analysis of a class of experiments in which the number of photons in consecutive time intervals is recorded. Sequence of photon counts or, alternatively, of FRET efficiencies can be studied using likelihood-based methods. For a kinetic model of the conformational dynamics and state-dependent Poisson photon statistics, the formalism to calculate the exact likelihood that this model describes such sequences of photons or FRET efficiencies is developed. Explicit analytic expressions for the likelihood function for a two-state kinetic model are provided. The important special case when conformational dynamics are so slow that at most a single transition occurs in a time bin is considered. By making a series of approximations, we eventually recover the likelihood function used in hidden Markov models. In this way, not only is insight gained into the range of validity of this procedure, but also an improved likelihood function can be obtained.

  13. Efficient Entanglement Concentration of Nonlocal Two-Photon Polarization-Time-Bin Hyperentangled States

    Science.gov (United States)

    Wang, Zi-Hang; Yu, Wen-Xuan; Wu, Xiao-Yuan; Gao, Cheng-Yan; Alzahrani, Faris; Hobiny, Aatef; Deng, Fu-Guo

    2018-03-01

    We present two different hyperentanglement concentration protocols (hyper-ECPs) for two-photon systems in nonlocal polarization-time-bin hyperentangled states with known parameters, including Bell-like and cluster-like states, resorting to the parameter splitting method. They require only one of two parties in quantum communication to operate her photon in the process of entanglement concentration, not two, and they have the maximal success probability. They work with linear optical elements and have good feasibility in experiment, especially in the case that there are a big number of quantum data exchanged as the parties can obtain the information about the parameters of the nonlocal hyperentangled states by sampling a subset of nonlocal hyperentangled two-photon systems and measuring them. As the quantum state of photons in the time-bin degree of freedom suffers from less noise in an optical-fiber channel, these hyper-ECPs may have good applications in practical long-distance quantum communication in the future.

  14. Development of Seismic Demand for Chang-Bin Offshore Wind Farm in Taiwan Strait

    Directory of Open Access Journals (Sweden)

    Yu-Kai Wang

    2016-12-01

    Full Text Available Taiwan is located on the Pacific seismic belt, and the soil conditions of Taiwan’s offshore wind farms are softer than those in Europe. To ensure safety and stability of the offshore wind turbine supporting structures, it is important to assess the offshore wind farms seismic forces reasonably. In this paper, the relevant seismic and geological data are obtained for Chang-Bin offshore wind farm in Taiwan Strait, the probabilistic seismic hazard analysis (PSHA is carried out, and the first uniform hazard response spectrum for Chang-Bin offshore wind farm is achieved. Compared with existing design response spectrum in the local regulation, this site-specific seismic hazard analysis has influence on the seismic force considered in the design of supporting structures and therefore affects the cost of the supporting structures. The results show that a site-specific seismic hazard analysis is required for high seismic area. The paper highlights the importance of seismic hazard analysis to assess the offshore wind farms seismic forces. The follow-up recommendations and research directions are given for Taiwan’s offshore wind turbine supporting structures under seismic force considerations.

  15. Conformational Smear Characterization and Binning of Single-Molecule Conductance Measurements for Enhanced Molecular Recognition.

    Science.gov (United States)

    Korshoj, Lee E; Afsari, Sepideh; Chatterjee, Anushree; Nagpal, Prashant

    2017-11-01

    Electronic conduction or charge transport through single molecules depends primarily on molecular structure and anchoring groups and forms the basis for a wide range of studies from molecular electronics to DNA sequencing. Several high-throughput nanoelectronic methods such as mechanical break junctions, nanopores, conductive atomic force microscopy, scanning tunneling break junctions, and static nanoscale electrodes are often used for measuring single-molecule conductance. In these measurements, "smearing" due to conformational changes and other entropic factors leads to large variances in the observed molecular conductance, especially in individual measurements. Here, we show a method for characterizing smear in single-molecule conductance measurements and demonstrate how binning measurements according to smear can significantly enhance the use of individual conductance measurements for molecular recognition. Using quantum point contact measurements on single nucleotides within DNA macromolecules, we demonstrate that the distance over which molecular junctions are maintained is a measure of smear, and the resulting variance in unbiased single measurements depends on this smear parameter. Our ability to identify individual DNA nucleotides at 20× coverage increases from 81.3% accuracy without smear analysis to 93.9% with smear characterization and binning (SCRIB). Furthermore, merely 7 conductance measurements (7× coverage) are needed to achieve 97.8% accuracy for DNA nucleotide recognition when only low molecular smear measurements are used, which represents a significant improvement over contemporary sequencing methods. These results have important implications in a broad range of molecular electronics applications from designing robust molecular switches to nanoelectronic DNA sequencing.

  16. Evaluation of methods for selecting the midventilation bin in 4DCT scans of lung cancer patients

    DEFF Research Database (Denmark)

    Nygaard, Ditte Eklund; Persson, Gitte Fredberg; Brink, Carsten

    2013-01-01

    based on: 1) visual evaluation of tumour displacement; 2) rigid registration of tumour position; 3) diaphragm displacement in the CC direction; and 4) carina displacement in the CC direction. Determination of the MidV bin based on the displacement of the manually delineated gross tumour volume (GTV.......4-5.4) mm, 1.9 (0.5-6.9) mm, 2.0 (0.5-12.3) mm and 1.1 (0.4-5.4) mm for the visual, rigid registration, diaphragm, carina, and reference method. Median (range) absolute difference between geometric MidV error for the evaluated methods and the reference method was 0.0 (0.0-1.2) mm, 0.0 (0.0-1.7) mm, 0.7 (0.......0-3.9) mm and 1.0 (0.0-6.9) mm for the visual, rigid registration, diaphragm and carina method. Conclusion. The visual and semi-automatic rigid registration methods were equivalent in accuracy for selecting the MidV bin of a 4DCT scan. The methods based on diaphragm and carina displacement cannot...

  17. The Islamic Ethics in the poetry of ‘Abdullah bin al-Mubarak (Arabic

    Directory of Open Access Journals (Sweden)

    Dr. Muhammad Ismail Bin Abdul Salam

    2017-01-01

    Full Text Available Abstract ‘Abdullah bin al-Mubark was born in Marw’ one of the prime cities in Khurasan, (nowadays in the surroundings of Afghanistan and Central Asia, in the year 118 AH. In addition to his many talents, achievements and abilities, ‘Abdullah bin al-Mubarak was also gifted in literacy, particularly in the art of poetry. He held an eloquent tongue which was recognized by all who conversed with him and his language displayed the nature of someone who had been taught well. Most of the poetry which has been recorded from him is actually his advice to others, whether they were close friends or high-ranking Caliphs and Rulers. The topics spoken of concerned the common issues which had arisen in his time (e.g. matters pertaining to theology, politics, the worldview, the community etc and as always, they contained much wisdom and hence the books of history have sealed them and recorded them. This research article discussed Biography of Abdullah ibn Al Mubarak, The Islamic Ethics in his poetry,\tImpact of Rhetoric on his poetry with special concentration on the four kinds i.e. citation, impact of Quranic words, Quranic pictorial and Quranic style on his poetry.

  18. Frequency-bin entanglement of ultra-narrow band non-degenerate photon pairs

    Science.gov (United States)

    Rieländer, Daniel; Lenhard, Andreas; Jime`nez Farìas, Osvaldo; Máttar, Alejandro; Cavalcanti, Daniel; Mazzera, Margherita; Acín, Antonio; de Riedmatten, Hugues

    2018-01-01

    We demonstrate frequency-bin entanglement between ultra-narrowband photons generated by cavity enhanced spontaneous parametric down conversion. Our source generates photon pairs in widely non-degenerate discrete frequency modes, with one photon resonant with a quantum memory material based on praseodymium doped crystals and the other photon at telecom wavelengths. Correlations between the frequency modes are analyzed using phase modulators and narrowband filters before detection. We show high-visibility two photon interference between the frequency modes, allowing us to infer a coherent superposition of the modes. We develop a model describing the state that we create and use it to estimate optimal measurements to achieve a violation of the Clauser-Horne (CH) Bell inequality under realistic assumptions. With these settings we perform a Bell test and show a significant violation of the CH inequality, thus proving the entanglement of the photons. Finally we demonstrate the compatibility with a quantum memory material by using a spectral hole in the praseodymium (Pr) doped crystal as spectral filter for measuring high-visibility two-photon interference. This demonstrates the feasibility of combining frequency-bin entangled photon pairs with Pr-based solid state quantum memories.

  19. Measuring Device for Air Speed in Macroporous Media and Its Application Inside Apple Storage Bins

    Directory of Open Access Journals (Sweden)

    Martin Geyer

    2018-02-01

    Full Text Available In cold storage facilities of fruit and vegetables, airflow is necessary for heat removal. The design of storage facilities influences the air speed in the surrounding of the product. Therefore, knowledge about airflow next to the product is important to plan the layout of cold stores adapted to the requirements of the products. A new sensing device (ASL, Air speed logger is developed for omnidirectional measurement of air speed between fruit or vegetables inside storage bins or in bulk. It consists of four interconnected plastic spheres with 80 mm diameter each, adapted to the size of apple fruit. In the free space between the spheres, silicon diodes are fixed for the airflow measurement based on a calorimetric principle. Battery and data logger are mounted inside the spheres. The device is calibrated in a wind tunnel in a measuring range of 0–1.3 m/s. Air speed measurements in fruit bulks on laboratory scale and in an industrial fruit store show air speeds in gaps between fruit with high stability at different airflow levels. Several devices can be placed between stored products for determination of the air speed distribution inside bulks or bin stacks in a storage room.

  20. Subtype Differentiation of Small (≤ 4 cm) Solid Renal Mass Using Volumetric Histogram Analysis of DWI at 3-T MRI.

    Science.gov (United States)

    Li, Anqin; Xing, Wei; Li, Haojie; Hu, Yao; Hu, Daoyu; Li, Zhen; Kamel, Ihab R

    2018-05-29

    The purpose of this article is to evaluate the utility of volumetric histogram analysis of apparent diffusion coefficient (ADC) derived from reduced-FOV DWI for small (≤ 4 cm) solid renal mass subtypes at 3-T MRI. This retrospective study included 38 clear cell renal cell carcinomas (RCCs), 16 papillary RCCs, 18 chromophobe RCCs, 13 minimal fat angiomyolipomas (AMLs), and seven oncocytomas evaluated with preoperative MRI. Volumetric ADC maps were generated using all slices of the reduced-FOV DW images to obtain histogram parameters, including mean, median, 10th percentile, 25th percentile, 75th percentile, 90th percentile, and SD ADC values, as well as skewness, kurtosis, and entropy. Comparisons of these parameters were made by one-way ANOVA, t test, and ROC curves analysis. ADC histogram parameters differentiated eight of 10 pairs of renal tumors. Three subtype pairs (clear cell RCC vs papillary RCC, clear cell RCC vs chromophobe RCC, and clear cell RCC vs minimal fat AML) were differentiated by mean ADC. However, five other subtype pairs (clear cell RCC vs oncocytoma, papillary RCC vs minimal fat AML, papillary RCC vs oncocytoma, chromophobe RCC vs minimal fat AML, and chromophobe RCC vs oncocytoma) were differentiated by histogram distribution parameters exclusively (all p histogram parameters yielded the highest AUC (0.851; sensitivity, 80.0%; specificity, 86.1%). Quantitative volumetric ADC histogram analysis may help differentiate various subtypes of small solid renal tumors, including benign and malignant lesions.

  1. Gliomas: Application of Cumulative Histogram Analysis of Normalized Cerebral Blood Volume on 3 T MRI to Tumor Grading

    Science.gov (United States)

    Kim, Hyungjin; Choi, Seung Hong; Kim, Ji-Hoon; Ryoo, Inseon; Kim, Soo Chin; Yeom, Jeong A.; Shin, Hwaseon; Jung, Seung Chai; Lee, A. Leum; Yun, Tae Jin; Park, Chul-Kee; Sohn, Chul-Ho; Park, Sung-Hye

    2013-01-01

    Background Glioma grading assumes significant importance in that low- and high-grade gliomas display different prognoses and are treated with dissimilar therapeutic strategies. The objective of our study was to retrospectively assess the usefulness of a cumulative normalized cerebral blood volume (nCBV) histogram for glioma grading based on 3 T MRI. Methods From February 2010 to April 2012, 63 patients with astrocytic tumors underwent 3 T MRI with dynamic susceptibility contrast perfusion-weighted imaging. Regions of interest containing the entire tumor volume were drawn on every section of the co-registered relative CBV (rCBV) maps and T2-weighted images. The percentile values from the cumulative nCBV histograms and the other histogram parameters were correlated with tumor grades. Cochran’s Q test and the McNemar test were used to compare the diagnostic accuracies of the histogram parameters after the receiver operating characteristic curve analysis. Using the parameter offering the highest diagnostic accuracy, a validation process was performed with an independent test set of nine patients. Results The 99th percentile of the cumulative nCBV histogram (nCBV C99), mean and peak height differed significantly between low- and high-grade gliomas (P = histogram analysis of nCBV using 3 T MRI can be a useful method for preoperative glioma grading. The nCBV C99 value is helpful in distinguishing high- from low-grade gliomas and grade IV from III gliomas. PMID:23704910

  2. Enhancing tumor apparent diffusion coefficient histogram skewness stratifies the postoperative survival in recurrent glioblastoma multiforme patients undergoing salvage surgery.

    Science.gov (United States)

    Zolal, Amir; Juratli, Tareq A; Linn, Jennifer; Podlesek, Dino; Sitoci Ficici, Kerim Hakan; Kitzler, Hagen H; Schackert, Gabriele; Sobottka, Stephan B; Rieger, Bernhard; Krex, Dietmar

    2016-05-01

    Objective To determine the value of apparent diffusion coefficient (ADC) histogram parameters for the prediction of individual survival in patients undergoing surgery for recurrent glioblastoma (GBM) in a retrospective cohort study. Methods Thirty-one patients who underwent surgery for first recurrence of a known GBM between 2008 and 2012 were included. The following parameters were collected: age, sex, enhancing tumor size, mean ADC, median ADC, ADC skewness, ADC kurtosis and fifth percentile of the ADC histogram, initial progression free survival (PFS), extent of second resection and further adjuvant treatment. The association of these parameters with survival and PFS after second surgery was analyzed using log-rank test and Cox regression. Results Using log-rank test, ADC histogram skewness of the enhancing tumor was significantly associated with both survival (p = 0.001) and PFS after second surgery (p = 0.005). Further parameters associated with prolonged survival after second surgery were: gross total resection at second surgery (p = 0.026), tumor size (0.040) and third surgery (p = 0.003). In the multivariate Cox analysis, ADC histogram skewness was shown to be an independent prognostic factor for survival after second surgery. Conclusion ADC histogram skewness of the enhancing lesion, enhancing lesion size, third surgery, as well as gross total resection have been shown to be associated with survival following the second surgery. ADC histogram skewness was an independent prognostic factor for survival in the multivariate analysis.

  3. Transmission usage cost allocation schemes

    International Nuclear Information System (INIS)

    Abou El Ela, A.A.; El-Sehiemy, R.A.

    2009-01-01

    This paper presents different suggested transmission usage cost allocation (TCA) schemes to the system individuals. Different independent system operator (ISO) visions are presented using the proportional rata and flow-based TCA methods. There are two proposed flow-based TCA schemes (FTCA). The first FTCA scheme generalizes the equivalent bilateral exchanges (EBE) concepts for lossy networks through two-stage procedure. The second FTCA scheme is based on the modified sensitivity factors (MSF). These factors are developed from the actual measurements of power flows in transmission lines and the power injections at different buses. The proposed schemes exhibit desirable apportioning properties and are easy to implement and understand. Case studies for different loading conditions are carried out to show the capability of the proposed schemes for solving the TCA problem. (author)

  4. Intelligent tactical asset allocation support system

    OpenAIRE

    Hiemstra, Y.

    1995-01-01

    This paper presents an advanced support system for Tactical Asset Allocation. Asset allocation explains over 90% of portfolio performance (Brinson, Hood and Beebower, 1988). Tactical asset allocation adjusts a strategic portfolio on the basis of short term market outlooks. The system includes aprediction model that forecasts quarterly excess returns on the S and PSOO, an optimization model that adjusts a user-specified strategic portfolio on thebasis of the excess return forecast, and a compo...

  5. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video.

    Science.gov (United States)

    Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.

  6. An intelligent allocation algorithm for parallel processing

    Science.gov (United States)

    Carroll, Chester C.; Homaifar, Abdollah; Ananthram, Kishan G.

    1988-01-01

    The problem of allocating nodes of a program graph to processors in a parallel processing architecture is considered. The algorithm is based on critical path analysis, some allocation heuristics, and the execution granularity of nodes in a program graph. These factors, and the structure of interprocessor communication network, influence the allocation. To achieve realistic estimations of the executive durations of allocations, the algorithm considers the fact that nodes in a program graph have to communicate through varying numbers of tokens. Coarse and fine granularities have been implemented, with interprocessor token-communication duration, varying from zero up to values comparable to the execution durations of individual nodes. The effect on allocation of communication network structures is demonstrated by performing allocations for crossbar (non-blocking) and star (blocking) networks. The algorithm assumes the availability of as many processors as it needs for the optimal allocation of any program graph. Hence, the focus of allocation has been on varying token-communication durations rather than varying the number of processors. The algorithm always utilizes as many processors as necessary for the optimal allocation of any program graph, depending upon granularity and characteristics of the interprocessor communication network.

  7. Cognitive radio networks dynamic resource allocation schemes

    CERN Document Server

    Wang, Shaowei

    2014-01-01

    This SpringerBrief presents a survey of dynamic resource allocation schemes in Cognitive Radio (CR) Systems, focusing on the spectral-efficiency and energy-efficiency in wireless networks. It also introduces a variety of dynamic resource allocation schemes for CR networks and provides a concise introduction of the landscape of CR technology. The author covers in detail the dynamic resource allocation problem for the motivations and challenges in CR systems. The Spectral- and Energy-Efficient resource allocation schemes are comprehensively investigated, including new insights into the trade-off

  8. Yet Another Method for Image Segmentation based on Histograms and Heuristics

    Directory of Open Access Journals (Sweden)

    Horia-Nicolai L. Teodorescu

    2012-07-01

    Full Text Available We introduce a method for image segmentation that requires little computations, yet providing comparable results to other methods. While the proposed method resembles to the known ones based on histograms, it is still different in the use of the gray level distribution. When to the basic procedure we add several heuristic rules, the method produces results that, in some cases, may outperform the results produced by the known methods. The paper reports preliminary results. More details on the method, improvements, and results will be presented in a future paper.

  9. Content Based Radiographic Images Indexing and Retrieval Using Pattern Orientation Histogram

    Directory of Open Access Journals (Sweden)

    Abolfazl Lakdashti

    2008-06-01

    Full Text Available Introduction: Content Based Image Retrieval (CBIR is a method of image searching and retrieval in a  database. In medical applications, CBIR is a tool used by physicians to compare the previous and current  medical images associated with patients pathological conditions. As the volume of pictorial information  stored in medical image databases is in progress, efficient image indexing and retrieval is increasingly  becoming a necessity.  Materials and Methods: This paper presents a new content based radiographic image retrieval approach  based on histogram of pattern orientations, namely pattern orientation histogram (POH. POH represents  the  spatial  distribution  of  five  different  pattern  orientations:  vertical,  horizontal,  diagonal  down/left,  diagonal down/right and non-orientation. In this method, a given image is first divided into image-blocks  and  the  frequency  of  each  type  of  pattern  is  determined  in  each  image-block.  Then,  local  pattern  histograms for each of these image-blocks are computed.   Results: The method was compared to two well known texture-based image retrieval methods: Tamura  and  Edge  Histogram  Descriptors  (EHD  in  MPEG-7  standard.  Experimental  results  based  on  10000  IRMA  radiography  image  dataset,  demonstrate  that  POH  provides  better  precision  and  recall  rates  compared to Tamura and EHD. For some images, the recall and precision rates obtained by POH are,  respectively, 48% and 18% better than the best of the two above mentioned methods.    Discussion and Conclusion: Since we exploit the absolute location of the pattern in the image as well as  its global composition, the proposed matching method can retrieve semantically similar medical images.

  10. Improved dose–volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    International Nuclear Information System (INIS)

    Cheng Lishui; Hobbs, Robert F; Sgouros, George; Frey, Eric C; Segars, Paul W

    2013-01-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose–volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator–detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  11. SUPERVISED AUTOMATIC HISTOGRAM CLUSTERING AND WATERSHED SEGMENTATION. APPLICATION TO MICROSCOPIC MEDICAL COLOR IMAGES

    Directory of Open Access Journals (Sweden)

    Olivier Lezoray

    2011-05-01

    Full Text Available In this paper, an approach to the segmentation of microscopic color images is addressed, and applied to medical images. The approach combines a clustering method and a region growing method. Each color plane is segmented independently relying on a watershed based clustering of the plane histogram. The marginal segmentation maps intersect in a label concordance map. The latter map is simplified based on the assumption that the color planes are correlated. This produces a simplified label concordance map containing labeled and unlabeled pixels. The formers are used as an image of seeds for a color watershed. This fast and robust segmentation scheme is applied to several types of medical images.

  12. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    Science.gov (United States)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less

  13. Mobile Visual Search Based on Histogram Matching and Zone Weight Learning

    Science.gov (United States)

    Zhu, Chuang; Tao, Li; Yang, Fan; Lu, Tao; Jia, Huizhu; Xie, Xiaodong

    2018-01-01

    In this paper, we propose a novel image retrieval algorithm for mobile visual search. At first, a short visual codebook is generated based on the descriptor database to represent the statistical information of the dataset. Then, an accurate local descriptor similarity score is computed by merging the tf-idf weighted histogram matching and the weighting strategy in compact descriptors for visual search (CDVS). At last, both the global descriptor matching score and the local descriptor similarity score are summed up to rerank the retrieval results according to the learned zone weights. The results show that the proposed approach outperforms the state-of-the-art image retrieval method in CDVS.

  14. Frontal Face Detection using Haar Wavelet Coefficients and Local Histogram Correlation

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2011-12-01

    Full Text Available Face detection is the main building block on which all automatic systems dealing with human faces is built. For example, a face recognition system must rely on face detection to process an input image and determine which areas contain human faces. These areas then become the input for the face recognition system for further processing. This paper presents a face detection system designed to detect frontal faces. The system uses Haar wavelet coefficients and local histogram correlation as differentiating features. Our proposed system is trained using 100 training images. Our experiments show that the proposed system performed well during testing, achieving a detection rate of 91.5%.

  15. Discovering bin-Laden’s Replacement in al-Qaeda, using Social Network Analysis: A Methodological Investigation

    Directory of Open Access Journals (Sweden)

    Edith Wu

    2014-02-01

    Full Text Available The removal of Osama bin-Laden created a leadership void within al-Qaeda. Despite the group’s autonomous cell structure, an authorative figure remains essential for promoting and disseminating al-Qaeda’s ideology. An appropriate replacement should exhibit traits comparable to bin-Laden and have similar positioning within the structure of the group. Using a media-based sample and social network analysis, this study attempts to uncover the most probable successor for bin-Laden by examining the dynamics within al-Qaeda. The results indicate how the differential embeddedness of al-Qaeda members affects social capital, which in turn provides insights for leadership potiential.

  16. Numerical simulation of turbulent flows past the RoBin helicopter with a four-bladed rotor

    Energy Technology Data Exchange (ETDEWEB)

    Xu, H.; Mamou, M.; Khalid, M. [National Research Council, Inst. for Aerospace Research, Ottawa, Ontario (Canada)]. E-mail: Hongyi.Xu@nrc.ca

    2003-07-01

    The current paper presents a turbulent flow simulation study past a generic helicopter RoBin with a four-bladed rotor using the Chimera moving grid approach. The aerodynamic performance of the rotor blades and their interactions with the RoBin fuselage are investigated using the k - {omega} SST turbulence model contained in the WIND code. The rotor is configured as a Chimera moving grid in a quasisteady flow field. The rotor blades are rectangular, untapered, linearly twisted and are made from NACA 0012 airfoil profile. The blade motion (rotation and cyclic pitching) schedule is specified in the NASA wind tunnel testing of a generic helicopter RoBin. The aerodynamic radial load distributions in the rotor plane are generated by integrating the pressure on each blade surfaces along the blade chordwise direction. The rotor flow interacts strongly with the flow coming off from the fuselage and thus has a significant impact on helicopter aerodynamic performance. (author)

  17. Using Growing Self-Organising Maps to Improve the Binning Process in Environmental Whole-Genome Shotgun Sequencing

    Science.gov (United States)

    Chan, Chon-Kit Kenneth; Hsu, Arthur L.; Tang, Sen-Lin; Halgamuge, Saman K.

    2008-01-01

    Metagenomic projects using whole-genome shotgun (WGS) sequencing produces many unassembled DNA sequences and small contigs. The step of clustering these sequences, based on biological and molecular features, is called binning. A reported strategy for binning that combines oligonucleotide frequency and self-organising maps (SOM) shows high potential. We improve this strategy by identifying suitable training features, implementing a better clustering algorithm, and defining quantitative measures for assessing results. We investigated the suitability of each of di-, tri-, tetra-, and pentanucleotide frequencies. The results show that dinucleotide frequency is not a sufficiently strong signature for binning 10 kb long DNA sequences, compared to the other three. Furthermore, we observed that increased order of oligonucleotide frequency may deteriorate the assignment result in some cases, which indicates the possible existence of optimal species-specific oligonucleotide frequency. We replaced SOM with growing self-organising map (GSOM) where comparable results are obtained while gaining 7%–15% speed improvement. PMID:18288261

  18. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    International Nuclear Information System (INIS)

    Lott, B.; Escande, L.; Larsson, S.; Ballet, J.

    2012-01-01

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  19. Three-dimensional volumetric gray-scale uterine cervix histogram prediction of days to delivery in full term pregnancy.

    Science.gov (United States)

    Kim, Ji Youn; Kim, Hai-Joong; Hahn, Meong Hi; Jeon, Hye Jin; Cho, Geum Joon; Hong, Sun Chul; Oh, Min Jeong

    2013-09-01

    Our aim was to figure out whether volumetric gray-scale histogram difference between anterior and posterior cervix can indicate the extent of cervical consistency. We collected data of 95 patients who were appropriate for vaginal delivery with 36th to 37th weeks of gestational age from September 2010 to October 2011 in the Department of Obstetrics and Gynecology, Korea University Ansan Hospital. Patients were excluded who had one of the followings: Cesarean section, labor induction, premature rupture of membrane. Thirty-four patients were finally enrolled. The patients underwent evaluation of the cervix through Bishop score, cervical length, cervical volume, three-dimensional (3D) cervical volumetric gray-scale histogram. The interval days from the cervix evaluation to the delivery day were counted. We compared to 3D cervical volumetric gray-scale histogram, Bishop score, cervical length, cervical volume with interval days from the evaluation of the cervix to the delivery. Gray-scale histogram difference between anterior and posterior cervix was significantly correlated to days to delivery. Its correlation coefficient (R) was 0.500 (P = 0.003). The cervical length was significantly related to the days to delivery. The correlation coefficient (R) and P-value between them were 0.421 and 0.013. However, anterior lip histogram, posterior lip histogram, total cervical volume, Bishop score were not associated with days to delivery (P >0.05). By using gray-scale histogram difference between anterior and posterior cervix and cervical length correlated with the days to delivery. These methods can be utilized to better help predict a cervical consistency.

  20. Modeling Early Postnatal Brain Growth and Development with CT: Changes in the Brain Radiodensity Histogram from Birth to 2 Years.

    Science.gov (United States)

    Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W

    2018-04-01

    The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.

  1. DNA IMAGE CYTOMETRY IN PROGNOSTICATION OF COLORECTAL CANCER: PRACTICAL CONSIDERATIONS OF THE TECHNIQUE AND INTERPRETATION OF THE HISTOGRAMS

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    2011-05-01

    Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.

  2. Modified strip packing heuristics for the rectangular variable-sized bin packing problem

    Directory of Open Access Journals (Sweden)

    FG Ortmann

    2010-06-01

    Full Text Available Two packing problems are considered in this paper, namely the well-known strip packing problem (SPP and the variable-sized bin packing problem (VSBPP. A total of 252 strip packing heuristics (and variations thereof from the literature, as well as novel heuristics proposed by the authors, are compared statistically by means of 1170 SPP benchmark instances in order to identify the best heuristics in various classes. A combination of new heuristics with a new sorting method yields the best results. These heuristics are combined with a previous heuristic for the VSBPP by the authors to find good feasible solutions to 1357 VSBPP benchmark instances. This is the largest statistical comparison of algorithms for the SPP and the VSBPP to the best knowledge of the authors.

  3. Face Image Retrieval of Efficient Sparse Code words and Multiple Attribute in Binning Image

    Directory of Open Access Journals (Sweden)

    Suchitra S

    2017-08-01

    Full Text Available ABSTRACT In photography, face recognition and face retrieval play an important role in many applications such as security, criminology and image forensics. Advancements in face recognition make easier for identity matching of an individual with attributes. Latest development in computer vision technologies enables us to extract facial attributes from the input image and provide similar image results. In this paper, we propose a novel LOP and sparse codewords method to provide similar matching results with respect to input query image. To improve accuracy in image results with input image and dynamic facial attributes, Local octal pattern algorithm [LOP] and Sparse codeword applied in offline and online. The offline and online procedures in face image binning techniques apply with sparse code. Experimental results with Pubfig dataset shows that the proposed LOP along with sparse codewords able to provide matching results with increased accuracy of 90%.

  4. Hyperentanglement concentration for polarization-spatial-time-bin hyperentangled photon systems with linear optics

    Science.gov (United States)

    Wang, Hong; Ren, Bao-Cang; Alzahrani, Faris; Hobiny, Aatef; Deng, Fu-Guo

    2017-10-01

    Hyperentanglement has significant applications in quantum information processing. Here we present an efficient hyperentanglement concentration protocol (hyper-ECP) for partially hyperentangled Bell states simultaneously entangled in polarization, spatial-mode and time-bin degrees of freedom (DOFs) with the parameter-splitting method, where the parameters of the partially hyperentangled Bell states are known to the remote parties. In this hyper-ECP, only one remote party is required to perform some local operations on the three DOFs of a photon, only the linear optical elements are considered, and the success probability can achieve the maximal value. Our hyper-ECP can be easily generalized to concentrate the N-photon partially hyperentangled Greenberger-Horne-Zeilinger states with known parameters, where the multiple DOFs have largely improved the channel capacity of long-distance quantum communication. All of these make our hyper-ECP more practical and useful in high-capacity long-distance quantum communication.

  5. An adaptive bin framework search method for a beta-sheet protein homopolymer model

    Directory of Open Access Journals (Sweden)

    Hoos Holger H

    2007-04-01

    Full Text Available Abstract Background The problem of protein structure prediction consists of predicting the functional or native structure of a protein given its linear sequence of amino acids. This problem has played a prominent role in the fields of biomolecular physics and algorithm design for over 50 years. Additionally, its importance increases continually as a result of an exponential growth over time in the number of known protein sequences in contrast to a linear increase in the number of determined structures. Our work focuses on the problem of searching an exponentially large space of possible conformations as efficiently as possible, with the goal of finding a global optimum with respect to a given energy function. This problem plays an important role in the analysis of systems with complex search landscapes, and particularly in the context of ab initio protein structure prediction. Results In this work, we introduce a novel approach for solving this conformation search problem based on the use of a bin framework for adaptively storing and retrieving promising locally optimal solutions. Our approach provides a rich and general framework within which a broad range of adaptive or reactive search strategies can be realized. Here, we introduce adaptive mechanisms for choosing which conformations should be stored, based on the set of conformations already stored in memory, and for biasing choices when retrieving conformations from memory in order to overcome search stagnation. Conclusion We show that our bin framework combined with a widely used optimization method, Monte Carlo search, achieves significantly better performance than state-of-the-art generalized ensemble methods for a well-known protein-like homopolymer model on the face-centered cubic lattice.

  6. Peer-Allocated Instant Response (PAIR): Computional allocation of peer tutors in learning communities

    NARCIS (Netherlands)

    Westera, Wim

    2009-01-01

    Westera, W. (2007). Peer-Allocated Instant Response (PAIR): Computational allocation of peer tutors in learning communities. Journal of Artificial Societies and Social Simulation, http://jasss.soc.surrey.ac.uk/10/2/5.html

  7. Online Data Monitoring Framework Based on Histogram Packaging in Network Distributed Data Acquisition Systems

    International Nuclear Information System (INIS)

    Konno, T; Ishitsuka, M; Kuze, M; Cabarera, A; Sakamoto, Y

    2011-01-01

    O nline monitor frameworkis a new general software framework for online data monitoring, which provides a way to collect information from online systems, including data acquisition, and displays them to shifters far from experimental sites. 'Monitor Server', a core system in this framework gathers the monitoring information from the online subsystems and the information is handled as collections of histograms named H istogram Package . Monitor Server broadcasts the histogram packages to 'Monitor Viewers', graphical user interfaces in the framework. We developed two types of the viewers with different technologies: Java and web browser. We adapted XML based file for the configuration of GUI components on the windows and graphical objects on the canvases. Monitor Viewer creates its GUIs automatically with the configuration files.This monitoring framework has been developed for the Double Chooz reactor neutrino oscillation experiment in France, but can be extended for general application to be used in other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  8. Radial polar histogram: obstacle avoidance and path planning for robotic cognition and motion control

    Science.gov (United States)

    Wang, Po-Jen; Keyawa, Nicholas R.; Euler, Craig

    2012-01-01

    In order to achieve highly accurate motion control and path planning for a mobile robot, an obstacle avoidance algorithm that provided a desired instantaneous turning radius and velocity was generated. This type of obstacle avoidance algorithm, which has been implemented in California State University Northridge's Intelligent Ground Vehicle (IGV), is known as Radial Polar Histogram (RPH). The RPH algorithm utilizes raw data in the form of a polar histogram that is read from a Laser Range Finder (LRF) and a camera. A desired open block is determined from the raw data utilizing a navigational heading and an elliptical approximation. The left and right most radii are determined from the calculated edges of the open block and provide the range of possible radial paths the IGV can travel through. In addition, the calculated obstacle edge positions allow the IGV to recognize complex obstacle arrangements and to slow down accordingly. A radial path optimization function calculates the best radial path between the left and right most radii and is sent to motion control for speed determination. Overall, the RPH algorithm allows the IGV to autonomously travel at average speeds of 3mph while avoiding all obstacles, with a processing time of approximately 10ms.

  9. AN ILLUMINATION INVARIANT FACE RECOGNITION BY ENHANCED CONTRAST LIMITED ADAPTIVE HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    A. Thamizharasi

    2016-05-01

    Full Text Available Face recognition system is gaining more importance in social networks and surveillance. The face recognition task is complex due to the variations in illumination, expression, occlusion, aging and pose. The illumination variations in image are due to changes in lighting conditions, poor illumination, low contrast or increased brightness. The variations in illumination adversely affect the quality of image and recognition accuracy. The illumination variations in face image have to be pre-processed prior to face recognition. The Contrast Limited Adaptive Histogram Equalization (CLAHE is an image enhancement technique popular in enhancing medical images. The proposed work is to create illumination invariant face recognition system by enhancing Contrast Limited Adaptive Histogram Equalization technique. This method is termed as “Enhanced CLAHE”. The efficiency of Enhanced CLAHE is tested using Fuzzy K Nearest Neighbour classifier and fisher face subspace projection method. The face recognition accuracy percentage rate, Equal Error Rate and False Acceptance Rate at 1% are calculated. The performance of CLAHE and Enhanced CLAHE methods is compared. The efficiency of the Enhanced CLAHE method is tested with three public face databases AR, Yale and ORL. The Enhanced CLAHE has very high recognition accuracy percentage rate when compared to CLAHE.

  10. Characterization of Diffusion Metric Map Similarity in Data From a Clinical Data Repository Using Histogram Distances

    Science.gov (United States)

    Warner, Graham C.; Helmer, Karl G.

    2018-01-01

    As the sharing of data is mandated by funding agencies and journals, reuse of data has become more prevalent. It becomes imperative, therefore, to develop methods to characterize the similarity of data. While users can group data based on the acquisition parameters stored in the file headers, these gives no indication whether a file can be combined with other data without increasing the variance in the data set. Methods have been implemented that characterize the signal-to-noise ratio or identify signal drop-outs in the raw image files, but potential users of data often have access to calculated metric maps and these are more difficult to characterize and compare. Here we describe a histogram-distance-based method applied to diffusion metric maps of fractional anisotropy and mean diffusivity that were generated using data extracted from a repository of clinically-acquired MRI data. We describe the generation of the data set, the pitfalls specific to diffusion MRI data, and the results of the histogram distance analysis. We find that, in general, data from GE scanners are less similar than are data from Siemens scanners. We also find that the distribution of distance metric values is not Gaussian at any selection of the acquisition parameters considered here (field strength, number of gradient directions, b-value, and vendor). PMID:29568257

  11. Predicting the nodal status in gastric cancers: The role of apparent diffusion coefficient histogram characteristic analysis.

    Science.gov (United States)

    Liu, Song; Zhang, Yujuan; Xia, Jie; Chen, Ling; Guan, Wenxian; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang

    2017-10-01

    To explore the application of histogram analysis in preoperative T and N staging of gastric cancers, with a focus on characteristic parameters of apparent diffusion coefficient (ADC) maps. Eighty-seven patients with gastric cancers underwent diffusion weighted magnetic resonance imaging (b=0, 1000s/mm 2 ), which generated ADC maps. Whole-volume histogram analysis was performed on ADC maps and 7 characteristic parameters were obtained. All those patients underwent surgery and postoperative pathologic T and N stages were determined. Four parameters, including skew, kurtosis, s-sD av and sample number, showed significant differences among gastric cancers at different T and N stages. Most parameters correlated with T and N stages significantly and worked in differentiating gastric cancers at different T or N stages. Especially skew yielded a sensitivity of 0.758, a specificity of 0.810, and an area under the curve (AUC) of 0.802 for differentiating gastric cancers with and without lymph node metastasis (Phistogram analysis could help assessing preoperative T and N stages of gastric cancers. Copyright © 2017. Published by Elsevier Inc.

  12. Conductance of single-atom platinum contacts: Voltage dependence of the conductance histogram

    DEFF Research Database (Denmark)

    Nielsen, S.K.; Noat, Y.; Brandbyge, Mads

    2003-01-01

    The conductance of a single-atom contact is sensitive to the coupling of this contact atom to the atoms in the leads. Notably for the transition metals this gives rise to a considerable spread in the observed conductance values. The mean conductance value and spread can be obtained from the first...... peak in conductance histograms recorded from a large set of contact-breaking cycles. In contrast to the monovalent metals, this mean value for Pt depends strongly on the applied voltage bias and other experimental conditions and values ranging from about 1 G(0) to 2.5 G(0) (G(0)=2e(2)/h) have been...... reported. We find that at low bias the first peak in the conductance histogram is centered around 1.5 G(0). However, as the bias increases past 300 mV the peak shifts to 1.8 G(0). Here we show that this bias dependence is due to a geometric effect where monatomic chains are replaced by single-atom contacts...

  13. Tools for the analysis of dose optimization: I. Effect-volume histogram

    International Nuclear Information System (INIS)

    Alber, M.; Nuesslin, F.

    2002-01-01

    With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level. (author)

  14. Two non-parametric methods for derivation of constraints from radiotherapy dose–histogram data

    International Nuclear Information System (INIS)

    Ebert, M A; Kennedy, A; Joseph, D J; Gulliford, S L; Buettner, F; Foo, K; Haworth, A; Denham, J W

    2014-01-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose–histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization. (note)

  15. Histogram analysis of greyscale sonograms to differentiate between the subtypes of follicular variant of papillary thyroid cancer.

    Science.gov (United States)

    Kwon, M-R; Shin, J H; Hahn, S Y; Oh, Y L; Kwak, J Y; Lee, E; Lim, Y

    2018-06-01

    To evaluate the diagnostic value of histogram analysis using ultrasound (US) to differentiate between the subtypes of follicular variant of papillary thyroid carcinoma (FVPTC). The present study included 151 patients with surgically confirmed FVPTC diagnosed between January 2014 and May 2016. Their preoperative US features were reviewed retrospectively. Histogram parameters (mean, maximum, minimum, range, root mean square, skewness, kurtosis, energy, entropy, and correlation) were obtained for each nodule. The 152 nodules in 151 patients comprised 48 non-invasive follicular thyroid neoplasm with papillary-like nuclear features (NIFTPs; 31.6%), 60 invasive encapsulated FVPTCs (EFVPTCs; 39.5%), and 44 infiltrative FVPTCs (28.9%). The US features differed significantly between the subtypes of FVPTC. Discrimination was achieved between NIFTPs and infiltrative FVPTC, and between invasive EFVPTC and infiltrative FVPTC using histogram parameters; however, the parameters were not significantly different between NIFTP and invasive EFVPTC. It is feasible to use greyscale histogram analysis to differentiate between NIFTP and infiltrative FVPTC, but not between NIFTP and invasive EFVPTC. Histograms can be used as a supplementary tool to differentiate the subtypes of FVPTC. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  16. Histogram analysis of apparent diffusion coefficient for monitoring early response in patients with advanced cervical cancers undergoing concurrent chemo-radiotherapy.

    Science.gov (United States)

    Meng, Jie; Zhu, Lijing; Zhu, Li; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng

    2017-11-01

    Background Apparent diffusion coefficient (ADC) histogram analysis has been widely used in determining tumor prognosis. Purpose To investigate the dynamic changes of ADC histogram parameters during concurrent chemo-radiotherapy (CCRT) in patients with advanced cervical cancers. Material and Methods This prospective study enrolled 32 patients with advanced cervical cancers undergoing CCRT who received diffusion-weighted (DW) magnetic resonance imaging (MRI) before CCRT, at the end of the second and fourth week during CCRT and one month after CCRT completion. The ADC histogram for the entire tumor volume was generated, and a series of histogram parameters was obtained. Dynamic changes of those parameters in cervical cancers were investigated as early biomarkers for treatment response. Results All histogram parameters except AUC low showed significant changes during CCRT (all P histogram parameters of cervical cancers changed significantly at the early stage of CCRT, indicating their potential in monitoring early tumor response to therapy.

  17. Time allocation of disabled individuals.

    Science.gov (United States)

    Pagán, Ricardo

    2013-05-01

    Although some studies have analysed the disability phenomenon and its effect on, for example, labour force participation, wages, job satisfaction, or the use of disability pension, the empirical evidence on how disability steals time (e.g. hours of work) from individuals is very scarce. This article examines how disabled individuals allocate their time to daily activities as compared to their non-disabled counterparts. Using time diary information from the Spanish Time Use Survey (last quarter of 2002 and the first three quarters of 2003), we estimate the determinants of time (minutes per day) spent on four aggregate categories (market work, household production, tertiary activities and leisure) for a sample of 27,687 non-disabled and 5250 disabled individuals and decompose the observed time differential by using the Oaxaca-Blinder methodology. The results show that disabled individuals devote less time to market work (especially females), and more time to household production (e.g. cooking, cleaning, child care), tertiary activities (e.g., sleeping, personal care, medical treatment) and leisure activities. We also find a significant effect of age on the time spent on daily activities and important differences by gender and disability status. The results are consistent with the hypothesis that disability steals time, and reiterate the fact that more public policies are needed to balance working life and health concerns among disabled individuals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    user

    In multivariate stratified sampling where more than one characteristic are to be estimated, an allocation which is optimum for one characteristic may not be optimum for other characteristics also. In such situations a compromise criterion is needed to work out a usable allocation which is optimum for all characteristics in some ...

  19. Bounds in the location-allocation problem

    DEFF Research Database (Denmark)

    Juel, Henrik

    1981-01-01

    Develops a family of stronger lower bounds on the objective function value of the location-allocation problem. Solution methods proposed to solve problems in location-allocation; Efforts to develop a more efficient bound solution procedure; Determination of the locations of the sources....

  20. A Time Allocation Study of University Faculty

    Science.gov (United States)

    Link, Albert N.; Swann, Christopher A.; Bozeman, Barry

    2008-01-01

    Many previous time allocation studies treat work as a single activity and examine trade-offs between work and other activities. This paper investigates the at-work allocation of time among teaching, research, grant writing and service by science and engineering faculty at top US research universities. We focus on the relationship between tenure…

  1. Obtaining a Proportional Allocation by Deleting Items

    NARCIS (Netherlands)

    Dorn, B.; de Haan, R.; Schlotter, I.; Röthe, J.

    2017-01-01

    We consider the following control problem on fair allocation of indivisible goods. Given a set I of items and a set of agents, each having strict linear preference over the items, we ask for a minimum subset of the items whose deletion guarantees the existence of a proportional allocation in the

  2. Directed networks, allocation properties and hierarchy formation

    NARCIS (Netherlands)

    Slikker, M.; Gilles, R.P.; Norde, H.W.; Tijs, S.H.

    2005-01-01

    We investigate properties for allocation rules on directed communication networks and the formation of such networks under these payoff properties. We study allocation rules satisfying two appealing properties, Component Efficiency (CE) and the Hierarchical Payoff Property (HPP). We show that such

  3. Nash social welfare in multiagent resource allocation

    NARCIS (Netherlands)

    Ramezani, S.; Endriss, U.; David, E.; Gerding, E.; Sarne, D.; Shehory, O.

    2010-01-01

    We study different aspects of the multiagent resource allocation problem when the objective is to find an allocation that maximizes Nash social welfare, the product of the utilities of the individual agents. The Nash solution is an important welfare criterion that combines efficiency and fairness

  4. Risk and reliability allocation to risk control

    International Nuclear Information System (INIS)

    Vojnovic, D.; Kozuh, M.

    1992-01-01

    The risk allocation procedure is used as an analytical model to support the optimal decision making for reliability/availability improvement planning. Both levels of decision criteria, the plant risk measures and plant performance indices, are used in risk allocation procedure. Decision support system uses the multi objective decision making concept. (author) [sl

  5. Bidding for surplus in network allocation problems

    NARCIS (Netherlands)

    Slikker, M.

    2007-01-01

    In this paper we study non-cooperative foundations of network allocation rules. We focus on three allocation rules: the Myerson value, the position value and the component-wise egalitarian solution. For any of these three rules we provide a characterization based on component efficiency and some

  6. Cost Allocation and Convex Data Envelopment

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tind, Jørgen

    such as Data Envelopment Analysis (DEA). The convexity constraint of the BCC model introduces a non-zero slack in the objective function of the multiplier problem and we show that the cost allocation rules discussed in this paper can be used as candidates to allocate this slack value on to the input (or output...

  7. Optimal allocation of resources in systems

    International Nuclear Information System (INIS)

    Derman, C.; Lieberman, G.J.; Ross, S.M.

    1975-01-01

    In the design of a new system, or the maintenance of an old system, allocation of resources is of prime consideration. In allocating resources it is often beneficial to develop a solution that yields an optimal value of the system measure of desirability. In the context of the problems considered in this paper the resources to be allocated are components already produced (assembly problems) and money (allocation in the construction or repair of systems). The measure of desirability for system assembly will usually be maximizing the expected number of systems that perform satisfactorily and the measure in the allocation context will be maximizing the system reliability. Results are presented for these two types of general problems in both a sequential (when appropriate) and non-sequential context

  8. Correlation between surrogates of bladder dosimetry and dose–volume histograms of the bladder wall defined on MRI in prostate cancer radiotherapy

    International Nuclear Information System (INIS)

    Carillo, Viviana; Cozzarini, Cesare; Chietera, Andreina; Perna, Lucia; Gianolini, Stefano; Maggio, Angelo; Botti, Andrea; Rancati, Tiziana; Valdagni, Riccardo; Fiorino, Claudio

    2012-01-01

    The correlation between bladder dose–wall-histogram (DWH) and dose–volume-histogram (DVH), dose–surface-histogram (DSH), and DVH-5/10 was investigated in a group of 28 patients; bladder walls were drawn on T2-MRI. DVH showed the poorest correlation with DWH; DSH or DVH-5/10 should be preferred in planning; absolute DVH may be used for radical patients, although less robust.

  9. Single-Cell-Genomics-Facilitated Read Binning of Candidate Phylum EM19 Genomes from Geothermal Spring Metagenomes.

    Science.gov (United States)

    Becraft, Eric D; Dodsworth, Jeremy A; Murugapiran, Senthil K; Ohlsson, J Ingemar; Briggs, Brandon R; Kanbar, Jad; De Vlaminck, Iwijn; Quake, Stephen R; Dong, Hailiang; Hedlund, Brian P; Swingley, Wesley D

    2016-02-15

    The vast majority of microbial life remains uncatalogued due to the inability to cultivate these organisms in the laboratory. This "microbial dark matter" represents a substantial portion of the tree of life and of the populations that contribute to chemical cycling in many ecosystems. In this work, we leveraged an existing single-cell genomic data set representing the candidate bacterial phylum "Calescamantes" (EM19) to calibrate machine learning algorithms and define metagenomic bins directly from pyrosequencing reads derived from Great Boiling Spring in the U.S. Great Basin. Compared to other assembly-based methods, taxonomic binning with a read-based machine learning approach yielded final assemblies with the highest predicted genome completeness of any method tested. Read-first binning subsequently was used to extract Calescamantes bins from all metagenomes with abundant Calescamantes populations, including metagenomes from Octopus Spring and Bison Pool in Yellowstone National Park and Gongxiaoshe Spring in Yunnan Province, China. Metabolic reconstruction suggests that Calescamantes are heterotrophic, facultative anaerobes, which can utilize oxidized nitrogen sources as terminal electron acceptors for respiration in the absence of oxygen and use proteins as their primary carbon source. Despite their phylogenetic divergence, the geographically separate Calescamantes populations were highly similar in their predicted metabolic capabilities and core gene content, respiring O2, or oxidized nitrogen species for energy conservation in distant but chemically similar hot springs. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  10. Odor volatiles associated with microflora in damp ventilated and non-ventilated bin-stored bulk wheat.

    Science.gov (United States)

    Tuma, D; Sinha, R N; Muir, W E; Abramson, D

    1989-05-01

    Western hard red spring wheat, stored at 20 and 25% moisture contents for 10 months during 1985-86, was monitored for biotic and abiotic variables in 10 unheated bins in Winnipeg, Manitoba. The major odor volatiles identified were 3-methyl-1-butanol, 3-octanone and 1-octen-3-ol. The production of these volatiles was associated and correlated with microfloral infection. Ventilation, used for cooling and drying of grain, disrupted microfloral growth patterns and production of volatiles. The highest levels of 3-methyl-1-butanol occurred in 25% moisture content wheat infected with bacteria, Penicillium spp. and Fusarium spp. In non-ventilated (control) bins with 20% moisture content wheat, 3-methyl-1-butanol was correlated with infection by members of the Aspergillus glaucus group and bacteria. In control bins, 1-octen-3-ol production was correlated with infection of wheat of both moisture contents by Penicillium spp. The fungal species, isolated from damp bin-stored wheat and tested for production of odor volatiles on wheat substrate, included Alternaria alternata (Fr.) Keissler, Aspergillus repens (Corda) Saccardo, A. flavus Link ex Fries, A. versicolor (Vuill.) Tiraboschi, Penicillium chrysogenum Thom, P. cyclopium Westling, Fusarium moniliforme Sheldon, F. semitectum (Cooke) Sacc. In the laboratory, fungus-inoculated wheat produced 3-methyl-1-butanol; 3-octanone and 1-octen-3-ol were also produced, but less frequently. Two unidentified bacterial species isolated from damp wheat and inoculated on agar produced 3-methyl-1-butanol.

  11. APT Blanket Detailed Bin Model Based on Initial Plate-Type Design -3D FLOWTRAN-TF Model

    International Nuclear Information System (INIS)

    Hamm, L.L.

    1998-01-01

    This report provides background information for a series of reports documenting accident scenario simulations for the Accelerator Production of Tritium (APT) blanket heat removal systems. The simulations were performed in support of the Preliminary Safety Analysis Report for the APT. This report gives a brief description of the FLOWTRAN-TF code which was used for detailed blanket bin modeling

  12. Test Anxiety & Its Relation to Perceived Academic Self-Efficacy among Al Hussein Bin Talal University Students

    Science.gov (United States)

    sa'ad alzboon, Habis

    2016-01-01

    This study aimed to identify the degree of perceived academic self-efficacy and the relationship nature between test anxiety and perceived academic self-efficacy among students of Al Hussein Bin Talal University (AHU). Moreover, to identify the degree of available statistical significance differences that are attributed to gender, college and…

  13. Internet Addiction and Its Relationship with Self-Efficacy Level among Al-Hussein Bin Talal University Students

    Science.gov (United States)

    Alrekebat, Amjad Farhan

    2016-01-01

    The aim of this study was to identify the Internet addiction and its relationship to self-efficacy level among Al-Hussein Bin Talal University students. The study sample consisted of 300 female and male students, who were selected randomly. The participants completed a questionnaire that consisted of two scales: Internet addiction which was…

  14. Mida teie oma tervise heaks teete? / Ly Jagor, Ülle Mihhailova, Ene Pill, Katrin Käbin...[jt.

    Index Scriptorium Estoniae

    2008-01-01

    Küsimusele vastavad Pärnu Õppenõustamiskeskuse psühholoog Ly Jagor, Maasika lasteaia juhataja asetäitja Ülle Mihhailova, Tallinna Perekeskuse ja Tähetorni lastekeskuse psühholoog Ene Pill, Tallinna Nõmme Noortemaja väikelaste ringijuht Katrin Käbin, Tääksi Põhikooli õpetaja Silva Kolk

  15. Condition monitoring of face milling tool using K-star algorithm and histogram features of vibration signal

    Directory of Open Access Journals (Sweden)

    C.K. Madhusudana

    2016-09-01

    Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.

  16. Expression robust 3D face recognition via mesh-based histograms of multiple order surface differential quantities

    KAUST Repository

    Li, Huibin

    2011-09-01

    This paper presents a mesh-based approach for 3D face recognition using a novel local shape descriptor and a SIFT-like matching process. Both maximum and minimum curvatures estimated in the 3D Gaussian scale space are employed to detect salient points. To comprehensively characterize 3D facial surfaces and their variations, we calculate weighted statistical distributions of multiple order surface differential quantities, including histogram of mesh gradient (HoG), histogram of shape index (HoS) and histogram of gradient of shape index (HoGS) within a local neighborhood of each salient point. The subsequent matching step then robustly associates corresponding points of two facial surfaces, leading to much more matched points between different scans of a same person than the ones of different persons. Experimental results on the Bosphorus dataset highlight the effectiveness of the proposed method and its robustness to facial expression variations. © 2011 IEEE.

  17. Whole-tumour diffusion kurtosis MR imaging histogram analysis of rectal adenocarcinoma: Correlation with clinical pathologic prognostic factors.

    Science.gov (United States)

    Cui, Yanfen; Yang, Xiaotang; Du, Xiaosong; Zhuo, Zhizheng; Xin, Lei; Cheng, Xintao

    2018-04-01

    To investigate potential relationships between diffusion kurtosis imaging (DKI)-derived parameters using whole-tumour volume histogram analysis and clinicopathological prognostic factors in patients with rectal adenocarcinoma. 79 consecutive patients who underwent MRI examination with rectal adenocarcinoma were retrospectively evaluated. Parameters D, K and conventional ADC were measured using whole-tumour volume histogram analysis. Student's t-test or Mann-Whitney U-test, receiver operating characteristic curves and Spearman's correlation were used for statistical analysis. Almost all the percentile metrics of K were correlated positively with nodal involvement, higher histological grades, the presence of lymphangiovascular invasion (LVI) and circumferential margin (CRM) (phistogram analysis, especially K parameters, were associated with important prognostic factors of rectal cancer. • K correlated positively with some important prognostic factors of rectal cancer. • K mean showed higher AUC and specificity for differentiation of nodal involvement. • DKI metrics with whole-tumour volume histogram analysis depicted tumour heterogeneity.

  18. Influence of Sampling Practices on the Appearance of DNA Image Histograms of Prostate Cells in FNAB Samples

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    1999-01-01

    Full Text Available Twenty‐one fine needle aspiration biopsies (FNAB of the prostate, diagnostically classified as definitely malignant, were studied. The Papanicolaou or H&E stained samples were destained and then stained for DNA with the Feulgen reaction. DNA cytometry was applied after different sampling rules. The histograms varied according to the sampling rule applied. Because free cells between cell groups were easier to measure than cells in the cell groups, two sampling rules were tested in all samples: (i cells in the cell groups were measured, and (ii free cells between cell groups were measured. Abnormal histograms were more common after the sampling rule based on free cells, suggesting that abnormal patterns are best revealed through the free cells in these samples. The conclusions were independent of the applied histogram interpretation method.

  19. Histogram analysis of diffusion kurtosis imaging derived maps may distinguish between low and high grade gliomas before surgery.

    Science.gov (United States)

    Qi, Xi-Xun; Shi, Da-Fa; Ren, Si-Xie; Zhang, Su-Ya; Li, Long; Li, Qing-Chang; Guan, Li-Ming

    2018-04-01

    To investigate the value of histogram analysis of diffusion kurtosis imaging (DKI) maps in the evaluation of glioma grading. A total of 39 glioma patients who underwent preoperative magnetic resonance imaging (MRI) were classified into low-grade (13 cases) and high-grade (26 cases) glioma groups. Parametric DKI maps were derived, and histogram metrics between low- and high-grade gliomas were analysed. The optimum diagnostic thresholds of the parameters, area under the receiver operating characteristic curve (AUC), sensitivity, and specificity were achieved using a receiver operating characteristic (ROC). Significant differences were observed not only in 12 metrics of histogram DKI parameters (PHistogram analysis of DKI may be more effective in glioma grading.

  20. Assessment of histological differentiation in gastric cancers using whole-volume histogram analysis of apparent diffusion coefficient maps.

    Science.gov (United States)

    Zhang, Yujuan; Chen, Jun; Liu, Song; Shi, Hua; Guan, Wenxian; Ji, Changfeng; Guo, Tingting; Zheng, Huanhuan; Guan, Yue; Ge, Yun; He, Jian; Zhou, Zhengyang; Yang, Xiaofeng; Liu, Tian

    2017-02-01

    To investigate the efficacy of histogram analysis of the entire tumor volume in apparent diffusion coefficient (ADC) maps for differentiating between histological grades in gastric cancer. Seventy-eight patients with gastric cancer were enrolled in a retrospective 3.0T magnetic resonance imaging (MRI) study. ADC maps were obtained at two different b values (0 and 1000 sec/mm 2 ) for each patient. Tumors were delineated on each slice of the ADC maps, and a histogram for the entire tumor volume was subsequently generated. A series of histogram parameters (eg, skew and kurtosis) were calculated and correlated with the histological grade of the surgical specimen. The diagnostic performance of each parameter for distinguishing poorly from moderately well-differentiated gastric cancers was assessed by using the area under the receiver operating characteristic curve (AUC). There were significant differences in the 5 th , 10 th , 25 th , and 50 th percentiles, skew, and kurtosis between poorly and well-differentiated gastric cancers (P histogram parameters, including the 10 th percentile, skew, kurtosis, and max frequency; the correlation coefficients were 0.273, -0.361, -0.339, and -0.370, respectively. Among all the histogram parameters, the max frequency had the largest AUC value, which was 0.675. Histogram analysis of the ADC maps on the basis of the entire tumor volume can be useful in differentiating between histological grades for gastric cancer. 4 J. Magn. Reson. Imaging 2017;45:440-449. © 2016 International Society for Magnetic Resonance in Medicine.

  1. Histogram Analysis of CT Perfusion of Hepatocellular Carcinoma for Predicting Response to Transarterial Radioembolization: Value of Tumor Heterogeneity Assessment.

    Science.gov (United States)

    Reiner, Caecilia S; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz; Schaefer, Niklaus; Veit-Haibach, Patrick; Pfammatter, Thomas; Alkadhi, Hatem

    2016-03-01

    To evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE). Sixteen patients (15 male; mean age 65 years; age range 47-80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters' ability to discriminate responders from non-responders. According to mRECIST, 8 patients (50%) were responders and 8 (50%) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min(-1) 100 mL(-1)); p 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min(-1) 100 mL(-1), therapy response could be predicted with a sensitivity of 88% (7/8) and specificity of 75% (6/8). Voxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.

  2. The histogram analysis of diffusion-weighted intravoxel incoherent motion (IVIM) imaging for differentiating the gleason grade of prostate cancer.

    Science.gov (United States)

    Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Hui; Liu, Xi-Sheng; Shi, Hai-Bin

    2015-04-01

    To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm(2)). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p Histogram D showed relatively higher correlations (ñ = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. • GS had relatively higher correlation with tumour D than ADCs. • Difference of histogram D among two-grade tumours was statistically significant. • D yielded better individual features in demonstrating tumour grade than ADC. • D* and f failed to determine tumour grade of PCa.

  3. Measuring the apparent diffusion coefficient in primary rectal tumors: is there a benefit in performing histogram analyses?

    Science.gov (United States)

    van Heeswijk, Miriam M; Lambregts, Doenja M J; Maas, Monique; Lahaye, Max J; Ayas, Z; Slenter, Jos M G M; Beets, Geerard L; Bakers, Frans C H; Beets-Tan, Regina G H

    2017-06-01

    The apparent diffusion coefficient (ADC) is a potential prognostic imaging marker in rectal cancer. Typically, mean ADC values are used, derived from precise manual whole-volume tumor delineations by experts. The aim was first to explore whether non-precise circular delineation combined with histogram analysis can be a less cumbersome alternative to acquire similar ADC measurements and second to explore whether histogram analyses provide additional prognostic information. Thirty-seven patients who underwent a primary staging MRI including diffusion-weighted imaging (DWI; b0, 25, 50, 100, 500, 1000; 1.5 T) were included. Volumes-of-interest (VOIs) were drawn on b1000-DWI: (a) precise delineation, manually tracing tumor boundaries (2 expert readers), and (b) non-precise delineation, drawing circular VOIs with a wide margin around the tumor (2 non-experts). Mean ADC and histogram metrics (mean, min, max, median, SD, skewness, kurtosis, 5th-95th percentiles) were derived from the VOIs and delineation time was recorded. Measurements were compared between the two methods and correlated with prognostic outcome parameters. Median delineation time reduced from 47-165 s (precise) to 21-43 s (non-precise). The 45th percentile of the non-precise delineation showed the best correlation with the mean ADC from the precise delineation as the reference standard (ICC 0.71-0.75). None of the mean ADC or histogram parameters showed significant prognostic value; only the total tumor volume (VOI) was significantly larger in patients with positive clinical N stage and mesorectal fascia involvement. When performing non-precise tumor delineation, histogram analysis (in specific 45th ADC percentile) may be used as an alternative to obtain similar ADC values as with precise whole tumor delineation. Histogram analyses are not beneficial to obtain additional prognostic information.

  4. Support vector machine for breast cancer classification using diffusion-weighted MRI histogram features: Preliminary study.

    Science.gov (United States)

    Vidić, Igor; Egnell, Liv; Jerome, Neil P; Teruel, Jose R; Sjøbakk, Torill E; Østlie, Agnes; Fjøsne, Hans E; Bathen, Tone F; Goa, Pål Erik

    2018-05-01

    Diffusion-weighted MRI (DWI) is currently one of the fastest developing MRI-based techniques in oncology. Histogram properties from model fitting of DWI are useful features for differentiation of lesions, and classification can potentially be improved by machine learning. To evaluate classification of malignant and benign tumors and breast cancer subtypes using support vector machine (SVM). Prospective. Fifty-one patients with benign (n = 23) and malignant (n = 28) breast tumors (26 ER+, whereof six were HER2+). Patients were imaged with DW-MRI (3T) using twice refocused spin-echo echo-planar imaging with echo time / repetition time (TR/TE) = 9000/86 msec, 90 × 90 matrix size, 2 × 2 mm in-plane resolution, 2.5 mm slice thickness, and 13 b-values. Apparent diffusion coefficient (ADC), relative enhanced diffusivity (RED), and the intravoxel incoherent motion (IVIM) parameters diffusivity (D), pseudo-diffusivity (D*), and perfusion fraction (f) were calculated. The histogram properties (median, mean, standard deviation, skewness, kurtosis) were used as features in SVM (10-fold cross-validation) for differentiation of lesions and subtyping. Accuracies of the SVM classifications were calculated to find the combination of features with highest prediction accuracy. Mann-Whitney tests were performed for univariate comparisons. For benign versus malignant tumors, univariate analysis found 11 histogram properties to be significant differentiators. Using SVM, the highest accuracy (0.96) was achieved from a single feature (mean of RED), or from three feature combinations of IVIM or ADC. Combining features from all models gave perfect classification. No single feature predicted HER2 status of ER + tumors (univariate or SVM), although high accuracy (0.90) was achieved with SVM combining several features. Importantly, these features had to include higher-order statistics (kurtosis and skewness), indicating the importance to account for heterogeneity. Our

  5. PROCESS PERFORMANCE EVALUATION USING HISTOGRAM AND TAGUCHI TECHNIQUE IN LOCK MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Hagos Berhane

    2013-12-01

    Full Text Available Process capability analysis is a vital part of an overall quality improvement program. It is a technique that has application in many segments of the product cycle, including product and process design, vendor sourcing, production or manufacturing planning, and manufacturing. Frequently, a process capability study involves observing a quality characteristic of the product. Since this information usually pertains to the product rather than the process, this analysis should strictly speaking be called a product analysis study. A true process capability study in this context would involve collecting data that relates to process parameters so that remedial actions can be identified on a timely basis. The present study attempts to analyze performance of drilling, pressing, and reaming operations carried out for the manufacturing of two major lock components viz. handle and lever plate, at Gaurav International, Aligarh (India. The data collected for depth of hole on handle, central hole diameter, and key hole diameter are used to construct histogram. Next, the information available in frequency distribution table, the process mean, process capability from calculations and specification limits provided by the manufacturing concern are used with Taguchi technique. The data obtained from histogram and Taguchi technique combined are used to evaluate the performance of the manufacturing process. Results of this study indicated that the performance of all the processes used to produce depth of hole on handle, key hole diameter, and central hole diameter are potentially incapable as the process capability indices are found to be 0.54, 0.54 and 0.76 respectively. The number of nonconforming parts expressed in terms of parts per million (ppm that have fallen out of the specification limits are found to be 140000, 26666.66, and 146666.66 for depth of hole on handle, central hole diameter, and key hole diameter respectively. As a result, the total loss incurred

  6. A 222 energy bins response matrix for a {sup 6}Lil scintillator Bss system

    Energy Technology Data Exchange (ETDEWEB)

    Lacerda, M. A. S. [Centro de Desenvolvimento da Tecnologia Nuclear, Laboratorio de Calibracao de Dosimetros, Av. Pte. Antonio Carlos 6627, 31270-901 Pampulha, Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico); Mendez V, R. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Laboratorio de Patrones Neutronicos, Av. Complutense 22, 28040 Madrid (Spain); Lorente F, A.; Ibanez F, S.; Gallego D, E., E-mail: masl@cdtn.br [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, 28006 Madrid (Spain)

    2016-10-15

    A new response matrix was calculated for a Bonner Sphere Spectrometer (Bss) with a {sup 6}Lil(Eu) scintillator. We utilized the Monte Carlo N-particle radiation transport code MCNPX, version 2.7.0, with Endf/B-VII.0 nuclear data library to calculate the responses for 6 spheres and the bare detector, for energies varying from 9.441 E(-10) MeV to 105.9 MeV, with 20 equal-log(E)-width bins per energy decade, totalizing 222 energy groups. A Bss, like the modeled in this work, was utilized to measure the neutron spectrum generated by the {sup 241}AmBe source of the Universidad Politecnica de Madrid. From the count rates obtained with this Bss system we unfolded neutron spectrum utilizing the BUNKIUT code for 31 energy bins (UTA-4 response matrix) and the MAXED code with the new calculated response functions. We compared spectra obtained with these Bss system / unfold codes with that obtained from measurements performed with a Bss system constituted of 12 spheres with a spherical {sup 3}He Sp-9 counter (Centronic Ltd., UK) and MAXED code with the system-specific response functions (Bss-CIEMAT). A relatively good agreement was observed between our response matrix and that calculated by other authors. In general, we observed an improvement in the agreement as the energy increases. However, higher discrepancies were observed for energies close to 1-E(-8) MeV and, mainly, for energies above 20 MeV. These discrepancies were mainly attributed to the differences in cross-section libraries employed. The ambient dose equivalent (H (10)) calculated with the {sup 6}Lil-MAXED showed a good agreement with values measured with the neutron area monitor Bert hold Lb 6411 and within 12% the value obtained with another Bss system (Bss-CIEMAT). The response matrix calculated in this work can be utilized together with the MAXED code to generate neutron spectra with a good energy resolution up to 20 MeV. Some additional tests are being done to validate this response matrix and improve the

  7. A 222 energy bins response matrix for a "6Lil scintillator Bss system

    International Nuclear Information System (INIS)

    Lacerda, M. A. S.; Vega C, H. R.; Mendez V, R.; Lorente F, A.; Ibanez F, S.; Gallego D, E.

    2016-10-01

    A new response matrix was calculated for a Bonner Sphere Spectrometer (Bss) with a "6Lil(Eu) scintillator. We utilized the Monte Carlo N-particle radiation transport code MCNPX, version 2.7.0, with Endf/B-VII.0 nuclear data library to calculate the responses for 6 spheres and the bare detector, for energies varying from 9.441 E(-10) MeV to 105.9 MeV, with 20 equal-log(E)-width bins per energy decade, totalizing 222 energy groups. A Bss, like the modeled in this work, was utilized to measure the neutron spectrum generated by the "2"4"1AmBe source of the Universidad Politecnica de Madrid. From the count rates obtained with this Bss system we unfolded neutron spectrum utilizing the BUNKIUT code for 31 energy bins (UTA-4 response matrix) and the MAXED code with the new calculated response functions. We compared spectra obtained with these Bss system / unfold codes with that obtained from measurements performed with a Bss system constituted of 12 spheres with a spherical "3He Sp-9 counter (Centronic Ltd., UK) and MAXED code with the system-specific response functions (Bss-CIEMAT). A relatively good agreement was observed between our response matrix and that calculated by other authors. In general, we observed an improvement in the agreement as the energy increases. However, higher discrepancies were observed for energies close to 1-E(-8) MeV and, mainly, for energies above 20 MeV. These discrepancies were mainly attributed to the differences in cross-section libraries employed. The ambient dose equivalent (H (10)) calculated with the "6Lil-MAXED showed a good agreement with values measured with the neutron area monitor Bert hold Lb 6411 and within 12% the value obtained with another Bss system (Bss-CIEMAT). The response matrix calculated in this work can be utilized together with the MAXED code to generate neutron spectra with a good energy resolution up to 20 MeV. Some additional tests are being done to validate this response matrix and improve the results for energies

  8. Identifying Memory Allocation Patterns in HEP Software

    Science.gov (United States)

    Kama, S.; Rauschmayr, N.

    2017-10-01

    HEP applications perform an excessive amount of allocations/deallocations within short time intervals which results in memory churn, poor locality and performance degradation. These issues are already known for a decade, but due to the complexity of software frameworks and billions of allocations for a single job, up until recently no efficient mechanism has been available to correlate these issues with source code lines. However, with the advent of the Big Data era, many tools and platforms are now available to do large scale memory profiling. This paper presents, a prototype program developed to track and identify each single (de-)allocation. The CERN IT Hadoop cluster is used to compute memory key metrics, like locality, variation, lifetime and density of allocations. The prototype further provides a web based visualization back-end that allows the user to explore the results generated on the Hadoop cluster. Plotting these metrics for every single allocation over time gives a new insight into application’s memory handling. For instance, it shows which algorithms cause which kind of memory allocation patterns, which function flow causes how many short-lived objects, what are the most commonly allocated sizes etc. The paper will give an insight into the prototype and will show profiling examples for the LHC reconstruction, digitization and simulation jobs.

  9. The Effectiveness of the Curriculum Biography of the Prophet in the Development of Social Intelligence Skills of Al-Hussein Bin Talal University Students

    Science.gov (United States)

    Al-Khateeb, Omar; Alrub, Mohammad Abo

    2015-01-01

    This study aimed to find out how the effectiveness of the curriculum biography of the Prophet in the development of social intelligence skills of Al-Hussein Bin Talal University students and the study sample consisted of 365 students from Al-Hussein Bin Talal University for the first semester 2014-2015 students were selected in accessible manner.…

  10. Worst-case analysis of heap allocations

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Huber, Benedikt; Schoeberl, Martin

    2010-01-01

    the worst-case heap allocations of tasks. The analysis builds upon techniques that are well established for worst-case execution time analysis. The difference is that the cost function is not the execution time of instructions in clock cycles, but the allocation in bytes. In contrast to worst-case execution...... time analysis, worst-case heap allocation analysis is not processor dependent. However, the cost function depends on the object layout of the runtime system. The analysis is evaluated with several real-time benchmarks to establish the usefulness of the analysis, and to compare the memory consumption...

  11. Dynamic Allocation of Sugars in Barley

    Science.gov (United States)

    Cumberbatch, L. C.; Crowell, A. S.; Fallin, B. A.; Howell, C. R.; Reid, C. D.; Weisenberger, A. G.; Lee, S. J.; McKisson, J. E.

    2014-03-01

    Allocation of carbon and nitrogen is a key factor for plant productivity. Measurements are carried out by tracing 11C-tagged sugars using positron emission tomography and coincidence counting. We study the mechanisms of carbon allocation and transport from carbohydrate sources (leaves) to sinks (stem, shoot, roots) under various environmental conditions such as soil nutrient levels and atmospheric CO2 concentration. The data are analyzed using a transfer function analysis technique to model transport and allocation in barley plants. The experimental technique will be described and preliminary results presented. This work was supported in part by USDOE Grant No. DE-FG02-97-ER41033 and DE-SC0005057.

  12. Modular routing interface for simoultaneous list mode and histogramming mode storage of coincident data

    International Nuclear Information System (INIS)

    D'Achard van Eschut, J.F.M.; Nationaal Inst. voor Kernfysica en Hoge-Energiefysica

    1985-01-01

    A routing interface has been developed and built for successive storage of the digital output of four 13-bit ADCs, within 6 μs, into selected parts of two 16K CAMAC histogramming modules and, if an event trigger is applied, simultaneously into four 64-words deep (16-bit) first-in first-out (FIFO) CAMAC modules. In this way it is possible to accumulate on-line single spectra and, at the same time, write coincident data in list mode to magnetic tape under control of a computer. Additional routing interfaces can be used in parallel so that extensive data-collecting systems can be set up to store multi-parameter events. (orig.)

  13. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images

    Directory of Open Access Journals (Sweden)

    Inhye Yoon

    2015-03-01

    Full Text Available Since incoming light to an unmanned aerial vehicle (UAV platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i image segmentation based on geometric classes; (ii generation of the context-adaptive transmission map; and (iii intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  14. REAL-TIME FACE RECOGNITION BASED ON OPTICAL FLOW AND HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    D. Sathish Kumar

    2013-05-01

    Full Text Available Face recognition is one of the intensive areas of research in computer vision and pattern recognition but many of which are focused on recognition of faces under varying facial expressions and pose variation. A constrained optical flow algorithm discussed in this paper, recognizes facial images involving various expressions based on motion vector computation. In this paper, an optical flow computation algorithm which computes the frames of varying facial gestures, and integrating with synthesized image in a probabilistic environment has been proposed. Also Histogram Equalization technique has been used to overcome the effect of illuminations while capturing the input data using camera devices. It also enhances the contrast of the image for better processing. The experimental results confirm that the proposed face recognition system is more robust and recognizes the facial images under varying expressions and pose variations more accurately.

  15. Underwater Image Enhancement by Adaptive Gray World and Differential Gray-Levels Histogram Equalization

    Directory of Open Access Journals (Sweden)

    WONG, S.-L.

    2018-05-01

    Full Text Available Most underwater images tend to be dominated by a single color cast. This paper presents a solution to remove the color cast and improve the contrast in underwater images. However, after the removal of the color cast using Gray World (GW method, the resultant image is not visually pleasing. Hence, we propose an integrated approach using Adaptive GW (AGW and Differential Gray-Levels Histogram Equalization (DHE that operate in parallel. The AGW is applied to remove the color cast while DHE is used to improve the contrast of the underwater image. The outputs of both chromaticity components of AGW and intensity components of DHE are combined to form the enhanced image. The results of the proposed method are compared with three existing methods using qualitative and quantitative measures. The proposed method increased the visibility of underwater images and in most cases produces better quantitative scores when compared to the three existing methods.

  16. Differentiation of adrenal adenomas from nonadenomas using CT histogram analysis method: A prospective study

    International Nuclear Information System (INIS)

    Halefoglu, Ahmet Mesrur; Bas, Nagihan; Yasar, Ahmet; Basak, Muzaffer

    2010-01-01

    Objective: The objective of our study was to prospectively evaluate the effectiveness of computed tomography (CT) histogram analysis method in the differentiation of benign and malignant adrenal masses. Materials and Methods: Between March 2007 and June 2008, 94 patients (46 males, 48 females, age range: 30-79 years, mean age: 57.7 years) with 113 adrenal masses (mean diameter: 3.03 cm, range: 1.07-8.02 cm) were prospectively evaluated. These included 66 adenomas, 45 metastases and 2 pheochromocytomas. Histogram analysis method was performed using a circular region of interest (ROI) and mean attenuation, total number of pixels, number of negative pixels and subsequent percentage of negative pixels were detected on both unenhanced and delayed contrast-enhanced CT images for each adrenal mass. A mean attenuation threshold of 10 Hounsfield unit (HU) for unenhanced CT and 5% and 10% negative pixel thresholds for both unenhanced and delayed contrast-enhanced CT were calculated by a consensus of at least two reviewers and the correlation between mean attenuation and percentage of negative pixels was determined. Final diagnoses were based on imaging follow-up of minimum 6 months, biopsy, surgery and adrenal washout study. Results: 51 of 66 adenomas (77.3%) showed attenuation values of ≤10 HU and 15 (22.7%) adenomas showed more than 10 HU on unenhanced CT. All of these adenomas contained negative pixels on unenhanced CT. Eight of 66 (12.1%) adenomas showed a mean attenuation value of ≤10 HU on delayed contrast-enhanced scans and 45 adenomas (68.2%) persisted on containing negative pixels. All metastases had an attenuation value of greater than 10 HU on unenhanced CT images. 21 of 45 (46.6%) metastases contained negative pixels on unenhanced images but only seven metastases (15.5%) had negative pixels on delayed contrast-enhanced images. Two pheochromocytomas had negative pixels on both unenhanced and delayed contrast-enhanced CT images. Increase in the percentage of

  17. Histogram analysis for age change of human lung with computed tomography

    International Nuclear Information System (INIS)

    Shirabe, Ichiju

    1990-01-01

    In order to evaluate physiological changes of normal lung with aging by computed tomography (CT), the peak position (PP) and full width half maximum (FWHM) of CT-histogram were studied in 77 normal human lung. Above 30 years old, PP tended to be seen in the lower attenuation value with advancing ages, with the result that the follow equation was obtained. CT attenuation value of PP=-0.87 x age -815. The peak position shifted to the range of higher CT attenuation in 30's. FWHM did not change with advancing ages. There were no differences of peak value and FWHM among the upper, middle and lower lung field. In this study, physiological changes of lung were evaluated quantitatively. Furthermore, this study was considered to be useful for diagnosis and treatment in lung diseases. (author)

  18. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    Science.gov (United States)

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-03-19

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  19. A Novel Histogram Region Merging Based Multithreshold Segmentation Algorithm for MR Brain Images

    Directory of Open Access Journals (Sweden)

    Siyan Liu

    2017-01-01

    Full Text Available Multithreshold segmentation algorithm is time-consuming, and the time complexity will increase exponentially with the increase of thresholds. In order to reduce the time complexity, a novel multithreshold segmentation algorithm is proposed in this paper. First, all gray levels are used as thresholds, so the histogram of the original image is divided into 256 small regions, and each region corresponds to one gray level. Then, two adjacent regions are merged in each iteration by a new designed scheme, and a threshold is removed each time. To improve the accuracy of the merger operation, variance and probability are used as energy. No matter how many the thresholds are, the time complexity of the algorithm is stable at O(L. Finally, the experiment is conducted on many MR brain images to verify the performance of the proposed algorithm. Experiment results show that our method can reduce the running time effectively and obtain segmentation results with high accuracy.

  20. Detection of License Plate using Sliding Window, Histogram of Oriented Gradient, and Support Vector Machines Method

    Science.gov (United States)

    Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari

    2018-01-01

    The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.