WorldWideScience

Sample records for bins histograms allocates

  1. Bin recycling strategy for improving the histogram precision on GPU

    Science.gov (United States)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  2. Allocation of solid waste collection bins and route optimisation using geographical information system: A case study of Dhanbad City, India.

    Science.gov (United States)

    Khan, D; Samadder, S R

    2016-07-01

    Collection of municipal solid waste is one of the most important elements of municipal waste management and requires maximum fund allocated for waste management. The cost of collection and transportation can be reduced in comparison with the present scenario if the solid waste collection bins are located at suitable places so that the collection routes become minimum. This study presents a suitable solid waste collection bin allocation method at appropriate places with uniform distance and easily accessible location so that the collection vehicle routes become minimum for the city Dhanbad, India. The network analyst tool set available in ArcGIS was used to find the optimised route for solid waste collection considering all the required parameters for solid waste collection efficiently. These parameters include the positions of solid waste collection bins, the road network, the population density, waste collection schedules, truck capacities and their characteristics. The present study also demonstrates the significant cost reductions that can be obtained compared with the current practices in the study area. The vehicle routing problem solver tool of ArcGIS was used to identify the cost-effective scenario for waste collection, to estimate its running costs and to simulate its application considering both travel time and travel distance simultaneously. © The Author(s) 2016.

  3. Models and Algorithms for the Integrated Planning of Bin Allocation and Vehicle Routing in Solid Waste Management

    NARCIS (Netherlands)

    Hemmelmayr, V.C.; Doerner, K.F.; Hartl, R.F.; Vigo, D.

    2014-01-01

    The efficient organization of waste collection systems based on bins located along the streets involves the solution of several tactical optimization problems. In particular, the bin configuration and sizing at each collection site as well as the service frequency over a given planning horizon have

  4. Improved Figure of Merit for Feynman Histograms

    Energy Technology Data Exchange (ETDEWEB)

    Arthur, Jennifer Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-06

    Typically a FOM that takes into account the differences between each bin of the histograms, as compared to the magnitude of the combination of the corresponding uncertainties, is used. Because the uncertainties corresponding to the larger multiplet bins are inherently larger than those of the smaller multiplet bins, this type of FOM puts more weight on differences between smaller multiplet bins. New proposed FOM also takes into account sensitivity of leakage multiplication (which is most sensitive to higher multiplet bins) to each bin in the histogram.

  5. Bi-Histogram Equalization with Brightnes Preservation Using Contras Enhancement

    OpenAIRE

    A. Anitha Rani; Gowthami Rajagopal; A. Jagadeswaran

    2014-01-01

    Contrast enhancement is an important factor in the image preprocesing step. One of the widely acepted contrast enhancement method is the histogram equalization. Although histogram equalization achieves comparatively beter performance on almost al types of image, global histogram equalization sometimes produces excesive visual deterioration. A new extension of bi- histogram equalization caled Bi-Histogram Equalization with Neighborhod Metric (BHENM). First, large histogram bins that cause w...

  6. Complexity of possibly gapped histogram and analysis of histogram

    Science.gov (United States)

    Fushing, Hsieh; Roy, Tania

    2018-02-01

    We demonstrate that gaps and distributional patterns embedded within real-valued measurements are inseparable biological and mechanistic information contents of the system. Such patterns are discovered through data-driven possibly gapped histogram, which further leads to the geometry-based analysis of histogram (ANOHT). Constructing a possibly gapped histogram is a complex problem of statistical mechanics due to the ensemble of candidate histograms being captured by a two-layer Ising model. This construction is also a distinctive problem of Information Theory from the perspective of data compression via uniformity. By defining a Hamiltonian (or energy) as a sum of total coding lengths of boundaries and total decoding errors within bins, this issue of computing the minimum energy macroscopic states is surprisingly resolved by applying the hierarchical clustering algorithm. Thus, a possibly gapped histogram corresponds to a macro-state. And then the first phase of ANOHT is developed for simultaneous comparison of multiple treatments, while the second phase of ANOHT is developed based on classical empirical process theory for a tree-geometry that can check the authenticity of branches of the treatment tree. The well-known Iris data are used to illustrate our technical developments. Also, a large baseball pitching dataset and a heavily right-censored divorce data are analysed to showcase the existential gaps and utilities of ANOHT.

  7. Subtracting and Fitting Histograms using Profile Likelihood

    CERN Document Server

    D'Almeida, F M L

    2008-01-01

    It is known that many interesting signals expected at LHC are of unknown shape and strongly contaminated by background events. These signals will be dif cult to detect during the rst years of LHC operation due to the initial low luminosity. In this work, one presents a method of subtracting histograms based on the pro le likelihood function when the background is previously estimated by Monte Carlo events and one has low statistics. Estimators for the signal in each bin of the histogram difference are calculated so as limits for the signals with 68.3% of Con dence Level in a low statistics case when one has a exponential background and a Gaussian signal. The method can also be used to t histograms when the signal shape is known. Our results show a good performance and avoid the problem of negative values when subtracting histograms.

  8. The grumpy bin

    DEFF Research Database (Denmark)

    Altarriba, Ferran; Funk, Mathias; Lanzani, Stefano Eugenio

    2017-01-01

    Domestic food waste is a world-wide problem that is complex and difficult to tackle as it touches diverse habits and social behaviors. This paper introduces the Grumpy Bin, a smart food waste bin designed for the context of student housing. The Grumpy Bin1 contributes to the state of the art...

  9. Mean shift trackers with cross-bin metrics.

    Science.gov (United States)

    Leichter, Ido

    2012-04-01

    Cross-bin metrics have been shown to be more suitable than bin-by-bin metrics for measuring the distance between histograms in various applications. In particular, a visual tracker that minimizes the earth mover's distance (EMD) between the candidate and reference feature histograms has recently been proposed. This tracker was shown to be more robust than the Mean Shift tracker, which employs a bin-by-bin metric. In each frame, the former tracker iteratively shifts the candidate location by one pixel in the direction opposite to the EMD's gradient until no improvement is made. This optimization process involves the clustering of the candidate feature density in feature space, as well as the computation of the EMD between the candidate and reference feature histograms after each shift of the candidate location. In this paper, alternative trackers that employ cross-bin metrics as well, but that are based on Mean Shift (MS) iterations, are derived. The proposed trackers are simpler and faster due to 1) the use of MS-based optimization, which is not restricted to single pixel shifts, 2) abstention from any clustering of feature densities, and 3) abstention from EMD computations in multidimensional spaces.

  10. Thresholding histogram equalization.

    Science.gov (United States)

    Chuang, K S; Chen, S; Hwang, I M

    2001-12-01

    The drawbacks of adaptive histogram equalization techniques are the loss of definition on the edges of the object and overenhancement of noise in the images. These drawbacks can be avoided if the noise is excluded in the equalization transformation function computation. A method has been developed to separate the histogram into zones, each with its own equalization transformation. This method can be used to suppress the nonanatomic noise and enhance only certain parts of the object. This method can be combined with other adaptive histogram equalization techniques. Preliminary results indicate that this method can produce images with superior contrast.

  11. Infrared Contrast Enhancement Through Log-Power Histogram Modification

    NARCIS (Netherlands)

    Toet, A.; Wu, T.

    2015-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance infrared (IR) image contrast. The algorithm combines a logarithm operator that smoothes the input image histogram while retaining the relative ordering of the original bins, with a power operator that restores the

  12. Mohammed A Bin Hussain

    Indian Academy of Sciences (India)

    Home; Journals; Bulletin of Materials Science. Mohammed A Bin Hussain. Articles written in Bulletin of Materials Science. Volume 38 Issue 7 December 2015 pp 1731-1736. Sintered gahnite–cordierite glass-ceramic based on raw materials with different fluorine sources · Esmat M A Hamzawy Mohammed A Bin Hussain.

  13. Contrast enhancement using histogram equalization based on logarithmic mapping

    Science.gov (United States)

    Kim, Wonkyun; You, Jongmin; Jeong, Jechang

    2012-06-01

    A widely used contrast enhancement method, the histogram equalization (HE) often produces images with unnatural appearances and visually disturbing artifacts because the HE compels the enhanced image to follow the uniform distribution. An adaptive histogram equalization using logarithmic mapping is presented, with a proposed algorithm based on a bin underflow and overflow method that achieves contrast enhancement by putting constraints on each histogram component differently. To incorporate characteristics of the human visual system, the logarithmic mapping function is used as constraint function, while the rate of contrast enhancement is controlled by determining the control parameters with the characteristics of the original image. The experimental results show that the proposed algorithm not only keeps the original histogram shape features, but also enhances the contrast effectively. Due to its simplicity, the proposed algorithm can be applied by simple hardware and processed in a real-time system.

  14. DPAK and HPAK: a versatile display and histogramming package

    International Nuclear Information System (INIS)

    Logg, C.A.; Boyarski, A.M.; Cook, A.J.; Cottrell, R.L.A.; Sund, S.

    1979-07-01

    The features of a display and histogram package which requires a minimal number of subroutine calls in order to generate graphic output in many flavors on a variety of devices are described. Default options are preset to values that are generally most wanted, but the default values may be readily changed to the user's needs. The description falls naturally into two parts, namely, the set of routines (DPAK) for displaying data on some device, and the set of routines (HPAK) for generating histograms. HPAK provides a means of allocating memory for histograms, accumulating data into histograms, and subsequently displaying the hisotgrams via calls to the DPAK routines. Histograms and displays of either one or two independent variables can be made

  15. Linear interpolation of histograms

    CERN Document Server

    Read, A L

    1999-01-01

    A prescription is defined for the interpolation of probability distributions that are assumed to have a linear dependence on a parameter of the distributions. The distributions may be in the form of continuous functions or histograms. The prescription is based on the weighted mean of the inverses of the cumulative distributions between which the interpolation is made. The result is particularly elegant for a certain class of distributions, including the normal and exponential distributions, and is useful for the interpolation of Monte Carlo simulation results which are time-consuming to obtain.

  16. Histogram equalization of CT images.

    Science.gov (United States)

    Lehr, J L; Capek, P

    1985-01-01

    Histogram equalization for display of clinical CT images was evaluated. In theory, histogram equalization makes optimal use of an available grey scale to display an image, and its use could circumvent the problem of selecting specific window settings for each image. In several clinical images, the use of a spatially variable histogram equalization technique limited to that portion of the CT image occupied by the patient did appear to increase the visibility of anatomic structures. However, using the technique also increased displayed image noise and artifacts. Although radiologists found this to be objectionable, it did not decrease the detectability of simulated low-contrast liver metastases. Further evaluation of histogram equalization for displaying CT images is being pursued.

  17. Fast tracking using edge histograms

    Science.gov (United States)

    Rokita, Przemyslaw

    1997-04-01

    This paper proposes a new algorithm for tracking objects and objects boundaries. This algorithm was developed and applied in a system used for compositing computer generated images and real world video sequences, but can be applied in general in all tracking systems where accuracy and high processing speed are required. The algorithm is based on analysis of histograms obtained by summing along chosen axles pixels of edge segmented images. Edge segmentation is done by spatial convolution using gradient operator. The advantage of such an approach is that it can be performed in real-time using available on the market hardware convolution filters. After edge extraction and histograms computation, respective positions of maximums in edge intensity histograms, in current and previous frame, are compared and matched. Obtained this way information about displacement of histograms maximums, can be directly converted into information about changes of target boundaries positions along chosen axles.

  18. Image Enhancement via Subimage Histogram Equalization Based on Mean and Variance.

    Science.gov (United States)

    Zhuang, Liyun; Guan, Yepeng

    2017-01-01

    This paper puts forward a novel image enhancement method via Mean and Variance based Subimage Histogram Equalization (MVSIHE), which effectively increases the contrast of the input image with brightness and details well preserved compared with some other methods based on histogram equalization (HE). Firstly, the histogram of input image is divided into four segments based on the mean and variance of luminance component, and the histogram bins of each segment are modified and equalized, respectively. Secondly, the result is obtained via the concatenation of the processed subhistograms. Lastly, the normalization method is deployed on intensity levels, and the integration of the processed image with the input image is performed. 100 benchmark images from a public image database named CVG-UGR-Database are used for comparison with other state-of-the-art methods. The experiment results show that the algorithm can not only enhance image information effectively but also well preserve brightness and details of the original image.

  19. Live histograms in moving windows

    International Nuclear Information System (INIS)

    Zhil'tsov, V.E.

    1989-01-01

    Application of computer graphics for specific hardware testing is discussed. The hardware is position sensitive detector (multiwire proportional chamber) which is used in high energy physics experiments, and real-out electronics for it. Testing program is described (XPERT), which utilises multi-window user interface. Data are represented as histograms in windows. The windows on the screen may be moved, reordered, their sizes may be changed. Histograms may be put to any window, and hardcopy may be made. Some program internals are discussed. The computer environment is quite simple: MS-DOS IBM PC/XT, 256 KB RAM, CGA, 5.25'' FD, Epson MX. 4 refs.; 7 figs

  20. The Maximum Resource Bin Packing Problem

    DEFF Research Database (Denmark)

    Boyar, J.; Epstein, L.; Favrholdt, L.M.

    2006-01-01

    Usually, for bin packing problems, we try to minimize the number of bins used or in the case of the dual bin packing problem, maximize the number or total size of accepted items. This paper presents results for the opposite problems, where we would like to maximize the number of bins used...... algorithms, First-Fit-Increasing and First-Fit-Decreasing for the maximum resource variant of classical bin packing. For the on-line variant, we define maximum resource variants of classical and dual bin packing. For dual bin packing, no on-line algorithm is competitive. For classical bin packing, we find...

  1. A novel 3-D color histogram equalization method with uniform 1-D gray scale histogram.

    Science.gov (United States)

    Han, Ji-Hee; Yang, Sejung; Lee, Byung-Uk

    2011-02-01

    The majority of color histogram equalization methods do not yield uniform histogram in gray scale. After converting a color histogram equalized image into gray scale, the contrast of the converted image is worse than that of an 1-D gray scale histogram equalized image. We propose a novel 3-D color histogram equalization method that produces uniform distribution in gray scale histogram by defining a new cumulative probability density function in 3-D color space. Test results with natural and synthetic images are presented to compare and analyze various color histogram equalization algorithms based upon 3-D color histograms. We also present theoretical analysis for nonideal performance of existing methods.

  2. LHCb: Machine assisted histogram classification

    CERN Multimedia

    Somogyi, P; Gaspar, C

    2009-01-01

    LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty components can be either done visually using instruments such as the LHCb Histogram Presenter, or by automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, a graph-theoretic based clustering tool, combined with machine learning algorithms is proposed and demonstrated by processing histograms representing 2D event hitmaps. The concept is proven by detecting ion feedback events in the LHCb RICH subdetector.

  3. Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system.

    Science.gov (United States)

    Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan

    2011-12-01

    This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Fast histogram equalization for medical image enhancement.

    Science.gov (United States)

    Wang, Qian; Chen, Liya; Shen, Dinggang

    2008-01-01

    To overcome the problem that the histogram equalization can fail for discrete images, a local-mean based strict pixel ordering method has been proposed recently, although it is unpractical for 3D medical image enhancement due to its complex computation. In this paper, a novel histogram mapping method is proposed. It uses a fast local feature generation technique to establish a combined histogram that represents voxels' local means as well as grey levels. Different sections of the combined histogram, separated by individual peaks, are independently mapped into the target histogram scale under the constraint that the final overall histogram should be as uniform as possible. By using this method, the speed of histogram equalization is dramatically improved, and the satisfactory enhancement results are also achieved.

  5. System for histogram entry, retrieval, and plotting

    International Nuclear Information System (INIS)

    Kellogg, M.; Gallup, J.M.; Shlaer, S.; Spencer, N.

    1977-10-01

    This manual describes the systems for producing histograms and dot plots that were designed for use in connection with the Q general-purpose data-acquisition system. These systems allow for the creation of histograms; the entry, retrieval, and plotting of data in the form of histograms; and the dynamic display of scatter plots as data are acquired. Although the systems are designed for use with Q, they can also be used as a part of other applications. 3 figures

  6. Improved taxation rate for bin packing games

    NARCIS (Netherlands)

    Kern, Walter; Qui, X.; Marchetti-Spaccamela, A.; Segal, M.

    2011-01-01

    A cooperative bin packing game is a $N$-person game, where the player set $N$ consists of $k$ bins of capacity 1 each and $n$ items of sizes $a_1,\\dots,a_n$. The value of a coalition of players is defined to be the maximum total size of items in the coalition that can be packed into the bins of the

  7. Information granules in image histogram analysis.

    Science.gov (United States)

    Wieclawek, Wojciech

    2017-05-10

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Combining Vector Quantization and Histogram Equalization.

    Science.gov (United States)

    Cosman, Pamela C.; And Others

    1992-01-01

    Discussion of contrast enhancement techniques focuses on the use of histogram equalization with a data compression technique, i.e., tree-structured vector quantization. The enhancement technique of intensity windowing is described, and the use of enhancement techniques for medical images is explained, including adaptive histogram equalization.…

  9. Asymptotics for generalized piecewise linear histograms

    Czech Academy of Sciences Publication Activity Database

    Berlinet, A.; Hobza, Tomáš; Vajda, Igor

    2002-01-01

    Roč. 34, č. 3 (2002), s. 3-19 ISSN 0041-9184 R&D Projects: GA ČR GA102/99/1137 Institutional research plan: CEZ:AV0Z1075907 Keywords : nonparametric density estimation * histogram * piecewise linear histogram Subject RIV: BB - Applied Statistics, Operational Research

  10. Computationally efficient multidimensional analysis of complex flow cytometry data using second order polynomial histograms.

    Science.gov (United States)

    Zaunders, John; Jing, Junmei; Leipold, Michael; Maecker, Holden; Kelleher, Anthony D; Koch, Inge

    2016-01-01

    Many methods have been described for automated clustering analysis of complex flow cytometry data, but so far the goal to efficiently estimate multivariate densities and their modes for a moderate number of dimensions and potentially millions of data points has not been attained. We have devised a novel approach to describing modes using second order polynomial histogram estimators (SOPHE). The method divides the data into multivariate bins and determines the shape of the data in each bin based on second order polynomials, which is an efficient computation. These calculations yield local maxima and allow joining of adjacent bins to identify clusters. The use of second order polynomials also optimally uses wide bins, such that in most cases each parameter (dimension) need only be divided into 4-8 bins, again reducing computational load. We have validated this method using defined mixtures of up to 17 fluorescent beads in 16 dimensions, correctly identifying all populations in data files of 100,000 beads in analysis, and up to 65 subpopulations of PBMC in 33-dimensional CyTOF data, showing its usefulness in discovery research. SOPHE has the potential to greatly increase efficiency of analysing complex mixtures of cells in higher dimensions. © 2015 International Society for Advancement of Cytometry.

  11. Automatic histogram threshold using fuzzy measures.

    Science.gov (United States)

    Vieira Lopes, Nuno; Mogadouro do Couto, Pedro A; Bustince, Humberto; Melo-Pinto, Pedro

    2010-01-01

    In this paper, an automatic histogram threshold approach based on a fuzziness measure is presented. This work is an improvement of an existing method. Using fuzzy logic concepts, the problems involved in finding the minimum of a criterion function are avoided. Similarity between gray levels is the key to find an optimal threshold. Two initial regions of gray levels, located at the boundaries of the histogram, are defined. Then, using an index of fuzziness, a similarity process is started to find the threshold point. A significant contrast between objects and background is assumed. Previous histogram equalization is used in small contrast images. No prior knowledge of the image is required.

  12. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    OpenAIRE

    MADKI.M.R; RUBINA KHAN

    2012-01-01

    The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniqu...

  13. Color Histogram Diffusion for Image Enhancement

    Science.gov (United States)

    Kim, Taemin

    2011-01-01

    Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.

  14. A New Method of Histogram Computation for Efficient Implementation of the HOG Algorithm

    Directory of Open Access Journals (Sweden)

    Mariana-Eugenia Ilas

    2018-03-01

    Full Text Available In this paper we introduce a new histogram computation method to be used within the histogram of oriented gradients (HOG algorithm. The new method replaces the arctangent with the slope computation and the classical magnitude allocation based on interpolation with a simpler algorithm. The new method allows a more efficient implementation of HOG in general, and particularly in field-programmable gate arrays (FPGAs, by considerably reducing the area (thus increasing the level of parallelism, while maintaining very close classification accuracy compared to the original algorithm. Thus, the new method is attractive for many applications, including car detection and classification.

  15. Letters from Abbottabad: Bin Ladin Sidelined?

    Science.gov (United States)

    2012-05-03

    warned Bin Ladin. In other words, Yunis warned Bin Ladin that unless the enthusiasts and religious extremists are brought in line, they would be a...his family adhered to such strict measures, precluding his children from playing outdoors without the supervision of an adult who could keep their

  16. Bioinformatics and Astrophysics Cluster (BinAc)

    Science.gov (United States)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  17. Time-bin quantum RAM

    Science.gov (United States)

    Moiseev, E. S.; Moiseev, S. A.

    2016-11-01

    We have proposed a compact scheme of quantum random access memory (qRAM) based on the impedance matched multi-qubit photon echo quantum memory incorporated with the control four-level atom in two coupled QED cavities. A set of matching conditions for basic physical parameters of the qRAM scheme that provides an efficient quantum control of the fast single photon storage and readout has been found. In particular, it has been discovered that the efficient qRAM operations are determined by the specific properties of the excited photonic molecule coupling the two QED cavities. Herein, the maximal efficiency of the qRAM is realized when the cooperativity parameter of the photonic molecule equals to unity that can be experimentally achievable. We have also elaborated upon the new quantum address scheme where the multi-time-bin photon state is used for the control of the four-level atom during the readout of the photonic qubits from the quantum memory. The scheme reduces the required number of logical elements to one. Experimental implementation by means of current quantum technologies in the optical and microwave domains is also discussed.

  18. The Research of Histogram Enhancement Technique Based on Matlab Software

    Directory of Open Access Journals (Sweden)

    Li Kai

    2014-08-01

    Full Text Available Histogram enhancement technique has been widely applied as a typical pattern in digital image processing. The paper is based on Matlab software, through the two ways of histogram equalization and histogram specification technologies to deal with the darker images, using two methods of partial equilibrium and mapping histogram to transform the original histograms, thereby enhanced the image information. The results show that these two kinds of techniques both can significantly improve the image quality and enhance the image feature.

  19. Regionally adaptive histogram equalization of the chest.

    Science.gov (United States)

    Sherrier, R H; Johnson, G A

    1987-01-01

    Advances in the area of digital chest radiography have resulted in the acquisition of high-quality images of the human chest. With these advances, there arises a genuine need for image processing algorithms specific to the chest, in order to fully exploit this digital technology. We have implemented the well-known technique of histogram equalization, noting the problems encountered when it is adapted to chest images. These problems have been successfully solved with our regionally adaptive histogram equalization method. With this technique histograms are calculated locally and then modified according to both the mean pixel value of that region as well as certain characteristics of the cumulative distribution function. This process, which has allowed certain regions of the chest radiograph to be enhanced differentially, may also have broader implications for other image processing tasks.

  20. ACTION RECOGNITION USING SALIENT NEIGHBORING HISTOGRAMS

    DEFF Research Database (Denmark)

    Ren, Huamin; Moeslund, Thomas B.

    2013-01-01

    Combining spatio-temporal interest points with Bag-of-Words models achieves state-of-the-art performance in action recognition. However, existing methods based on “bag-ofwords” models either are too local to capture the variance in space/time or fail to solve the ambiguity problem in spatial...... and temporal dimensions. Instead, we propose a salient vocabulary construction algorithm to select visual words from a global point of view, and form compact descriptors to represent discriminative histograms in the neighborhoods. Those salient neighboring histograms are then trained to model different actions...

  1. Combining resummed Higgs predictions across jet bins

    Energy Technology Data Exchange (ETDEWEB)

    Boughezal, Radja [Argonne National Laboratory, IL (United States). High Energy Physics Division; Liu, Xiaohui; Petriello, Frank [Argonne National Laboratory, IL (United States). High Energy Physics Division; Northwestern Univ., Evanston, IL (United States). Dept. of Physics and Astronomy; Tackmann, Frank J. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Walsh, Jonathan R. [California Univ., Berkeley, CA (United States). Ernest Orlando Lawrence Berkeley Laboratory; California Univ., Berkeley, CA (United States). Center for Theoretical Physics

    2013-12-15

    Experimental analyses often use jet binning to distinguish between different kinematic regimes and separate contributions from background processes. To accurately model theoretical uncertainties in these measurements, a consistent description of the jet bins is required. We present a complete framework for the combination of resummed results for production processes in different exclusive jet bins, focusing on Higgs production in gluon fusion as an example. We extend the resummation of the H+1-jet cross section into the challenging low transverse momentum region, lowering the uncertainties considerably. We provide combined predictions with resummation for cross sections in the H+0-jet and H+1-jet bins, and give an improved theory covariance matrix for use in experimental studies. We estimate that the relevant theoretical uncertainties on the signal strength in the H{yields}WW{sup *} analysis are reduced by nearly a factor of 2 compared to the current value.

  2. Segmentation and Location Computation of Bin Objects

    Directory of Open Access Journals (Sweden)

    C.R. Hema

    2008-11-01

    Full Text Available In this paper we present a stereo vision based system for segmentation and location computation of partially occluded objects in bin picking environments. Algorithms to segment partially occluded objects and to find the object location [midpoint,x, y and z coordinates] with respect to the bin area are proposed. The z co ordinate is computed using stereo images and neural networks. The proposed algorithms is tested using two neural network architectures namely the Radial Basis Function nets and Simple Feedforward nets. The training results fo feedforward nets are found to be more suitable for the current application.The proposed stereo vision system is interfaced with an Adept SCARA Robot to perform bin picking operations. The vision system is found to be effective for partially occluded objects, in the absence of albedo effects. The results are validated through real time bin picking experiments on the Adept Robot.

  3. Method of infrared image enhancement based on histogram

    Science.gov (United States)

    Wang, Liang; Yan, Jie

    2011-05-01

    Aiming at the problem in infrared image enhancement, a new method is given based on histogram. Using the gray characteristics of target, the upper-bound threshold is selected adaptively and the histogram is processed by the threshold. After choosing the gray transform function based on the gray level distribution of image, the gray transformation is done during histogram equalization. Finally, the enhanced image is obtained. Compared with histogram equalization (HE), histogram double equalization (HDE) and plateau histogram equalization (PE), the simulation results demonstrate that the image enhancement effect of this method has obvious superiority. At the same time, its operation speed is fast and real-time ability is excellent.

  4. Bin Set 1 Calcine Retrieval Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    R. D. Adams; S. M. Berry; K. J. Galloway; T. A. Langenwalter; D. A. Lopez; C. M. Noakes; H. K. Peterson; M. I. Pope; R. J. Turk

    1999-10-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase.

  5. Bin Set 1 Calcine Retrieval Feasibility Study

    International Nuclear Information System (INIS)

    Adams, R.D.; Berry, S.M.; Galloway, K.J.; Langenwalter, T.A.; Lopez, D.A.; Noakes, C.M.; Peterson, H.K.; Pope, M.I.; Turk, R.J.

    1999-01-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase

  6. Control system of hexacopter using color histogram footprint and convolutional neural network

    Science.gov (United States)

    Ruliputra, R. N.; Darma, S.

    2017-07-01

    The development of unmanned aerial vehicles (UAV) has been growing rapidly in recent years. The use of logic thinking which is implemented into the program algorithms is needed to make a smart system. By using visual input from a camera, UAV is able to fly autonomously by detecting a target. However, some weaknesses arose as usage in the outdoor environment might change the target's color intensity. Color histogram footprint overcomes the problem because it divides color intensity into separate bins that make the detection tolerant to the slight change of color intensity. Template matching compare its detection result with a template of the reference image to determine the target position and use it to position the vehicle in the middle of the target with visual feedback control based on Proportional-Integral-Derivative (PID) controller. Color histogram footprint method localizes the target by calculating the back projection of its histogram. It has an average success rate of 77 % from a distance of 1 meter. It can position itself in the middle of the target by using visual feedback control with an average positioning time of 73 seconds. After the hexacopter is in the middle of the target, Convolutional Neural Networks (CNN) classifies a number contained in the target image to determine a task depending on the classified number, either landing, yawing, or return to launch. The recognition result shows an optimum success rate of 99.2 %.

  7. Robust histogram-based image retrieval

    Czech Academy of Sciences Publication Activity Database

    Höschl, Cyril; Flusser, Jan

    2016-01-01

    Roč. 69, č. 1 (2016), s. 72-81 ISSN 0167-8655 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Image retrieval * Noisy image * Histogram * Convolution * Moments * Invariants Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.995, year: 2016 http://library.utia.cas.cz/separaty/2015/ZOI/hoschl-0452147.pdf

  8. A Psychological Profile of Osama bin Laden.

    Science.gov (United States)

    Ross, Colin A

    2015-01-01

    Understanding Osama bin Laden's personal history illuminates his motivation, inner conflicts, decisions and behaviors. His relationships with his mother, father, country and religion set the stage for his conflicted choices as an adolescent and then as an adult. Although only a cursory psychological profile is possible based on public domain information, the profile constructed here could be useful in setting future foreign policy. Perhaps the crucial mistake in U.S. foreign policy was abandoning bin Laden as an asset when Russian forces were expelled from Afghanistan in 1989: this act by the U.S. set the stage for the World Trade Center attacks on September 11, 2001.

  9. Efficient contrast enhancement through log-power histogram modification

    NARCIS (Netherlands)

    Wu, T.; Toet, A.

    2014-01-01

    A simple power-logarithm histogram modification operator is proposed to enhance digital image contrast. First a logarithm operator reduces the effect of spikes and transforms the image histogram into a smoothed one that approximates a uniform histogram while retaining the relative size ordering of

  10. Binning metagenomic contigs by coverage and composition

    NARCIS (Netherlands)

    Alneberg, J.; Bjarnason, B.S.; Bruijn, de I.; Schirmer, M.; Quick, J.; Ijaz, U.Z.; Lahti, L.M.; Loman, N.J.; Andersson, A.F.; Quince, C.

    2014-01-01

    Shotgun sequencing enables the reconstruction of genomes from complex microbial communities, but because assembly does not reconstruct entire genomes, it is necessary to bin genome fragments. Here we present CONCOCT, a new algorithm that combines sequence composition and coverage across multiple

  11. The World According to Usama Bin Laden

    Science.gov (United States)

    2001-01-01

    little-known manifesto, Al- Faridah Al-Gha’ibah (the neglected duty), by Muhammad Abdel Salam Al-Farag. The work of this Egyptian Islamic radical, who was...mother, a progressive woman who was his father’s fourth wife. Bin Laden attended King Abdul Aziz University in Jeddah and in 1979 earned a degree—in

  12. ESG Allocations

    Data.gov (United States)

    Department of Housing and Urban Development — This report displays the Emergency Solutions Grants (ESG), formerly Emergency Shelter Grants, allocation by jurisdiction. The website allows users to look at...

  13. A histogram-free multicanonical Monte Carlo algorithm for the construction of analytical density of states

    Energy Technology Data Exchange (ETDEWEB)

    Eisenbach, Markus [ORNL; Li, Ying Wai [ORNL

    2017-06-01

    We report a new multicanonical Monte Carlo (MC) algorithm to obtain the density of states (DOS) for physical systems with continuous state variables in statistical mechanics. Our algorithm is able to obtain an analytical form for the DOS expressed in a chosen basis set, instead of a numerical array of finite resolution as in previous variants of this class of MC methods such as the multicanonical (MUCA) sampling and Wang-Landau (WL) sampling. This is enabled by storing the visited states directly in a data set and avoiding the explicit collection of a histogram. This practice also has the advantage of avoiding undesirable artificial errors caused by the discretization and binning of continuous state variables. Our results show that this scheme is capable of obtaining converged results with a much reduced number of Monte Carlo steps, leading to a significant speedup over existing algorithms.

  14. Pakistan andis Osama bin Ladeni jahtimisel alla / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Pakistani president Pervez Musharrafi sõnul on Osama bin Ladeni otsingud Lõuna-Waziristanis tulemusteta. Edaspidi keskendutakse otsingutel Põhja-Waziristani, oletatakse, et bin Laden võib olla ka Tora Bora koobastikus. Lisa: USA hambutu luureteenistus

  15. MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yu-Wei [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Simmons, Blake A. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Steven W. [Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-10-29

    The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning a single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.

  16. Survey of Contrast Enhancement Techniques based on Histogram Equalization

    OpenAIRE

    Manpreet Kaur,; Jasdeep Kaur; Jappreet Kaur

    2011-01-01

    This Contrast enhancement is frequently referred to as one of the most important issues in image processing. Histogram equalization (HE) is one of the common methods used for improving contrast in digital images. Histogram equalization (HE) has proved to be a simple and effective image contrast enhancement technique. However, the conventional histogram equalization methods usually result in excessive contrast enhancement, which causes the unnatural look and visual artifacts of the processed i...

  17. An adaptive brightness preserving bi-histogram equalization

    Science.gov (United States)

    Shen, Hongying; Sun, Shuifa; Lei, Bangjun; Zheng, Sheng

    2011-11-01

    Based on mean preserving bi-histogram equalization (BBHE), an adaptive image histogram equalization algorithm for contrast enhancement is proposed. The threshold is gotten with adaptive iterative steps and used to divide the original image into two sub-images. The proposed Iterative of Brightness Bi-Histogram Equalization overcomes the over-enhancement phenomenon in the conventional histogram equalization. The simulation results show that the algorithm can not only preserve the mean brightness, but also keep the enhancement image information effectively from visual perception, and get a better edge detection result.

  18. Improvement of digital mammogram images using histogram equalization, histogram stretching and median filter.

    Science.gov (United States)

    Langarizadeh, M; Mahmud, R; Ramli, A R; Napis, S; Beikzadeh, M R; Rahman, W E Z W A

    2011-02-01

    Breast cancer is one of the most important diseases in females worldwide. According to the Malaysian Oncological Society, about 4% of women who are 40 years old and above are involved have breast cancer. Masses and microcalcifications are two important signs of breast cancer diagnosis on mammography. Enhancement techniques, i.e. histogram equalization, histogram stretching and median filters, were used to provide better visualization for radiologists in order to help early detection of breast abnormalities. In this research 60 digital mammogram images which includes 20 normal and 40 confirmed diagnosed cancerous cases were selected and manipulated using the mentioned techniques. The original and manipulated images were scored by three expert radiologists. Results showed that the selected methods have a positive significant effect on image quality.

  19. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  20. Contrast enhancement via texture region based histogram equalization

    Science.gov (United States)

    Singh, Kuldeep; Vishwakarma, Dinesh K.; Singh Walia, Gurjit; Kapoor, Rajiv

    2016-08-01

    This paper presents two novel contrast enhancement approaches using texture regions-based histogram equalization (HE). In HE-based contrast enhancement methods, the enhanced image often contains undesirable artefacts because an excessive number of pixels in the non-textured areas heavily bias the histogram. The novel idea presented in this paper is to suppress the impact of pixels in non-textured areas and to exploit texture features for the computation of histogram in the process of HE. The first algorithm named as Dominant Orientation-based Texture Histogram Equalization (DOTHE), constructs the histogram of the image using only those image patches having dominant orientation. DOTHE categories image patches into smooth, dominant or non-dominant orientation patches by using the image variance and singular value decomposition algorithm and utilizes only dominant orientation patches in the process of HE. The second method termed as Edge-based Texture Histogram Equalization, calculates significant edges in the image and constructs the histogram using the grey levels present in the neighbourhood of edges. The cumulative density function of the histogram formed from texture features is mapped on the entire dynamic range of the input image to produce the contrast-enhanced image. Subjective as well as objective performance assessment of proposed methods is conducted and compared with other existing HE methods. The performance assessment in terms of visual quality, contrast improvement index, entropy and measure of enhancement reveals that the proposed methods outperform the existing HE methods.

  1. Automatic Grasp Generation and Improvement for Industrial Bin-Picking

    DEFF Research Database (Denmark)

    Kraft, Dirk; Ellekilde, Lars-Peter; Rytz, Jimmy Alison

    2014-01-01

    and achieve comparable results and that our learning approach can improve system performance significantly. Automatic bin-picking is an important industrial process that can lead to significant savings and potentially keep production in countries with high labour cost rather than outsourcing it. The presented...... work allows to minimize cycle time as well as setup cost, which are essential factors in automatic bin-picking. It therefore leads to a wider applicability of bin-picking in industry....

  2. Efficient binning for bitmap indices on high-cardinality attributes

    Energy Technology Data Exchange (ETDEWEB)

    Rotem, Doron; Stockinger, Kurt; Wu, Kesheng

    2004-11-17

    Bitmap indexing is a common technique for indexing high-dimensional data in data warehouses and scientific applications. Though efficient for low-cardinality attributes, query processing can be rather costly for high-cardinality attributes due to the large storage requirements for the bitmap indices. Binning is a common technique for reducing storage costs of bitmap indices. This technique partitions the attribute values into a number of ranges, called bins, and uses bitmap vectors to represent bins (attribute ranges) rather than distinct values. Although binning may reduce storage costs, it may increase the access costs of queries that do not fall on exact bin boundaries (edge bins). For this kind of queries the original data values associated with edge bins must be accessed, in order to check them against the query constraints.In this paper we study the problem of finding optimal locations for the bin boundaries in order to minimize these access costs subject to storage constraints. We propose a dynamic programming algorithm for optimal partitioning of attribute values into bins that takes into account query access patterns as well as data distribution statistics. Mathematical analysis and experiments on real life data sets show that the optimal partitioning achieved by this algorithm can lead to a significant improvement in the access costs of bitmap indexing systems for high-cardinality attributes.

  3. IMPLEMENTASI METODE HISTOGRAM EQUALIZATION UNTUK MENINGKATKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Isa Akhlis

    2012-02-01

    Full Text Available Radiografi dapat digunakan untuk membantu mendiagnosis penyakit dalam bidang medis. Umumnya citra radiograf masih tampak kabur sehingga memerlukan pengolahan untuk menghilangkan atau mengurangi kekaburan tersebut. Tujuan penelitian ini adalah mendesain perangkat lunak untuk meningkatkan kualitas citra digital foto Roentgen yaitu dengan meningkatkan kontras citra tersebut. Salah satu metode untuk meningkatkan kontras citra digital adalah dengan menggunakan metode histogram equalization. Metoda tersebut membuat tingkat keabuan citra tersebar merata pada semua tingkat keabuan. Hasil penelitian menunjukkan bahwa metoda histogram equalization dapat digunakan untuk meningkatkan kontras citra.  Hal ini dapat langsung dilihat pada layar monitor.   Kata kunci: citra radiograf,  histogram equalization

  4. HPLOT: the graphics interface package for the HBOOK histogramming package

    International Nuclear Information System (INIS)

    Watkins, H.

    1978-01-01

    The subroutine package HPLOT described in this report, enables the CERN histogramming package HBOOK to produce high-quality pictures by means of high-resolution devices such as plotters. HPLOT can be implemented on any scientific computing system with a Fortran IV compiler and can be interfaced with any graphics package; spectral routines in addition to the basic ones enable users to embellish their histograms. Examples are also given of the use of HPLOT as a graphics package for plotting simple pictures without histograms. (Auth.)

  5. A variational model for histogram transfer of color images.

    Science.gov (United States)

    Papadakis, N; Provenzi, E; Caselles, V

    2011-06-01

    In this paper, we propose a variational formulation for histogram transfer of two or more color images. We study an energy functional composed by three terms: one tends to approach the cumulative histograms of the transformed images, the other two tend to maintain the colors and geometry of the original images. By minimizing this energy, we obtain an algorithm that balances equalization and the conservation of features of the original images. As a result, they evolve while approaching an intermediate histogram between them. This intermediate histogram does not need to be specified in advance, but it is a natural result of the model. Finally, we provide experiments showing that the proposed method compares well with the state of the art.

  6. A Modified Image Comparison Algorithm Using Histogram Features

    OpenAIRE

    Al-Oraiqat, Anas M.; Kostyukova, Natalya S.

    2018-01-01

    This article discuss the problem of color image content comparison. Particularly, methods of image content comparison are analyzed, restrictions of color histogram are described and a modified method of images content comparison is proposed. This method uses the color histograms and considers color locations. Testing and analyzing of based and modified algorithms are performed. The modified method shows 97% average precision for a collection containing about 700 images without loss of the adv...

  7. Online Variable-Sized Bin Packing with Conflicts

    DEFF Research Database (Denmark)

    Epstein, Leah; Favrholdt, Lene Monrad; Levin, Asaf

    2011-01-01

    We study a new kind of on-line bin packing with conflicts, motivated by a problem arising when scheduling jobs on the Grid. In this bin packing problem, the set of items is given at the beginning, together with a set of conflicts on pairs of items. A conflict on a pair of items implies...

  8. Mobile garbage bins and hand injuries in older people.

    Science.gov (United States)

    Niu, Rui; Woodbridge, Adam B; Smith, Belinda J; Ruff, Stephen J; Lawson, Richard D

    2013-10-07

    To conduct a database search, chart and literature review of open extensor tendon and proximal interphalangeal joint injuries incurred while handling mobile garbage bins. A review of medical records at a Sydney tertiary referral hospital and a NSW rural Level 2 trauma hospital from 1 January 2006 to 31 December 2010, identified through database searches of appropriate medical record codes and followed by a chart review. We identified 11 patients with finger injuries from handling mobile garbage bins that necessitated hospital-based treatments. Their average age was 75 years. Eight patients required surgery. Patients typically fell while maintaining their grip on mobile garbage bin handles, causing abrasive injury to the dorsal aspect of the proximal interphalangeal joint. Older patients are at risk of significant injuries to the dorsal side of their fingers when manoeuvring mobile garbage bins. This risk could be reduced by providing older members of the community with help to move their bins, or by modifying the design of bin handles. We propose a simple modification to the design of bin handles.

  9. Design and Development of a Smart Waste Bin

    Directory of Open Access Journals (Sweden)

    Michael E.

    2017-10-01

    Full Text Available For years waste bin has been part of our lives this has necessitated many inventions and innovations to make it automated. In this light much research was channeled towards the opening and closing of the bin when the presence of human is sensed. However this may be considered less smart since the bin will operate when the presence of human is sensed even though there is no intention to use it. To avert this ill this paper presents the design and development of a smart waste bin. The objective of this paper is to develop a smart waste bin that detects the presence of man at a particular distance 1 meter for usage so as not to spill the dirt and obeys voice command to open or close the lid. This is achieved by the use of PIR ultrasonic module voice recognition module Arduino and servo motor. Results gotten after testing the developed system shows that the performance of the waste bin attains a better level of smartness compared to existing waste bin.

  10. A Comparative Study of Histogram Equalization Based Image Enhancement Techniques for Brightness Preservation and Contrast Enhancement

    OpenAIRE

    Patel, Omprakash; Maravi, Yogendra P. S.; Sharma, Sanjeev

    2013-01-01

    Histogram Equalization is a contrast enhancement te chnique in the image processing which uses the histogram of image. However histogram equalization is not the best method for contrast enhancement because the mean brightness of the output image is significantly different from the input image. There are several extensions of histogram equalization has be en proposed to overcome the brightness preservation cha...

  11. Adaptive histogram subsection modification for infrared image enhancement

    Science.gov (United States)

    Qu, Hui-ming; Chen, Qian

    2006-05-01

    Firstly, the drawbacks of infrared image histogram equalization and its improved algorithm are analyzed. A novel technique which can not only enhance the contrast but also preserve detail information of infrared image is presented. It is called adaptive histogram subsection modification in this paper. The property of infrared image histogram is applied to determine the subsection position adaptively. The second-order differential coefficient of gray level probabilistic density curve is calculated from top down direction. The first inflexion is chosen as the subsection point between high probabilistic density gray levels and low probabilistic density gray levels in the histogram of infrared image. Then the histogram of low probabilistic density section and high probabilistic density section are mapped and modified respectively. Finally, subsection images are combined together and an output infrared image is reconstructed. The contrast is enhanced and the original gray levels are mostly preserved simultaneously during extending the dynamic range of gray levels in infrared image. Meanwhile, suitable distance is kept between gray levels to avoid large isolated grains defined as patchiness in the image. Several infrared images are adopted to demonstrate the performance of this method. Experimental results show that the infrared image quality is greatly improved by this approach. Furthermore, the proposed algorithm is simple and easy to perform.

  12. Discrimination of paediatric brain tumours using apparent diffusion coefficient histograms

    International Nuclear Information System (INIS)

    Bull, Jonathan G.; Clark, Christopher A.; Saunders, Dawn E.

    2012-01-01

    To determine if histograms of apparent diffusion coefficients (ADC) can be used to differentiate paediatric brain tumours. Imaging of histologically confirmed tumours with pre-operative ADC maps were reviewed (54 cases, 32 male, mean age 6.1 years; range 0.1-15.8 years) comprising 6 groups. Whole tumour ADC histograms were calculated; normalised for volume. Stepwise logistic regression analysis was used to differentiate tumour types using histogram metrics, initially for all groups and then for specific subsets. All 6 groups (5 dysembryoplastic neuroectodermal tumours, 22 primitive neuroectodermal tumours (PNET), 5 ependymomas, 7 choroid plexus papillomas, 4 atypical teratoid rhabdoid tumours (ATRT) and 9 juvenile pilocytic astrocytomas (JPA)) were compared. 74% (40/54) were correctly classified using logistic regression of ADC histogram parameters. In the analysis of posterior fossa tumours, 80% of ependymomas, 100% of astrocytomas and 94% of PNET-medulloblastoma were classified correctly. All PNETs were discriminated from ATRTs (22 PNET and 4 supratentorial ATRTs) (100%). ADC histograms are useful in differentiating paediatric brain tumours, in particular, the common posterior fossa tumours of childhood. PNETs were differentiated from supratentorial ATRTs, in all cases, which has important implications in terms of clinical management. (orig.)

  13. Defect detection based on extreme edge of defective region histogram

    Directory of Open Access Journals (Sweden)

    Zouhir Wakaf

    2018-01-01

    Full Text Available Automatic thresholding has been used by many applications in image processing and pattern recognition systems. Specific attention was given during inspection for quality control purposes in various industries like steel processing and textile manufacturing. Automatic thresholding problem has been addressed well by the commonly used Otsu method, which provides suitable results for thresholding images based on a histogram of bimodal distribution. However, the Otsu method fails when the histogram is unimodal or close to unimodal. Defects have different shapes and sizes, ranging from very small to large. The gray-level distributions of the image histogram can vary between unimodal and multimodal. Furthermore, Otsu-revised methods, like the valley-emphasis method and the background histogram mode extents, which overcome the drawbacks of the Otsu method, require preprocessing steps and fail to use the general threshold for multimodal defects. This study proposes a new automatic thresholding algorithm based on the acquisition of the defective region histogram and the selection of its extreme edge as the threshold value to segment all defective objects in the foreground from the image background. To evaluate the proposed defect-detection method, common standard images for experimentation were used. Experimental results of the proposed method show that the proposed method outperforms the current methods in terms of defect detection.

  14. Bin-packing problems with load balancing and stability constraints

    DEFF Research Database (Denmark)

    Trivella, Alessio; Pisinger, David

    -packing and load balancing of items. The problem has only been considered in the literature in simplified versions, e.g. balancing a single bin or introducing a feasible region for the barycenter. In section 3 we generalize the problem to handle cargo stability and weight constraints....... realistic constraints related to e.g. load balancing, cargo stability and weight limits, in the multi-dimensional BPP. The BPP poses additional challenges compared to the CLP due to the supplementary objective of minimizing the number of bins. In particular, in section 2 we discuss how to integrate bin...

  15. Contrast enhancement of portal images by selective histogram equalization.

    Science.gov (United States)

    Crooks, I; Fallone, B G

    1993-01-01

    Because of the high energy of the treatment beam, contrast of portal verification films is very poor. A simple contrast enhancement technique is described which we have labeled selective histogram equalization (SHE), to improve visualization of double-exposure portal images and this facilitate the beam verification process. The technique performs separate histogram equalization on the treatment- and open-field sections of double-exposure portal images. Delineation of the treatment field edge and separation into two regions is performed automatically for off-line portal radiographs by a strategic combination of Sobel filtration and morphological processes. Analyses of images processed by SHE and other adaptive histogram equalization techniques indicate that SHE produces improved contrast enhancement with minimal addition of noise or artifacts, thus simplifying the beam verification procedure. The simple implementation of an automatic SHE process with on-line portal systems is also discussed.

  16. PENGARUH HISTOGRAM EQUALIZATION UNTUK PERBAIKAN KUALITAS CITRA DIGITAL

    Directory of Open Access Journals (Sweden)

    Sisilia Daeng Bakka Mau

    2016-04-01

    Full Text Available Penelitian ini membahas penggunaan metode histogram equalization yang akan digunakan untuk perbaikan kualitas citra. Perbaikan kualitas citra (image enhancement merupakan salah satu proses awal dalam peningkatan mutu citra. Peningkatan mutu citra diperlukan karena seringkali citra yang dijadikan objek pembahasan mempunyai kualitas yang buruk, misalnya citra mengalami derau, kabur, citra terlalu gelap atau terang, citra kurang tajam dan sebagainya. Perbaikan kualitas citra adalah proses memperjelas dan mempertajam ciri atau fitur tertentu dari citra agar citra lebih mudah dipersepsi maupun dianalisa secara lebih teliti. Hasil penelitian ini membuktikan bahwa penggunaan metode histogram equalization dapat digunakan untuk meningkatkan kontras citra dan dapat meningkatkan kualitas citra, sehingga informasi yang ada pada citra lebih jelas terlihat. Kata kunci: perbaikan kualitas citra, histogram equalization, citra digital

  17. Wavelet-based histogram equalization enhancement of gastric sonogram images.

    Science.gov (United States)

    Fu, J C; Lien, H C; Wong, S T

    2000-01-01

    The gray levels of gastric sonogram images are usually concentrated at the zero end of the spectrum, making the image too low in contrast and too dark for the naked eye. Though histogram equalization can enhance the contrast by redistributing the gray levels, it has the drawback that it reduces the information in the processed image. In this paper, a wavelet-based enhancement algorithm post-processor is used to further enhance the image and compensate for the information loss during histogram equalization. Experimental results show that the wavelet-based enhancement algorithm can enhance the contrast and significantly increase the informational entropy of the image. Because the combination of the histogram equalization and wavelet approach can dramatically increase the contrast and maintain information rate in gastric sonograms, it has the potential to improve clinical diagnosis and research.

  18. Residency Allocation Database

    Data.gov (United States)

    Department of Veterans Affairs — The Residency Allocation Database is used to determine allocation of funds for residency programs offered by Veterans Affairs Medical Centers (VAMCs). Information...

  19. Simulating storm electrification with bin and bulk microphysics

    Science.gov (United States)

    Mansell, E. R.

    2013-12-01

    Simulated storm electrification can be highly dependent on the parameterizations of microphysical processes, particularly those involving ice particles. Commonly-used bulk microphysics assume a functional form of the particle size distribution and predict one or more moments of the distribution, such as total mass, number concentration, and reflectivity. Bin schemes, on the other hand, allow the particle spectrum to evolve by predicting the number of particles in discrete size ranges (bins). Bin schemes are often promoted as benchmark solutions, but have much greater computational expense and can have other disadvantages. Only a few studies have compared results for bin and bulk schemes within the same model framework, which controls for differences in model numerics and other physics. Here, the bin microphysics scheme of Takahashi has been incorporated into the COMMAS model for comparison with the 2-3-moment bulk scheme. The resulting electrification, charge structure and lightning are compared, as well. Charge separation and transfer have been newly added to the bin scheme, along with some updates to the physics, such as improved ice melting. Thus the same laboratory-based charging schemes from previous work can be used with both microphysics packages. The bulk and bin schemes generally have similar microphysical features in the simulations. Differences can result in part from differences the parameterizations of partical interactions (and particle types) as much as from the simple difference in size distributions. For example both the bin and bulk schemes are sensitive to the concentration of cloud condensation nuclei, as shown in recent work from the bulk scheme. Results will be presented for idealized 2-dimensional cases and for fully 3D simulations of a small multicell thunderstorms.

  20. Benefits of a Hospital Two-Bin Kanban System

    Science.gov (United States)

    2014-09-01

    supply chain management market. The application of RFID technologies was seen from high-valued item traceability to asset tracking —or Real Time...stocks split between primary (in front) and secondary bins (directly behind). RFID tags are placed on the front of each bin. Photos taken at WRNMMC...on 16DEC13. ....................................................................3  Figure 2.  Example of RFID tags placed on the RFID board that

  1. The volatile compound BinBase mass spectral database.

    Science.gov (United States)

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The Bin

  2. The volatile compound BinBase mass spectral database

    Directory of Open Access Journals (Sweden)

    Barupal Dinesh K

    2011-08-01

    Full Text Available Abstract Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species. Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http

  3. Comments on 'Reconsidering the definition of a dose-volume histogram'--dose-mass histogram (DMH) versus dose-volume histogram (DVH) for predicting radiation-induced pneumonitis.

    Science.gov (United States)

    Mavroidis, Panayiotis; Plataniotis, Georgios A; Górka, Magdalena Adamus; Lind, Bengt K

    2006-12-21

    In a recently published paper (Nioutsikou et al 2005 Phys. Med. Biol. 50 L17) the authors showed that the use of the dose-mass histogram (DMH) concept is a more accurate descriptor of the dose delivered to lung than the traditionally used dose-volume histogram (DVH) concept. Furthermore, they state that if a functional imaging modality could also be registered to the anatomical imaging modality providing a functional weighting across the organ (functional mass) then the more general and realistic concept of the dose-functioning mass histogram (D[F]MH) could be an even more appropriate descriptor. The comments of the present letter to the editor are in line with the basic arguments of that work since their general conclusions appear to be supported by the comparison of the DMH and DVH concepts using radiobiological measures. In this study, it is examined whether the dose-mass histogram (DMH) concept deviated significantly from the widely used dose-volume histogram (DVH) concept regarding the expected lung complications and if there are clinical indications supporting these results. The problem was investigated theoretically by applying two hypothetical dose distributions (Gaussian and semi-Gaussian shaped) on two lungs of uniform and varying densities. The influence of the deviation between DVHs and DMHs on the treatment outcome was estimated by using the relative seriality and LKB models using the Gagliardi et al (2000 Int. J. Radiat. Oncol. Biol. Phys. 46 373) and Seppenwoolde et al (2003 Int. J. Radiat. Oncol. Biol. Phys. 55 724) parameter sets for radiation pneumonitis, respectively. Furthermore, the biological equivalent of their difference was estimated by the biologically effective uniform dose (D) and equivalent uniform dose (EUD) concepts, respectively. It is shown that the relation between the DVHs and DMHs varies depending on the underlying cell density distribution and the applied dose distribution. However, the range of their deviation in terms of the

  4. Fuzzy Logic-Based Histogram Equalization for Image Contrast Enhancement

    Directory of Open Access Journals (Sweden)

    V. Magudeeswaran

    2013-01-01

    Full Text Available Fuzzy logic-based histogram equalization (FHE is proposed for image contrast enhancement. The FHE consists of two stages. First, fuzzy histogram is computed based on fuzzy set theory to handle the inexactness of gray level values in a better way compared to classical crisp histograms. In the second stage, the fuzzy histogram is divided into two subhistograms based on the median value of the original image and then equalizes them independently to preserve image brightness. The qualitative and quantitative analyses of proposed FHE algorithm are evaluated using two well-known parameters like average information contents (AIC and natural image quality evaluator (NIQE index for various images. From the qualitative and quantitative measures, it is interesting to see that this proposed method provides optimum results by giving better contrast enhancement and preserving the local information of the original image. Experimental result shows that the proposed method can effectively and significantly eliminate washed-out appearance and adverse artifacts induced by several existing methods. The proposed method has been tested using several images and gives better visual quality as compared to the conventional methods.

  5. Adaptive multi-histogram equalization using human vision thresholding

    Science.gov (United States)

    Wharton, Eric; Panetta, Karen; Agaian, Sos

    2007-02-01

    Image enhancement is the task of applying certain alterations to an input image such as to obtain a more visually pleasing image. The alteration usually requires interpretation and feedback from a human evaluator of the output resulting image. Therefore, image enhancement is considered a difficult task when attempting to automate the analysis process and eliminate the human intervention. Furthermore, images that do not have uniform brightness pose a challenging problem for image enhancement systems. Different kinds of histogram equalization techniques have been employed for enhancing images that have overall improper illumination or are over/under exposed. However, these techniques perform poorly for images that contain various regions of improper illumination or improper exposure. In this paper, we introduce new human vision model based automatic image enhancement techniques, multi-histogram equalization as well as local and adaptive algorithms. These enhancement algorithms address the previously mentioned shortcomings. We present a comparison of our results against many current local and adaptive histogram equalization methods. Computer simulations are presented showing that the proposed algorithms outperform the other algorithms in two important areas. First, they have better performance, both in terms of subjective and objective evaluations, then that currently used algorithms on a series of poorly illuminated images as well as images with uniform and non-uniform illumination, and images with improper exposure. Second, they better adapt to local features in an image, in comparison to histogram equalization methods which treat the images globally.

  6. The effect of illumination compensation methods with histogram ...

    African Journals Online (AJOL)

    This paper presents the results of a factorial experiment performed to determine the effect of illumination compensation methods with histogram back projection to be used for object tracking algorithm continuous adaptive mean-shift (Camshift). Since Camshift tracking can be used for distance approximation of an object, ...

  7. Solar Radiation Pressure Binning for the Geosynchronous Orbit

    Science.gov (United States)

    Hejduk, M. D.; Ghrist, R. W.

    2011-01-01

    Orbital maintenance parameters for individual satellites or groups of satellites have traditionally been set by examining orbital parameters alone, such as through apogee and perigee height binning; this approach ignored the other factors that governed an individual satellite's susceptibility to non-conservative forces. In the atmospheric drag regime, this problem has been addressed by the introduction of the "energy dissipation rate," a quantity that represents the amount of energy being removed from the orbit; such an approach is able to consider both atmospheric density and satellite frontal area characteristics and thus serve as a mechanism for binning satellites of similar behavior. The geo-synchronous orbit (of broader definition than the geostationary orbit -- here taken to be from 1300 to 1800 minutes in orbital period) is not affected by drag; rather, its principal non-conservative force is that of solar radiation pressure -- the momentum imparted to the satellite by solar radiometric energy. While this perturbation is solved for as part of the orbit determination update, no binning or division scheme, analogous to the drag regime, has been developed for the geo-synchronous orbit. The present analysis has begun such an effort by examining the behavior of geosynchronous rocket bodies and non-stabilized payloads as a function of solar radiation pressure susceptibility. A preliminary examination of binning techniques used in the drag regime gives initial guidance regarding the criteria for useful bin divisions. Applying these criteria to the object type, solar radiation pressure, and resultant state vector accuracy for the analyzed dataset, a single division of "large" satellites into two bins for the purposes of setting related sensor tasking and orbit determination (OD) controls is suggested. When an accompanying analysis of high area-to-mass objects is complete, a full set of binning recommendations for the geosynchronous orbit will be available.

  8. Wildfire Detection using by Multi Dimensional Histogram in Boreal Forest

    Science.gov (United States)

    Honda, K.; Kimura, K.; Honma, T.

    2008-12-01

    Early detection of wildfires is an issue for reduction of damage to environment and human. There are some attempts to detect wildfires by using satellite imagery, which are mainly classified into three methods: Dozier Method(1981-), Threshold Method(1986-) and Contextual Method(1994-). However, the accuracy of these methods is not enough: some commission and omission errors are included in the detected results. In addition, it is not so easy to analyze satellite imagery with high accuracy because of insufficient ground truth data. Kudoh and Hosoi (2003) developed the detection method by using three-dimensional (3D) histogram from past fire data with the NOAA-AVHRR imagery. But their method is impractical because their method depends on their handworks to pick up past fire data from huge data. Therefore, the purpose of this study is to collect fire points as hot spots efficiently from satellite imagery and to improve the method to detect wildfires with the collected data. As our method, we collect past fire data with the Alaska Fire History data obtained by the Alaska Fire Service (AFS). We select points that are expected to be wildfires, and pick up the points inside the fire area of the AFS data. Next, we make 3D histogram with the past fire data. In this study, we use Bands 1, 21 and 32 of MODIS. We calculate the likelihood to detect wildfires with the three-dimensional histogram. As our result, we select wildfires with the 3D histogram effectively. We can detect the troidally spreading wildfire. This result shows the evidence of good wildfire detection. However, the area surrounding glacier tends to rise brightness temperature. It is a false alarm. Burnt area and bare ground are sometimes indicated as false alarms, so that it is necessary to improve this method. Additionally, we are trying various combinations of MODIS bands as the better method to detect wildfire effectively. So as to adjust our method in another area, we are applying our method to tropical

  9. Multi-dimensional Bin Packing Problems with Guillotine Constraints

    DEFF Research Database (Denmark)

    Amossen, Rasmus Resen; Pisinger, David

    2010-01-01

    The problem addressed in this paper is the decision problem of determining if a set of multi-dimensional rectangular boxes can be orthogonally packed into a rectangular bin while satisfying the requirement that the packing should be guillotine cuttable. That is, there should exist a series of face...... parallel straight cuts that can recursively cut the bin into pieces so that each piece contains a box and no box has been intersected by a cut. The unrestricted problem is known to be NP-hard. In this paper we present a generalization of a constructive algorithm for the multi-dimensional bin packing...... problem, with and without the guillotine constraint, based on constraint programming....

  10. Grasp Densities for Grasp Refinement in Industrial Bin Picking

    DEFF Research Database (Denmark)

    Hupfauf, Benedikt; Hahn, Heiko; Bodenhagen, Leon

    in terms of object-relative gripper pose, can be learned from empirical experience, and allow the automatic choice of optimal grasps in a given scene context (object pose, workspace constraints, etc.). We will show grasp densities extracted from empirical data in a real industrial bin picking context...... generated in industrial bin-picking for grasp learning. This aim is achieved by using the novel concept of grasp densities (Detry et al., 2010). Grasp densities can describe the full variety of grasps that apply to specific objects using specific grippers. They represent the likelihood of grasp success...

  11. Histogram Equalization to Model Adaptation for Robust Speech Recognition

    Directory of Open Access Journals (Sweden)

    Suh Youngjoo

    2010-01-01

    Full Text Available We propose a new model adaptation method based on the histogram equalization technique for providing robustness in noisy environments. The trained acoustic mean models of a speech recognizer are adapted into environmentally matched conditions by using the histogram equalization algorithm on a single utterance basis. For more robust speech recognition in the heavily noisy conditions, trained acoustic covariance models are efficiently adapted by the signal-to-noise ratio-dependent linear interpolation between trained covariance models and utterance-level sample covariance models. Speech recognition experiments on both the digit-based Aurora2 task and the large vocabulary-based task showed that the proposed model adaptation approach provides significant performance improvements compared to the baseline speech recognizer trained on the clean speech data.

  12. Adaptive image contrast enhancement using generalizations of histogram equalization.

    Science.gov (United States)

    Stark, J A

    2000-01-01

    This paper proposes a scheme for adaptive image-contrast enhancement based on a generalization of histogram equalization (HE). HE is a useful technique for improving image contrast, but its effect is too severe for many purposes. However, dramatically different results can be obtained with relatively minor modifications. A concise description of adaptive HE is set out, and this framework is used in a discussion of past suggestions for variations on HE. A key feature of this formalism is a "cumulation function," which is used to generate a grey level mapping from the local histogram. By choosing alternative forms of cumulation function one can achieve a wide variety of effects. A specific form is proposed. Through the variation of one or two parameters, the resulting process can produce a range of degrees of contrast enhancement, at one extreme leaving the image unchanged, at another yielding full adaptive equalization.

  13. Histogram Equalization to Model Adaptation for Robust Speech Recognition

    Science.gov (United States)

    Suh, Youngjoo; Kim, Hoirin

    2010-12-01

    We propose a new model adaptation method based on the histogram equalization technique for providing robustness in noisy environments. The trained acoustic mean models of a speech recognizer are adapted into environmentally matched conditions by using the histogram equalization algorithm on a single utterance basis. For more robust speech recognition in the heavily noisy conditions, trained acoustic covariance models are efficiently adapted by the signal-to-noise ratio-dependent linear interpolation between trained covariance models and utterance-level sample covariance models. Speech recognition experiments on both the digit-based Aurora2 task and the large vocabulary-based task showed that the proposed model adaptation approach provides significant performance improvements compared to the baseline speech recognizer trained on the clean speech data.

  14. Multifractal analysis of three-dimensional histogram from color images

    International Nuclear Information System (INIS)

    Chauveau, Julien; Rousseau, David; Richard, Paul; Chapeau-Blondeau, Francois

    2010-01-01

    Natural images, especially color or multicomponent images, are complex information-carrying signals. To contribute to the characterization of this complexity, we investigate the possibility of multiscale organization in the colorimetric structure of natural images. This is realized by means of a multifractal analysis applied to the three-dimensional histogram from natural color images. The observed behaviors are confronted to those of reference models with known multifractal properties. We use for this purpose synthetic random images with trivial monofractal behavior, and multidimensional multiplicative cascades known for their actual multifractal behavior. The behaviors observed on natural images exhibit similarities with those of the multifractal multiplicative cascades and display the signature of elaborate multiscale organizations stemming from the histograms of natural color images. This type of characterization of colorimetric properties can be helpful to various tasks of digital image processing, as for instance modeling, classification, indexing.

  15. Histogram analysis for smartphone-based rapid hematocrit determination

    Science.gov (United States)

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  16. Independent histogram pursuit for segmentation of skin lesions

    DEFF Research Database (Denmark)

    Gomez, D.D.; Butakoff, C.; Ersbøll, Bjarne Kjær

    2008-01-01

    In this paper, an unsupervised algorithm, called the Independent Histogram Pursuit (HIP), for segmenting dermatological lesions is proposed. The algorithm estimates a set of linear combinations of image bands that enhance different structures embedded in the image. In particular, the first estima...... to deal with different types of dermatological lesions. The boundary detection precision using k-means segmentation was close to 97%. The proposed algorithm can be easily combined with the majority of classification algorithms....

  17. Wood Species Recognition Based on SIFT Keypoint Histogram

    OpenAIRE

    Hu, Shuaiqi; Li, Ke; Bao, Xudong

    2015-01-01

    Traditionally, only experts who are equipped with professional knowledge and rich experience are able to recognize different species of wood. Applying image processing techniques for wood species recognition can not only reduce the expense to train qualified identifiers, but also increase the recognition accuracy. In this paper, a wood species recognition technique base on Scale Invariant Feature Transformation (SIFT) keypoint histogram is proposed. We use first the SIFT algorithm to extract ...

  18. Color and Contrast Enhancement by Controlled Piecewise Affine Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Jose-Luis Lisani

    2012-10-01

    Full Text Available This paper presents a simple contrast enhancement algorithm based on histogram equalization (HE. The proposed algorithm performs a piecewise affine transform of the intensity levels of a digital image such that the new cumulative distribution function will be approximately uniform (as with HE, but where the stretching of the range is locally controlled to avoid brutal noise enhancement. We call this algorithm Piecewise Affine Equalization (PAE. Several experiments show that, in general, the new algorithm improves HE results.

  19. Flood detection/monitoring using adjustable histogram equalization technique.

    Science.gov (United States)

    Nazir, Fakhera; Riaz, Muhammad Mohsin; Ghafoor, Abdul; Arif, Fahim

    2014-01-01

    Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique.

  20. [A fast iterative algorithm for adaptive histogram equalization].

    Science.gov (United States)

    Cao, X; Liu, X; Deng, Z; Jiang, D; Zheng, C

    1997-01-01

    In this paper, we propose an iterative algorthm called FAHE., which is based on the relativity between the current local histogram and the one before the sliding window moving. Comparing with the basic AHE, the computing time of FAHE is decreased from 5 hours to 4 minutes on a 486dx/33 compatible computer, when using a 65 x 65 sliding window for a 512 x 512 with 8 bits gray-level range.

  1. Breast density pattern characterization by histogram features and texture descriptors

    OpenAIRE

    Carneiro,Pedro Cunha; Franco,Marcelo Lemos Nunes; Thomaz,Ricardo de Lima; Patrocinio,Ana Claudia

    2017-01-01

    Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammo...

  2. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chung-Cheng Chiu

    2016-06-01

    Full Text Available Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA, which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods.

  3. Contrast Enhancement Algorithm Based on Gap Adjustment for Histogram Equalization.

    Science.gov (United States)

    Chiu, Chung-Cheng; Ting, Chih-Chung

    2016-06-22

    Image enhancement methods have been widely used to improve the visual effects of images. Owing to its simplicity and effectiveness histogram equalization (HE) is one of the methods used for enhancing image contrast. However, HE may result in over-enhancement and feature loss problems that lead to unnatural look and loss of details in the processed images. Researchers have proposed various HE-based methods to solve the over-enhancement problem; however, they have largely ignored the feature loss problem. Therefore, a contrast enhancement algorithm based on gap adjustment for histogram equalization (CegaHE) is proposed. It refers to a visual contrast enhancement algorithm based on histogram equalization (VCEA), which generates visually pleasing enhanced images, and improves the enhancement effects of VCEA. CegaHE adjusts the gaps between two gray values based on the adjustment equation, which takes the properties of human visual perception into consideration, to solve the over-enhancement problem. Besides, it also alleviates the feature loss problem and further enhances the textures in the dark regions of the images to improve the quality of the processed images for human visual perception. Experimental results demonstrate that CegaHE is a reliable method for contrast enhancement and that it significantly outperforms VCEA and other methods.

  4. Texture enhanced histogram equalization using TV- L¹ image decomposition.

    Science.gov (United States)

    Ghita, Ovidiu; Ilea, Dana E; Whelan, Paul F

    2013-08-01

    Histogram transformation defines a class of image processing operations that are widely applied in the implementation of data normalization algorithms. In this paper, we present a new variational approach for image enhancement that is constructed to alleviate the intensity saturation effects that are introduced by standard contrast enhancement (CE) methods based on histogram equalization. In this paper, we initially apply total variation (TV) minimization with a L(1) fidelity term to decompose the input image with respect to cartoon and texture components. Contrary to previous papers that rely solely on the information encompassed in the distribution of the intensity information, in this paper, the texture information is also employed to emphasize the contribution of the local textural features in the CE process. This is achieved by implementing a nonlinear histogram warping CE strategy that is able to maximize the information content in the transformed image. Our experimental study addresses the CE of a wide variety of image data and comparative evaluations are provided to illustrate that our method produces better results than conventional CE strategies.

  5. Breast density pattern characterization by histogram features and texture descriptors

    Directory of Open Access Journals (Sweden)

    Pedro Cunha Carneiro

    2017-04-01

    Full Text Available Abstract Introduction Breast cancer is the first leading cause of death for women in Brazil as well as in most countries in the world. Due to the relation between the breast density and the risk of breast cancer, in medical practice, the breast density classification is merely visual and dependent on professional experience, making this task very subjective. The purpose of this paper is to investigate image features based on histograms and Haralick texture descriptors so as to separate mammographic images into categories of breast density using an Artificial Neural Network. Methods We used 307 mammographic images from the INbreast digital database, extracting histogram features and texture descriptors of all mammograms and selecting them with the K-means technique. Then, these groups of selected features were used as inputs of an Artificial Neural Network to classify the images automatically into the four categories reported by radiologists. Results An average accuracy of 92.9% was obtained in a few tests using only some of the Haralick texture descriptors. Also, the accuracy rate increased to 98.95% when texture descriptors were mixed with some features based on a histogram. Conclusion Texture descriptors have proven to be better than gray levels features at differentiating the breast densities in mammographic images. From this paper, it was possible to automate the feature selection and the classification with acceptable error rates since the extraction of the features is suitable to the characteristics of the images involving the problem.

  6. Slope histogram distribution-based parametrisation of Martian geomorphic features

    Science.gov (United States)

    Balint, Zita; Székely, Balázs; Kovács, Gábor

    2014-05-01

    The application of geomorphometric methods on the large Martian digital topographic datasets paves the way to analyse the Martian areomorphic processes in more detail. One of the numerous methods is the analysis is to analyse local slope distributions. To this implementation a visualization program code was developed that allows to calculate the local slope histograms and to compare them based on Kolmogorov distance criterion. As input data we used the digital elevation models (DTMs) derived from HRSC high-resolution stereo camera image from various Martian regions. The Kolmogorov-criterion based discrimination produces classes of slope histograms that displayed using coloration obtaining an image map. In this image map the distribution can be visualized by their different colours representing the various classes. Our goal is to create a local slope histogram based classification for large Martian areas in order to obtain information about general morphological characteristics of the region. This is a contribution of the TMIS.ascrea project, financed by the Austrian Research Promotion Agency (FFG). The present research is partly realized in the frames of TÁMOP 4.2.4.A/2-11-1-2012-0001 high priority "National Excellence Program - Elaborating and Operating an Inland Student and Researcher Personal Support System convergence program" project's scholarship support, using Hungarian state and European Union funds and cofinances from the European Social Fund.

  7. Proposed first-generation WSQ bit allocation procedure

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, J.N.; Brislawn, C.M.

    1993-09-08

    The Wavelet/Scalar Quantization (WSQ) gray-scale fingerprint image compression algorithm involves a symmetric wavelet transform (SWT) image decomposition followed by uniform scalar quantization of each subband. The algorithm is adaptive insofar as the bin widths for the scalar quantizers are image-specific and are included in the compressed image format. Since the decoder requires only the actual bin width values -- but not the method by which they were computed -- the standard allows for future refinements of the WSQ algorithm by improving the method used to select the scalar quantizer bin widths. This report proposes a bit allocation procedure for use with the first-generation WSQ encoder. In previous work a specific formula is provided for the relative sizes of the scalar quantizer bin widths in terms of the variances of the SWT subbands. An explicit specification for the constant of proportionality, q, that determines the absolute bin widths was not given. The actual compression ratio produced by the WSQ algorithm will generally vary from image to image depending on the amount of coding gain obtained by the run-length and Huffman coding, stages of the algorithm, but testing performed by the FBI established that WSQ compression produces archival quality images at compression ratios of around 20 to 1. The bit allocation procedure described in this report possesses a control parameter, r, that can be set by the user to achieve a predetermined amount of lossy compression, effectively giving the user control over the amount of distortion introduced by quantization noise. The variability observed in final compression ratios is thus due only to differences in lossless coding gain from image to image, chiefly a result of the varying amounts of blank background surrounding the print area in the images. Experimental results are presented that demonstrate the proposed method`s effectiveness.

  8. Rumsfeld: Osama bin Ladenil polnud sidemeid Saddamiga / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Valeks on osutunud väide, et Saddam Husseini ja bin Ladeni vahel oli tihe koostöö. Autori sõnul uurisid Richard Cheney, Donald Rumsfeld ja USA praegune asekaitseminister Paul Wolfowitz juba 1990-ndate keskel võimalusi uueks sõjaks Iraagi vastu. Lisa: Bremer süüdistab Valget Maja

  9. Binning sequences using very sparse labels within a metagenome

    Directory of Open Access Journals (Sweden)

    Halgamuge Saman K

    2008-04-01

    Full Text Available Abstract Background In metagenomic studies, a process called binning is necessary to assign contigs that belong to multiple species to their respective phylogenetic groups. Most of the current methods of binning, such as BLAST, k-mer and PhyloPythia, involve assigning sequence fragments by comparing sequence similarity or sequence composition with already-sequenced genomes that are still far from comprehensive. We propose a semi-supervised seeding method for binning that does not depend on knowledge of completed genomes. Instead, it extracts the flanking sequences of highly conserved 16S rRNA from the metagenome and uses them as seeds (labels to assign other reads based on their compositional similarity. Results The proposed seeding method is implemented on an unsupervised Growing Self-Organising Map (GSOM, and called Seeded GSOM (S-GSOM. We compared it with four well-known semi-supervised learning methods in a preliminary test, separating random-length prokaryotic sequence fragments sampled from the NCBI genome database. We identified the flanking sequences of the highly conserved 16S rRNA as suitable seeds that could be used to group the sequence fragments according to their species. S-GSOM showed superior performance compared to the semi-supervised methods tested. Additionally, S-GSOM may also be used to visually identify some species that do not have seeds. The proposed method was then applied to simulated metagenomic datasets using two different confidence threshold settings and compared with PhyloPythia, k-mer and BLAST. At the reference taxonomic level Order, S-GSOM outperformed all k-mer and BLAST results and showed comparable results with PhyloPythia for each of the corresponding confidence settings, where S-GSOM performed better than PhyloPythia in the ≥ 10 reads datasets and comparable in the ≥ 8 kb benchmark tests. Conclusion In the task of binning using semi-supervised learning methods, results indicate S-GSOM to be the best of

  10. Decomposed multi-objective bin-packing for virtual machine consolidation

    Directory of Open Access Journals (Sweden)

    Eli M. Dow

    2016-02-01

    Full Text Available In this paper, we describe a novel solution to the problem of virtual machine (VM consolidation, otherwise known as VM-Packing, as applicable to Infrastructure-as-a-Service cloud data centers. Our solution relies on the observation that virtual machines are not infinitely variable in resource consumption. Generally, cloud compute providers offer them in fixed resource allocations. Effectively this makes all VMs of that allocation type (or instance type generally interchangeable for the purposes of consolidation from a cloud compute provider viewpoint. The main contribution of this work is to demonstrate the advantages to our approach of deconstructing the VM consolidation problem into a two-step process of multidimensional bin packing. The first step is to determine the optimal, but abstract, solution composed of finite groups of equivalent VMs that should reside on each host. The second step selects concrete VMs from the managed compute pool to satisfy the optimal abstract solution while enforcing anti-colocation and preferential colocation of the virtual machines through VM contracts. We demonstrate our high-performance, deterministic packing solution generation, with over 7,500 VMs packed in under 2 min. We demonstrating comparable runtimes to other VM management solutions published in the literature allowing for favorable extrapolations of the prior work in the field in order to deal with larger VM management problem sizes our solution scales to.

  11. Membership function modification of fuzzy logic controllers with histogram equalization.

    Science.gov (United States)

    Zhuang, H; Wu, X

    2001-01-01

    In most fuzzy logic controllers (FLCs), initial membership functions (MFs) are normally laid evenly all across the universes of discourse (UD) that represent fuzzy control inputs. However, for evenly distributed MFs, there exists a potential problem that may adversely affect the control performance; that is, if the actual inputs are not equally distributed, but instead concentrate within a certain interval that is only part of the entire input area, this will result in two negative effects. On one hand, the MFs staying in the dense-input area will not be sufficient to react precisely to the inputs, because these inputs are too close to each other compared to the MFs in this area. The same fuzzy control output could be triggered for several different inputs. On the other hand, some of the MFs assigned for the sparse-input area are "wasted". In this paper we argue that, if we arrange the placement of these MFs according to a statistical study of feedback errors in a closed-loop system, we can expect a better control performance. To this end, we introduce a new mechanism to modify the evenly distributed MFs with the help of a technique termed histogram equalization. The histogram of the errors is actually the spatial distribution of real-time errors of the control system. To illustrate the proposed MF modification approach, a computer simulation of a simple system that has a known mathematical model is first analyzed, leading to our understanding of how this histogram-based modification mechanism functions. We then apply this method to an experimental laser tracking system to demonstrate that in real-world applications, a better control performance can he obtained by using this proposed technique.

  12. Adaptive histogram equalization in digital radiography of destructive skeletal lesions.

    Science.gov (United States)

    Braunstein, E M; Capek, P; Buckwalter, K; Bland, P; Meyer, C R

    1988-03-01

    Adaptive histogram equalization, an image-processing technique that distributes pixel values of an image uniformly throughout the gray scale, was applied to 28 plain radiographs of bone lesions, after they had been digitized. The non-equalized and equalized digital images were compared by two skeletal radiologists with respect to lesion margins, internal matrix, soft-tissue mass, cortical breakthrough, and periosteal reaction. Receiver operating characteristic (ROC) curves were constructed on the basis of the responses. Equalized images were superior to nonequalized images in determination of cortical breakthrough and presence or absence of periosteal reaction. ROC analysis showed no significant difference in determination of margins, matrix, or soft-tissue masses.

  13. High capacity, high speed histogramming data acquisition memory

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, A.; Boulin, C. [European Molecular Biology Lab., Heidelberg (Germany). Cell Biophysics Programme

    1996-02-01

    A double width CAMAC DRAM store module was developed for use as a histogramming memory in fast time-resolved synchrotron radiation applications to molecular biology. High speed direct memory modify (3 MHz) is accomplished by using a discrete DRAM controller and fast page mode access. The module can be configured using standard SIMMs to sizes of up to 64M-words. The word width is 16 bit and the module can handle overflows by storing the overflow addresses in a dedicated FIFO. Simultaneous front panel DMM/DMI access and CAMAC readout of the overflow addresses is supported.

  14. Text-Independent Speaker Identification Using the Histogram Transform Model

    DEFF Research Database (Denmark)

    Ma, Zhanyu; Yu, Hong; Tan, Zheng-Hua

    2016-01-01

    In this paper, we propose a novel probabilistic method for the task of text-independent speaker identification (SI). In order to capture the dynamic information during SI, we design a super-MFCCs features by cascading three neighboring Mel-frequency Cepstral coefficients (MFCCs) frames together....... These super-MFCC vectors are utilized for probabilistic model training such that the speaker’s characteristics can be sufficiently captured. The probability density function (PDF) of the aforementioned super-MFCCs features is estimated by the recently proposed histogram transform (HT) method. To recedes...

  15. WASP (Write a Scientific Paper) using Excel - 4: Histograms.

    Science.gov (United States)

    Grech, Victor

    2018-01-12

    Plotting data into graphs is a crucial step in data analysis as part of an initial descriptive statistics exercise since it gives the researcher an overview of the shape and nature of the data. Outlier values may also be identified, and these may be incorrect data, or true and important outliers. This paper explains how to access Microsoft Excel's Analysis Toolpak and provides some pointers for the utilisation of the histogram tool within the Toolpak. Copyright © 2018. Published by Elsevier B.V.

  16. Benchmarking motion planning algorithms for bin-picking applications

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter

    2017-01-01

    planning algorithms to identify which are most suited in the given context. Design/methodology/approach The paper presents a selection of motion planning algorithms and defines benchmarks based on three different bin-picking scenarios. The evaluation is done based on a fixed set of tasks, which are planned...... and executed on a real and a simulated robot. Findings The benchmarking shows a clear difference between the planners and generally indicates that algorithms integrating optimization, despite longer planning time, perform better due to a faster execution. Originality/value The originality of this work lies...... in the selected set of planners and the specific choice of application. Most new planners are only compared to existing methods for specific applications chosen to demonstrate the advantages. However, with the specifics of another application, such as bin picking, it is not obvious which planner to choose....

  17. Vision guided robot bin picking of cylindrical objects

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg; Dyhr-Nielsen, Carsten

    1997-01-01

    In order to achieve increased flexibility on robotic production lines an investigation of the rovbot bin-picking problem is presented. In the paper, the limitations related to previous attempts to solve the problem are pointed uot and a set of innovative methods are presented. The main elements h......-pixel precision methods to make a working system. Actual tests are presented using an ASEA 2000 robot.......In order to achieve increased flexibility on robotic production lines an investigation of the rovbot bin-picking problem is presented. In the paper, the limitations related to previous attempts to solve the problem are pointed uot and a set of innovative methods are presented. The main elements...

  18. Osama bin Laden võib olla uues piiramisrõngas / Heiki Suurkask

    Index Scriptorium Estoniae

    Suurkask, Heiki, 1972-

    2004-01-01

    Briti ajakirjanduse andmeil on suudetud tuvastada bin Ladeni asukoht. 1993. aastal CIA peakorteri ees kaks USA luureagenti tapnud pakistanlase Mir Aimal Kasi ja bin Ladeni tagaotsimisest. Vt. samas: Taliban kogub uut jõudu

  19. Osama bin Ladeni tapmine toonuks tuumapõrgu / Kaivo Kopli

    Index Scriptorium Estoniae

    Kopli, Kaivo

    2011-01-01

    Wikileaksi kaudu on avalikustatud märkmeid Guantánamo vangide ülekuulamistest. Ühe al-Qaida komandöri väitel on terroristidel kuhugi Euroopasse peidetud tuumapomm, mis oleks pandud plahvatama, kui Osama bin Laden oleks vangistatud või tapetud. Selgub, et Guantanamos oli umbes 150 kinnipeetut täiesti süütud

  20. Vision guided robot bin picking of cylindrical objects

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg; Dyhr-Nielsen, Carsten

    1997-01-01

    In order to achieve increased flexibility on robotic production lines an investigation of the rovbot bin-picking problem is presented. In the paper, the limitations related to previous attempts to solve the problem are pointed uot and a set of innovative methods are presented. The main elements h......-pixel precision methods to make a working system. Actual tests are presented using an ASEA 2000 robot....

  1. Visual Contrast Enhancement Algorithm Based on Histogram Equalization.

    Science.gov (United States)

    Ting, Chih-Chung; Wu, Bing-Fei; Chung, Meng-Liang; Chiu, Chung-Cheng; Wu, Ya-Ching

    2015-07-13

    Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE) because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA) based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods.

  2. Visual Contrast Enhancement Algorithm Based on Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Chih-Chung Ting

    2015-07-01

    Full Text Available Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods.

  3. Variational Histogram Equalization for Single Color Image Defogging

    Directory of Open Access Journals (Sweden)

    Li Zhou

    2016-01-01

    Full Text Available Foggy images taken in the bad weather inevitably suffer from contrast loss and color distortion. Existing defogging methods merely resort to digging out an accurate scene transmission in ignorance of their unpleasing distortion and high complexity. Different from previous works, we propose a simple but powerful method based on histogram equalization and the physical degradation model. By revising two constraints in a variational histogram equalization framework, the intensity component of a fog-free image can be estimated in HSI color space, since the airlight is inferred through a color attenuation prior in advance. To cut down the time consumption, a general variation filter is proposed to obtain a numerical solution from the revised framework. After getting the estimated intensity component, it is easy to infer the saturation component from the physical degradation model in saturation channel. Accordingly, the fog-free image can be restored with the estimated intensity and saturation components. In the end, the proposed method is tested on several foggy images and assessed by two no-reference indexes. Experimental results reveal that our method is relatively superior to three groups of relevant and state-of-the-art defogging methods.

  4. Image Compression Using Moving Average Histogram and RBF Network

    Directory of Open Access Journals (Sweden)

    Sandar khowaja

    2016-04-01

    Full Text Available Modernization and Globalization have made the multimedia technology as one of the fastest growing field in recent times but optimal use of bandwidth and storage has been one of the topics which attract the research community to work on. Considering that images have a lion?s share in multimedia communication, efficient image compression technique has become the basic need for optimal use of bandwidth and space. This paper proposes a novel method for image compression based on fusion of moving average histogram and RBF (Radial Basis Function. Proposed technique employs the concept of reducing color intensity levels using moving average histogram technique followed by the correction of color intensity levels using RBF networks at reconstruction phase. Existing methods have used low resolution images for the testing purpose but the proposed method has been tested on various image resolutions to have a clear assessment of the said technique. The proposed method have been tested on 35 images with varying resolution and have been compared with the existing algorithms in terms of CR (Compression Ratio, MSE (Mean Square Error, PSNR (Peak Signal to Noise Ratio, computational complexity. The outcome shows that the proposed methodology is a better trade off technique in terms of compression ratio, PSNR which determines the quality of the image and computational complexity

  5. Dynamic Channel Allocation

    Science.gov (United States)

    2003-09-01

    21 9. Beowulf Ethernet Channel Bonding.................................................22 F. SUMMARY...on demand, hybrid channel allocation in wireless networks, and 3 Beowulf Ethernet channel bonding. The background information presented in this...channels are available for dynamic allocation [Ref 32]. 9. Beowulf Ethernet Channel Bonding A by-product of using older computers in a NASA research lab

  6. Scintigraphic image contrast-enhancement techniques: global and local area histogram equalization.

    Science.gov (United States)

    Verdenet, J; Cardot, J C; Baud, M; Chervet, H; Duvernoy, J; Bidet, R

    1981-01-01

    This article develops two contrast-modification techniques for the display of scintigraphic images. Based on histogram-modification techniques, histogram equalization, where each level of gray is used to the same extent, gives maximum entropy. The first technique uses the application of histogram equalization in the whole image. To eliminate contrast attenuation small but important portion of the gray scale histogram, local area histogram equalization has been applied to images with differences in intensity. Both techniques were tested using a phantom with known characteristics. The global equalization technique is more suitable to bone scintigraphies, and some well-chosen boundaries improved the difference between two comparable areas. For liver scintigraphies, where intensity is quite equal in every pixel, a local area equalization was chosen that allowed detection of heterogeneous structures. The images resulting from histogram-equalization techniques improve the readability of data, but are often far from usual images and necessitate an apprenticeship for the physician.

  7. Implementing a 3D histogram version of the Energy-Test in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, E.O., E-mail: cohen.erez7@gmail.com [School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978 (Israel); Reid, I.D., E-mail: ivan.reid@brunel.ac.uk [College of Engineering, Design and Physical Sciences, Brunel University London, Uxbridge UB8 3PH (United Kingdom); Piasetzky, E., E-mail: eip@tauphy.tau.ac.il [School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978 (Israel)

    2016-08-21

    Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov–Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech. The software package can be found at (http://www-nuclear.tau.ac.il/ecohen/).

  8. Design of Auto Exposure Unit Based On 2-Way Histogram Equalization

    OpenAIRE

    Junghwan Choi; Seongsoo Lee

    2013-01-01

    Histogram equalization is often used in image enhancement, but it can be also used in auto exposure. However, conventional histogram equalization does not work well when many pixels are concentrated in a narrow luminance range.This paper proposes an auto exposure method based on 2-way histogram equalization. Two cumulative distribution functions are used, where one is from dark to bright and the other is from bright to dark. In this paper, the proposed auto exposure method is also designed an...

  9. Using histograms to introduce randomization in the generation of ensembles of decision trees

    Science.gov (United States)

    Kamath, Chandrika; Cantu-Paz, Erick; Littau, David

    2005-02-22

    A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

  10. Steam leak detection method in pipeline using histogram analysis

    International Nuclear Information System (INIS)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun; Park, Jong Won

    2015-01-01

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results

  11. TSimpleAnalysis: histogramming many trees in parallel

    CERN Document Server

    Giommi, Luca

    2016-01-01

    I worked inside the ROOT team of EP-SFT group. My project focuses on writing a ROOT class that has the aim of creating histograms from a TChain. The name of the class is TSimpleAnalysis and it is already integrated in ROOT. The work that I have done was to write the source, the header le of the class and also a python script, that allows to the user to use the class through the command line. This represents a great improvement respect to the usual user code that counts lines and lines of code to do the same thing. (Link for the class: https://root.cern.ch/doc/master/classTSimpleAnalysis.html)

  12. Histogram plots and cutoff energies for nuclear discrete levels

    International Nuclear Information System (INIS)

    Belgya, T.; Molnar, G.; Fazekas, B.; Oestoer, J.

    1997-05-01

    Discrete level schemes for 1277 nuclei, from 6 Li through 251 Es, extracted from the Evaluated Nuclear Structure Data File were analyzed. Cutoff energies (U max ), indicating the upper limit of level scheme completeness, were deduced from the inspection of histograms of the cumulative number of levels. Parameters of the constant-temperature level density formula (nuclear temperature T and energy shift U 0 ) were obtained by means of the least square fit of the formula to the known levels below cutoff energy. The results are tabulated for all 1277 nuclei allowing for an easy and reliable application of the constant-temperature level density approach. A complete set of cumulative plots of discrete levels is also provided. (author). 5 figs, 2 tabs

  13. A psychophysical comparison of two methods for adaptive histogram equalization.

    Science.gov (United States)

    Zimmerman, J B; Cousins, S B; Hartzell, K M; Frisse, M E; Kahn, M G

    1989-05-01

    Adaptive histogram equalization (AHE) is a method for adaptive contrast enhancement of digital images. It is an automatic, reproducible method for the simultaneous viewing of contrast within a digital image with a large dynamic range. Recent experiments have shown that in specific cases, there is no significant difference in the ability of AHE and linear intensity windowing to display gray-scale contrast. More recently, a variant of AHE which limits the allowed contrast enhancement of the image has been proposed. This contrast-limited adaptive histogram equalization (CLAHE) produces images in which the noise content of an image is not excessively enhanced, but in which sufficient contrast is provided for the visualization of structures within the image. Images processed with CLAHE have a more natural appearance and facilitate the comparison of different areas of an image. However, the reduced contrast enhancement of CLAHE may hinder the ability of an observer to detect the presence of some significant gray-scale contrast. In this report, a psychophysical observer experiment was performed to determine if there is a significant difference in the ability of AHE and CLAHE to depict gray-scale contrast. Observers were presented with computed tomography (CT) images of the chest processed with AHE and CLAHE. Subtle artificial lesions were introduced into some images. The observers were asked to rate their confidence regarding the presence of the lesions; this rating-scale data was analyzed using receiver operating characteristic (ROC) curve techniques. These ROC curves were compared for significant differences in the observers' performances. In this report, no difference was found in the abilities of AHE and CLAHE to depict contrast information.

  14. Adaptive Motion Planning in Bin-Picking with Object Uncertainties

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter; Miró, Jaime Valls

    2017-01-01

    in which the system receives an object pose update while moving towards the place position. Another where the update includes the object type being grasped out of a fixed number of options, each class to be deposited in a different place. When an online POMDP solver is utilized, the state adjusting POMDP......Doing motion planning for bin-picking with object uncertainties requires either a re-grasp of picked objects or an online sensor system. Using the latter is advantageous in terms of computational time, as no time is wasted doing an extra pick and place action. It does, however, put extra...

  15. CPD Allocations and Awards

    Data.gov (United States)

    Department of Housing and Urban Development — The CPD Allocation and Award database provides filterable on-screen and exportable reports on select programs, such as the Community Development Block Grant Program,...

  16. Two-state theory of binned photon statistics for a large class of waiting time distributions and its application to quantum dot blinking

    Energy Technology Data Exchange (ETDEWEB)

    Volkán-Kacsó, Sándor [Noyes Laboratory of Chemical Physics, California Institute of Technology, 1200 East California Boulevard, Pasadena, California 91125 (United States)

    2014-06-14

    A theoretical method is proposed for the calculation of the photon counting probability distribution during a bin time. Two-state fluorescence and steady excitation are assumed. A key feature is a kinetic scheme that allows for an extensive class of stochastic waiting time distribution functions, including power laws, expanded as a sum of weighted decaying exponentials. The solution is analytic in certain conditions, and an exact and simple expression is found for the integral contribution of “bright” and “dark” states. As an application for power law kinetics, theoretical results are compared with experimental intensity histograms from a number of blinking CdSe/ZnS quantum dots. The histograms are consistent with distributions of intensity states around a “bright” and a “dark” maximum. A gap of states is also revealed in the more-or-less flat inter-peak region. The slope and to some extent the flatness of the inter-peak feature are found to be sensitive to the power-law exponents. Possible models consistent with these findings are discussed, such as the combination of multiple charging and fluctuating non-radiative channels or the multiple recombination center model. A fitting of the latter to experiment provides constraints on the interaction parameter between the recombination centers. Further extensions and applications of the photon counting theory are also discussed.

  17. Two-state theory of binned photon statistics for a large class of waiting time distributions and its application to quantum dot blinking.

    Science.gov (United States)

    Volkán-Kacsó, Sándor

    2014-06-14

    A theoretical method is proposed for the calculation of the photon counting probability distribution during a bin time. Two-state fluorescence and steady excitation are assumed. A key feature is a kinetic scheme that allows for an extensive class of stochastic waiting time distribution functions, including power laws, expanded as a sum of weighted decaying exponentials. The solution is analytic in certain conditions, and an exact and simple expression is found for the integral contribution of "bright" and "dark" states. As an application for power law kinetics, theoretical results are compared with experimental intensity histograms from a number of blinking CdSe/ZnS quantum dots. The histograms are consistent with distributions of intensity states around a "bright" and a "dark" maximum. A gap of states is also revealed in the more-or-less flat inter-peak region. The slope and to some extent the flatness of the inter-peak feature are found to be sensitive to the power-law exponents. Possible models consistent with these findings are discussed, such as the combination of multiple charging and fluctuating non-radiative channels or the multiple recombination center model. A fitting of the latter to experiment provides constraints on the interaction parameter between the recombination centers. Further extensions and applications of the photon counting theory are also discussed.

  18. Guitar Chords Classification Using Uncertainty Measurements of Frequency Bins

    Directory of Open Access Journals (Sweden)

    Jesus Guerrero-Turrubiates

    2015-01-01

    Full Text Available This paper presents a method to perform chord classification from recorded audio. The signal harmonics are obtained by using the Fast Fourier Transform, and timbral information is suppressed by spectral whitening. A multiple fundamental frequency estimation of whitened data is achieved by adding attenuated harmonics by a weighting function. This paper proposes a method that performs feature selection by using a thresholding of the uncertainty of all frequency bins. Those measurements under the threshold are removed from the signal in the frequency domain. This allows a reduction of 95.53% of the signal characteristics, and the other 4.47% of frequency bins are used as enhanced information for the classifier. An Artificial Neural Network was utilized to classify four types of chords: major, minor, major 7th, and minor 7th. Those, played in the twelve musical notes, give a total of 48 different chords. Two reference methods (based on Hidden Markov Models were compared with the method proposed in this paper by having the same database for the evaluation test. In most of the performed tests, the proposed method achieved a reasonably high performance, with an accuracy of 93%.

  19. Smart Bin: Internet-of-Things Garbage Monitoring System

    Directory of Open Access Journals (Sweden)

    Mustafa M.R

    2017-01-01

    Full Text Available This work introduces the design and development of smart green environment of garbage monitoring system by measuring the garbage level in real time and to alert the municipality where never the bin is full based on the types of garbage. The proposed system consisted the ultrasonic sensors which measure the garbage level, an ARM microcontroller which controls system operation whereas everything will be connected to ThingSpeak. This work demonstrates a system that allows the waste management to monitor based on the level of the garbage depth inside the dustbin. The system shows the status of different four types of garbage; domestic waste, paper, glass and plastic through LCD and ThingSpeak in a real time to store the data for future use and analysis, such as prediction of peak level of garbage bin fullness. It is expected that this system can create greener environment by monitoring and controlling the collection of garbage smartly through Internet-of-Things.

  20. Best-fit bin-packing with random order

    Energy Technology Data Exchange (ETDEWEB)

    Kenyon, C. [CNRS, Lyon (France)

    1996-12-31

    Best-fit is the best known algorithm for on-line bin-packing, in the sense that no algorithm is known to behave better both in the worst case (when Best-fit has performance ratio 1.7) and in the average uniform case, with items drawn uniformly in the interval (then Best-fit has expected wasted space O(n{sup 1/2}(log n){sup 3/4})). In practical applications, Best-fit appears to perform within a few percent of optimal. In this paper, in the spirit of previous work in computational geometry, we study the expected performance ratio, taking the worst-case multiset of items L, and assuming that the elements of L are inserted in random order, with all permutations equally likely. We show a lower bound of 1.08 ... and an upper bound of 1.5 on the random order performance ratio of Best-fit. The upper bound contrasts with the result that in the worst case, any (deterministic or randomized) on-line bin-packing algorithm has performance ratio at least 1.54.

  1. Constant-complexity stochastic simulation algorithm with optimal binning

    Energy Technology Data Exchange (ETDEWEB)

    Sanft, Kevin R., E-mail: kevin@kevinsanft.com [Department of Computer Science, University of North Carolina Asheville, Asheville, North Carolina 28804 (United States); Othmer, Hans G., E-mail: othmer@math.umn.edu [School of Mathematics, University of Minnesota, Minneapolis, Minnesota 55455 (United States); Digital Technology Center, University of Minnesota, Minneapolis, Minnesota 55455 (United States)

    2015-08-21

    At the molecular level, biochemical processes are governed by random interactions between reactant molecules, and the dynamics of such systems are inherently stochastic. When the copy numbers of reactants are large, a deterministic description is adequate, but when they are small, such systems are often modeled as continuous-time Markov jump processes that can be described by the chemical master equation. Gillespie’s Stochastic Simulation Algorithm (SSA) generates exact trajectories of these systems, but the amount of computational work required for each step of the original SSA is proportional to the number of reaction channels, leading to computational complexity that scales linearly with the problem size. The original SSA is therefore inefficient for large problems, which has prompted the development of several alternative formulations with improved scaling properties. We describe an exact SSA that uses a table data structure with event time binning to achieve constant computational complexity with respect to the number of reaction channels for weakly coupled reaction networks. We present a novel adaptive binning strategy and discuss optimal algorithm parameters. We compare the computational efficiency of the algorithm to existing methods and demonstrate excellent scaling for large problems. This method is well suited for generating exact trajectories of large weakly coupled models, including those that can be described by the reaction-diffusion master equation that arises from spatially discretized reaction-diffusion processes.

  2. Test Plan: WIPP bin-scale CH TRU waste tests

    International Nuclear Information System (INIS)

    Molecke, M.A.

    1990-08-01

    This WIPP Bin-Scale CH TRU Waste Test program described herein will provide relevant composition and kinetic rate data on gas generation and consumption resulting from TRU waste degradation, as impacted by synergistic interactions due to multiple degradation modes, waste form preparation, long-term repository environmental effects, engineered barrier materials, and, possibly, engineered modifications to be developed. Similar data on waste-brine leachate compositions and potentially hazardous volatile organic compounds released by the wastes will also be provided. The quantitative data output from these tests and associated technical expertise are required by the WIPP Performance Assessment (PA) program studies, and for the scientific benefit of the overall WIPP project. This Test Plan describes the necessary scientific and technical aspects, justifications, and rational for successfully initiating and conducting the WIPP Bin-Scale CH TRU Waste Test program. This Test Plan is the controlling scientific design definition and overall requirements document for this WIPP in situ test, as defined by Sandia National Laboratories (SNL), scientific advisor to the US Department of Energy, WIPP Project Office (DOE/WPO). 55 refs., 16 figs., 19 tabs

  3. Thresholding using two-dimensional histogram and watershed algorithm in the luggage inspection system

    International Nuclear Information System (INIS)

    Chen Jingyun; Cong Peng; Song Qi

    2006-01-01

    The authors present a new DR image segmentation method based on two-dimensional histogram and watershed algorithm. The authors use watershed algorithm to locate threshold on the vertical projection plane of two-dimensional histogram. This method is applied to the segmentation of DR images produced by luggage inspection system with DR-CT. The advantage of this method is also analyzed. (authors)

  4. Hand Vein Images Enhancement Based on Local Gray-level Information Histogram

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2015-06-01

    Full Text Available Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image information. Then, we propose the histogram equalization method that is composed of an automatic histogram separation module and an intensity transformation module, and the histogram separation module is a combination of the proposed prompt multiple threshold procedure and an optimum peak signal-to-noise (PSNR calculation to separate the histogram into small-scale detail, the use of the intensity transformation module can enhance the vein images with vein topological structure and gray information preservation for each generated sub-histogram. Experimental results show that the proposed method can achieve extremely good contrast enhancement effect.

  5. Symbol recognition via statistical integration of pixel-level constraint histograms: a new descriptor.

    Science.gov (United States)

    Yang, Su

    2005-02-01

    A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.

  6. ANALISIS PERBANDINGAN HISTOGRAM EQUALIZATION DAN MODEL LOGARITHMIC IMAGE PROCESSING (LIP UNTUK IMAGE ENHANCEMENT

    Directory of Open Access Journals (Sweden)

    Murinto ,

    2012-05-01

    Full Text Available Salah satu metode perbaikan citra (image enhancement adalah dengan cara ekualisasi histogram equalization dan Logarithmic Image Processing (LIP. Kedua metode tersebut memiliki algoritma yang berbeda dan belum diketahui keunggulan dan kelemahan dari masing-masing metode. Penelitian ini membandingkan kinerja antara metode ekualisasi histogram dan LIP dalam memperbaiki kualitas kecemerlangan citra. Jenis gambar yang digunakan berekstensi *.bmp (bitmap berformat 24 bit dengan ukuran pixel yang tidak dibatasi. Citra tersebut kemudian dimasukan ke dalam program lalu dilakukan proses ekualisasi histogram dan LIP. Adapun parameter yang digunakan adalah citra hasil, histogram, timming-run dan signal-to-noise (SNR. Pengujian dilakukan dengan metode Black Box Tes dan Alfa Test. Hasil penelitian dari beberapa sampel citra yang diujikan menunjukan bahwa pendistribusian nilai intensitas piksel menggunakan LIP dapat memberikan kualitas citra yang lebih baik bila dilihat dari secara visual  meskipun memerlukan waktu proses lama dibandingkan dengan metode ekualisasi histogram tetapi bila dilihat dari segi SNR metode Logarithmic Image Processing lebih unggul.

  7. Automatic exact histogram specification for contrast enhancement and visual system based quantitative evaluation.

    Science.gov (United States)

    Sen, Debashis; Pal, Sankar K

    2011-05-01

    Histogram equalization, which aims at information maximization, is widely used in different ways to perform contrast enhancement in images. In this paper, an automatic exact histogram specification technique is proposed and used for global and local contrast enhancement of images. The desired histogram is obtained by first subjecting the image histogram to a modification process and then by maximizing a measure that represents increase in information and decrease in ambiguity. A new method of measuring image contrast based upon local band-limited approach and center-surround retinal receptive field model is also devised in this paper. This method works at multiple scales (frequency bands) and combines the contrast measures obtained at different scales using L(p)-norm. In comparison to a few existing methods, the effectiveness of the proposed automatic exact histogram specification technique in enhancing contrasts of images is demonstrated through qualitative analysis and the proposed image contrast measure based quantitative analysis.

  8. Risk capital allocation

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Smilgins, Aleksandrs

    Risk capital allocation problems have been widely discussed in the academic literature. We consider a company with multiple subunits having individual portfolios. Hence, when portfolios of subunits are merged, a diversification benefit arises: the risk of the company as a whole is smaller than th...... of new axioms related directly to the problem of risk capital allocation and show that the Lorenz set satisfies these new axioms in contrast to other well-known coherent methods. Finally, we discuss how to deal with non-uniqueness of the Lorenz set.......Risk capital allocation problems have been widely discussed in the academic literature. We consider a company with multiple subunits having individual portfolios. Hence, when portfolios of subunits are merged, a diversification benefit arises: the risk of the company as a whole is smaller than...... the sum of the risks of the individual sub-units. The question is how to allocate the risk capital of the company among the subunits in a fair way. In this paper we propose to use the Lorenz set as an allocation method. We show that the Lorenz set is operational and coherent. Moreover, we propose a set...

  9. Bas-relief generation using adaptive histogram equalization.

    Science.gov (United States)

    Sun, Xianfang; Rosin, Paul L; Martin, Ralph R; Langbein, Frank C

    2009-01-01

    An algorithm is presented to automatically generate bas-reliefs based on adaptive histogram equalization (AHE), starting from an input height field. A mesh model may alternatively be provided, in which case a height field is first created via orthogonal or perspective projection. The height field is regularly gridded and treated as an image, enabling a modified AHE method to be used to generate a bas-relief with a user-chosen height range. We modify the original image-contrast-enhancement AHE method to use gradient weights also to enhance the shape features of the bas-relief. To effectively compress the height field, we limit the height-dependent scaling factors used to compute relative height variations in the output from height variations in the input; this prevents any height differences from having too great effect. Results of AHE over different neighborhood sizes are averaged to preserve information at different scales in the resulting bas-relief. Compared to previous approaches, the proposed algorithm is simple and yet largely preserves original shape features. Experiments show that our results are, in general, comparable to and in some cases better than the best previously published methods.

  10. Histogram-based ionogram displays and their application to autoscaling

    Science.gov (United States)

    Lynn, Kenneth J. W.

    2018-03-01

    A simple method is described for displaying and auto scaling the basic ionogram parameters foF2 and h'F2 as well as some additional layer parameters from digital ionograms. The technique employed is based on forming frequency and height histograms in each ionogram. This technique has now been applied specifically to ionograms produced by the IPS5D ionosonde developed and operated by the Australian Space Weather Service (SWS). The SWS ionograms are archived in a cleaned format and readily available from the SWS internet site. However, the method is applicable to any ionosonde which produces ionograms in a digital format at a useful signal-to-noise level. The most novel feature of the technique for autoscaling is its simplicity and the avoidance of the mathematical imaging and line fitting techniques often used. The program arose from the necessity to display many days of ionogram output to allow the location of specific types of ionospheric event such as ionospheric storms, travelling ionospheric disturbances and repetitive ionospheric height changes for further investigation and measurement. Examples and applications of the method are given including the removal of sporadic E and spread F.

  11. Position of Imam Malik bin Anas on Deviant Sects

    Directory of Open Access Journals (Sweden)

    Allaa Eddin Muhammad Esmail

    2014-06-01

    Full Text Available The study attempts to highlight the position of Imam Malik bin Anas on deviant sects which emerged and developed during his time. It discusses the views of Imam Malik on other Muslim sects like Shi’its and its school of thoughts, Qadarites, where he authored a specific book and a chapter in his “Muwatta” in response to their central teachings, Khariji, Murji”it and Mu’tazilites. This paper also discusses the views of Imam Malik and his response to important issues related to those sects such as “al-mutasyabihat” and the creation of Al-Quran. Imam Malik’s ways and methods in dealing with those sects are also highlighted.

  12. Low-Light Image Enhancement Using Adaptive Digital Pixel Binning

    Directory of Open Access Journals (Sweden)

    Yoonjong Yoo

    2015-06-01

    Full Text Available This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP. Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor.

  13. BinAligner: a heuristic method to align biological networks.

    Science.gov (United States)

    Yang, Jialiang; Li, Jun; Grünewald, Stefan; Wan, Xiu-Feng

    2013-01-01

    The advances in high throughput omics technologies have made it possible to characterize molecular interactions within and across various species. Alignments and comparison of molecular networks across species will help detect orthologs and conserved functional modules and provide insights on the evolutionary relationships of the compared species. However, such analyses are not trivial due to the complexity of network and high computational cost. Here we develop a mixture of global and local algorithm, BinAligner, for network alignments. Based on the hypotheses that the similarity between two vertices across networks would be context dependent and that the information from the edges and the structures of subnetworks can be more informative than vertices alone, two scoring schema, 1-neighborhood subnetwork and graphlet, were introduced to derive the scoring matrices between networks, besides the commonly used scoring scheme from vertices. Then the alignment problem is formulated as an assignment problem, which is solved by the combinatorial optimization algorithm, such as the Hungarian method. The proposed algorithm was applied and validated in aligning the protein-protein interaction network of Kaposi's sarcoma associated herpesvirus (KSHV) and that of varicella zoster virus (VZV). Interestingly, we identified several putative functional orthologous proteins with similar functions but very low sequence similarity between the two viruses. For example, KSHV open reading frame 56 (ORF56) and VZV ORF55 are helicase-primase subunits with sequence identity 14.6%, and KSHV ORF75 and VZV ORF44 are tegument proteins with sequence identity 15.3%. These functional pairs can not be identified if one restricts the alignment into orthologous protein pairs. In addition, BinAligner identified a conserved pathway between two viruses, which consists of 7 orthologous protein pairs and these proteins are connected by conserved links. This pathway might be crucial for virus packing and

  14. Quadrant Dynamic with Automatic Plateau Limit Histogram Equalization for Image Enhancement

    Directory of Open Access Journals (Sweden)

    P. Jagatheeswari

    2014-01-01

    Full Text Available The fundamental and important preprocessing stage in image processing is the image contrast enhancement technique. Histogram equalization is an effective contrast enhancement technique. In this paper, a histogram equalization based technique called quadrant dynamic with automatic plateau limit histogram equalization (QDAPLHE is introduced. In this method, a hybrid of dynamic and clipped histogram equalization methods are used to increase the brightness preservation and to reduce the overenhancement. Initially, the proposed QDAPLHE algorithm passes the input image through a median filter to remove the noises present in the image. Then the histogram of the filtered image is divided into four subhistograms while maintaining second separated point as the mean brightness. Then the clipping process is implemented by calculating automatically the plateau limit as the clipped level. The clipped portion of the histogram is modified to reduce the loss of image intensity value. Finally the clipped portion is redistributed uniformly to the entire dynamic range and the conventional histogram equalization is executed in each subhistogram independently. Based on the qualitative and the quantitative analysis, the QDAPLHE method outperforms some existing methods in literature.

  15. Automated compensation of light attenuation in confocal microscopy by exact histogram specification.

    Science.gov (United States)

    Stanciu, Stefan G; Stanciu, George A; Coltuc, Dinu

    2010-03-01

    Confocal laser scanning microscopy (CLSM) enables us to capture images representing optical sections on the volume of a specimen. The images acquired from different layers have a different contrast: the images obtained from the deeper layers of the specimen will have a lower contrast with respect to the images obtained from the topmost layers. The main reasons responsible for the effects described above are light absorption and scattering by the atoms and molecules contained in the volume through which the light passes. Also light attenuation can be caused by the inclination of the observed surface. In the case of the surfaces that have a steep inclination, the reflected light will have a different direction than the one of the detector. We propose a technique of digital image processing that can be used to compensate the effects of light attenuation based on histogram operations. We process the image series obtained by CLSM by exact histogram specification and equalization. In this case, a strict ordering among pixels must be induced in order to achieve the exact histogram modeling. The processed images will end up having exactly the specified histogram and not a histogram with a shape that just resembles to the specified one, as in the case of classical histogram specification algorithms. Experimental results and theoretical aspects of the induced ordering are discussed, as well as a comparison between several histogram modeling techniques with respect to the processing of image series obtained by confocal microscopy.

  16. Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram

    Directory of Open Access Journals (Sweden)

    Uwe Klose

    2015-01-01

    Full Text Available Purpose. The distribution of apparent diffusion coefficient (ADC values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects.

  17. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  18. Link Monotonic Allocation Schemes

    NARCIS (Netherlands)

    Slikker, M.

    1999-01-01

    A network is a graph where the nodes represent players and the links represent bilateral interaction between the players. A reward game assigns a value to every network on a fixed set of players. An allocation scheme specifies how to distribute the worth of every network among the players. This

  19. Histogram-driven cupping correction (HDCC) in CT

    Science.gov (United States)

    Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.

    2010-04-01

    Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.

  20. Real-time rotation estimation using histograms of oriented gradients.

    Science.gov (United States)

    Bratanič, Blaž; Pernuš, Franjo; Likar, Boštjan; Tomaževič, Dejan

    2014-01-01

    This paper focuses on real-time rotation estimation for model-based automated visual inspection. In the case of model-based inspection, spatial alignment is essential to distinguish visual defects from normal appearance variations. Defects are detected by comparing the inspected object with its spatially aligned ideal reference model. Rotation estimation is crucial for the inspection of rotationally symmetric objects where mechanical manipulation is unable to ensure the correct object rotation. We propose a novel method for in-plane rotation estimation. Rotation is estimated with an ensemble of nearest-neighbor estimators. Each estimator contains a spatially local representation of an object in a feature space for all rotation angles and is constructed with a semi-supervised self-training approach from a set of unlabeled training images. An individual representation in a feature space is obtained by calculating the Histograms of Oriented Gradients (HOG) over a spatially local region. Each estimator votes separately for the estimated angle; all votes are weighted and accumulated. The final estimation is the angle with the most votes. The method was evaluated on several datasets of pharmaceutical tablets varying in size, shape, and color. The results show that the proposed method is superior in robustness with comparable speed and accuracy to previously proposed methods for rotation estimation of pharmaceutical tablets. Furthermore, all evaluations were performed with the same set of parameters, which implies that the method requires minimal human intervention. Despite the evaluation focused on pharmaceutical tablets, we consider the method useful for any application that requires robust real-time in-plane rotation estimation.

  1. An evaluation of an improved method for computing histograms in dynamic tracer studies using positron-emission tomography

    International Nuclear Information System (INIS)

    Ollinger, J.M.; Snyder, D.L.

    1986-01-01

    A method for computing approximate minimum-mean-square-error estimates of histograms from list-mode data for use in dynamic tracer studies is evaluated. Parameters estimated from these histograms are significantly more accurate than those estimated from histograms computed by a commonly used method

  2. An Adaptive Histogram Equalization Algorithm on the Image Gray Level Mapping

    Science.gov (United States)

    Zhu, Youlian; Huang, Cheng

    The conventional histogram equalization algorithm is easy causing information loss. The paper presented an adaptive histogram-based algorithm in which the information entropy remains the same. The algorithm introduces parameter β in the gray level mapping formula, and takes the information entropy as the target function to adaptively adjust the spacing of two adjacent gray levels in the new histogram. So it avoids excessive gray pixel merger and excessive bright local areas of the image. Experiments show that the improved algorithm may effectively improve visual effects under the premise of the same information entropy. It is useful in CT image processing.

  3. An alternative to γ histograms for ROI-based quantitative dose comparisons

    International Nuclear Information System (INIS)

    Dvorak, P

    2009-01-01

    An alternative to gamma (γ) histograms for ROI-based quantitative comparisons of dose distributions using the γ concept is proposed. The method provides minimum values of dose difference and distance-to-agreement such that a pre-set fraction of the region of interest passes the γ test. Compared to standard γ histograms, the method provides more information in terms of pass rate per γ calculation. This is achieved at negligible additional calculation cost and without loss of accuracy. The presented method is proposed as a useful and complementary alternative to standard γ histograms, increasing both the quantity and quality of information for use in acceptance or rejection decisions. (note)

  4. Potential fitting biases resulting from grouping data into variable width bins

    International Nuclear Information System (INIS)

    Towers, S.

    2014-01-01

    When reading peer-reviewed scientific literature describing any analysis of empirical data, it is natural and correct to proceed with the underlying assumption that experiments have made good faith efforts to ensure that their analyses yield unbiased results. However, particle physics experiments are expensive and time consuming to carry out, thus if an analysis has inherent bias (even if unintentional), much money and effort can be wasted trying to replicate or understand the results, particularly if the analysis is fundamental to our understanding of the universe. In this note we discuss the significant biases that can result from data binning schemes. As we will show, if data are binned such that they provide the best comparison to a particular (but incorrect) model, the resulting model parameter estimates when fitting to the binned data can be significantly biased, leading us to too often accept the model hypothesis when it is not in fact true. When using binned likelihood or least squares methods there is of course no a priori requirement that data bin sizes need to be constant, but we show that fitting to data grouped into variable width bins is particularly prone to produce biased results if the bin boundaries are chosen to optimize the comparison of the binned data to a wrong model. The degree of bias that can be achieved simply with variable binning can be surprisingly large. Fitting the data with an unbinned likelihood method, when possible to do so, is the best way for researchers to show that their analyses are not biased by binning effects. Failing that, equal bin widths should be employed as a cross-check of the fitting analysis whenever possible

  5. A decision-analytic approach to the optimal allocation of resources for endangered species consultation

    Science.gov (United States)

    Converse, Sarah J.; Shelley, Kevin J.; Morey, Steve; Chan, Jeffrey; LaTier, Andrea; Scafidi, Carolyn; Crouse, Deborah T.; Runge, Michael C.

    2011-01-01

    The resources available to support conservation work, whether time or money, are limited. Decision makers need methods to help them identify the optimal allocation of limited resources to meet conservation goals, and decision analysis is uniquely suited to assist with the development of such methods. In recent years, a number of case studies have been described that examine optimal conservation decisions under fiscal constraints; here we develop methods to look at other types of constraints, including limited staff and regulatory deadlines. In the US, Section Seven consultation, an important component of protection under the federal Endangered Species Act, requires that federal agencies overseeing projects consult with federal biologists to avoid jeopardizing species. A benefit of consultation is negotiation of project modifications that lessen impacts on species, so staff time allocated to consultation supports conservation. However, some offices have experienced declining staff, potentially reducing the efficacy of consultation. This is true of the US Fish and Wildlife Service's Washington Fish and Wildlife Office (WFWO) and its consultation work on federally-threatened bull trout (Salvelinus confluentus). To improve effectiveness, WFWO managers needed a tool to help allocate this work to maximize conservation benefits. We used a decision-analytic approach to score projects based on the value of staff time investment, and then identified an optimal decision rule for how scored projects would be allocated across bins, where projects in different bins received different time investments. We found that, given current staff, the optimal decision rule placed 80% of informal consultations (those where expected effects are beneficial, insignificant, or discountable) in a short bin where they would be completed without negotiating changes. The remaining 20% would be placed in a long bin, warranting an investment of seven days, including time for negotiation. For formal

  6. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  7. Infrared image gray adaptive adjusting enhancement algorithm based on gray redundancy histogram-dealing technique

    Science.gov (United States)

    Hao, Zi-long; Liu, Yong; Chen, Ruo-wang

    2016-11-01

    In view of the histogram equalizing algorithm to enhance image in digital image processing, an Infrared Image Gray adaptive adjusting Enhancement Algorithm Based on Gray Redundancy Histogram-dealing Technique is proposed. The algorithm is based on the determination of the entire image gray value, enhanced or lowered the image's overall gray value by increasing appropriate gray points, and then use gray-level redundancy HE method to compress the gray-scale of the image. The algorithm can enhance image detail information. Through MATLAB simulation, this paper compares the algorithm with the histogram equalization method and the algorithm based on gray redundancy histogram-dealing technique , and verifies the effectiveness of the algorithm.

  8. Adaptive Histogram Equalization Based Image Forensics Using Statistics of DC DCT Coefficients

    Directory of Open Access Journals (Sweden)

    Neetu Singh

    2018-01-01

    Full Text Available The vulnerability of digital images is growing towards manipulation. This motivated an area of research to deal with digital image forgeries. The certifying origin and content of digital images is an open problem in the multimedia world. One of the ways to find the truth of images is finding the presence of any type of contrast enhancement. In this work, novel and simple machine learning tool is proposed to detect the presence of histogram equalization using statistical parameters of DC Discrete Cosine Transform (DCT coefficients. The statistical parameters of the Gaussian Mixture Model (GMM fitted to DC DCT coefficients are used as features for classifying original and histogram equalized images. An SVM classifier has been developed to classify original and histogram equalized image which can detect histogram equalized image with accuracy greater than 95% when false rate is less than 5%.

  9. Color Histograms Adapted to Query-Target Images for Object Recognition across Illumination Changes

    Directory of Open Access Journals (Sweden)

    Jack-Gérard Postaire

    2005-08-01

    Full Text Available Most object recognition schemes fail in case of illumination changes between the color image acquisitions. One of the most widely used solutions to cope with this problem is to compare the images by means of the intersection between invariant color histograms. The main originality of our approach is to cope with the problem of illumination changes by analyzing each pair of query and target images constructed during the retrieval, instead of considering each image of the database independently from each other. In this paper, we propose a new approach which determines color histograms adapted to each pair of images. These adapted color histograms are obtained so that their intersection is higher when the two images are similar than when they are different. The adapted color histograms processing is based on an original model of illumination changes based on rank measures of the pixels within the color component images.

  10. Molecular analysis of Culex quinquefasciatus larvae responses to Lysinibacillus sphaericus Bin toxin.

    Science.gov (United States)

    Tangsongcharoen, Chontida; Jupatanakul, Natapong; Promdonkoy, Boonhiang; Dimopoulos, George; Boonserm, Panadda

    2017-01-01

    Lysinibacillus sphaericus produces the mosquito larvicidal binary toxin consisting of BinA and BinB, which are both required for toxicity against Culex and Anopheles larvae. The molecular mechanisms behind Bin toxin-induced damage remain unexplored. We used whole-genome microarray-based transcriptome analysis to better understand how Culex larvae respond to Bin toxin treatment at the molecular level. Our analyses of Culex quinquefasciatus larvae transcriptome changes at 6, 12, and 18 h after Bin toxin treatment revealed a wide range of transcript signatures, including genes linked to the cytoskeleton, metabolism, immunity, and cellular stress, with a greater number of down-regulated genes than up-regulated genes. Bin toxin appears to mainly repress the expression of genes involved in metabolism, the mitochondrial electron transport chain, and the protein transporter of the outer/inner mitochondrial membrane. The induced genes encode proteins linked to mitochondrial-mediated apoptosis and cellular detoxification including autophagic processes and lysosomal compartments. This study is, to our knowledge, the first microarray analysis of Bin toxin-induced transcriptional responses in Culex larvae, providing a basis for an in-depth understanding of the molecular nature of Bin toxin-induced damage.

  11. A compost bin for handling privy wastes: its fabrication and use

    Science.gov (United States)

    R.E. Leonard; S.C. Fay

    1978-01-01

    A 24-ft3 (6.8-m3) fiberglass bin was constructed and tested for its effectiveness in composting privy wastes. A mixture of ground hardwood bark and raw sewage was used for composting. Temperatures in excess of 60°C for 36 hours were produced in the bin by aerobic, thermophilic composting. This temperature is...

  12. 29 CFR 1917.49 - Spouts, chutes, hoppers, bins, and associated equipment.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) MARINE TERMINALS Cargo Handling Gear and Equipment § 1917.49... gear used as a permanent part of spouts, chutes or similar devices shall be inspected before each use... until the employee has left the bin. (j) Bin top openings that present a hazard to employees shall be...

  13. Waste Isolation Pilot Plant Dry Bin-Scale Integrated Systems Checkout Plan

    International Nuclear Information System (INIS)

    1991-04-01

    In order to determine the long-term performance of the Waste Isolation Pilot Plant (WIPP) disposal system, in accordance with the requirements of the US Environmental Protection Agency (EPA) Standard 40 CFR 191, Subpart B, Sections 13 and 15, two performance assessment tests will be conducted. The tests are titled WIPP Bin-Scale Contact Handled (CH) Transuranic (TRU) Waste Tests and WIPP In Situ Alcove CH TRU Waste Tests. These tests are designed to measure the gas generation characteristics of CH TRU waste. Much of the waste will be specially prepared to provide data for a better understanding of the interactions due to differing degradation modes, waste forms, and repository environmental affects. The bin-scale test is designed to emplace nominally 146 bins. The majority of the bins will contain various forms of waste. Eight bins will be used as reference bins and will contain no waste. This checkout plan exercises the systems, operating procedures, and training readiness of personnel to safely carry out those specifically dedicated activities associated with conducting the bin-scale test plan for dry bins only. The plan does not address the entire WIPP facility readiness state. 18 refs., 6 figs., 3 tabs

  14. Using a combination of binning strategies and taxonomic approaches to unravel the anaerobic digestion microbiome

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    of scaffolds comprehensive of thousands genome sequences, but the binning of these scaffolds into OTUs representative of microbial genomes is still challenging. In the attempt to obtain a deep characterization of the anaerobic digestion microbiome, different metagenomic binning approaches were integrated...

  15. An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision

    Science.gov (United States)

    2018-01-01

    ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision...needed. Do not return it to the originator. ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection...Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  16. Improving contig binning of metagenomic data using [Formula: see text] oligonucleotide frequency dissimilarity.

    Science.gov (United States)

    Wang, Ying; Wang, Kun; Lu, Yang Young; Sun, Fengzhu

    2017-09-20

    Metagenomics sequencing provides deep insights into microbial communities. To investigate their taxonomic structure, binning assembled contigs into discrete clusters is critical. Many binning algorithms have been developed, but their performance is not always satisfactory, especially for complex microbial communities, calling for further development. According to previous studies, relative sequence compositions are similar across different regions of the same genome, but they differ between distinct genomes. Generally, current tools have used the normalized frequency of k-tuples directly, but this represents an absolute, not relative, sequence composition. Therefore, we attempted to model contigs using relative k-tuple composition, followed by measuring dissimilarity between contigs using [Formula: see text]. The [Formula: see text] was designed to measure the dissimilarity between two long sequences or Next-Generation Sequencing data with the Markov models of the background genomes. This method was effective in revealing group and gradient relationships between genomes, metagenomes and metatranscriptomes. With many binning tools available, we do not try to bin contigs from scratch. Instead, we developed [Formula: see text] to adjust contigs among bins based on the output of existing binning tools for a single metagenomic sample. The tool is taxonomy-free and depends only on k-tuples. To evaluate the performance of [Formula: see text], five widely used binning tools with different strategies of sequence composition or the hybrid of sequence composition and abundance were selected to bin six synthetic and real datasets, after which [Formula: see text] was applied to adjust the binning results. Our experiments showed that [Formula: see text] consistently achieves the best performance with tuple length k = 6 under the independent identically distributed (i.i.d.) background model. Using the metrics of recall, precision and ARI (Adjusted Rand Index), [Formula: see

  17. Applied cost allocation

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    This paper deals with empirical computation of Aumann–Shapley cost shares for joint production. We show that if one uses a mathematical programing approach with its non-parametric estimation of the cost function there may be observations in the data set for which we have multiple Aumann–Shapley p...... of assumptions concerning firm behavior. These assumptions enable us to connect inefficient with efficient production and thereby provide consistent ways of allocating the costs arising from inefficiency....

  18. The load-balanced multi-dimensional bin-packing problem

    DEFF Research Database (Denmark)

    Trivella, Alessio; Pisinger, David

    2016-01-01

    The bin-packing problem is one of the most investigated and applicable combinatorial optimization problems. In this paper we consider its multi-dimensional version with the practical extension of load balancing, i.e. to find the packing requiring the minimum number of bins while ensuring that the......The bin-packing problem is one of the most investigated and applicable combinatorial optimization problems. In this paper we consider its multi-dimensional version with the practical extension of load balancing, i.e. to find the packing requiring the minimum number of bins while ensuring...... of interval graphs, and iteratively improves the load balancing of a bin-packing solution using different search levels. The first level explores the space of transitive orientations of the complement graphs associated with the packing, the second modifies the structure itself of the interval graphs...

  19. MetaBinG: using GPUs to accelerate metagenomic sequence classification.

    Directory of Open Access Journals (Sweden)

    Peng Jia

    Full Text Available Metagenomic sequence classification is a procedure to assign sequences to their source genomes. It is one of the important steps for metagenomic sequence data analysis. Although many methods exist, classification of high-throughput metagenomic sequence data in a limited time is still a challenge. We present here an ultra-fast metagenomic sequence classification system (MetaBinG using graphic processing units (GPUs. The accuracy of MetaBinG is comparable to the best existing systems and it can classify a million of 454 reads within five minutes, which is more than 2 orders of magnitude faster than existing systems. MetaBinG is publicly available at http://cbb.sjtu.edu.cn/~ccwei/pub/software/MetaBinG/MetaBinG.php.

  20. Exosomes: From Garbage Bins to Promising Therapeutic Targets.

    Science.gov (United States)

    H Rashed, Mohammed; Bayraktar, Emine; K Helal, Gouda; Abd-Ellah, Mohamed F; Amero, Paola; Chavez-Reyes, Arturo; Rodriguez-Aguayo, Cristian

    2017-03-02

    Intercellular communication via cell-released vesicles is a very important process for both normal and tumor cells. Cell communication may involve exosomes, small vesicles of endocytic origin that are released by all types of cells and are found in abundance in body fluids, including blood, saliva, urine, and breast milk. Exosomes have been shown to carry lipids, proteins, mRNAs, non-coding RNAs, and even DNA out of cells. They are more than simply molecular garbage bins, however, in that the molecules they carry can be taken up by other cells. Thus, exosomes transfer biological information to neighboring cells and through this cell-to-cell communication are involved not only in physiological functions such as cell-to-cell communication, but also in the pathogenesis of some diseases, including tumors and neurodegenerative conditions. Our increasing understanding of why cells release exosomes and their role in intercellular communication has revealed the very complex and sophisticated contribution of exosomes to health and disease. The aim of this review is to reveal the emerging roles of exosomes in normal and pathological conditions and describe the controversial biological role of exosomes, as it is now understood, in carcinogenesis. We also summarize what is known about exosome biogenesis, composition, functions, and pathways and discuss the potential clinical applications of exosomes, especially as biomarkers and novel therapeutic agents.

  1. Designing a power supply for Nim-bin formatted equipment

    International Nuclear Information System (INIS)

    Banuelos G, L. E.; Hernandez D, V. M.; Vega C, H. R.

    2016-09-01

    From an old Nuclear Chicago power supply that was practically in the trash, was able to recover the 19 inches casing, rear connectors and the housing where the circuits were. From here all mechanical parts were cleaned and the electronic design was started to replace the original voltage and current functions of this equipment. The cards for the ±6, ±12 and ±24 voltages were designed, simulated and tested with circuitry that does not rely on specialized components or that is sold only by the equipment manufacturer. In the handling of the current by each voltage to operate, was possible to tie with the specifications of the manufacturers like Ortec or Canberra where a model of power supply gives a power of 160 Watts. Basic tests were performed to show that the behavior is very similar to commercial equipment; such as the full load regulation index and the noise level in the supply voltages. So our Nim-bin voltage source is viable for use in our institution laboratories. (Author)

  2. Looking at large data sets using binned data plots

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.

    1990-04-01

    This report addresses the monumental challenge of developing exploratory analysis methods for large data sets. The goals of the report are to increase awareness of large data sets problems and to contribute simple graphical methods that address some of the problems. The graphical methods focus on two- and three-dimensional data and common task such as finding outliers and tail structure, assessing central structure and comparing central structures. The methods handle large sample size problems through binning, incorporate information from statistical models and adapt image processing algorithms. Examples demonstrate the application of methods to a variety of publicly available large data sets. The most novel application addresses the too many plots to examine'' problem by using cognostics, computer guiding diagnostics, to prioritize plots. The particular application prioritizes views of computational fluid dynamics solution sets on the fly. That is, as each time step of a solution set is generated on a parallel processor the cognostics algorithms assess virtual plots based on the previous time step. Work in such areas is in its infancy and the examples suggest numerous challenges that remain. 35 refs., 15 figs.

  3. DE-STRIPING FOR TDICCD REMOTE SENSING IMAGE BASED ON STATISTICAL FEATURES OF HISTOGRAM

    Directory of Open Access Journals (Sweden)

    H.-T. Gao

    2016-06-01

    Full Text Available Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.

  4. Determination of pulp necrosis based on periapical digital radiography histogram and pulp histopathology

    Directory of Open Access Journals (Sweden)

    Emi Khoironi

    2017-11-01

    Full Text Available Introduction: Radiographic examination is needed to determine the diagnosis of pulp necrosis in addition to a clinical examination. Visual observation was limited in seeing the colour change degree and hence an effort taken by assessing the histogram value. The purpose of this study was to obtain the pulp chamber histogram pattern which reveals its grey scale value, trend, intensity average, histogram variation, and histograms maximum regional of interest (ROI through digital periapical radiograph. Methods: This study was a descriptive study of the total of nine pulp chamber periapical radiograph data samples. The samples were divided into three groups, the 1st group was the data taken prior to the tooth extraction, the 2nd group was the data collected after the teeth extraction, and the 3rd group was the data of priorly pulpless teeth. Results: There was a tendency of histogram graphic shifting to the left side, likely towards the radiolucent area on ROI of the pulp at the apical region, whilst histopathologically, a massive infiltration of a round PMN cells was found in the area. This finding supporteded the determination of pulp necrosis diagnose. Conclusion: The tooth with a pulp necrosis showed a tendency that led to radiolucency on periapical radiograph histogram, and histopathologic examination showed massive infiltration of a round PMN cells, thus supported the pulp necrosis diagnose.

  5. Surface contamination of hazardous drug pharmacy storage bins and pharmacy distributor shipping containers.

    Science.gov (United States)

    Redic, Kimberly A; Fang, Kayleen; Christen, Catherine; Chaffee, Bruce W

    2018-03-01

    Purpose This study was conducted to determine whether there is contamination on exterior drug packaging using shipping totes from the distributor and carousel storage bins as surrogate markers of external packaging contamination. Methods A two-part study was conducted to measure the presence of 5-fluorouracil, ifosfamide, cyclophosphamide, docetaxel and paclitaxel using surrogate markers for external drug packaging. In Part I, 10 drug distributor shipping totes designated for transport of hazardous drugs provided a snapshot view of contamination from regular use and transit in and out of the pharmacy. An additional two totes designated for transport of non-hazardous drugs served as controls. In Part II, old carousel storage bins (i.e. those in use pre-study) were wiped for snapshot view of hazardous drug contamination on storage bins. New carousel storage bins were then put into use for storage of the five tested drugs and used for routine storage and inventory maintenance activities. Carousel bins were wiped at time intervals 0, 8, 16 and 52 weeks to measure surface contamination. Results Two of the 10 hazardous shipping totes were contaminated. Three of the five-old carousel bins were contaminated with cyclophosphamide. One of the old carousel bins was also contaminated with ifosfamide. There were no detectable levels of hazardous drugs on any of the new storage bins at time 0, 8 or 16 weeks. However, at the Week 52, there was a detectable level of 5-FU present in the 5-FU carousel bin. Conclusions Contamination of the surrogate markers suggests that external packaging for hazardous drugs is contaminated, either during the manufacturing process or during routine chain of custody activities. These results demonstrate that occupational exposure may occur due to contamination from shipping totes and storage bins, and that handling practices including use of personal protective equipment is warranted.

  6. Solid waste bin detection and classification using Dynamic Time Warping and MLP classifier

    Energy Technology Data Exchange (ETDEWEB)

    Islam, Md. Shafiqul, E-mail: shafique@eng.ukm.my [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Hannan, M.A., E-mail: hannan@eng.ukm.my [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Basri, Hassan [Dept. of Civil and Structural Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia); Hussain, Aini; Arebey, Maher [Dept. of Electrical, Electronic and Systems Engineering, Universiti Kebangsaan Malaysia, Bangi 43600, Selangore (Malaysia)

    2014-02-15

    Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensor intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level.

  7. Solid waste bin detection and classification using Dynamic Time Warping and MLP classifier

    International Nuclear Information System (INIS)

    Islam, Md. Shafiqul; Hannan, M.A.; Basri, Hassan; Hussain, Aini; Arebey, Maher

    2014-01-01

    Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensor intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level

  8. Cost allocation with limited information

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tind, Jørgen

    This article investigates progressive development of Aumann-Shapley cost allocation in a multilevel organizational or production structure. In particular, we study a linear parametric programming setup utilizing the Dantzig-Wolfe decomposition procedure. Typically cost allocation takes place after...... all activities have been performed, for example by finishing all outputs. Here the allocation is made progressively with suggestions for activities. I other words cost allocation is performed in parallel for example with a production planning process. This development does not require detailed...

  9. Regulation of the interaction between the neuronal BIN1 isoform 1 and Tau proteins - role of the SH3 domain.

    Science.gov (United States)

    Malki, Idir; Cantrelle, François-Xavier; Sottejeau, Yoann; Lippens, Guy; Lambert, Jean-Charles; Landrieu, Isabelle

    2017-10-01

    Bridging integrator 1 (bin1) gene is a genetic determinant of Alzheimer's disease (AD) and has been reported to modulate Alzheimer's pathogenesis through pathway(s) involving Tau. The functional impact of Tau/BIN1 interaction as well as the molecular details of this interaction are still not fully resolved. As a consequence, how BIN1 through its interaction with Tau affects AD risk is also still not determined. To progress in this understanding, interaction of Tau with two BIN1 isoforms was investigated using Nuclear Magnetic Resonance spectroscopy. 1 H, 15 N spectra showed that the C-terminal SH3 domain of BIN1 isoform 1 (BIN1Iso1) is not mobile in solution but locked with the core of the protein. In contrast, the SH3 domain of BIN1 isoform 9 (BIN1Iso9) behaves as an independent mobile domain. This reveals an equilibrium between close and open conformations for the SH3 domain. Interestingly, a 334-376 peptide from the clathrin and AP-2-binding domain (CLAP) domain of BIN1Iso1, which contains a SH3-binding site, is able to compete with BIN1-SH3 intramolecular interaction. For both BIN1 isoforms, the SH3 domain can interact with Tau(210-240) sequence. Tau(210-240) peptide can indeed displace the intramolecular interaction of the BIN1-SH3 of BIN1Iso1 and form a complex with the released domain. The measured K d were in agreement with a stronger affinity of Tau peptide. Both CLAP and Tau peptides occupied the same surface on the BIN1-SH3 domain, showing that their interaction is mutually exclusive. These results emphasize an additional level of complexity in the regulation of the interaction between BIN1 and Tau dependent of the BIN1 isoforms. © 2017 Federation of European Biochemical Societies.

  10. Strain histograms are equal to strain ratios in predicting malignancy in breast tumours

    Science.gov (United States)

    Ewertsen, Caroline; Sletting, Susanne; Talman, Maj-Lis; Vejborg, Ilse; Bachmann Nielsen, Michael

    2017-01-01

    Objectives To assess whether strain histograms are equal to strain ratios in predicting breast tumour malignancy and to see if either could be used to upgrade Breast Imaging Reporting and Data System (BI-RADS) 3 tumours for immediate biopsy. Methods Ninety-nine breast tumours were examined using B-mode BI-RADS scorings and strain elastography. Strain histograms and ratios were assessed, and areas- under-the-receiver-operating-characteristic-curve (AUROC) for each method calculated. In BI-RADS 3 tumours cut-offs for strain histogram and ratio values were calculated to see if some tumours could be upgraded for immediate biopsy. Linear regression was performed to evaluate the effect of tumour depth and size, and breast density on strain elastography. Results Forty-four of 99 (44.4%) tumours were malignant. AUROC of BI-RADS, strain histograms and strain ratios were 0.949, 0.830 and 0.794 respectively. There was no significant difference between AUROCs of strain histograms and strain ratios (P = 0.405), while they were both inferior to BI-RADS scoring (PBI-RADS 3 tumours were malignant. When cut-offs of 189 for strain histograms and 1.44 for strain ratios were used to upgrade BI-RADS 3 tumours, AUROCS were 0.961 (Strain histograms and BI-RADS) and 0.941 (Strain ratios and BI-RADS). None of them was significantly different from BI-RADS scoring alone (P = 0.249 and P = 0.414). Tumour size and depth, and breast density influenced neither strain histograms (P = 0.196, P = 0.115 and P = 0.321) nor strain ratios (P = 0.411, P = 0.596 and P = 0.321) Conclusion Strain histogram analyses are reliable and easy to do in breast cancer diagnosis and perform comparably to strain ratio analyses. No significant difference in AUROCs between BI-RADS scoring and elastography combined with BI-RADS scoring was found in this study. PMID:29073170

  11. Strain histograms are equal to strain ratios in predicting malignancy in breast tumours.

    Directory of Open Access Journals (Sweden)

    Jonathan Frederik Carlsen

    Full Text Available To assess whether strain histograms are equal to strain ratios in predicting breast tumour malignancy and to see if either could be used to upgrade Breast Imaging Reporting and Data System (BI-RADS 3 tumours for immediate biopsy.Ninety-nine breast tumours were examined using B-mode BI-RADS scorings and strain elastography. Strain histograms and ratios were assessed, and areas- under-the-receiver-operating-characteristic-curve (AUROC for each method calculated. In BI-RADS 3 tumours cut-offs for strain histogram and ratio values were calculated to see if some tumours could be upgraded for immediate biopsy. Linear regression was performed to evaluate the effect of tumour depth and size, and breast density on strain elastography.Forty-four of 99 (44.4% tumours were malignant. AUROC of BI-RADS, strain histograms and strain ratios were 0.949, 0.830 and 0.794 respectively. There was no significant difference between AUROCs of strain histograms and strain ratios (P = 0.405, while they were both inferior to BI-RADS scoring (P<0.001, P = 0.008. Four out of 26 BI-RADS 3 tumours were malignant. When cut-offs of 189 for strain histograms and 1.44 for strain ratios were used to upgrade BI-RADS 3 tumours, AUROCS were 0.961 (Strain histograms and BI-RADS and 0.941 (Strain ratios and BI-RADS. None of them was significantly different from BI-RADS scoring alone (P = 0.249 and P = 0.414. Tumour size and depth, and breast density influenced neither strain histograms (P = 0.196, P = 0.115 and P = 0.321 nor strain ratios (P = 0.411, P = 0.596 and P = 0.321.Strain histogram analyses are reliable and easy to do in breast cancer diagnosis and perform comparably to strain ratio analyses. No significant difference in AUROCs between BI-RADS scoring and elastography combined with BI-RADS scoring was found in this study.

  12. Strain histograms are equal to strain ratios in predicting malignancy in breast tumours

    DEFF Research Database (Denmark)

    Carlsen, Jonathan Frederik; Ewertsen, Caroline; Sletting, Susanne

    2017-01-01

    could be upgraded for immediate biopsy. Linear regression was performed to evaluate the effect of tumour depth and size, and breast density on strain elastography. Results: Forty-four of 99 (44.4%) tumours were malignant. AUROC of BI-RADS, strain histograms and strain ratios were 0.949, 0.830 and 0.......794 respectively. There was no significant difference between AUROCs of strain histograms and strain ratios (P = 0.405), while they were both inferior to BI-RADS scoring (P0.001, P = 0.008). Four out of 26 BI-RADS 3 tumours were malignant. When cut-offs of 189 for strain histograms and 1.44 for strain ratios were...... used to upgrade BI-RADS 3 tumours, AUROCS were 0.961 (Strain histograms and BIRADS) and 0.941 (Strain ratios and BI-RADS). None of them was significantly different from BI-RADS scoring alone (P = 0.249 and P = 0.414). Tumour size and depth, and breast density influenced neither strain histograms (P = 0...

  13. Image contrast enhancement using adjacent-blocks-based modification for local histogram equalization

    Science.gov (United States)

    Wang, Yang; Pan, Zhibin

    2017-11-01

    Infrared images usually have some non-ideal characteristics such as weak target-to-background contrast and strong noise. Because of these characteristics, it is necessary to apply the contrast enhancement algorithm to improve the visual quality of infrared images. Histogram equalization (HE) algorithm is a widely used contrast enhancement algorithm due to its effectiveness and simple implementation. But a drawback of HE algorithm is that the local contrast of an image cannot be equally enhanced. Local histogram equalization algorithms are proved to be the effective techniques for local image contrast enhancement. However, over-enhancement of noise and artifacts can be easily found in the local histogram equalization enhanced images. In this paper, a new contrast enhancement technique based on local histogram equalization algorithm is proposed to overcome the drawbacks mentioned above. The input images are segmented into three kinds of overlapped sub-blocks using the gradients of them. To overcome the over-enhancement effect, the histograms of these sub-blocks are then modified by adjacent sub-blocks. We pay more attention to improve the contrast of detail information while the brightness of the flat region in these sub-blocks is well preserved. It will be shown that the proposed algorithm outperforms other related algorithms by enhancing the local contrast without introducing over-enhancement effects and additional noise.

  14. The L_infinity constrained global optimal histogram equalization technique for real time imaging

    Science.gov (United States)

    Ren, Qiongwei; Niu, Yi; Liu, Lin; Jiao, Yang; Shi, Guangming

    2015-08-01

    Although the current imaging sensors can achieve 12 or higher precision, the current display devices and the commonly used digital image formats are still only 8 bits. This mismatch causes significant waste of the sensor precision and loss of information when storing and displaying the images. For better usage of the precision-budget, tone mapping operators have to be used to map the high-precision data into low-precision digital images adaptively. In this paper, the classic histogram equalization tone mapping operator is reexamined in the sense of optimization. We point out that the traditional histogram equalization technique and its variants are fundamentally improper by suffering from local optimum problems. To overcome this drawback, we remodel the histogram equalization tone mapping task based on graphic theory which achieves the global optimal solutions. Another advantage of the graphic-based modeling is that the tone-continuity is also modeled as a vital constraint in our approach which suppress the annoying boundary artifacts of the traditional approaches. In addition, we propose a novel dynamic programming technique to solve the histogram equalization problem in real time. Experimental results shows that the proposed tone-preserved global optimal histogram equalization technique outperforms the traditional approaches by exhibiting more subtle details in the foreground while preserving the smoothness of the background.

  15. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Lee Hae-Yeoun

    2010-01-01

    Full Text Available Abstract We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  16. Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function

    Directory of Open Access Journals (Sweden)

    Heung-Kyu Lee

    2010-01-01

    Full Text Available We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.

  17. Unsupervised Binning of Metagenomic Assembled Contigs Using Improved Fuzzy C-Means Method.

    Science.gov (United States)

    Liu, Yun; Hou, Tao; Kang, Bing; Liu, Fu

    2017-01-01

    Metagenomic contigs binning is a necessary step of metagenome analysis. After assembly, the number of contigs belonging to different genomes is usually unequal. So a metagenomic contigs dataset is a kind of imbalanced dataset and traditional fuzzy c-means method (FCM) fails to handle it very well. In this paper, we will introduce an improved version of fuzzy c-means method (IFCM) into metagenomic contigs binning. First, tetranucleotide frequencies are calculated for every contig. Second, the number of bins is roughly estimated by the distribution of genome lengths of a complete set of non-draft sequenced microbial genomes from NCBI. Then, IFCM is used to cluster DNA contigs with the estimated result. Finally, a clustering validity function is utilized to determine the binning result. We tested this method on a synthetic and two real datasets and experimental results have showed the effectiveness of this method compared with other tools.

  18. VL1 MARS METEOROLOGY DATA RESAMPLED DATA BINNED-P-T-V V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains binned and splined data obtained from the Viking Meteorology Instrument System (VMIS) through portions of the Viking Lander 1 mission. The...

  19. VO1/VO2 MARS IRTM BINNED DATA AND DERIVED CLOUDS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set, derived from the Viking Orbiter Infrared Thermal Mapper (IRTM) data set, has been binned in both space and time. It consists of two complementary...

  20. Deterministically swapping frequency-bin entanglement from photon-photon to atom-photon hybrid systems

    Science.gov (United States)

    Ou, Bao-Quan; Liu, Chang; Sun, Yuan; Chen, Ping-Xing

    2018-02-01

    Inspired by the recent developments of the research on the atom-photon quantum interface and energy-time entanglement between single-photon pulses, we are motivated to study the deterministic protocol for the frequency-bin entanglement of the atom-photon hybrid system, which is analogous to the frequency-bin entanglement between single-photon pulses. We show that such entanglement arises naturally in considering the interaction between a frequency-bin entangled single-photon pulse pair and a single atom coupled to an optical cavity, via straightforward atom-photon phase gate operations. Its anticipated properties and preliminary examples of its potential application in quantum networking are also demonstrated. Moreover, we construct a specific quantum entanglement witness tool to detect such extended frequency-bin entanglement from a reasonably general set of separable states, and prove its capability theoretically. We focus on the energy-time considerations throughout the analysis.

  1. MetaBAT: Metagenome Binning based on Abundance and Tetranucleotide frequence

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dongwan; Froula, Jeff; Egan, Rob; Wang, Zhong

    2014-03-21

    Grouping large fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Here we developed automated metagenome binning software, called MetaBAT, which integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency. On synthetic datasets MetaBAT on average achieves 98percent precision and 90percent recall at the strain level with 281 near complete unique genomes. Applying MetaBAT to a human gut microbiome data set we recovered 176 genome bins with 92percent precision and 80percent recall. Further analyses suggest MetaBAT is able to recover genome fragments missed in reference genomes up to 19percent, while 53 genome bins are novel. In summary, we believe MetaBAT is a powerful tool to facilitate comprehensive understanding of complex microbial communities.

  2. Meeting the EU recycling targets by introducing a 2-compartment bin to households

    DEFF Research Database (Denmark)

    Jensen, Morten Bang; Scheutz, Charlotte; Møller, Jacob

    A Danish municipality has introduced a 2-compartment bin in the waste collection scheme, this bin should increase recycling of dry household recyclables. An excessive waste sorting campaign was conducted and the efficiency of the bin assessed. The waste sorting campaign yielded a full waste...... composition with focus on the dry recyclables, and it was used to determine wheter the 2-compartment bin could fulfill the EU recycling targets for 2020. Only 2 of 4 calculation methods for meeting the EU targets were applicable and only one of these fulfilled the EU target. Eventhough the EU recycling...... targets can be fulfilled, there is still room for improvement (increase source separation), especially for hard plastic and metals....

  3. SSC accelerator availability allocation

    International Nuclear Information System (INIS)

    Dixon, K.T.; Franciscovich, J.

    1991-03-01

    Superconducting Super Collider (SSC) operational availability is an area of major concern, judged by the Central Design Group to present such risk that use of modern engineering tools would be essential to program success. Experience has shown that as accelerator beam availability falls below about 80%, efficiency of physics experiments degrades rapidly due to inability to maintain adequate coincident accelerator and detector operation. For this reason, the SSC availability goal has been set at 80%, even though the Fermi National Accelerator Laboratory accelerator, with a fraction of the SSC's complexity, has only recently approached that level. This paper describes the allocation of the top-level goal to part-level reliability and maintainability requirements, and it gives the results of parameter sensitivity studies designed to help identify the best approach to achieve the needed system availability within funding and schedule constraints. 1 ref., 12 figs., 4 tabs

  4. SU-F-T-253: Volumetric Comparison Between 4D CT Amplitude and Phase Binning Mode

    Energy Technology Data Exchange (ETDEWEB)

    Yang, G; Ma, R; Reyngold, M [Memorial Sloan-Kettering Cancer Center, Commack, NY (United States); Li, X; Xiong, W; Gewanter, R [Memorial Sloan-Kettering Cancer Center, Rockville Center, NY (United States); Yorke, E; Mageras, G; Wu, A; Deasy, J; Hunt, M [Memorial Sloan-Kettering Cancer Center, New York, NY (United States); Tang, X [Memorial Sloan-Kettering Cancer Center, West Harrison, NY (United States); Chan, M [Memorial Sloan-Kettering Cancer Center, Basking Ridge, NJ (United States)

    2016-06-15

    Purpose: Motion artifact in 4DCT images can affect radiation treatment quality. To identify the most robust and accurate binning method, we compare the volume difference between targets delineated on amplitude and phase binned 4DCT scans. Methods: Varian RPM system and CT scanner were used to acquire 4DCTs of a Quasar phantom with embedded cubic and spherical objects having superior-inferior motion. Eight patients’ respiration waveforms were used to drive the phantom. The 4DCT scan was reconstructed into 10 phase and 10 amplitude bins (2 mm slices). A scan of the static phantom was also acquired. For each waveform, sphere and cube volumes were generated automatically on each phase using HU thresholding. Phase (amplitude) ITVs were the union of object volumes over all phase (amplitude) binned images. The sphere and cube volumes measured in the static phantom scan were V{sub sphere}=4.19cc and V{sub cube}=27.0cc. Volume difference (VD) and dice similarity coefficient (DSC) of the ITVs, and mean volume error (MVE) defined as the average target volume percentage difference between each phase image and the static image, were used to evaluate the performance of amplitude and phase binning. Results: Averaged over the eight breathing traces, the VD and DSC of the internal target volume (ITV) between amplitude and phase binning were 3.4%±3.2% (mean ± std) and 95.9%±2.1% for sphere; 2.1%±3.3% and 98.0% ±1.5% for cube, respectively.For all waveforms, the average sphere MVE of amplitude and phase binning was 6.5% ± 5.0% and 8.2%±6.3%, respectively; and the average cube MVE of amplitude and phase binning was 5.7%±3.5%and 12.9%±8.9%, respectively. Conclusion: ITV volume and spatial overlap as assessed by VD and DSC are similar between amplitude and phase binning. Compared to phase binning, amplitude binning results in lower MVE suggesting it is less susceptible to motion artifact.

  5. A histogram modification framework and its application for image contrast enhancement.

    Science.gov (United States)

    Arici, Tarik; Dikbas, Salih; Altunbasak, Yucel

    2009-09-01

    A general framework based on histogram equalization for image contrast enhancement is presented. In this framework, contrast enhancement is posed as an optimization problem that minimizes a cost function. Histogram equalization is an effective technique for contrast enhancement. However, a conventional histogram equalization (HE) usually results in excessive contrast enhancement, which in turn gives the processed image an unnatural look and creates visual artifacts. By introducing specifically designed penalty terms, the level of contrast enhancement can be adjusted; noise robustness, white/black stretching and mean-brightness preservation may easily be incorporated into the optimization. Analytic solutions for some of the important criteria are presented. Finally, a low-complexity algorithm for contrast enhancement is presented, and its performance is demonstrated against a recently proposed method.

  6. ANALISA PENINGKATAN KUALITAS CITRA BAWAH AIR BERBASIS KOREKSI GAMMA dan HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    Aria Hendrawan

    2016-11-01

    Full Text Available Underwater image of water quality in the dark, it depends on the depth of water at the time of image acquisition or image. The results of the image quality is adversely affecting the results matching the image pairs underwater with SIFT algorithm. This research aims to use the method of image preprocessing and Histogram Equalization Gamma Correction that works to improve the quality of images underwater. The results showed 27.76% increase using image preprocessing Gamma Correction and Histogram Equalization compared with no increase in image quality. Results of paired t-test has the null hypothesis is rejected so that there is a significant difference between the application of Gamma Correction Histogram Equalization with and without image enhancement.

  7. Power-constrained contrast enhancement for emissive displays based on histogram equalization.

    Science.gov (United States)

    Lee, Chulwoo; Lee, Chul; Lee, Young-Yoon; Kim, Chang-Su

    2012-01-01

    A power-constrained contrast-enhancement algorithm for emissive displays based on histogram equalization (HE) is proposed in this paper. We first propose a log-based histogram modification scheme to reduce overstretching artifacts of the conventional HE technique. Then, we develop a power-consumption model for emissive displays and formulate an objective function that consists of the histogram-equalizing term and the power term. By minimizing the objective function based on the convex optimization theory, the proposed algorithm achieves contrast enhancement and power saving simultaneously. Moreover, we extend the proposed algorithm to enhance video sequences, as well as still images. Simulation results demonstrate that the proposed algorithm can reduce power consumption significantly while improving image contrast and perceptual quality.

  8. Introducing the Jacobian-volume-histogram of deforming organs: application to parotid shrinkage evaluation

    Science.gov (United States)

    Fiorino, Claudio; Maggiulli, Eleonora; Broggi, Sara; Liberini, Simone; Mauro Cattaneo, Giovanni; Dell'Oca, Italo; Faggiano, Elena; Di Muzio, Nadia; Calandrino, Riccardo; Rizzo, Giovanna

    2011-06-01

    The Jacobian of the deformation field of elastic registration between images taken during radiotherapy is a measure of inter-fraction local deformation. The histogram of the Jacobian values (Jac) within an organ was introduced (JVH—Jacobian-volume-histogram) and first applied in quantifying parotid shrinkage. MVCTs of 32 patients previously treated with helical tomotherapy for head-neck cancers were collected. Parotid deformation was evaluated through elastic registration between MVCTs taken at the first and last fractions. Jac was calculated for each voxel of all parotids, and integral JVHs were calculated for each parotid; the correlation between the JVH and the planning dose-volume histogram (DVH) was investigated. On average, 82% (±17%) of the voxels shrinks (Jac 50% (Jac = 75% (OR: 7.6, p = 0.002). Jac and the JVH are promising tools for scoring/modelling toxicity and for evaluating organ/contour variations with potential applications in adaptive radiotherapy.

  9. An energy-based model for the image edge-histogram specification problem.

    Science.gov (United States)

    Mignotte, Max

    2012-01-01

    In this correspondence, we present an original energy-based model that achieves the edge-histogram specification of a real input image and thus extends the exact specification method of the image luminance (or gray level) distribution recently proposed by Coltuc et al. Our edge-histogram specification approach is stated as an optimization problem in which each edge of a real input image will tend iteratively toward some specified gradient magnitude values given by a target edge distribution (or a normalized edge histogram possibly estimated from a target image). To this end, a hybrid optimization scheme combining a global and deterministic conjugate-gradient-based procedure and a local stochastic search using the Metropolis criterion is proposed herein to find a reliable solution to our energy-based model. Experimental results are presented, and several applications follow from this procedure.

  10. Perceived quality of wood images influenced by the skewness of image histogram

    Science.gov (United States)

    Katsura, Shigehito; Mizokami, Yoko; Yaguchi, Hirohisa

    2015-08-01

    The shape of image luminance histograms is related to material perception. We investigated how the luminance histogram contributed to improvements in the perceived quality of wood images by examining various natural wood and adhesive vinyl sheets with printed wood grain. In the first experiment, we visually evaluated the perceived quality of wood samples. In addition, we measured the colorimetric parameters of the wood samples and calculated statistics of image luminance. The relationship between visual evaluation scores and image statistics suggested that skewness and kurtosis affected the perceived quality of wood. In the second experiment, we evaluated the perceived quality of wood images with altered luminance skewness and kurtosis using a paired comparison method. Our result suggests that wood images are more realistic if the skewness of the luminance histogram is slightly negative.

  11. Independent technical review of the Bin and Alcove test programs at the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    1993-12-01

    This Independent Technical Review (ITR) assessed the need for and technical validity of the proposed Bin and Alcove test programs using TRU-waste at the WIPP site. The ITR Team recommends that the planned Bin and Alcove tests be abandoned, and that new activities be initiated in support of the WIPP regulatory compliance processes. Recommendations in this report offer an alternate path for expeditiously attaining disposal certification and permitting

  12. Genetic patterns in European geometrid moths revealed by the Barcode Index Number (BIN system.

    Directory of Open Access Journals (Sweden)

    Axel Hausmann

    Full Text Available BACKGROUND: The geometrid moths of Europe are one of the best investigated insect groups in traditional taxonomy making them an ideal model group to test the accuracy of the Barcode Index Number (BIN system of BOLD (Barcode of Life Datasystems, a method that supports automated, rapid species delineation and identification. METHODOLOGY/PRINCIPAL FINDINGS: This study provides a DNA barcode library for 219 of the 249 European geometrid moth species (88% in five selected subfamilies. The data set includes COI sequences for 2130 specimens. Most species (93% were found to possess diagnostic barcode sequences at the European level while only three species pairs (3% were genetically indistinguishable in areas of sympatry. As a consequence, 97% of the European species we examined were unequivocally discriminated by barcodes within their natural areas of distribution. We found a 1:1 correspondence between BINs and traditionally recognized species for 67% of these species. Another 17% of the species (15 pairs, three triads shared BINs, while specimens from the remaining species (18% were divided among two or more BINs. Five of these species are mixtures, both sharing and splitting BINs. For 82% of the species with two or more BINs, the genetic splits involved allopatric populations, many of which have previously been hypothesized to represent distinct species or subspecies. CONCLUSIONS/SIGNIFICANCE: This study confirms the effectiveness of DNA barcoding as a tool for species identification and illustrates the potential of the BIN system to characterize formal genetic units independently of an existing classification. This suggests the system can be used to efficiently assess the biodiversity of large, poorly known assemblages of organisms. For the moths examined in this study, cases of discordance between traditionally recognized species and BINs arose from several causes including overlooked species, synonymy, and cases where DNA barcodes revealed

  13. Max–min Bin Packing Algorithm and its application in nano-particles filling

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    With regard to existing bin packing algorithms, higher packing efficiency often leads to lower packing speed while higher packing speed leads to lower packing efficiency. Packing speed and packing efficiency of existing bin packing algorithms including NFD, NF, FF, FFD, BF and BFD correlates negatively with each other, thus resulting in the failure of existing bin packing algorithms to satisfy the demand of nano-particles filling for both high speed and high efficiency. The paper provides a new bin packing algorithm, Max–min Bin Packing Algorithm (MM), which realizes both high packing speed and high packing efficiency. MM has the same packing speed as NFD (whose packing speed ranks no. 1 among existing bin packing algorithms); in case that the size repetition rate of objects to be packed is over 5, MM can realize almost the same packing efficiency as BFD (whose packing efficiency ranks No. 1 among existing bin packing algorithms), and in case that the size repetition rate of objects to be packed is over 500, MM can achieve exactly the same packing efficiency as BFD. With respect to application of nano-particles filling, the size repetition rate of nano particles to be packed is usually in thousands or ten thousands, far higher than 5 or 500. Consequently, in application of nano-particles filling, the packing efficiency of MM is exactly equal to that of BFD. Thus the irreconcilable conflict between packing speed and packing efficiency is successfully removed by MM, which leads to MM having better packing effect than any existing bin packing algorithm. In practice, there are few cases when the size repetition of objects to be packed is lower than 5. Therefore the MM is not necessarily limited to nano-particles filling, and can also be widely used in other applications besides nano-particles filling. Especially, MM has significant value in application of nano-particles filling such as nano printing and nano tooth filling.

  14. Application of Genetic Algorithm for the Bin Packing Problem with a New Representation Scheme

    Directory of Open Access Journals (Sweden)

    N. Mohamadi

    2010-10-01

    Full Text Available The Bin Packing Problem (BPP is to find the minimum number of binsneeded to pack a given set of objects of known sizes so that they donot exceed the capacity of each bin. This problem is known to beNP-Hard [5]; hence many heuristic procedures for its solution havebeen suggested. In this paper we propose a new representation schemeand solve the problem by a Genetic Algorithm. Limited computationalresults show the efficiency of this scheme.

  15. Solid waste bin detection and classification using Dynamic Time Warping and MLP classifier.

    Science.gov (United States)

    Islam, Md Shafiqul; Hannan, M A; Basri, Hassan; Hussain, Aini; Arebey, Maher

    2014-02-01

    The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensor intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Afghanistan, the Taliban, and Osama bin Laden: The Background to September 11

    Science.gov (United States)

    Social Education, 2011

    2011-01-01

    On May 1, 2011, a group of U.S. soldiers boarded helicopters at a base in Afghanistan, hoping to find a man named Osama bin Laden. Bin Laden, the leader of the al Qaeda terrorist network, was responsible for a number of terrorist attacks around the world, including those of September 11, 2001, that killed nearly 3,000 people in the United States.…

  17. Two-Bin Kanban: Ordering Impact at Navy Medical Center San Diego

    Science.gov (United States)

    2016-06-01

    whole or on clinical processes. 6 Barriers to the lean practices in health care include poor management support, poorly aligned incentives, poor data...impact, if any, two-bin Kanban had on the Gastroenterology, Urology, and Oral Maxillofacial Surgery (OMFS) departments at NMCSD. The data is...determine what impact, if any, two-bin Kanban had on the Gastroenterology, Urology, and Oral Maxillofacial Surgery (OMFS) departments at NMCSD. The data is

  18. Importance measures and resource allocation

    International Nuclear Information System (INIS)

    Guey, C.N.; Morgan, T.; Hughes, E.A.

    1987-01-01

    This paper discusses various importance measures and their practical relevance to allocating resources. The characteristics of importance measures are illustrated through simple examples. Important factors associated with effectively allocating resources to improve plant system performance or to prevent system degradation are discussed. It is concluded that importance measures are only indicative of and not equal to the risk significance of a component, system, or event. A decision framework is suggested to provide a comprehensive basis for resource allocation

  19. A robust and accurate binning algorithm for metagenomic sequences with arbitrary species abundance ratio.

    Science.gov (United States)

    Leung, Henry C M; Yiu, S M; Yang, Bin; Peng, Yu; Wang, Yi; Liu, Zhihua; Chen, Jingchi; Qin, Junjie; Li, Ruiqiang; Chin, Francis Y L

    2011-06-01

    With the rapid development of next-generation sequencing techniques, metagenomics, also known as environmental genomics, has emerged as an exciting research area that enables us to analyze the microbial environment in which we live. An important step for metagenomic data analysis is the identification and taxonomic characterization of DNA fragments (reads or contigs) resulting from sequencing a sample of mixed species. This step is referred to as 'binning'. Binning algorithms that are based on sequence similarity and sequence composition markers rely heavily on the reference genomes of known microorganisms or phylogenetic markers. Due to the limited availability of reference genomes and the bias and low availability of markers, these algorithms may not be applicable in all cases. Unsupervised binning algorithms which can handle fragments from unknown species provide an alternative approach. However, existing unsupervised binning algorithms only work on datasets either with balanced species abundance ratios or rather different abundance ratios, but not both. In this article, we present MetaCluster 3.0, an integrated binning method based on the unsupervised top--down separation and bottom--up merging strategy, which can bin metagenomic fragments of species with very balanced abundance ratios (say 1:1) to very different abundance ratios (e.g. 1:24) with consistently higher accuracy than existing methods. MetaCluster 3.0 can be downloaded at http://i.cs.hku.hk/~alse/MetaCluster/.

  20. Brief Communication: Contrast-stretching- and histogram-smoothness-based synthetic aperture radar image enhancement for flood map generation

    Science.gov (United States)

    Nazir, F.; Riaz, M. M.; Ghafoor, A.; Arif, F.

    2015-02-01

    Synthetic-aperture-radar-image-based flood map generation is usually a challenging task (due to degraded contrast). A three-step approach (based on adaptive histogram clipping, histogram remapping and smoothing) is proposed for generation of a more visualized flood map image. The pre- and post-flood images are adaptively histogram equalized. The hidden details in difference image are enhanced using contrast-based enhancement and histogram smoothing. A fast-ready flood map is then generated using equalized pre-, post- and difference images. Results (evaluated using different data sets) show significance of the proposed technique.

  1. Blind image quality evaluation using the conditional histogram patterns of divisive normalization transform coefficients

    Science.gov (United States)

    Chu, Ying; Mou, Xuanqin; Yu, Hengyong

    2017-09-01

    A novel code book based framework for blind image quality assessment is developed. The code words are designed according to the image pattern of joint conditional histograms among neighboring divisive normalization transform coefficients in degraded images. By extracting high dimensional perceptual features from different subjective score levels in the sample database, and by clustering the features to their centroids, the conditional histogram based code book is constructed. Objective image quality score is calculated by comparing the distances between extracted features and the code words. Experiments are performed on most current databases, and the results confirm the effectiveness and feasibility of the proposed approach.

  2. Quick cytogenetic screening of breeding bulls using flow cytometric sperm DNA histogram analysis.

    Science.gov (United States)

    Nagy, Szabolcs; Polgár, Péter J; Andersson, Magnus; Kovács, András

    2016-09-01

    The aim of the present study was to test the FXCycle PI/RNase kit for routine DNA analyses in order to detect breeding bulls and/or insemination doses carrying cytogenetic aberrations. In a series of experiments we first established basic DNA histogram parameters of cytogenetically healthy breeding bulls by measuring the intraspecific genome size variation of three animals, then we compared the histogram profiles of bulls carrying cytogenetic defects to the baseline values. With the exception of one case the test was able to identify bulls with cytogenetic defects. Therefore, we conclude that the assay could be incorporated into the laboratory routine where flow cytometry is applied for semen quality control.

  3. Efficient Metropolitan Resource Allocation

    Directory of Open Access Journals (Sweden)

    Richard Arnott

    2016-05-01

    Full Text Available Over the past 30 years Calgary has doubled in size, from a population of 640,645 in 1985 to 1,230,915 in 2015. During that time the City has had five different mayors, hosted the Winter Olympics, and expanded the C-Train from 25 platforms to 45. Calgary’s Metropolitan Area has grown too, with Airdrie, Chestermere, Okotoks and Cochrane growing into full-fledged cities, ripe with inter-urban commuters.* And with changes to provincial legislation in the mid-’90s, rural Rocky View County and the Municipal District of Foothills are now real competitors for residential, commercial and industrial development that in the past would have been considered urban. In this metropolitan system, where people live, their household structure, and their place of work informs the services they need to conduct their daily lives, and directly impacts the spatial character of the City and the broader region. In sum, Metropolitan Calgary is increasingly complex. Calgary and the broader metropolitan area will continue to grow, even with the current economic slowdown. Frictions within Calgary, between the various municipalities in the metropolitan area, and the priorities of other local authorities (such as the School Boards and Alberta Health Services will continue to impact the agendas of local politicians and their ability to answer to the needs of their residents. How resources – whether it is hard infrastructure, affordable housing, classrooms, or hospital beds – are allocated over space and how these resources are funded, directly impacts these relationships. This technical paper provides my perspective as an urban economist on the efficient allocation of resources within a metropolitan system in general, with reference to Calgary where appropriate, and serves as a companion to the previously released “Reflections on Calgary’s Spatial Structure: An Urban Economists Critique of Municipal Planning in Calgary.” It is hoped that the concepts reviewed

  4. Risk allocation under liquidity constraints

    NARCIS (Netherlands)

    Csóka, P.; Herings, P.J.J.

    2013-01-01

    Risk allocation games are cooperative games that are used to attribute the risk of a financial entity to its divisions. In this paper, we extend the literature on risk allocation games by incorporating liquidity considerations. A liquidity policy specifies state-dependent liquidity requirements that

  5. Unsupervised binning of environmental genomic fragments based on an error robust selection of l-mers.

    Science.gov (United States)

    Yang, Bin; Peng, Yu; Leung, Henry Chi-Ming; Yiu, Siu-Ming; Chen, Jing-Chi; Chin, Francis Yuk-Lun

    2010-04-16

    With the rapid development of genome sequencing techniques, traditional research methods based on the isolation and cultivation of microorganisms are being gradually replaced by metagenomics, which is also known as environmental genomics. The first step, which is still a major bottleneck, of metagenomics is the taxonomic characterization of DNA fragments (reads) resulting from sequencing a sample of mixed species. This step is usually referred as "binning". Existing binning methods are based on supervised or semi-supervised approaches which rely heavily on reference genomes of known microorganisms and phylogenetic marker genes. Due to the limited availability of reference genomes and the bias and instability of marker genes, existing binning methods may not be applicable in many cases. In this paper, we present an unsupervised binning method based on the distribution of a carefully selected set of l-mers (substrings of length l in DNA fragments). From our experiments, we show that our method can accurately bin DNA fragments with various lengths and relative species abundance ratios without using any reference and training datasets. Another feature of our method is its error robustness. The binning accuracy decreases by less than 1% when the sequencing error rate increases from 0% to 5%. Note that the typical sequencing error rate of existing commercial sequencing platforms is less than 2%. We provide a new and effective tool to solve the metagenome binning problem without using any reference datasets or markers information of any known reference genomes (species). The source code of our software tool, the reference genomes of the species for generating the test datasets and the corresponding test datasets are available at http://i.cs.hku.hk/~alse/MetaCluster/.

  6. Utilization of deletion bins to anchor and order sequences along the wheat 7B chromosome.

    Science.gov (United States)

    Belova, Tatiana; Grønvold, Lars; Kumar, Ajay; Kianian, Shahryar; He, Xinyao; Lillemo, Morten; Springer, Nathan M; Lien, Sigbjørn; Olsen, Odd-Arne; Sandve, Simen R

    2014-09-01

    A total of 3,671 sequence contigs and scaffolds were mapped to deletion bins on wheat chromosome 7B providing a foundation for developing high-resolution integrated physical map for this chromosome. Bread wheat (Triticum aestivum L.) has a large, complex and highly repetitive genome which is challenging to assemble into high quality pseudo-chromosomes. As part of the international effort to sequence the hexaploid bread wheat genome by the international wheat genome sequencing consortium (IWGSC) we are focused on assembling a reference sequence for chromosome 7B. The successful completion of the reference chromosome sequence is highly dependent on the integration of genetic and physical maps. To aid the integration of these two types of maps, we have constructed a high-density deletion bin map of chromosome 7B. Using the 270 K Nimblegen comparative genomic hybridization (CGH) array on a set of cv. Chinese spring deletion lines, a total of 3,671 sequence contigs and scaffolds (~7.8 % of chromosome 7B physical length) were mapped into nine deletion bins. Our method of genotyping deletions on chromosome 7B relied on a model-based clustering algorithm (Mclust) to accurately predict the presence or absence of a given genomic sequence in a deletion line. The bin mapping results were validated using three different approaches, viz. (a) PCR-based amplification of randomly selected bin mapped sequences (b) comparison with previously mapped ESTs and (c) comparison with a 7B genetic map developed in the present study. Validation of the bin mapping results suggested a high accuracy of the assignment of 7B sequence contigs and scaffolds to the 7B deletion bins.

  7. Novel Colitis Immunotherapy Targets Bin1 and Improves Colon Cell Barrier Function.

    Science.gov (United States)

    Thomas, Sunil; Mercado, Joanna M; DuHadaway, James; DiGuilio, Kate; Mullin, James M; Prendergast, George C

    2016-02-01

    Ulcerative colitis (UC) is associated with defects in colonic epithelial barriers as well as inflammation of the colon mucosa resulting from the recruitment of lymphocytes and neutrophils in the lamina propria. Patients afflicted with UC are at increased risk of colorectal cancer. Currently, UC management employs general anti-inflammatory strategies associated with a variety of side effects, including heightened risks of infection, in patients where the therapy is variably effective. Thus, second generation drugs that can more effectively and selectively limit UC are desired. Building on genetic evidence that attenuation of the Bin1 (Bridging integrator 1) gene can limit UC pathogenicity in the mouse, we pursued Bin1 targeting as a therapeutic option. Mice were injected with a single dose of Bin1 mAb followed by oral administration of 3 % DSS in water for 7 days. In this study, we offer preclinical proof of concept for a monoclonal antibody (mAb) targeting the Bin1 protein that blunts UC pathogenicity in a mouse model of experimental colitis. Administration of Bin1 mAb reduced colitis morbidity in mice; whereas unprotected mice is characterized by severe lesions throughout the mucosa, rupture of the lymphoid follicle, high-level neutrophil and lymphocyte infiltration into the mucosal and submucosal areas, and loss of surface crypts. In vitro studies in human Caco-2 cells showed that Bin1 antibody altered the expression of tight junction proteins and improved barrier function. Our results suggest that a therapy based on Bin1 monoclonal antibody supporting mucosal barrier function and protecting integrity of the lymphoid follicle could offer a novel strategy to treat UC and possibly limit risks of colorectal cancer.

  8. Charge flipping combined with histogram matching to solve complex crystal structures from powder diffraction data

    Czech Academy of Sciences Publication Activity Database

    Baerlocher, Ch.; McCusker, L.B.; Palatinus, Lukáš

    2007-01-01

    Roč. 222, - (2007), s. 47-53 ISSN 0044-2968 Institutional research plan: CEZ:AV0Z10100521 Keywords : charge flipping * histogram-matching * polycrystalline materials * powder diffraction structure analysis Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.338, year: 2007

  9. HEp-2 Cell Classification Using Shape Index Histograms With Donut-Shaped Spatial Pooling

    DEFF Research Database (Denmark)

    Larsen, Anders Boesen Lindbo; Vestergaard, Jacob Schack; Larsen, Rasmus

    2014-01-01

    introduce a spatial decomposition scheme which is radially symmetric and suitable for cell images. The spatial decomposition is performed using donut-shaped pooling regions of varying sizes when gathering histogram contributions. We evaluate our method using both the ICIP 2013 and the ICPR 2012 competition...

  10. Likelihood-based object detection and object tracking using color histograms and EM

    NARCIS (Netherlands)

    Withagen, P.J.; Schutte, K.; Groen, F.

    2002-01-01

    The topic of this paper is the integration of Expectation Maximization (EM) background modeling and template matching using color histograms as templates to improve person tracking for surveillance applications. The tracked objects are humans, which are not rigid bodies. As such shape deformations

  11. A Concise Guide to Feature Histograms with Applications to LIDAR-Based Spacecraft Relative Navigation

    Science.gov (United States)

    Rhodes, Andrew P.; Christian, John A.; Evans, Thomas

    2017-12-01

    With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results. Since OUR-CVFH is the most recent innovation in a large family of feature histogram point cloud descriptors, discussions of parameter settings and insights into its functionality are spread among various publications and online resources. This paper organizes the history of feature histogram point cloud descriptors for a straightforward explanation of their evolution. This article compiles all the requisite information needed to implement OUR-CVFH into one location, as well as providing useful suggestions on how to tune the generation parameters. This work is beneficial for anyone interested in using this histogram descriptor for object recognition or navigation - may it be personal robotics or spacecraft navigation.

  12. Robust Face Recognition by Computing Distances from Multiple Histograms of Oriented Gradients

    NARCIS (Netherlands)

    Karaaba, Mahir; Surinta, Olarik; Schomaker, Lambertus; Wiering, Marco

    2015-01-01

    The Single Sample per Person Problem is a challenging problem for face recognition algorithms. Patch-based methods have obtained some promising results for this problem. In this paper, we propose a new face recognition algorithm that is based on a combination of different histograms of oriented

  13. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong

    2016-09-17

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD-based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stochastic learning framework, we have one triplet of bags, including one basic bag, one positive bag, and one negative bag. These bags are mapped to histograms using a multi-instance dictionary. We argue that the EMD between the basic histogram and the positive histogram should be smaller than that between the basic histogram and the negative histogram. Base on this condition, we design a hinge loss. By minimizing this hinge loss and some regularization terms of the dictionary, we update the dictionary instances. The experiments over multi-instance retrieval applications shows its effectiveness when compared to other dictionary learning methods over the problems of medical image retrieval and natural language relation classification. © 2016 The Natural Computing Applications Forum

  14. Diagnostic Accuracy of Ultrasonic Histogram Features to Evaluate Radiation Toxicity of the Parotid Glands

    Science.gov (United States)

    Yang, Xiaofeng; Tridandapani, Srini; Beitler, Jonathan J.; Yu, David S.; Chen, Zhengjia; Kim, Sungjin; Bruner, Deborah W.; Curran, Walter J.; Liu, Tian

    2015-01-01

    Rationale and Objectives To investigate the diagnostic accuracy of ultrasound histogram features in the quantitative assessment of radiation-induced parotid gland injury and to identify potential imaging biomarkers for radiation-induced xerostomia (dry mouth)—the most common and debilitating side effect after head-and-neck radiotherapy (RT). Materials and Methods Thirty-four patients, who have developed xerostomia after RT for head-and-neck cancer, were enrolled. Radiation-induced xerostomia was defined by the Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer morbidity scale. Ultrasound scans were performed on each patient’s parotids bilaterally. The 34 patients were stratified into the acute-toxicity groups (16 patients, ≤3 months after treatment) and the late-toxicity group (18 patients, >3 months after treatment). A separate control group of 13 healthy volunteers underwent similar ultrasound scans of their parotid glands. Six sonographic features were derived from the echo-intensity histograms to assess acute and late toxicity of the parotid glands. The quantitative assessments were compared to a radiologist’s clinical evaluations. The diagnostic accuracy of these ultrasonic histogram features was evaluated with the receiver operating characteristic (ROC) curve. Results With an area under the ROC curve greater than 0.90, several histogram features demonstrated excellent diagnostic accuracy for evaluation of acute and late toxicity of parotid glands. Significant differences (P parotid glands. Conclusions We demonstrated that ultrasound histogram features could be used to measure acute and late toxicity of the parotid glands after head-and-neck cancer RT, which may be developed into a low-cost imaging method for xerostomia monitoring and assessment. PMID:25088832

  15. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    Science.gov (United States)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  16. Fungal volatiles associated with moldy grain in ventilated and non-ventilated bin-stored wheat.

    Science.gov (United States)

    Sinha, R N; Tuma, D; Abramson, D; Muir, W E

    1988-01-01

    The fungal odor compounds 3-methyl-1-butanol, 1-octen-3-ol and 3-octanone were monitored in nine experimental bins in Winnipeg, Manitoba containing a hard red spring wheat during the autumn, winter and summer seasons of 1984-85. Quality changes were associated with seed-borne microflora and moisture content in both ventilated and non-ventilated bins containing wheat of 15.6 and 18.2% initial moisture content. All three odor compounds occurred in considerably greater amounts in bulk wheat in non-ventilated than in ventilated bins, particularly in those with wheat having 18.2% moisture content. The presence of these compounds usually coincided with infection of the seeds by the fungi Alternaria alternata (Fr.) Keissler, Aspergillus repens DeBarry, A. versicolor (Vuill.) Tiraboschi, Penicillium crustosum Thom, P. oxalicum Currie and Thom, P. aurantiogriesum Dierckx, and P. citrinum Thom. High production of all three odor compounds in damp wheat stored in non-ventilated bins was associated with heavy fungal infection of the seeds and reduction in seed germinability. High initial moisture content of the harvested grain accelerated the production of all three fungal volatiles in non-ventilated bins.

  17. Bin mapping of genomic and EST-derived SSRs in melon (Cucumis melo L.).

    Science.gov (United States)

    Fernandez-Silva, I; Eduardo, I; Blanca, J; Esteras, C; Picó, B; Nuez, F; Arús, P; Garcia-Mas, J; Monforte, Antonio José

    2008-12-01

    We report the development of 158 primer pairs flanking SSR motifs in genomic (gSSR) and EST (EST-SSR) melon sequences, all yielding polymorphic bands in melon germplasm, except one that was polymorphic only in Cucurbita species. A similar polymorphism level was found among EST-SSRs and gSSRs, between dimeric and trimeric EST-SSRs, and between EST-SSRs placed in the open reading frame or any of the 5'- or 3'-untranslated regions. Correlation between SSR length and polymorphism was only found for dinucleotide EST-SSRs located within the untranslated regions, but not for trinucleotide EST-SSRs. Transferability of EST-SSRs to Cucurbita species was assayed and 12.7% of the primer pairs amplified at least in one species, although only 5.4% were polymorphic. A set of 14 double haploid lines from the cross between the cultivar "Piel de Sapo" and the accession PI161375 were selected for the bin mapping approach in melon. One hundred and twenty-one SSR markers were newly mapped. The position of 46 SSR loci was also verified by genotyping the complete population. A final bin-map was constructed including 80 RFLPs, 212 SSRs, 3 SNPs and the Nsv locus, distributed in 122 bins with an average bin length of 10.2 cM and a maximum bin length of 33 cM. Map density was 4.2 cM/marker or 5.9 cM/SSR.

  18. Robotic vision system for random bin picking with dual-arm robots

    Directory of Open Access Journals (Sweden)

    Kang Sangseung

    2016-01-01

    Full Text Available Random bin picking is one of the most challenging industrial robotics applications available. It constitutes a complicated interaction between the vision system, robot, and control system. For a packaging operation requiring a pick-and-place task, the robot system utilized should be able to perform certain functions for recognizing the applicable target object from randomized objects in a bin. In this paper, we introduce a robotic vision system for bin picking using industrial dual-arm robots. The proposed system recognizes the best object from randomized target candidates based on stereo vision, and estimates the position and orientation of the object. It then sends the result to the robot control system. The system was developed for use in the packaging process of cell phone accessories using dual-arm robots.

  19. Fast discriminative latent Dirichlet allocation

    Data.gov (United States)

    National Aeronautics and Space Administration — This is the code for fast discriminative latent Dirichlet allocation, which is an algorithm for topic modeling and text classification. The related paper is at...

  20. FY12 CPD Formula Allocation

    Data.gov (United States)

    Department of Housing and Urban Development — The Fiscal Year (FY) 2012 budget for the Department of Housing and Urban Development has been enacted. This spreadsheet provide full-year allocations for the Office...

  1. Dose-volume histogram and dose-surface histogram analysis for skin reactions to carbon ion radiotherapy for bone and soft tissue sarcoma.

    Science.gov (United States)

    Yanagi, Takeshi; Kamada, Tadashi; Tsuji, Hiroshi; Imai, Reiko; Serizawa, Itsuko; Tsujii, Hirohiko

    2010-04-01

    To evaluate the usefulness of the dose-volume histogram (DVH) and dose-surface histogram (DSH) as clinically relevant and available parameters that helped to identify bone and soft tissue sarcoma patients at risk of developing late skin reactions, including ulceration, when treated with carbon ion radiotherapy. Thirty-five patients with bone and soft tissue sarcoma treated with carbon ion beams were studied. The clinical skin reactions were evaluated. Some pretreatment variables were compared with the grade of late skin reactions. Average DVH and DSH were established in accordance with the grading of the skin reactions. Prescribed dose, the difference in depths between the skin surface and the proximal extent of the tumor, and some DVH/DSH parameters were correlated with late skin reaction (> or = grade 3) according to univariate analysis. Furthermore, the area irradiated with over 60 GyE (S(60)>20 cm(2)) on DSH was the most important factor by multivariate analysis. The area irradiated with over 60 GyE (S(60)>20 cm(2)) on DSH was found to be a parameter for use as a predictor of late skin reactions. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  2. International Development Aid Allocation Determinants

    OpenAIRE

    Tapas Mishra; Bazoumana Ouattara; Mamata Parhi

    2012-01-01

    This paper investigates the factors explaining aid allocation by bilateral and multilateral donors. We use data for 146 aid recipient countries over the period 1990-2007 and employ Bayesian Averaging of Classical Estimates Approach (BACE) approach and find that both the recipient need and donor interest motives are `significant' determinants of bilateral and multilateral aid allocation process. Our results also indicate that the measures for recipient need and donor interests vary from bilate...

  3. Application of an allocation methodology

    International Nuclear Information System (INIS)

    Youngblood, R.

    1989-01-01

    This paper presents a method for allocating resources to elements of a system for the purpose of achieving prescribed levels of defense-in-depth at minimal cost. The method makes extensive use of logic modelling. An analysis of a simplified high-level waste repository is used as an illustrative application of the method. It is shown that it is possible to allocate quality control costs (or demonstrate performance) in an optimal way over elements of a conceptual design

  4. The Opening of the Hamad Bin Khalifa Civilisation Center in Copenhagen

    DEFF Research Database (Denmark)

    Jacobsen, Brian Arly

    2014-01-01

    In Nørrebro – a large white building with a cupola and a 20-meter high minaret topped with a small crescent marks the site of the newly built mosque, the Hamad Bin Khalifa Civilisation Center in Copenhagen, which is now the largest Scandinavian mosque. This is the story about the opening of the f......In Nørrebro – a large white building with a cupola and a 20-meter high minaret topped with a small crescent marks the site of the newly built mosque, the Hamad Bin Khalifa Civilisation Center in Copenhagen, which is now the largest Scandinavian mosque. This is the story about the opening...

  5. Solar-Powered Compaction Garbage Bins in Public Areas: A Preliminary Economic and Environmental Evaluation

    Directory of Open Access Journals (Sweden)

    Long Duc Nghiem

    2010-02-01

    Full Text Available An excel-based model was developed to evaluate economic and environmental benefits of the solar-powered compaction garbage bins in public areas in Australia. Input data were collected from Brisbane and Wollongong City councils, and Sydney Olympic Park. The results demonstrate that solar-powered compaction garbage bins would provide environmental benefits in all scenarios. However, results of the economic analysis of the three studied areas varied significantly. The unique situation of Sydney Olympic Park made implementation in that facility particularly appealing. A lower monthly rental cost is needed for the implementation of this novel waste management practice.

  6. Propagation and survival of frequency-bin entangled photons in metallic nanostructures

    Directory of Open Access Journals (Sweden)

    Olislager Laurent

    2015-01-01

    Full Text Available We report on the design of two plasmonic nanostructures and the propagation of frequency-bin entangled photons through them. The experimental findings clearly show the robustness of frequency-bin entanglement, which survives after interactions with both a hybrid plasmo-photonic structure, and a nano-pillar array. These results confirm that quantum states can be encoded into the collective motion of a many-body electronic system without demolishing their quantum nature, and pave the way towards applications of plasmonic structures in quantum information.

  7. How should INGOs allocate resources?

    Directory of Open Access Journals (Sweden)

    Scott Wisor

    2012-02-01

    Full Text Available International Non-governmental Organizations (INGOs face difficult choices when choosing to allocate resources. Given that the resources made available to INGOs fall far short of what is needed to reduce massive human rights deficits, any chosen scheme of resource allocation requires failing to reach other individuals in great need. Facing these moral opportunity costs, what moral reasons should guide INGO resource allocation? Two reasons that clearly matter, and are recognized by philosophers and development practitioners, are the consequences (or benefit or harm reduction of any given resource allocation and the need (or priority of individual beneficiaries. If accepted, these reasons should lead INGOs to allocate resources to a limited number of countries where the most prioritarian weighted harm reduction will be achieved. I make three critiques against this view. First, on grounds the consequentialist accepts, I argue that INGOs ought to maintain a reasonably wide distribution of resources. Second, I argue that even if one is a consequentialist, consequentialism ought not act as an action guiding principle for INGOs. Third, I argue that additional moral reasons should influence decision making about INGO resource allocation. Namely, INGO decision making should attend to relational reasons, desert, respect for agency, concern for equity, and the importance of expressing a view of moral wrongs.

  8. Generalized multidimensional dynamic allocation method.

    Science.gov (United States)

    Lebowitsch, Jonathan; Ge, Yan; Young, Benjamin; Hu, Feifang

    2012-12-10

    Dynamic allocation has received considerable attention since it was first proposed in the 1970s as an alternative means of allocating treatments in clinical trials which helps to secure the balance of prognostic factors across treatment groups. The purpose of this paper is to present a generalized multidimensional dynamic allocation method that simultaneously balances treatment assignments at three key levels: within the overall study, within each level of each prognostic factor, and within each stratum, that is, combination of levels of different factors Further it offers capabilities for unbalanced and adaptive designs for trials. The treatment balancing performance of the proposed method is investigated through simulations which compare multidimensional dynamic allocation with traditional stratified block randomization and the Pocock-Simon method. On the basis of these results, we conclude that this generalized multidimensional dynamic allocation method is an improvement over conventional dynamic allocation methods and is flexible enough to be applied for most trial settings including Phases I, II and III trials. Copyright © 2012 John Wiley & Sons, Ltd.

  9. Contrast Enhancement Using Brightness Preserving Histogram Equalization Technique for Classification of Date Varieties

    Directory of Open Access Journals (Sweden)

    G Thomas

    2014-06-01

    Full Text Available Computer vision technique is becoming popular for quality assessment of many products in food industries. Image enhancement is the first step in analyzing the images in order to obtain detailed information for the determination of quality. In this study, Brightness preserving histogram equalization technique was used to enhance the features of gray scale images to classify three date varieties (Khalas, Fard and Madina. Mean, entropy, kurtosis and skewness features were extracted from the original and enhanced images. Mean and entropy from original images and kurtosis from the enhanced images were selected based on Lukka's feature selection approach. An overall classification efficiency of 93.72% was achieved with just three features. Brightness preserving histogram equalization technique has great potential to improve the classification in various quality attributes of food and agricultural products with minimum features.

  10. ANALISIS KOMPARASI METODE PERBAIKAN KONTRAS BERBASIS HISTOGRAM EQUALIZATION PADA CITRA MEDIS

    Directory of Open Access Journals (Sweden)

    Aditya Akbar Riadi

    2017-04-01

    Full Text Available Citra merupakan gambaran tentang karakteristik suatu obyek menurut kondisi variabel tertentu. Pengolahan citra bertujuan memperbaiki kualitas citra agar mudah diinterpretasi oleh manusia atau mesin (dalam hal ini komputer. Terdapat beberapa operasi di dalam pengolahan citra, salah satunya adalah perbaikan kontras yang pada dasarnya biasa digunakan untuk memunculkan bagian-bagian yang tidak terlihat (hidden feature pada citra. Hasil citra dari rontgen yang tidak selalu memiliki kualitas citra yang baik, seperti halnya hasil citra x-ray yang terlalu gelap atau ada bagian tulang yang terlihat samar sehingga gambar tidak terlihat jelas. Pada penelitian ini teknik peningkatan citra dengan perbaikan kontras menggunakan metode berbasis Histrogram Equalization. Pada citra medis tersebut dan juga menunjukkan kinerja hasil pengukuran kontrol eror menggunakan Mean Square Error menjelaskan bahwa metode  Contrast Limited Adaptive Histogram Equalization lebih baik dibandingkan dengan metode Histrogram Equalization dan metode Adaptive Histogram Equalization.

  11. Improving the convergence rate in affine registration of PET and SPECT brain images using histogram equalization.

    Science.gov (United States)

    Salas-Gonzalez, D; Górriz, J M; Ramírez, J; Padilla, P; Illán, I A

    2013-01-01

    A procedure to improve the convergence rate for affine registration methods of medical brain images when the images differ greatly from the template is presented. The methodology is based on a histogram matching of the source images with respect to the reference brain template before proceeding with the affine registration. The preprocessed source brain images are spatially normalized to a template using a general affine model with 12 parameters. A sum of squared differences between the source images and the template is considered as objective function, and a Gauss-Newton optimization algorithm is used to find the minimum of the cost function. Using histogram equalization as a preprocessing step improves the convergence rate in the affine registration algorithm of brain images as we show in this work using SPECT and PET brain images.

  12. Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.

    Science.gov (United States)

    Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn

    2011-09-01

    Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".

  13. Real time object localization based on histogram of s-RGB

    Science.gov (United States)

    Mudjirahardjo, Panca; Suyono, Hadi; Setyawan, Raden Arief

    2017-09-01

    Object localization is the first task in pattern detection and recognition. This task is very important due to it reduces the searching time to the interest object. In this paper we introduce our novel method of object localization based on color feature. Our novel method is a histogram of s-RGB. This histogram is used in the training phase to determine the color dominant in the initial Region of Interest (ROI). Then this information is used to label the interest object. To reduce noise and localize the interest object, we apply the row and column density function of pixels. The comparison result with some processes, our system gives a best result and takes a short computation time of 26.56 ms, in the video rate of 15 frames per second (fps).

  14. 3D facial expression recognition based on histograms of surface differential quantities

    KAUST Repository

    Li, Huibin

    2011-01-01

    3D face models accurately capture facial surfaces, making it possible for precise description of facial activities. In this paper, we present a novel mesh-based method for 3D facial expression recognition using two local shape descriptors. To characterize shape information of the local neighborhood of facial landmarks, we calculate the weighted statistical distributions of surface differential quantities, including histogram of mesh gradient (HoG) and histogram of shape index (HoS). Normal cycle theory based curvature estimation method is employed on 3D face models along with the common cubic fitting curvature estimation method for the purpose of comparison. Based on the basic fact that different expressions involve different local shape deformations, the SVM classifier with both linear and RBF kernels outperforms the state of the art results on the subset of the BU-3DFE database with the same experimental setting. © 2011 Springer-Verlag.

  15. Three-dimensional bead position histograms reveal single-molecule nanomechanics

    Science.gov (United States)

    Becker, Nils B.; Altmann, Stephan M.; Scholz, Tim; Hörber, J. K. Heinrich; Stelzer, Ernst H. K.; Rohrbach, Alexander

    2005-02-01

    We describe a method to investigate the structure and elasticity of macromolecules by a combination of single molecule experiments and kinematic modeling. With a photonic force microscope, we recorded spatial position histograms of a fluctuating microsphere tethered to full-length myosin-II. Assuming only that the molecule consists of concatenated rigid segments, a model derived from robot kinematics allows us to relate these histograms to the molecule’s segment lengths and bending stiffnesses. Both our calculated position distributions and the experimental data show an asymmetry characteristic of a mixed entropic-enthalpic spring. Our model that fits best to experimental line profiles has two intramolecular hinges, one at the bound head domain, and another about 50 nm down the myosin tail, with a summed bending stiffness of about 3kBT/rad .

  16. An improved contrast enhancement algorithm for infrared images based on adaptive double plateaus histogram equalization

    Science.gov (United States)

    Li, Shuo; Jin, Weiqi; Li, Li; Li, Yiyang

    2018-05-01

    Infrared thermal images can reflect the thermal-radiation distribution of a particular scene. However, the contrast of the infrared images is usually low. Hence, it is generally necessary to enhance the contrast of infrared images in advance to facilitate subsequent recognition and analysis. Based on the adaptive double plateaus histogram equalization, this paper presents an improved contrast enhancement algorithm for infrared thermal images. In the proposed algorithm, the normalized coefficient of variation of the histogram, which characterizes the level of contrast enhancement, is introduced as feedback information to adjust the upper and lower plateau thresholds. The experiments on actual infrared images show that compared to the three typical contrast-enhancement algorithms, the proposed algorithm has better scene adaptability and yields better contrast-enhancement results for infrared images with more dark areas or a higher dynamic range. Hence, it has high application value in contrast enhancement, dynamic range compression, and digital detail enhancement for infrared thermal images.

  17. A new adaptive contrast enhancement algorithm for infrared images based on double plateaus histogram equalization

    Science.gov (United States)

    Liang, Kun; Ma, Yong; Xie, Yue; Zhou, Bo; Wang, Rui

    2012-07-01

    In infrared images, detail pixels are easily immerged in large quantity of low-contrast background pixels. According to these characteristics, an adaptive contrast enhancement algorithm based on double plateaus histogram equalization for infrared images was presented in this paper. Traditional double plateaus histogram equalization algorithm used constant threshold and could not change the threshold value in various scenes, so that its practical usage is limited. In the proposed algorithm, the upper and lower threshold value could be calculated by searching local maximum and predicting minimum gray interval and be updated in real time. With the proposed algorithm, the background of infrared image was constrained while the details could also be enhanced. Experimental results proved that the proposed algorithm can effectively enhance the contrast of infrared images, especially the details of infrared images.

  18. Underwater range-gated laser imaging enhancement based on contrast-limited adaptive histogram equalization

    Science.gov (United States)

    Sun, Liang; Wang, Xinwei; Liu, Xiaoquan; Ren, Pengdao; Lei, Pingshun; You, Ruirong; He, Jun; Zhou, Yan; Liu, Yuliang

    2016-10-01

    Underwater range-gated laser imaging (URGLI) still has some problems like un-uniform light, low brightness and contrast. To solve the problems, a variant of adaptive histogram equalization called contrast limited adaptive histogram equalization (CLAHE) is proposed in this paper. In experiment, using the CLAHE and HE to enhance the images, and evaluate the quality of enhanced images by peak signal to noise ratio (PSNR) and contrast. The result shows that the HE gets the images over-enhanced, while the CLAHE has a good enhancement with compressing the over-enhancement and the influence of un-uniform light. The experimental results demonstrate that the CLAHE has a good result of image enhancement for target detection by underwater range-gated laser imaging system.

  19. Distribution pattern of density in lumbar vertebra studied with computed tomography; A study of histogram plot

    Energy Technology Data Exchange (ETDEWEB)

    Tanno, Munehiko; Yamada, Hideo; Endou, Kazuo; Hayashida, Ko-ichi; Ide, Hiroshi; Kurihara, Norimitsu; Mashima, Yasuoki; Chiba, Kazuo (Tokyo Metropolitan Geriatric Hospital (Japan))

    1989-07-01

    The bone mineral status of the cancellous bone in the lumbar vertebrae was evaluated by analyzing density histograms and measuring the mean density by computed tomography. The results obtained were as follows: (a) the distribution pattern of bone density in lumbar vertebrae revealed a normal distribution; (b) high correlation coefficients between peak density (r=-0.79) or mean density (r=-0.77) and age was obtained in males, whereas peak densities in females were maintained well at ages younger than 50 years and peak densities abruptly decreased after 50 years of age. Osteoporotic vertebrae, in which multiple osteosclerotic changes were observed, had several peak densities and did not show normal density distribution pattern. These results indicated that our methods combining analysis of density histograms and measurement of mean density are useful to evaluate the bone mineral status. (author).

  20. Moderated histogram equalization, an automatic means of enhancing the contrast in digital light micrographs reversibly.

    Science.gov (United States)

    Entwistle, A

    2004-06-01

    A means for improving the contrast in the images produced from digital light micrographs is described that requires no intervention by the experimenter: zero-order, scaling, tonally independent, moderated histogram equalization. It is based upon histogram equalization, which often results in digital light micrographs that contain regions that appear to be saturated, negatively biased or very grainy. Here a non-decreasing monotonic function is introduced into the process, which moderates the changes in contrast that are generated. This method is highly effective for all three of the main types of contrast found in digital light micrography: bright objects viewed against a dark background, e.g. fluorescence and dark-ground or dark-field image data sets; bright and dark objects sets against a grey background, e.g. image data sets collected with phase or Nomarski differential interference contrast optics; and darker objects set against a light background, e.g. views of absorbing specimens. Moreover, it is demonstrated that there is a single fixed moderating function, whose actions are independent of the number of elements of image data, which works well with all types of digital light micrographs, including multimodal or multidimensional image data sets. The use of this fixed function is very robust as the appearance of the final image is not altered discernibly when it is applied repeatedly to an image data set. Consequently, moderated histogram equalization can be applied to digital light micrographs as a push-button solution, thereby eliminating biases that those undertaking the processing might have introduced during manual processing. Finally, moderated histogram equalization yields a mapping function and so, through the use of look-up tables, indexes or palettes, the information present in the original data file can be preserved while an image with the improved contrast is displayed on the monitor screen.

  1. Chest CT window settings with multiscale adaptive histogram equalization: pilot study.

    Science.gov (United States)

    Fayad, Laura M; Jin, Yinpeng; Laine, Andrew F; Berkmen, Yahya M; Pearson, Gregory D; Freedman, Benjamin; Van Heertum, Ronald

    2002-06-01

    Multiscale adaptive histogram equalization (MAHE), a wavelet-based algorithm, was investigated as a method of automatic simultaneous display of the full dynamic contrast range of a computed tomographic image. Interpretation times were significantly lower for MAHE-enhanced images compared with those for conventionally displayed images. Diagnostic accuracy, however, was insufficient in this pilot study to allow recommendation of MAHE as a replacement for conventional window display.

  2. Conductance histogram evolution of an EC-MCBJ fabricated Au atomic point contact

    International Nuclear Information System (INIS)

    Yang Yang; Liu Junyang; Chen Zhaobin; Tian Jinghua; Jin Xi; Liu Bo; Yang Fangzu; Tian Zhongqun; Li Xiulan; Tao Nongjian; Luo Zhongzi; Lu Miao

    2011-01-01

    This work presents a study of Au conductance quantization based on a combined electrochemical deposition and mechanically controllable break junction (MCBJ) method. We describe the microfabrication process and discuss improved features of our microchip structure compared to the previous one. The improved structure prolongs the available life of the microchip and also increases the success rate of the MCBJ experiment. Stepwise changes in the current were observed at the last stage of atomic point contact breakdown and conductance histograms were constructed. The evolution of 1G 0 peak height in conductance histograms was used to investigate the probability of formation of an atomic point contact. It has been shown that the success rate in forming an atomic point contact can be improved by decreasing the stretching speed and the degree that the two electrodes are brought into contact. The repeated breakdown and formation over thousands of cycles led to a distinctive increase of 1G 0 peak height in the conductance histograms, and this increased probability of forming a single atomic point contact is discussed.

  3. Conductance histogram evolution of an EC-MCBJ fabricated Au atomic point contact

    Energy Technology Data Exchange (ETDEWEB)

    Yang Yang; Liu Junyang; Chen Zhaobin; Tian Jinghua; Jin Xi; Liu Bo; Yang Fangzu; Tian Zhongqun [State Key Laboratory of Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005 (China); Li Xiulan; Tao Nongjian [Center for Bioelectronics and Biosensors, Biodesign Institute, Department of Electrical Engineering, Arizona State University, Tempe, AZ 85287-6206 (United States); Luo Zhongzi; Lu Miao, E-mail: zqtian@xmu.edu.cn [Micro-Electro-Mechanical Systems Research Center, Pen-Tung Sah Micro-Nano Technology Institute, Xiamen University, Xiamen 361005 (China)

    2011-07-08

    This work presents a study of Au conductance quantization based on a combined electrochemical deposition and mechanically controllable break junction (MCBJ) method. We describe the microfabrication process and discuss improved features of our microchip structure compared to the previous one. The improved structure prolongs the available life of the microchip and also increases the success rate of the MCBJ experiment. Stepwise changes in the current were observed at the last stage of atomic point contact breakdown and conductance histograms were constructed. The evolution of 1G{sub 0} peak height in conductance histograms was used to investigate the probability of formation of an atomic point contact. It has been shown that the success rate in forming an atomic point contact can be improved by decreasing the stretching speed and the degree that the two electrodes are brought into contact. The repeated breakdown and formation over thousands of cycles led to a distinctive increase of 1G{sub 0} peak height in the conductance histograms, and this increased probability of forming a single atomic point contact is discussed.

  4. Whole brain magnetization transfer histogram analysis of pediatric acute lymphoblastic leukemia patients receiving intrathecal methotrexate therapy

    International Nuclear Information System (INIS)

    Yamamoto, Akira; Miki, Yukio; Adachi, Souichi

    2006-01-01

    Background and purpose: The purpose of this prospective study was to evaluate the hypothesis that magnetization transfer ratio (MTR) histogram analysis of the whole brain could detect early and subtle brain changes nonapparent on conventional magnetic resonance imaging (MRI) in children with acute lymphoblastic leukemia (ALL) receiving methotrexate (MTX) therapy. Materials and methods: Subjects in this prospective study comprised 10 children with ALL (mean age, 6 years; range, 0-16 years). In addition to conventional MRI, magnetization transfer images were obtained before and after intrathecal and intravenous MTX therapy. MTR values were calculated and plotted as a histogram, and peak height and location were calculated. Differences in peak height and location between pre- and post-MTX therapy scans were statistically analyzed. Conventional MRI was evaluated for abnormal signal area in white matter. Results: MTR peak height was significantly lower on post-MTX therapy scans than on pre-MTX therapy scans (p = 0.002). No significant differences in peak location were identified between pre- and post-chemotherapy imaging. No abnormal signals were noted in white matter on either pre- or post-MTX therapy conventional MRI. Conclusions: This study demonstrates that MTR histogram analysis allows better detection of early and subtle brain changes in ALL patients who receive MTX therapy than conventional MRI

  5. Compensating Acoustic Mismatch Using Class-Based Histogram Equalization for Robust Speech Recognition

    Directory of Open Access Journals (Sweden)

    Hoirin Kim

    2007-01-01

    Full Text Available A new class-based histogram equalization method is proposed for robust speech recognition. The proposed method aims at not only compensating for an acoustic mismatch between training and test environments but also reducing the two fundamental limitations of the conventional histogram equalization method, the discrepancy between the phonetic distributions of training and test speech data, and the nonmonotonic transformation caused by the acoustic mismatch. The algorithm employs multiple class-specific reference and test cumulative distribution functions, classifies noisy test features into their corresponding classes, and equalizes the features by using their corresponding class reference and test distributions. The minimum mean-square error log-spectral amplitude (MMSE-LSA-based speech enhancement is added just prior to the baseline feature extraction to reduce the corruption by additive noise. The experiments on the Aurora2 database proved the effectiveness of the proposed method by reducing relative errors by 62% over the mel-cepstral-based features and by 23% over the conventional histogram equalization method, respectively.

  6. RGB Color Cube-Based Histogram Specification for Hue-Preserving Color Image Enhancement

    Directory of Open Access Journals (Sweden)

    Kohei Inoue

    2017-07-01

    Full Text Available A large number of color image enhancement methods are based on the methods for grayscale image enhancement in which the main interest is contrast enhancement. However, since colors usually have three attributes, including hue, saturation and intensity of more than only one attribute of grayscale values, the naive application of the methods for grayscale images to color images often results in unsatisfactory consequences. Conventional hue-preserving color image enhancement methods utilize histogram equalization (HE for enhancing the contrast. However, they cannot always enhance the saturation simultaneously. In this paper, we propose a histogram specification (HS method for enhancing the saturation in hue-preserving color image enhancement. The proposed method computes the target histogram for HS on the basis of the geometry of RGB (rad, green and blue color space, whose shape is a cube with a unit side length. Therefore, the proposed method includes no parameters to be set by users. Experimental results show that the proposed method achieves higher color saturation than recent parameter-free methods for hue-preserving color image enhancement. As a result, the proposed method can be used for an alternative method of HE in hue-preserving color image enhancement.

  7. Histogram-based DNA analysis for the visualization of chromosome, genome and species information.

    Science.gov (United States)

    Costa, António M; Machado, José T; Quelhas, Maria D

    2011-05-01

    We describe a novel approach to explore DNA nucleotide sequence data, aiming to produce high-level categorical and structural information about the underlying chromosomes, genomes and species. The article starts by analyzing chromosomal data through histograms using fixed length DNA sequences. After creating the DNA-related histograms, a correlation between pairs of histograms is computed, producing a global correlation matrix. These data are then used as input to several data processing methods for information extraction and tabular/graphical output generation. A set of 18 species is processed and the extensive results reveal that the proposed method is able to generate significant and diversified outputs, in good accordance with current scientific knowledge in domains such as genomics and phylogenetics. Source code freely available for download at http://www4.dei.isep.ipp.pt/etc/dnapaper2010, implemented in Free Pascal and UNIX scripting tools. Study input data available online for download at University of California at Santa Cruz Genome Bioinformatics, http://hgdownload.cse.ucsc.edu/downloads.html.

  8. Image indexing using color histogram and k-means clustering for optimization CBIR in image database

    Science.gov (United States)

    Rejito, Juli; Setiawan Abdullahi, Atje; Akmal; Setiana, Deni; Nurani Ruchjana, Budi

    2017-10-01

    Retrieving visually similar images from image database needs high speed and accuracy. Various text and content based image retrieval techniques are being investigated by the researchers in order to exactly match the image features. In this paper, a content-based image retrieval system (CBIR), which computes color similarity among images, is presented. CBIR is a set of techniques for retrieving semantically relevant images from an image database based on automatically derived image features. Color is one important visual features of an image. This document gives a brief description of a system developed for retrieving images similar to a query image from a large set of distinct images with histogram color feature based on image index. Result from the histogram color feature extraction, then using K-Means clustering to produce the image index. Image index used to compare to the histogram color feature of query image and thus, the image database is sorted in decreasing order of similarity. The results obtained by the proposed system obviously confirm that partitioning of image objects helps in optimization retrieving of similar images from the database. The proposed CBIR method is compared with our previously existed methodologies and found better in the retrieval accuracy. The retrieval accuracy are comparatively good than previous works proposed in CBIR system.

  9. Local Histogram of Figure/Ground Segmentations for Dynamic Background Subtraction

    Directory of Open Access Journals (Sweden)

    Bineng Zhong

    2010-01-01

    Full Text Available We propose a novel feature, local histogram of figure/ground segmentations, for robust and efficient background subtraction (BGS in dynamic scenes (e.g., waving trees, ripples in water, illumination changes, camera jitters, etc.. We represent each pixel as a local histogram of figure/ground segmentations, which aims at combining several candidate solutions that are produced by simple BGS algorithms to get a more reliable and robust feature for BGS. The background model of each pixel is constructed as a group of weighted adaptive local histograms of figure/ground segmentations, which describe the structure properties of the surrounding region. This is a natural fusion because multiple complementary BGS algorithms can be used to build background models for scenes. Moreover, the correlation of image variations at neighboring pixels is explicitly utilized to achieve robust detection performance since neighboring pixels tend to be similarly affected by environmental effects (e.g., dynamic scenes. Experimental results demonstrate the robustness and effectiveness of the proposed method by comparing with four representatives of the state of the art in BGS.

  10. Local Histogram of Figure/Ground Segmentations for Dynamic Background Subtraction

    Directory of Open Access Journals (Sweden)

    Yuan Xiaotong

    2010-01-01

    Full Text Available Abstract We propose a novel feature, local histogram of figure/ground segmentations, for robust and efficient background subtraction (BGS in dynamic scenes (e.g., waving trees, ripples in water, illumination changes, camera jitters, etc.. We represent each pixel as a local histogram of figure/ground segmentations, which aims at combining several candidate solutions that are produced by simple BGS algorithms to get a more reliable and robust feature for BGS. The background model of each pixel is constructed as a group of weighted adaptive local histograms of figure/ground segmentations, which describe the structure properties of the surrounding region. This is a natural fusion because multiple complementary BGS algorithms can be used to build background models for scenes. Moreover, the correlation of image variations at neighboring pixels is explicitly utilized to achieve robust detection performance since neighboring pixels tend to be similarly affected by environmental effects (e.g., dynamic scenes. Experimental results demonstrate the robustness and effectiveness of the proposed method by comparing with four representatives of the state of the art in BGS.

  11. ULTRASOUND HISTOGRAM ASSESSMENT OF PAROTID GLAND INJURY FOLLOWING HEAD-AND-NECK RADIOTHERAPY: A FEASIBILITY STUDY

    Science.gov (United States)

    Yang, Xiaofeng; Tridandapani, Srini; Beitler, Jonathan J.; Yu, David S.; Yoshida, Emi J.; Curran, Walter J.; Liu, Tian

    2012-01-01

    Xerostomia (dry mouth), resulting from radiation damage to the parotid glands, is one of the most common and distressing side effects of head-and-neck cancer radiotherapy. A noninvasive, objective imaging method to assess parotid injury is lacking, but much needed in the clinic. Therefore, we investigated echo histograms to quantitatively evaluate the morphologic and microstructural integrity of the parotid glands. Six sono-graphic features were derived from the echo-intensity histograms to assess the echogenicity, homogeneity and heterogeneity of the parotid gland: (1) peak intensity value (Ipeak), (2) −3-dB intensity width (W3-dB), (3) the low (50% Ipeak) intensity width (Whigh), (5) the area of low intensity (Alow) and (6) the area of high intensity (Ahigh). In this pilot study, 12 post-radiotherapy patients and seven healthy volunteers were enrolled. Significant differences (p parotid glands. In summary, we developed a family of sonographic features derived from echo histograms and demonstrated the feasibility of quantitative evaluation of radiation-induced parotid-gland injury. PMID:22766120

  12. Whole brain magnetization transfer histogram analysis of pediatric acute lymphoblastic leukemia patients receiving intrathecal methotrexate therapy

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Akira [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: yakira@kuhp.kyoto-u.ac.jp; Miki, Yukio [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: mikiy@kuhp.kyoto-u.ac.jp; Adachi, Souichi [Department of Pediatrics, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: sadachi@kuhp.kyoto-u.ac.jp (and others)

    2006-03-15

    Background and purpose: The purpose of this prospective study was to evaluate the hypothesis that magnetization transfer ratio (MTR) histogram analysis of the whole brain could detect early and subtle brain changes nonapparent on conventional magnetic resonance imaging (MRI) in children with acute lymphoblastic leukemia (ALL) receiving methotrexate (MTX) therapy. Materials and methods: Subjects in this prospective study comprised 10 children with ALL (mean age, 6 years; range, 0-16 years). In addition to conventional MRI, magnetization transfer images were obtained before and after intrathecal and intravenous MTX therapy. MTR values were calculated and plotted as a histogram, and peak height and location were calculated. Differences in peak height and location between pre- and post-MTX therapy scans were statistically analyzed. Conventional MRI was evaluated for abnormal signal area in white matter. Results: MTR peak height was significantly lower on post-MTX therapy scans than on pre-MTX therapy scans (p = 0.002). No significant differences in peak location were identified between pre- and post-chemotherapy imaging. No abnormal signals were noted in white matter on either pre- or post-MTX therapy conventional MRI. Conclusions: This study demonstrates that MTR histogram analysis allows better detection of early and subtle brain changes in ALL patients who receive MTX therapy than conventional MRI.

  13. Statistical Analysis of Photopyroelectric Signals using Histogram and Kernel Density Estimation for differentiation of Maize Seeds

    Science.gov (United States)

    Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.

    2016-09-01

    Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.

  14. 25 miljoni dollari küsimus : kus on bin Laden? / Kaivo Kopli

    Index Scriptorium Estoniae

    Kopli, Kaivo

    2006-01-01

    Üks põhjusi, miks bin Ladenit pole suudetud tabada, on Pakistani ja Afganistani raskelt kulgev koostöö. Ühendriikide, Afganistani kui ka Pakistani ametnikud on enamasti ühel meelel, et al-Qaida juhi tabamisele jõuti kõige lähemale Tora Boras

  15. Performance evaluation of an ox-drawn ridging plough in a soil-bin ...

    African Journals Online (AJOL)

    An ox-drawn ridging plough was developed using the Godwin-Spoor narrow tine soil force prediction model. The plough was evaluated in a sandy loam soil in the soil-bin at Cranfield University, Silsoe. The objectives were to compare predicted with measured draught and vertical forces, and cross-sectional area of soil ...

  16. VizieR Online Data Catalog: WASP-80b wavelength-binned light curves (Kirk+, 2018)

    Science.gov (United States)

    Kirk, J.; Wheatley, P. J.; Louden, T.; Skillen, I.; King, G. W.; McCormac, J.; Irwin, P. G. J.

    2018-02-01

    This table contains the wavelength binned light curves of the two transits of WASP-80b observed with the ACAM instrument on the William Herschel Telescope on the nights of the 2016 August 18 and 2016 August 21. (2 data files).

  17. Bin-Picking based on Harmonic Shape Contexts and Graph-Based Matching

    DEFF Research Database (Denmark)

    Moeslund, Thomas B.; Kirkegaard, Jakob

    2006-01-01

    In this work we address the general bin-picking problem where 3D data is available. We apply Harmonic Shape Contexts (HSC) features since these are invariant to translation, scale, and 3D rotation. Each object is divided into a number of sub-models each represented by a number of HSC features. Th...

  18. ANALISIS TINGKAT OPTIMASI ALGORITMA GENETIKA DALAM HUKUM KETETAPAN HARDY-WEINBERG PADA BIN PACKING PROBLEM

    Directory of Open Access Journals (Sweden)

    Terry Noviar Panggabean

    2016-08-01

    Full Text Available Abstrak—Karna representasi abstrak dari beberapa sistem pengambilan keputusan yang nyata dalam kehidupan sehari hari membuat masalah optimasi kombinatorial umumnya sangat sulit untuk dipecahkan. Bin packing problem ialah solusi terbaik dalam mengatasi masalah optimasi kombinatorial, yang digunakan untuk mencari sebuah objek secara optimal dari sekelompok himpunan objek yang berhingga. Serangkaian pendekatan hybrid telah dikembangkan dalam hal ini untuk memecahkan masalah Bin Packing. Metaheuristik adalah salah satu pendekatan tingkat tinggi dalam memandu dalam memodifikasi beberapa metode heuristik lainnya untuk mencari tingkat optimasi yang lebih baik. Genetic Algorithm atau Algoritma Genetika juga merupakan metode metaheuristik yang digunakan untuk menyelesaikan berbagai masalah dalam hal peningkatan optimasi. Dalam algoritma genetika terdapat bermacam-macam varian. Dalam penelitian dipaparkan mengenai taksonomi dari algoritma genetika parallel (Parallel Genetic Algorithm yang memiliki kemampuan yang lebih baik dari algoritma genetika konvensional dalam hal kinerja dan skalabilitasnya. Tetapi algoritma genetika paralel ini hanya cocok untuk permasalahan jaringan komputer heterogen dan sistem terdistribusi. Berdasarkan penelitian yang sudah pernah dilakukan sebelumnya dan dari uraian diatas maka penulis tertarik untuk melakukan penelitian bagaimana menerapkan hukum ketetapan Hardy-Weinberg dari bidang biologi kedalam algoritma genetika melakukan analisis tingkat optimasi terhadap Bin Packing Problem..   Keywords— Genetic Algortihm, Hardy-Weinberg, Bin Packing Problem.

  19. The Peril of Hasty Triumphalism and Osama bin Laden’s Death

    Directory of Open Access Journals (Sweden)

    Eugenio Lilli

    2011-05-01

    Full Text Available On May 1, 2011 the headlines of a large number of newspapers and TV channels around the world were saying “justice has been done”. Those were the words used by the US President Barack Obama to announce to the world the killing of Osama bin Laden, the number one terrorist on the US most-wanted list.

  20. The effect of stocking density and bin feeder space on performance ...

    African Journals Online (AJOL)

    Unknown

    #Email: lavers@agric.unp.ac.za. Introduction. Pigs housed individually have been shown to have higher feed intakes and consequently better performance than grouped pigs (Nielsen et al., 1996). This experiment was designed to determine the effect of feed bin number on growth and feed intake. Materials and Methods.

  1. Research and Development of a New Waste Collection Bin to Facilitate Education in Plastic Recycling

    Science.gov (United States)

    Chow, Cheuk-fai; So, Wing-Mui Winnie; Cheung, Tsz-Yan

    2016-01-01

    Plastic recycling has been an alternative method for solid waste management apart from landfill and incineration. However, recycling quality is affected when all plastics are discarded into a single recycling bin that increases cross contaminations and operation cost to the recycling industry. Following the engineering design process, a new…

  2. Fisher Matrix-based Predictions for Measuring the z= 3.35 Binned ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Astrophysics and Astronomy; Volume 38; Issue 1. Fisher Matrix-based Predictions for Measuring the z = 3.35 Binned 21-cm Power Spectrum using the Ooty Wide Field Array (OWFA). Anjan Kumar Sarkar Somnath Bharadwaj Sk. Saiyad Ali. Review Article Volume 38 Issue 1 March 2017 Article ID ...

  3. The effect of stocking density and bin feeder space on performance ...

    African Journals Online (AJOL)

    The effect of stocking density and bin feeder space on performance in pigs. G.A. Lavers, N.S. Ferguson. Abstract. (South African J of Animal Science, 2000, 30, Supplement 1: 70-71). Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT.

  4. The effect of stocking density and bin feeder space on performance ...

    African Journals Online (AJOL)

    Unknown

    The effect of stocking density and bin feeder space on performance in pigs. G.A. Lavers# and N.S. Ferguson. School of Agricultural Sciences & Agribusiness, University of Natal, P Bag X01, Scottsville 3209. #Email: lavers@agric.unp.ac.za. Introduction. Pigs housed individually have been shown to have higher feed intakes ...

  5. Effects of Outside Air Temperature on Movement of Phosphine Gas in Concrete Elevator Bins

    Science.gov (United States)

    Studies that measured the movement and concentration of phosphine gas in upright concrete bins over time indicated that fumigant movement was dictated by air currents, which in turn, were a function of the difference between the average grain temperature and the average outside air temperature durin...

  6. Effects of Number and Location of Bins on Plastic Recycling at a University

    Science.gov (United States)

    O'Connor, Ryan T.; Lerman, Dorothea C.; Fritz, Jennifer N.; Hodde, Henry B.

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location…

  7. Cost allocation in distribution planning

    Energy Technology Data Exchange (ETDEWEB)

    Engevall, S.

    1996-12-31

    This thesis concerns cost allocation problems in distribution planning. The cost allocation problems we study are illustrated using the distribution planning situation at the Logistics department of Norsk Hydro Olje AB. The planning situation is modeled as a Traveling Salesman Problem and a Vehicle Routing Problem with an inhomogeneous fleet. The cost allocation problems are the problems of how to divide the transportation costs among the customers served in each problem. The cost allocation problems are formulated as cooperative games, in characteristic function form, where the customers are defined to be the players. The games contain five and 21 players respectively. Game theoretical solution concepts such as the core, the nucleolus, the Shapley value and the {tau}-value are discussed. From the empirical results we can, among other things, conclude that the core of the Traveling Salesman Game is large, and that the core of the Vehicle Routing Game is empty. In the accounting of Norsk Hydro the cost per m{sup 3} can be found for each tour. We conclude that for a certain definition of the characteristic function, a cost allocation according to this principle will not be included in the core of the Traveling Salesman Game. The models and methods presented in this thesis can be applied to transportation problems similar to that of Norsk Hydro, independent of the type of products that are delivered. 96 refs, 11 figs, 26 tabs

  8. Cost allocation in distribution planning

    International Nuclear Information System (INIS)

    Engevall, S.

    1996-01-01

    This thesis concerns cost allocation problems in distribution planning. The cost allocation problems we study are illustrated using the distribution planning situation at the Logistics department of Norsk Hydro Olje AB. The planning situation is modeled as a Traveling Salesman Problem and a Vehicle Routing Problem with an inhomogeneous fleet. The cost allocation problems are the problems of how to divide the transportation costs among the customers served in each problem. The cost allocation problems are formulated as cooperative games, in characteristic function form, where the customers are defined to be the players. The games contain five and 21 players respectively. Game theoretical solution concepts such as the core, the nucleolus, the Shapley value and the τ-value are discussed. From the empirical results we can, among other things, conclude that the core of the Traveling Salesman Game is large, and that the core of the Vehicle Routing Game is empty. In the accounting of Norsk Hydro the cost per m 3 can be found for each tour. We conclude that for a certain definition of the characteristic function, a cost allocation according to this principle will not be included in the core of the Traveling Salesman Game. The models and methods presented in this thesis can be applied to transportation problems similar to that of Norsk Hydro, independent of the type of products that are delivered. 96 refs, 11 figs, 26 tabs

  9. The binning of metagenomic contigs for microbial physiology of mixed cultures.

    Science.gov (United States)

    Strous, Marc; Kraft, Beate; Bisdorf, Regina; Tegetmeyer, Halina E

    2012-01-01

    So far, microbial physiology has dedicated itself mainly to pure cultures. In nature, cross feeding and competition are important aspects of microbial physiology and these can only be addressed by studying complete communities such as enrichment cultures. Metagenomic sequencing is a powerful tool to characterize such mixed cultures. In the analysis of metagenomic data, well established algorithms exist for the assembly of short reads into contigs and for the annotation of predicted genes. However, the binning of the assembled contigs or unassembled reads is still a major bottleneck and required to understand how the overall metabolism is partitioned over different community members. Binning consists of the clustering of contigs or reads that apparently originate from the same source population. In the present study eight metagenomic samples from the same habitat, a laboratory enrichment culture, were sequenced. Each sample contained 13-23 Mb of assembled contigs and up to eight abundant populations. Binning was attempted with existing methods but they were found to produce poor results, were slow, dependent on non-standard platforms or produced errors. A new binning procedure was developed based on multivariate statistics of tetranucleotide frequencies combined with the use of interpolated Markov models. Its performance was evaluated by comparison of the results between samples with BLAST and in comparison to existing algorithms for four publicly available metagenomes and one previously published artificial metagenome. The accuracy of the new approach was comparable or higher than existing methods. Further, it was up to a 100 times faster. It was implemented in Java Swing as a complete open source graphical binning application available for download and further development (http://sourceforge.net/projects/metawatt).

  10. The binning of metagenomic contigs for microbial physiology of mixed cultures

    Directory of Open Access Journals (Sweden)

    Marc eStrous

    2012-12-01

    Full Text Available So far, microbial physiology has dedicated itself mainly to pure cultures. In nature, cross feeding and competition are important aspects of microbial physiology and these can only be addressed by studying complete communities such as enrichment cultures. Metagenomic sequencing is a powerful tool to characterize such mixed cultures. In the analysis of metagenomic data, well established algorithms exist for the assembly of short reads into contigs and for the annotation of predicted genes. However, the binning of the assembled contigs or unassembled reads is still a major bottleneck and required to understand how the overall metabolism is partitioned over different community members. Binning consists of the clustering of contigs or reads that apparently originate from the same source population.In the present study eight metagenomic samples originating from the same habitat, a laboratory enrichment culture, were sequenced. Each sample contained 13-23 Mb of assembled contigs and up to eight abundant populations. Binning was attempted with existing methods but they were found to produce poor results, were slow, dependent on non-standard platforms or produced errors. A new binning procedure was developed based on multivariate statistics of tetranucleotide frequencies combined with the use of interpolated Markov models. Its performance was evaluated by comparison of the results between samples with BLAST and in comparison to exisiting algorithms for four publicly available metagenomes and one previously published artificial metagenome. The accuracy of the new approach was comparable or higher than existing methods. Further, it was up to a hunderd times faster. It was implemented in Java Swing as a complete open source graphical binning application available for download and further development (http://sourceforge.net/projects/metawatt.

  11. Centralized Allocation in Multiple Markets

    DEFF Research Database (Denmark)

    Monte, Daniel; Tumennasan, Norovsambuu

    The problem of allocating indivisible objects to different agents, where each indi vidual is assigned at most one object, has been widely studied. Pápai (2000) shows that the set of strategy-proof, nonbossy, Pareto optimal and reallocation-proof rules are hierarchical exchange rules | generalizat......The problem of allocating indivisible objects to different agents, where each indi vidual is assigned at most one object, has been widely studied. Pápai (2000) shows that the set of strategy-proof, nonbossy, Pareto optimal and reallocation-proof rules are hierarchical exchange rules...... | generalizations of Gale's Top Trading Cycles mechanism. We study the centralized allocation that takes place in multiple markets. For example, the assignment of multiple types of indivisible objects; or the assignment of objects in successive periods. We show that the set of strategy-proof, Pareto efficient...

  12. Development of a new bin filler for apple harvesting and infield sorting with a review of existing technologies

    Science.gov (United States)

    The bin filler, which receives apples from the sorting system and then places them in the bin evenly without causing bruise damage, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges ...

  13. Solving the non-oriented three-dimensional bin packing problem with stability and load bearing constraints

    DEFF Research Database (Denmark)

    Hansen, Jesper

    2003-01-01

    The three-dimensional bin packing problem is concerned with packing a given set of rectangular items into rectangular bins. We are interested in solving real-life problems where rotations of items are allowed and the packings must be packable and stable. Load bearing of items is taken into account...

  14. Preserving the allocation ratio at every allocation with biased coin randomization and minimization in studies with unequal allocation.

    Science.gov (United States)

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2012-04-13

    The demand for unequal allocation in clinical trials is growing. Most commonly, the unequal allocation is achieved through permuted block randomization. However, other allocation procedures might be required to better approximate the allocation ratio in small samples, reduce the selection bias in open-label studies, or balance on baseline covariates. When these allocation procedures are generalized to unequal allocation, special care is to be taken to preserve the allocation ratio at every allocation step. This paper offers a way to expand the biased coin randomization to unequal allocation that preserves the allocation ratio at every allocation. The suggested expansion works with biased coin randomization that balances only on treatment group totals and with covariate-adaptive procedures that use a random biased coin element at every allocation. Balancing properties of the allocation ratio preserving biased coin randomization and minimization are described through simulations. It is demonstrated that these procedures are asymptotically protected against the shift in the rerandomization distribution identified for some examples of minimization with 1:2 allocation. The asymptotic shift in the rerandomization distribution of the difference in treatment means for an arbitrary unequal allocation procedure is explicitly derived in the paper. Copyright © 2011 John Wiley & Sons, Ltd.

  15. A novel approach to find and optimize bin locations and collection routes using a geographic information system.

    Science.gov (United States)

    Erfani, Seyed Mohammad Hassan; Danesh, Shahnaz; Karrabi, Seyed Mohsen; Shad, Rouzbeh

    2017-07-01

    One of the major challenges in big cities is planning and implementation of an optimized, integrated solid waste management system. This optimization is crucial if environmental problems are to be prevented and the expenses to be reduced. A solid waste management system consists of many stages including collection, transfer and disposal. In this research, an integrated model was proposed and used to optimize two functional elements of municipal solid waste management (storage and collection systems) in the Ahmadabad neighbourhood located in the City of Mashhad - Iran. The integrated model was performed by modelling and solving the location allocation problem and capacitated vehicle routing problem (CVRP) through Geographic Information Systems (GIS). The results showed that the current collection system is not efficient owing to its incompatibility with the existing urban structure and population distribution. Application of the proposed model could significantly improve the storage and collection system. Based on the results of minimizing facilities analyses, scenarios with 100, 150 and 180 m walking distance were considered to find optimal bin locations for Alamdasht, C-metri and Koohsangi. The total number of daily collection tours was reduced to seven as compared to the eight tours carried out in the current system (12.50% reduction). In addition, the total number of required crews was minimized and reduced by 41.70% (24 crews in the current collection system vs 14 in the system provided by the model). The total collection vehicle routing was also optimized such that the total travelled distances during night and day working shifts was cut back by 53%.

  16. Allocation Problems and Market Design

    DEFF Research Database (Denmark)

    Smilgins, Aleksandrs

    The thesis contains six independent papers with a common theme: Allocation problems and market design. The first paper is concerned with fair allocation of risk capital where independent autonomous subunits have risky activities and together constitute the entity's total risk, whose associated risk......-to-one matching model by including a set of objects, such that a matching consists of two agents from disjoint sets, and an object. Agents' preference lists consist of all possible pairs of objects and agents from the other set, and thus contain important information about agent-object tradeoffs. The notion...

  17. Regulating nutrient allocation in plants

    Science.gov (United States)

    Udvardi, Michael; Yang, Jiading; Worley, Eric

    2014-12-09

    The invention provides coding and promoter sequences for a VS-1 and AP-2 gene, which affects the developmental process of senescence in plants. Vectors, transgenic plants, seeds, and host cells comprising heterologous VS-1 and AP-2 genes are also provided. Additionally provided are methods of altering nutrient allocation and composition in a plant using the VS-1 and AP-2 genes.

  18. Operational risk economic capital allocation

    Science.gov (United States)

    Nikonov, Oleg I.; Vlasov, Vladimir E.; Medvedeva, Marina A.

    2013-10-01

    In the paper we describe a model of operational risk of economic capital estimation and allocation based on Loss Distribution Approach (LDA). Bank's total losses are modeled through Monte-Carlo simulations of its business units' losses. It allows to fairly distributing the corresponding capital between business units in order to assess and manage their risk adjusted performance.

  19. Planning and Resource Allocation Management.

    Science.gov (United States)

    Coleman, Jack W.

    1986-01-01

    Modern scientific management techniques provide college administrators with valuable planning and resource allocation insights and enhances the decision process. The planning model should incorporate assessment, strategic planning, dynamic and long-term budgeting, operational planning, and feedback and control for actual operations. (MSE)

  20. Designing for dynamic task allocation

    NARCIS (Netherlands)

    Dongen, C.J.G. van; Maanen, P.P. van

    2005-01-01

    Future platforms are envisioned in which human-machine teams are able to share and trade tasks as demands in situations change. It seems that human-machine coordination has not received the attention it deserves by past and present approaches to task allocation. In this paper a simple way to make

  1. Governance and Foreign Aid Allocation

    Science.gov (United States)

    2006-07-01

    In addition, this chapter develops a microeconomic model to explore donors’ aid allocation decisions and their potential impact on aid effectiveness...the promotion of market-based principles to restructure macroeconomic policies in developing countries. The greater focus on...be explained by successful development 29 efforts of number of countries in the region, such as Republic of Korea, Malaysia , Singapore, and Peoples

  2. Comparison of an alternative and existing binning methods to reduce the acquisition duration of 4D PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Didierlaurent, David, E-mail: dadidierlaurent@gmail.com; Ribes, Sophie; Caselles, Olivier [SIMAD, LU 50, Université Toulouse III Paul Sabatier, Toulouse 31062 (France); Jaudet, Cyril; Dierickx, Lawrence O.; Zerdoud, Slimane; Brillouet, Severine; Weits, Kathleen [Institut Claudius Regaud, 20-24 Rue du Pont Saint-Pierre, Toulouse 31052 (France); Batatia, Hadj [IRIT-INPT, Université Toulouse III Paul Sabatier, Toulouse 31071 (France); Courbon, Frédéric [GCS CHU-CLCC, Université Toulouse III Paul Sabatier, Toulouse 31052 (France)

    2014-11-01

    Purpose: Respiratory motion is a source of artifacts that reduce image quality in PET. Four dimensional (4D) PET/CT is one approach to overcome this problem. Existing techniques to limiting the effects of respiratory motions are based on prospective phase binning which requires a long acquisition duration (15–25 min). This time is uncomfortable for the patients and limits the clinical exploitation of 4D PET/CT. In this work, the authors evaluated an existing method and an alternative retrospective binning method to reduce the acquisition duration of 4D PET/CT. Methods: The authors studied an existing mixed-amplitude binning (MAB) method and an alternative binning method by mixed-phases (MPhB). Before implementing MPhB, they analyzed the regularity of the breathing patterns in patients. They studied the breathing signal drift and missing CT slices that could be challenging for implementing MAB. They compared the performance of MAB and MPhB with current binning methods to measure the maximum uptake, internal volume, and maximal range of tumor motion. Results: MPhB can be implemented depending on an optimal phase (in average, the exhalation peak phase −4.1% of the entire breathing cycle duration). Signal drift of patients was in average 35% relative to the breathing amplitude. Even after correcting this drift, MAB was feasible in 4D CT for only 64% of patients. No significant differences appeared between the different binning methods to measure the maximum uptake, internal volume, and maximal range of tumor motion. The authors also determined the inaccuracies of MAB and MPhB to measure the maximum amplitude of tumor motion with three bins (less than 3 mm for movement inferior to 12 mm, up to 6.4 mm for a 21 mm movement). Conclusions: The authors proposed an alternative binning method by mixed-phase binning that halves the acquisition duration of 4D PET/CT. Mixed-amplitude binning was challenging because of signal drift and missing CT slices. They showed that more

  3. CHOBS: Color Histogram of Block Statistics for Automatic Bleeding Detection in Wireless Capsule Endoscopy Video.

    Science.gov (United States)

    Ghosh, Tonmoy; Fattah, Shaikh Anowarul; Wahid, Khan A

    2018-01-01

    Wireless capsule endoscopy (WCE) is the most advanced technology to visualize whole gastrointestinal (GI) tract in a non-invasive way. But the major disadvantage here, it takes long reviewing time, which is very laborious as continuous manual intervention is necessary. In order to reduce the burden of the clinician, in this paper, an automatic bleeding detection method for WCE video is proposed based on the color histogram of block statistics, namely CHOBS. A single pixel in WCE image may be distorted due to the capsule motion in the GI tract. Instead of considering individual pixel values, a block surrounding to that individual pixel is chosen for extracting local statistical features. By combining local block features of three different color planes of RGB color space, an index value is defined. A color histogram, which is extracted from those index values, provides distinguishable color texture feature. A feature reduction technique utilizing color histogram pattern and principal component analysis is proposed, which can drastically reduce the feature dimension. For bleeding zone detection, blocks are classified using extracted local features that do not incorporate any computational burden for feature extraction. From extensive experimentation on several WCE videos and 2300 images, which are collected from a publicly available database, a very satisfactory bleeding frame and zone detection performance is achieved in comparison to that obtained by some of the existing methods. In the case of bleeding frame detection, the accuracy, sensitivity, and specificity obtained from proposed method are 97.85%, 99.47%, and 99.15%, respectively, and in the case of bleeding zone detection, 95.75% of precision is achieved. The proposed method offers not only low feature dimension but also highly satisfactory bleeding detection performance, which even can effectively detect bleeding frame and zone in a continuous WCE video data.

  4. Random variation and correlation of the weather data series – evaluation and simulation using bounded histograms

    Directory of Open Access Journals (Sweden)

    Konecny Petr

    2017-01-01

    Full Text Available The contribution is focused on the random variation and correlation of input parameters for the climate data description. The climate data such as ambient temperature, solar intensity, wind speed and direction are of the random nature. The description of ambient temperature can be based on the climate data time series in the form of climatic “load duration curve”. Particular input parameters such as ambient temperature and solar radiation have a significant correlation. The attention is paid especially to the evaluation and simulation application of the histograms using Monte Carlo type process considering correlation of particular parameters.

  5. SUPERVISED AUTOMATIC HISTOGRAM CLUSTERING AND WATERSHED SEGMENTATION. APPLICATION TO MICROSCOPIC MEDICAL COLOR IMAGES

    Directory of Open Access Journals (Sweden)

    Olivier Lezoray

    2011-05-01

    Full Text Available In this paper, an approach to the segmentation of microscopic color images is addressed, and applied to medical images. The approach combines a clustering method and a region growing method. Each color plane is segmented independently relying on a watershed based clustering of the plane histogram. The marginal segmentation maps intersect in a label concordance map. The latter map is simplified based on the assumption that the color planes are correlated. This produces a simplified label concordance map containing labeled and unlabeled pixels. The formers are used as an image of seeds for a color watershed. This fast and robust segmentation scheme is applied to several types of medical images.

  6. An evaluation of the effectiveness of adaptive histogram equalization for contrast enhancement.

    Science.gov (United States)

    Zimmerman, J B; Pizer, S M; Staab, E V; Perry, J R; McCartney, W; Brenton, B C

    1988-01-01

    Adaptive histogram equalization (AHE) and intensity windowing have been compared using psychophysical observer studies. Experienced radiologists were shown clinical CT (computerized tomographic) images of the chest. Into some of the images, appropriate artificial lesions were introduced; the physicians were then shown the images processed with both AHE and intensity windowing. They were asked to assess the probability that a given image contained the artificial lesion, and their accuracy was measured. The results of these experiments show that for this particular diagnostic task, there was no significant difference in the ability of the two methods to depict luminance contrast; thus, further evaluation of AHE using controlled clinical trials is indicated.

  7. Mobile Visual Search Based on Histogram Matching and Zone Weight Learning

    Science.gov (United States)

    Zhu, Chuang; Tao, Li; Yang, Fan; Lu, Tao; Jia, Huizhu; Xie, Xiaodong

    2018-01-01

    In this paper, we propose a novel image retrieval algorithm for mobile visual search. At first, a short visual codebook is generated based on the descriptor database to represent the statistical information of the dataset. Then, an accurate local descriptor similarity score is computed by merging the tf-idf weighted histogram matching and the weighting strategy in compact descriptors for visual search (CDVS). At last, both the global descriptor matching score and the local descriptor similarity score are summed up to rerank the retrieval results according to the learned zone weights. The results show that the proposed approach outperforms the state-of-the-art image retrieval method in CDVS.

  8. A quantitative measure based infrared image enhancement algorithm using plateau histogram

    Science.gov (United States)

    Lai, Rui; Yang, Yin-tang; Wang, Bing-jian; Zhou, Hui-xin

    2010-11-01

    A quantitative measure based scene-adaptive contrast enhancement algorithm for an infrared (IR) image is proposed. This method regulates the probability density function (PDF) of the raw image firstly, and then applies an improved plateau histogram equalization method whose plateau threshold is determined by the concavity of the regulated PDF to enhance the raw IR image. In the stepped parameter tuning process of the algorithm, quantitative measure EME is used as the criterion to determine the optimal PDF regulator factor and plateau threshold. The above improvements contribute to the performance promotion of the proposed algorithm, whose effectiveness is validated by the final assessment with visual quality and quantitative measures.

  9. Using Relational Histogram Features and Action Labelled Data to Learn Preconditions for Means-End Actions

    DEFF Research Database (Denmark)

    Fichtl, Severin; Kraft, Dirk; Krüger, Norbert

    2015-01-01

    that capture and represent the spatial relationships in an easily accessible way. We are interested in learning to predict the success of “means end” actions that manipulate two objects at once, from exploratory actions, and the observed sensorimo- tor contingencies. In this paper, we use relational histogram...... features and illustrate their effect on learning to predict a variety of “means end” actions’ outcomes. The results show that our vision features can make the learning problem significantly easier, leading to increased learning rates and higher maximum performance. This work is in particular important...

  10. Yet Another Method for Image Segmentation based on Histograms and Heuristics

    Directory of Open Access Journals (Sweden)

    Horia-Nicolai L. Teodorescu

    2012-07-01

    Full Text Available We introduce a method for image segmentation that requires little computations, yet providing comparable results to other methods. While the proposed method resembles to the known ones based on histograms, it is still different in the use of the gray level distribution. When to the basic procedure we add several heuristic rules, the method produces results that, in some cases, may outperform the results produced by the known methods. The paper reports preliminary results. More details on the method, improvements, and results will be presented in a future paper.

  11. Frontal Face Detection using Haar Wavelet Coefficients and Local Histogram Correlation

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2011-12-01

    Full Text Available Face detection is the main building block on which all automatic systems dealing with human faces is built. For example, a face recognition system must rely on face detection to process an input image and determine which areas contain human faces. These areas then become the input for the face recognition system for further processing. This paper presents a face detection system designed to detect frontal faces. The system uses Haar wavelet coefficients and local histogram correlation as differentiating features. Our proposed system is trained using 100 training images. Our experiments show that the proposed system performed well during testing, achieving a detection rate of 91.5%.

  12. Content Based Radiographic Images Indexing and Retrieval Using Pattern Orientation Histogram

    Directory of Open Access Journals (Sweden)

    Abolfazl Lakdashti

    2008-06-01

    Full Text Available Introduction: Content Based Image Retrieval (CBIR is a method of image searching and retrieval in a  database. In medical applications, CBIR is a tool used by physicians to compare the previous and current  medical images associated with patients pathological conditions. As the volume of pictorial information  stored in medical image databases is in progress, efficient image indexing and retrieval is increasingly  becoming a necessity.  Materials and Methods: This paper presents a new content based radiographic image retrieval approach  based on histogram of pattern orientations, namely pattern orientation histogram (POH. POH represents  the  spatial  distribution  of  five  different  pattern  orientations:  vertical,  horizontal,  diagonal  down/left,  diagonal down/right and non-orientation. In this method, a given image is first divided into image-blocks  and  the  frequency  of  each  type  of  pattern  is  determined  in  each  image-block.  Then,  local  pattern  histograms for each of these image-blocks are computed.   Results: The method was compared to two well known texture-based image retrieval methods: Tamura  and  Edge  Histogram  Descriptors  (EHD  in  MPEG-7  standard.  Experimental  results  based  on  10000  IRMA  radiography  image  dataset,  demonstrate  that  POH  provides  better  precision  and  recall  rates  compared to Tamura and EHD. For some images, the recall and precision rates obtained by POH are,  respectively, 48% and 18% better than the best of the two above mentioned methods.    Discussion and Conclusion: Since we exploit the absolute location of the pattern in the image as well as  its global composition, the proposed matching method can retrieve semantically similar medical images.

  13. Allocating application to group of consecutive processors in fault-tolerant deadlock-free routing path defined by routers obeying same rules for path selection

    Science.gov (United States)

    Leung, Vitus J [Albuquerque, NM; Phillips, Cynthia A [Albuquerque, NM; Bender, Michael A [East Northport, NY; Bunde, David P [Urbana, IL

    2009-07-21

    In a multiple processor computing apparatus, directional routing restrictions and a logical channel construct permit fault tolerant, deadlock-free routing. Processor allocation can be performed by creating a linear ordering of the processors based on routing rules used for routing communications between the processors. The linear ordering can assume a loop configuration, and bin-packing is applied to this loop configuration. The interconnection of the processors can be conceptualized as a generally rectangular 3-dimensional grid, and the MC allocation algorithm is applied with respect to the 3-dimensional grid.

  14. Binäre, zinkreiche Phasen der Elemente Rhodium, Ruthenium und Osmium.

    OpenAIRE

    Allio, Céline

    2010-01-01

    Die Erschließung zinkreicher Zustandsgebiete binärer Edelmetall-Zn-Systeme hat jüngst mehrere strukturell hochgradig ausdifferenzierte Verbindungen zu Tage gefördert. Einige der Phasen lassen subtile elementspezifische Einflüsse auf die ansonsten von der Valenzelektronenkonzentration gesteuerte Phasen- und Strukturbildung erkennen. Um zu lernen, wie sich dieser Einfluss mit der d-Elektronenzahl der Minoritätskomponente en...

  15. Morphological and Strength Properties of Tanjung Bin Coal Ash Mixtures for Applied in Geotechnical Engineering Work

    OpenAIRE

    Awang, Abd. Rahim; Marto, Aminaton; Makhtar, Ahmad Maher

    2012-01-01

    In Malaysia, coal has been used as a raw material to generate electricity since 1988. In the past, most of the wastage of coal burning especially the bottom ash was not managed properly as it was dumped in the waste pond and accumulated drastically.This paper focuses on some properties of coal ash mixtures (fly  ash and bottom ash mixtures) from Tanjung Bin power plant. The characteristics studied were morphological properties, compaction behaviour and strength properties. Strength properties...

  16. Efficient use of design-based binning methodology in a DRAM fab

    Science.gov (United States)

    Karsenti, Laurent; Wehner, Arno; Fischer, Andreas; Seifert, Uwe; Goeckeritz, Jens; Geshel, Mark; Gscheidlen, Dieter; Bartov, Avishai

    2009-03-01

    It is a well established fact that as design rules and printed features shrink, sophisticated techniques are required to ensure the design intent is indeed printed on the wafer. Techniques of this kind are Optical Proximity Correction (OPC), Resolution Enhancement Techniques (RET) and DFM Design for Manufacturing (DFM). As these methods are applied to the overall chip and rely on complex modeling and simulations, they increase the risk of creating local areas or layouts with a limiting process window. Hence, it is necessary to verify the manufacturability (sufficient depth of focus) of the overall die and not only of a pre-defined set of metrology structures. The verification process is commonly based on full chip defect density inspection of a Focus Exposure Matrix (FEM) wafer, combined with appropriate post processing of the inspection data. This is necessary to avoid time consuming search for the Defects of Interest (DOI's) as defect counts are usually too high to be handled by manual SEM review. One way to post process defect density data is the so called design based binning (DBB). The Litho Qualification Monitor (LQM) system allows to classify and also to bin defects based on design information. In this paper we will present an efficient way to combine classification and binning in order to check design rules and to determine the marginal features (layout with low depth of focus). The Design Based Binning has been connected to the Yield Management System (YMS) to allow new process monitoring approaches towards Design Based SPC. This could dramatically cut the time to detect systematic defects inline.

  17. Bioinformatics strategies for taxonomy independent binning and visualization of sequences in shotgun metagenomics

    Directory of Open Access Journals (Sweden)

    Karel Sedlar

    2017-01-01

    Full Text Available One of main steps in a study of microbial communities is resolving their composition, diversity and function. In the past, these issues were mostly addressed by the use of amplicon sequencing of a target gene because of reasonable price and easier computational postprocessing of the bioinformatic data. With the advancement of sequencing techniques, the main focus shifted to the whole metagenome shotgun sequencing, which allows much more detailed analysis of the metagenomic data, including reconstruction of novel microbial genomes and to gain knowledge about genetic potential and metabolic capacities of whole environments. On the other hand, the output of whole metagenomic shotgun sequencing is mixture of short DNA fragments belonging to various genomes, therefore this approach requires more sophisticated computational algorithms for clustering of related sequences, commonly referred to as sequence binning. There are currently two types of binning methods: taxonomy dependent and taxonomy independent. The first type classifies the DNA fragments by performing a standard homology inference against a reference database, while the latter performs the reference-free binning by applying clustering techniques on features extracted from the sequences. In this review, we describe the strategies within the second approach. Although these strategies do not require prior knowledge, they have higher demands on the length of sequences. Besides their basic principle, an overview of particular methods and tools is provided. Furthermore, the review covers the utilization of the methods in context with the length of sequences and discusses the needs for metagenomic data preprocessing in form of initial assembly prior to binning.

  18. Quantum secret sharing based on modulated high-dimensional time-bin entanglement

    International Nuclear Information System (INIS)

    Takesue, Hiroki; Inoue, Kyo

    2006-01-01

    We propose a scheme for quantum secret sharing (QSS) that uses a modulated high-dimensional time-bin entanglement. By modulating the relative phase randomly by {0,π}, a sender with the entanglement source can randomly change the sign of the correlation of the measurement outcomes obtained by two distant recipients. The two recipients must cooperate if they are to obtain the sign of the correlation, which is used as a secret key. We show that our scheme is secure against intercept-and-resend (IR) and beam splitting attacks by an outside eavesdropper thanks to the nonorthogonality of high-dimensional time-bin entangled states. We also show that a cheating attempt based on an IR attack by one of the recipients can be detected by changing the dimension of the time-bin entanglement randomly and inserting two 'vacant' slots between the packets. Then, cheating attempts can be detected by monitoring the count rate in the vacant slots. The proposed scheme has better experimental feasibility than previously proposed entanglement-based QSS schemes

  19. A study of volatile organic compounds evolved in urban waste disposal bins

    Science.gov (United States)

    Statheropoulos, M.; Agapiou, A.; Pallis, G.

    Volatile organic compounds (VOCs) evolved in urban waste disposal bins in different situations were studied. Waste of various loads (full, empty, partially filled bins), remained uncollected in the containers for variable time and under different weather conditions. Analysis of VOCs was carried out by thermal desorption/gas chromatography/mass spectrometry (TD/GC/MS). Over 150 compounds have been identified and the 30 most abundant were quantified. Generally, VOCs were determined in the range of micrograms per cubic meter. Median concentrations of the most prominent VOCs were: decane (694.9 μg m -3), acetic acid ethyl ester (353.1 μg m -3), limonene (334.9 μg m -3), nonane (257.4 μg m -3), ethanol (216.1 μg m -3), benzene 1,2,4-trimethyl (212.6 μg m -3) and undecane (159.1 μg m -3). High levels of alkanes, alkylbenzenes and terpenes are responsible for undesirable odours. The variety and concentration of VOCs evolved depends on the prevailing conditions such as time of waste exposure, load and weather. When waste accumulates in bins under unforeseen circumstances, some compounds produced may exceed olfactory and safety thresholds representing a source of potential health impact.

  20. Subinteger Range-Bin Alignment Method for ISAR Imaging of Noncooperative Targets

    Directory of Open Access Journals (Sweden)

    Pérez-Martínez F

    2010-01-01

    Full Text Available Inverse Synthetic Aperture Radar (ISAR is a coherent radar technique capable of generating images of noncooperative targets. ISAR may have better performance in adverse meteorological conditions than traditional imaging sensors. Unfortunately, ISAR images are usually blurred because of the relative motion between radar and target. To improve the quality of ISAR products, motion compensation is necessary. In this context, range-bin alignment is the first step for translational motion compensation. In this paper, we propose a subinteger range-bin alignment method based on envelope correlation and reference profiles. The technique, which makes use of a carefully designed optimization stage, is robust against noise, clutter, target scintillation, and error accumulation. It provides us with very fine translational motion compensation. Comparisons with state-of-the-art range-bin alignment methods are included and advantages of the proposal are highlighted. Simulated and live data from a high-resolution linear-frequency-modulated continuous-wave radar are included to perform the pertinent comparisons.

  1. Entanglement between a Photonic Time-Bin Qubit and a Collective Atomic Spin Excitation

    Science.gov (United States)

    Farrera, Pau; Heinze, Georg; de Riedmatten, Hugues

    2018-03-01

    Entanglement between light and matter combines the advantage of long distance transmission of photonic qubits with the storage and processing capabilities of atomic qubits. To distribute photonic states efficiently over long distances several schemes to encode qubits have been investigated—time-bin encoding being particularly promising due to its robustness against decoherence in optical fibers. Here, we demonstrate the generation of entanglement between a photonic time-bin qubit and a single collective atomic spin excitation (spin wave) in a cold atomic ensemble, followed by the mapping of the atomic qubit onto another photonic qubit. A magnetic field that induces a periodic dephasing and rephasing of the atomic excitation ensures the temporal distinguishability of the two time bins and plays a central role in the entanglement generation. To analyze the generated quantum state, we use largely imbalanced Mach-Zehnder interferometers to perform projective measurements in different qubit bases and verify the entanglement by violating a Clauser-Horne-Shimony-Holt Bell inequality.

  2. Modeling Early Postnatal Brain Growth and Development with CT: Changes in the Brain Radiodensity Histogram from Birth to 2 Years.

    Science.gov (United States)

    Cauley, K A; Hu, Y; Och, J; Yorks, P J; Fielden, S W

    2018-02-15

    The majority of brain growth and development occur in the first 2 years of life. This study investigated these changes by analysis of the brain radiodensity histogram of head CT scans from the clinical population, 0-2 years of age. One hundred twenty consecutive head CTs with normal findings meeting the inclusion criteria from children from birth to 2 years were retrospectively identified from 3 different CT scan platforms. Histogram analysis was performed on brain-extracted images, and histogram mean, mode, full width at half maximum, skewness, kurtosis, and SD were correlated with subject age. The effects of scan platform were investigated. Normative curves were fitted by polynomial regression analysis. Average total brain volume was 360 cm 3 at birth, 948 cm 3 at 1 year, and 1072 cm 3 at 2 years. Total brain tissue density showed an 11% increase in mean density at 1 year and 19% at 2 years. Brain radiodensity histogram skewness was positive at birth, declining logarithmically in the first 200 days of life. The histogram kurtosis also decreased in the first 200 days to approach a normal distribution. Direct segmentation of CT images showed that changes in brain radiodensity histogram skewness correlated with, and can be explained by, a relative increase in gray matter volume and an increase in gray and white matter tissue density that occurs during this period of brain maturation. Normative metrics of the brain radiodensity histogram derived from routine clinical head CT images can be used to develop a model of normal brain development. © 2018 by American Journal of Neuroradiology.

  3. DNA IMAGE CYTOMETRY IN PROGNOSTICATION OF COLORECTAL CANCER: PRACTICAL CONSIDERATIONS OF THE TECHNIQUE AND INTERPRETATION OF THE HISTOGRAMS

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    2011-05-01

    Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.

  4. AN ILLUMINATION INVARIANT FACE RECOGNITION BY ENHANCED CONTRAST LIMITED ADAPTIVE HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    A. Thamizharasi

    2016-05-01

    Full Text Available Face recognition system is gaining more importance in social networks and surveillance. The face recognition task is complex due to the variations in illumination, expression, occlusion, aging and pose. The illumination variations in image are due to changes in lighting conditions, poor illumination, low contrast or increased brightness. The variations in illumination adversely affect the quality of image and recognition accuracy. The illumination variations in face image have to be pre-processed prior to face recognition. The Contrast Limited Adaptive Histogram Equalization (CLAHE is an image enhancement technique popular in enhancing medical images. The proposed work is to create illumination invariant face recognition system by enhancing Contrast Limited Adaptive Histogram Equalization technique. This method is termed as “Enhanced CLAHE”. The efficiency of Enhanced CLAHE is tested using Fuzzy K Nearest Neighbour classifier and fisher face subspace projection method. The face recognition accuracy percentage rate, Equal Error Rate and False Acceptance Rate at 1% are calculated. The performance of CLAHE and Enhanced CLAHE methods is compared. The efficiency of the Enhanced CLAHE method is tested with three public face databases AR, Yale and ORL. The Enhanced CLAHE has very high recognition accuracy percentage rate when compared to CLAHE.

  5. Fault classification approaches using empirical mode decomposition and histogram of power signals

    Science.gov (United States)

    Sharma, K. K.; Samad, Abdul

    2013-01-01

    This paper presents two new algorithms for fault classification in power signals. The first algorithm is based on empirical mode decomposition (EMD) of the power signals which decomposes a signal into intrinsic mode functions (IMF). In the proposed technique we obtain the IMFs of the power signals and compute the higher order statistical parameters of each IMF, and a dictionary of feature vectors of different types of faults is prepared. To classify the fault in a given signal, its feature vector is computed and its classification is done using the nearest neighbor rule using its Euclidean distance with the feature vectors stored in the dictionary. The simulation results show that we are able to classify the faults accurately using HOS based approach even at signal-to-noise ratio (SNR) value of 10 dB, which is much lower than the values of SNR reported in the literature. The second method is based on computing the histograms of different types of fault signals and computing their distances with histograms of signals stored in the dictionary. It is observed that above SNR value of 30 dB, we are able to classify all types of faults accurately and this method is computationally less demanding.

  6. Conductance of single-atom platinum contacts: Voltage dependence of the conductance histogram

    DEFF Research Database (Denmark)

    Nielsen, S.K.; Noat, Y.; Brandbyge, Mads

    2003-01-01

    The conductance of a single-atom contact is sensitive to the coupling of this contact atom to the atoms in the leads. Notably for the transition metals this gives rise to a considerable spread in the observed conductance values. The mean conductance value and spread can be obtained from the first...... peak in conductance histograms recorded from a large set of contact-breaking cycles. In contrast to the monovalent metals, this mean value for Pt depends strongly on the applied voltage bias and other experimental conditions and values ranging from about 1 G(0) to 2.5 G(0) (G(0)=2e(2)/h) have been...... reported. We find that at low bias the first peak in the conductance histogram is centered around 1.5 G(0). However, as the bias increases past 300 mV the peak shifts to 1.8 G(0). Here we show that this bias dependence is due to a geometric effect where monatomic chains are replaced by single-atom contacts...

  7. From image processing to computational neuroscience: a neural model based on histogram equalization.

    Science.gov (United States)

    Bertalmío, Marcelo

    2014-01-01

    There are many ways in which the human visual system works to reduce the inherent redundancy of the visual information in natural scenes, coding it in an efficient way. The non-linear response curves of photoreceptors and the spatial organization of the receptive fields of visual neurons both work toward this goal of efficient coding. A related, very important aspect is that of the existence of post-retinal mechanisms for contrast enhancement that compensate for the blurring produced in early stages of the visual process. And alongside mechanisms for coding and wiring efficiency, there is neural activity in the human visual cortex that correlates with the perceptual phenomenon of lightness induction. In this paper we propose a neural model that is derived from an image processing technique for histogram equalization, and that is able to deal with all the aspects just mentioned: this new model is able to predict lightness induction phenomena, and improves the efficiency of the representation by flattening both the histogram and the power spectrum of the image signal.

  8. From Image Processing to Computational Neuroscience: A Neural Model Based on Histogram Equalization

    Directory of Open Access Journals (Sweden)

    Marcelo eBertalmío

    2014-07-01

    Full Text Available There are many ways in which the human visual system works to reduce the inherent redundancy of the visual information in natural scenes, coding it in an efficient way.The non-linear response curves of photoreceptors and the spatial organization of the receptive fields of visual neurons both work towards this goal of efficient coding. A related, very important aspect is that of the existence of post-retinal mechanisms for contrast enhancement that compensate for the blurring produced in early stages of the visual process. And alongside mechanisms for coding and wiring efficiency, there is neural activity in the human visual cortex that correlates with the perceptual phenomenon of lightness induction.In this paper we propose a neural model that is derived from an image processing technique for histogram equalization, and that is able to deal with all the aspects just mentioned: this new model is able to predict lightness induction phenomena, and improves the efficiency of the representation by flattening both the histogram and the power spectrum of the image signal.

  9. Uniform enhancement of optical micro-angiography images using Rayleigh contrast-limited adaptive histogram equalization.

    Science.gov (United States)

    Yousefi, Siavash; Qin, Jia; Zhi, Zhongwei; Wang, Ruikang K

    2013-02-01

    Optical microangiography is an imaging technology that is capable of providing detailed functional blood flow maps within microcirculatory tissue beds in vivo. Some practical issues however exist when displaying and quantifying the microcirculation that perfuses the scanned tissue volume. These issues include: (I) Probing light is subject to specular reflection when it shines onto sample. The unevenness of the tissue surface makes the light energy entering the tissue not uniform over the entire scanned tissue volume. (II) The biological tissue is heterogeneous in nature, meaning the scattering and absorption properties of tissue would attenuate the probe beam. These physical limitations can result in local contrast degradation and non-uniform micro-angiogram images. In this paper, we propose a post-processing method that uses Rayleigh contrast-limited adaptive histogram equalization to increase the contrast and improve the overall appearance and uniformity of optical micro-angiograms without saturating the vessel intensity and changing the physical meaning of the micro-angiograms. The qualitative and quantitative performance of the proposed method is compared with those of common histogram equalization and contrast enhancement methods. We demonstrate that the proposed method outperforms other existing approaches. The proposed method is not limited to optical microangiography and can be used in other image modalities such as photo-acoustic tomography and scanning laser confocal microscopy.

  10. Quantification of sonographic echogenicity by the gray-level histogram in patients with supraspinatus tendinopathy.

    Science.gov (United States)

    Tsai, Yao-Hung; Huang, Kuo-Chin; Shen, Shih-Hsun; Yang, Tien-Yu; Huang, Tsung-Jen; Hsu, Robert Wen-Wei

    2014-07-01

    The aim of this study was to compare the gray-level value of the supraspinatus tendon of a painful shoulder with that of a normal shoulder measured by ultrasonography, and to investigate whether a low mean gray-level value of the supraspinatus tendon could indicate a partial-thickness or incomplete full-thickness tear. Two hundred and ten patients had significant unilateral shoulder pain with the clinical suspicion of rotator cuff tendinopathy. They underwent bilateral shoulder ultrasonography, and the mean echogenicity of the histogram was calculated on the screen. The mean gray-level value of each patient's contralateral asymptomatic shoulder was compared with that of the painful shoulder. Based on the scan of transverse and longitudinal planes, a significant difference existed between the symptomatic shoulder and contralateral asymptomatic shoulder (p shoulders showed no statistically significant difference between the patients who underwent surgery and the patients who underwent conservative treatment. We demonstrated that the ultrasound gray-level histogram is a promising tool for detecting the hypoechogenic appearance of supraspinatus tendinopathy. A decrease in the mean gray-level value on the symptomatic shoulder may be used as an alternative sonographic indicator of rotator cuff partial-thickness tear or tendinopathy. Diagnostic level III.

  11. Infrared image enhancement based on atmospheric scattering model and histogram equalization

    Science.gov (United States)

    Li, Yi; Zhang, Yunfeng; Geng, Aihui; Cao, Lihua; Chen, Juan

    2016-09-01

    Infrared images are fuzzy due to the special imaging technology of infrared sensor. In order to achieve contrast enhancement and gain clear edge details from a fuzzy infrared image, we propose an efficient enhancement method based on atmospheric scattering model and histogram equalization. The novel algorithm optimizes and improves the visual image haze remove method which combines the characteristics of the fuzzy infrared images. Firstly, an average filtering operation is presented to get the estimation of coarse transmission rate. Then we get the fuzzy free image through self-adaptive transmission rate calculated with the statistics information of original infrared image. Finally, to deal with low lighting problem of fuzzy free image, we propose a sectional plateau histogram equalization method which is capable of background suppression. Experimental results show that the performance and efficiency of the proposed algorithm are pleased, compared to four other algorithms in both subjective observation and objective quantitative evaluation. In addition, the proposed algorithm is competent to enhance infrared image for different applications under different circumstances.

  12. Two non-parametric methods for derivation of constraints from radiotherapy dose–histogram data

    International Nuclear Information System (INIS)

    Ebert, M A; Kennedy, A; Joseph, D J; Gulliford, S L; Buettner, F; Foo, K; Haworth, A; Denham, J W

    2014-01-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose–histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization. (note)

  13. Online Data Monitoring Framework Based on Histogram Packaging in Network Distributed Data Acquisition Systems

    International Nuclear Information System (INIS)

    Konno, T; Ishitsuka, M; Kuze, M; Cabarera, A; Sakamoto, Y

    2011-01-01

    O nline monitor frameworkis a new general software framework for online data monitoring, which provides a way to collect information from online systems, including data acquisition, and displays them to shifters far from experimental sites. 'Monitor Server', a core system in this framework gathers the monitoring information from the online subsystems and the information is handled as collections of histograms named H istogram Package . Monitor Server broadcasts the histogram packages to 'Monitor Viewers', graphical user interfaces in the framework. We developed two types of the viewers with different technologies: Java and web browser. We adapted XML based file for the configuration of GUI components on the windows and graphical objects on the canvases. Monitor Viewer creates its GUIs automatically with the configuration files.This monitoring framework has been developed for the Double Chooz reactor neutrino oscillation experiment in France, but can be extended for general application to be used in other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  14. Incremental Prognostic Value of ADC Histogram Analysis over MGMT Promoter Methylation Status in Patients with Glioblastoma.

    Science.gov (United States)

    Choi, Yoon Seong; Ahn, Sung Soo; Kim, Dong Wook; Chang, Jong Hee; Kang, Seok-Gu; Kim, Eui Hyun; Kim, Se Hoon; Rim, Tyler Hyungtaek; Lee, Seung-Koo

    2016-10-01

    Purpose To investigate the incremental prognostic value of apparent diffusion coefficient (ADC) histogram analysis over oxygen 6-methylguanine-DNA methyltransferase (MGMT) promoter methylation status in patients with glioblastoma and the correlation between ADC parameters and MGMT status. Materials and Methods This retrospective study was approved by institutional review board, and informed consent was waived. A total of 112 patients with glioblastoma were divided into training (74 patients) and test (38 patients) sets. Overall survival (OS) and progression-free survival (PFS) was analyzed with ADC parameters, MGMT status, and other clinical factors. Multivariate Cox regression models with and without ADC parameters were constructed. Model performance was assessed with c index and receiver operating characteristic curve analyses for 12- and 16-month OS and 12-month PFS in the training set and validated in the test set. ADC parameters were compared according to MGMT status for the entire cohort. Results By using ADC parameters, the c indices and diagnostic accuracies for 12- and 16-month OS and 12-month PFS in the models showed significant improvement, with the exception of c indices in the models for PFS (P MGMT status. Conclusion ADC histogram analysis had incremental prognostic value over MGMT promoter methylation status in patients with glioblastoma. (©) RSNA, 2016 Online supplemental material is available for this article.

  15. Tools for the analysis of dose optimization: I. Effect-volume histogram

    International Nuclear Information System (INIS)

    Alber, M.; Nuesslin, F.

    2002-01-01

    With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level. (author)

  16. Two non-parametric methods for derivation of constraints from radiotherapy dose-histogram data

    Science.gov (United States)

    Ebert, M. A.; Gulliford, S. L.; Buettner, F.; Foo, K.; Haworth, A.; Kennedy, A.; Joseph, D. J.; Denham, J. W.

    2014-07-01

    Dose constraints based on histograms provide a convenient and widely-used method for informing and guiding radiotherapy treatment planning. Methods of derivation of such constraints are often poorly described. Two non-parametric methods for derivation of constraints are described and investigated in the context of determination of dose-specific cut-points—values of the free parameter (e.g., percentage volume of the irradiated organ) which best reflect resulting changes in complication incidence. A method based on receiver operating characteristic (ROC) analysis and one based on a maximally-selected standardized rank sum are described and compared using rectal toxicity data from a prostate radiotherapy trial. Multiple test corrections are applied using a free step-down resampling algorithm, which accounts for the large number of tests undertaken to search for optimal cut-points and the inherent correlation between dose-histogram points. Both methods provide consistent significant cut-point values, with the rank sum method displaying some sensitivity to the underlying data. The ROC method is simple to implement and can utilize a complication atlas, though an advantage of the rank sum method is the ability to incorporate all complication grades without the need for grade dichotomization.

  17. Moleculo Long-Read Sequencing Facilitates Assembly and Genomic Binning from Complex Soil Metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    White, Richard Allen; Bottos, Eric M.; Roy Chowdhury, Taniya; Zucker, Jeremy D.; Brislawn, Colin J.; Nicora, Carrie D.; Fansler, Sarah J.; Glaesemann, Kurt R.; Glass, Kevin; Jansson, Janet K.; Langille, Morgan

    2016-06-28

    ABSTRACT

    Soil metagenomics has been touted as the “grand challenge” for metagenomics, as the high microbial diversity and spatial heterogeneity of soils make them unamenable to current assembly platforms. Here, we aimed to improve soil metagenomic sequence assembly by applying the Moleculo synthetic long-read sequencing technology. In total, we obtained 267 Gbp of raw sequence data from a native prairie soil; these data included 109.7 Gbp of short-read data (~100 bp) from the Joint Genome Institute (JGI), an additional 87.7 Gbp of rapid-mode read data (~250 bp), plus 69.6 Gbp (>1.5 kbp) from Moleculo sequencing. The Moleculo data alone yielded over 5,600 reads of >10 kbp in length, and over 95% of the unassembled reads mapped to contigs of >1.5 kbp. Hybrid assembly of all data resulted in more than 10,000 contigs over 10 kbp in length. We mapped three replicate metatranscriptomes derived from the same parent soil to the Moleculo subassembly and found that 95% of the predicted genes, based on their assignments to Enzyme Commission (EC) numbers, were expressed. The Moleculo subassembly also enabled binning of >100 microbial genome bins. We obtained via direct binning the first complete genome, that of “CandidatusPseudomonas sp. strain JKJ-1” from a native soil metagenome. By mapping metatranscriptome sequence reads back to the bins, we found that several bins corresponding to low-relative-abundanceAcidobacteriawere highly transcriptionally active, whereas bins corresponding to high-relative-abundanceVerrucomicrobiawere not. These results demonstrate that Moleculo sequencing provides a significant advance for resolving complex soil microbial communities.

    IMPORTANCESoil microorganisms carry out key processes for life on our planet, including cycling of carbon and other nutrients and supporting growth of plants. However, there is poor molecular-level understanding of their

  18. Integer Programming Models for Sales Resource Allocation

    OpenAIRE

    Andris A. Zoltners; Prabhakant Sinha

    1980-01-01

    A practical conceptual framework for sales resource allocation modeling is presented in this paper. A literature review of sales resource allocation models is described in terms of this framework. The conceptual framework also lends itself to several integer programming models which may be used to address the variety of sales resource allocation decisions faced by every sales organization. A general model for sales resource allocation is developed which incorporates multiple sales resources, ...

  19. Intelligent tactical asset allocation support system

    NARCIS (Netherlands)

    Hiemstra, Y.

    1995-01-01

    This paper presents an advanced support system for Tactical Asset Allocation. Asset allocation explains over 90% of portfolio performance (Brinson, Hood and Beebower, 1988). Tactical asset allocation adjusts a strategic portfolio on the basis of short term market outlooks. The system includes

  20. Longitudinal Acceleration Test of Overhead Luggage Bins and Auxiliary Fuel Tank in a Transport Airplane Airframe Section, Part 2

    National Research Council Canada - National Science Library

    McGuire, Robert

    2000-01-01

    ...). The purpose of the tests was to measure the structural responses and interaction between the fuselage, overhead stowage bins, and auxiliary fuel tank under simulated, potentially survivable, crash conditions...

  1. Longitudinal Acceleration Tests of Overhead Luggage Bins and Auxiliary Fuel Tank in a Transport Airplane Airframe Section

    National Research Council Canada - National Science Library

    McGuire, Robert

    1999-01-01

    ...). The purpose of the tests was to measure the structural responses and interaction between the fuselage, overhead stowage bins, and auxiliary fuel tank under simulated, potentially survivable, crash conditions...

  2. VL1/VL2 MARS METEOROLOGY RESAMPLED DATA BINNED-P-T-V V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set contains binned and splined data obtained from the Viking Meteorology Instrument System (VMIS) through most of the Viking Lander 2 mission and the...

  3. Massachusetts Bay - Internal Wave Packets Extracted from SAR Imagery Binned in 1x1 minute grid cells

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This feature class contains internal wave packets extracted from SAR imagery that were binned in 1x1 minute latitude/longitude polygon grid cells. Statistics were...

  4. Methylation of the BIN1 gene promoter CpG island associated with breast and prostate cancer

    Directory of Open Access Journals (Sweden)

    Khomyakova Anastasiya

    2007-01-01

    Full Text Available Abstract Background Loss of BIN1 tumor suppressor expression is abundant in human cancer and its frequency exceeds that of genetic alterations, suggesting the role of epigenetic regulators (DNA methylation. BIN1 re-expression in the DU145 prostate cancer cell line after 5-aza-2'-deoxycytidine treatment was recently reported but no methylation of the BIN1 promoter CpG island was found in DU145. Methods Methylation-sensitive arbitrarily-primed PCR was used to detect genomic loci abnormally methylated in breast cancer. BIN1 CpG island fragment was identified among the differentially methylated loci as a result of direct sequencing of the methylation-sensitive arbitrarily-primed PCR product and subsequent BLAST alliance. BIN1 CpG island cancer related methylation in breast and prostate cancers was confirmed by bisulphite sequencing and its methylation frequency was evaluated by methylation sensitive PCR. Loss of heterozygosity analysis of the BIN1 region was performed with two introgenic and one closely adjacent extragenic microsatellite markers.BIN1 expression was evaluated by real-time RT-PCR. Results We have identified a 3'-part of BIN1 promoter CpG island among the genomic loci abnormally methylated in breast cancer. The fragment proved to be methylated in 18/99 (18% and 4/46 (9% breast and prostate tumors, correspondingly, as well as in MCF7 and T47D breast cancer cell lines, but was never methylated in normal tissues and lymphocytes as well as in DU145 and LNCaP prostate cancer cell lines. The 5'-part of the CpG island revealed no methylation in all samples tested. BIN1 expression losses were detected in MCF7 and T47D cells and were characteristic of primary breast tumors (10/13; 77%, while loss of heterozygosity was a rare event in tissue samples (2/22 informative cases; 9% and was ruled out for MCF7. Conclusion BIN1 promoter CpG island is composed of two parts differing drastically in the methylation patterns in cancer. This appears to be a

  5. Model-based implementation of self-configurable intellectual property modules for image histogram calculation in FPGAs

    Directory of Open Access Journals (Sweden)

    Luis Manuel Garcés Socarrás

    2017-05-01

    Full Text Available This work presents the development of self-modifiable Intellectual Property (IP modules for histogram calculation using the modelbased design technique provided by Xilinx System Generator. In this work, an analysis and a comparison among histogram calculation architectures are presented, selecting the best solution for the design flow used. Also, the paper emphasizes the use of generic architectures capable of been adjustable by a self configurable procedure to ensure a processing flow adequate to the application requirements. In addition, the implementation of a configurable IP module for histogram calculation using a model-based design flow is described and some implementation results are shown over a Xilinx FPGA Spartan-6 LX45.

  6. A real-time automatic contrast adjustment method for high-bit-depth cameras based on histogram variance analysis

    Science.gov (United States)

    Zhao, Jun; Lu, Jun

    2015-10-01

    In this paper we propose an efficient method to enhance contrast in real time in digital video streams by exploiting histogram variances and adaptively adjusting gamma curves. The proposed method aims to overcome the limitations of the conventional histogram equalization method, which often produces noisy, unrealistic effects in images. To improve visual quality, we use gamma correction technique and choose different gamma curves according to the histogram variance of the images. By using this scheme, the details of an image can be enhanced while the mean brightness level is kept. Experiment results demonstrate that our method is simple and efficient, and robust for both low and high dynamic scenes, and hence well suited for real-time, high-bit-depth video acquisitions.

  7. Expression robust 3D face recognition via mesh-based histograms of multiple order surface differential quantities

    KAUST Repository

    Li, Huibin

    2011-09-01

    This paper presents a mesh-based approach for 3D face recognition using a novel local shape descriptor and a SIFT-like matching process. Both maximum and minimum curvatures estimated in the 3D Gaussian scale space are employed to detect salient points. To comprehensively characterize 3D facial surfaces and their variations, we calculate weighted statistical distributions of multiple order surface differential quantities, including histogram of mesh gradient (HoG), histogram of shape index (HoS) and histogram of gradient of shape index (HoGS) within a local neighborhood of each salient point. The subsequent matching step then robustly associates corresponding points of two facial surfaces, leading to much more matched points between different scans of a same person than the ones of different persons. Experimental results on the Bosphorus dataset highlight the effectiveness of the proposed method and its robustness to facial expression variations. © 2011 IEEE.

  8. Condition monitoring of face milling tool using K-star algorithm and histogram features of vibration signal

    Directory of Open Access Journals (Sweden)

    C.K. Madhusudana

    2016-09-01

    Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.

  9. Influence of Sampling Practices on the Appearance of DNA Image Histograms of Prostate Cells in FNAB Samples

    Directory of Open Access Journals (Sweden)

    Abdelbaset Buhmeida

    1999-01-01

    Full Text Available Twenty‐one fine needle aspiration biopsies (FNAB of the prostate, diagnostically classified as definitely malignant, were studied. The Papanicolaou or H&E stained samples were destained and then stained for DNA with the Feulgen reaction. DNA cytometry was applied after different sampling rules. The histograms varied according to the sampling rule applied. Because free cells between cell groups were easier to measure than cells in the cell groups, two sampling rules were tested in all samples: (i cells in the cell groups were measured, and (ii free cells between cell groups were measured. Abnormal histograms were more common after the sampling rule based on free cells, suggesting that abnormal patterns are best revealed through the free cells in these samples. The conclusions were independent of the applied histogram interpretation method.

  10. Support vector machine for breast cancer classification using diffusion-weighted MRI histogram features: Preliminary study.

    Science.gov (United States)

    Vidić, Igor; Egnell, Liv; Jerome, Neil P; Teruel, Jose R; Sjøbakk, Torill E; Østlie, Agnes; Fjøsne, Hans E; Bathen, Tone F; Goa, Pål Erik

    2018-05-01

    Diffusion-weighted MRI (DWI) is currently one of the fastest developing MRI-based techniques in oncology. Histogram properties from model fitting of DWI are useful features for differentiation of lesions, and classification can potentially be improved by machine learning. To evaluate classification of malignant and benign tumors and breast cancer subtypes using support vector machine (SVM). Prospective. Fifty-one patients with benign (n = 23) and malignant (n = 28) breast tumors (26 ER+, whereof six were HER2+). Patients were imaged with DW-MRI (3T) using twice refocused spin-echo echo-planar imaging with echo time / repetition time (TR/TE) = 9000/86 msec, 90 × 90 matrix size, 2 × 2 mm in-plane resolution, 2.5 mm slice thickness, and 13 b-values. Apparent diffusion coefficient (ADC), relative enhanced diffusivity (RED), and the intravoxel incoherent motion (IVIM) parameters diffusivity (D), pseudo-diffusivity (D*), and perfusion fraction (f) were calculated. The histogram properties (median, mean, standard deviation, skewness, kurtosis) were used as features in SVM (10-fold cross-validation) for differentiation of lesions and subtyping. Accuracies of the SVM classifications were calculated to find the combination of features with highest prediction accuracy. Mann-Whitney tests were performed for univariate comparisons. For benign versus malignant tumors, univariate analysis found 11 histogram properties to be significant differentiators. Using SVM, the highest accuracy (0.96) was achieved from a single feature (mean of RED), or from three feature combinations of IVIM or ADC. Combining features from all models gave perfect classification. No single feature predicted HER2 status of ER + tumors (univariate or SVM), although high accuracy (0.90) was achieved with SVM combining several features. Importantly, these features had to include higher-order statistics (kurtosis and skewness), indicating the importance to account for heterogeneity. Our

  11. Using visible SNR (vSNR) to compare the image quality of pixel binning and digital resizing

    Science.gov (United States)

    Farrell, Joyce; Okincha, Mike; Parmar, Manu; Wandell, Brian

    2010-01-01

    We introduce a new metric, the visible signal-to-noise ratio (vSNR), to analyze how pixel-binning and resizing methods influence noise visibility in uniform areas of an image. The vSNR is the inverse of the standard deviation of the SCIELAB representation of a uniform field; its units are 1/ΔE. The vSNR metric can be used in simulations to predict how imaging system components affect noise visibility. We use simulations to evaluate two image rendering methods: pixel binning and digital resizing. We show that vSNR increases with scene luminance, pixel size and viewing distance and decreases with read noise. Under low illumination conditions and for pixels with relatively high read noise, images generated with the binning method have less noise (high vSNR) than resized images. The binning method has noticeably lower spatial resolution. The binning method reduces demands on the ADC rate and channel throughput. When comparing binning and resizing, there is an image quality tradeoff between noise and blur. Depending on the application users may prefer one error over another.

  12. Cost Allocation and Convex Data Envelopment

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tind, Jørgen

    This paper considers allocation rules. First, we demonstrate that costs allocated by the Aumann-Shapley and the Friedman-Moulin cost allocation rules are easy to determine in practice using convex envelopment of registered cost data and parametric programming. Second, from the linear programming...... such as Data Envelopment Analysis (DEA). The convexity constraint of the BCC model introduces a non-zero slack in the objective function of the multiplier problem and we show that the cost allocation rules discussed in this paper can be used as candidates to allocate this slack value on to the input (or output...

  13. Cost allocation review : staff discussion paper

    International Nuclear Information System (INIS)

    2005-09-01

    This report addressed the need for updated cost allocation studies filed by local electricity distribution companies because they ensure that distribution rates for each customer class remain just and reasonable. According to the 2001 Electricity Distribution Rate Handbook, the Ontario Energy Board requires new cost allocation studies before implementing any future incentive regulation plans. A review of cost allocations allows the Board to consider the need for adjustments to the current share of distribution costs paid by different classes of ratepayers. This report included 14 sections to facilitate consultations with stakeholders on financial information requirements for cost allocation; directly assignable costs; functionalization; categorization; allocation methods; allocation of other costs; load data requirements; cost allocation implementation issues; addition of new rate class and rate design for scattered unmetered loads; addition of new rate class for larger users; rates to charge embedded distributors; treatment of the rate sub-classification identified as time-of-use; and, rate design implementation issues. 1 fig., 7 appendices

  14. A DNA-based registry for all animal species: the barcode index number (BIN system.

    Directory of Open Access Journals (Sweden)

    Sujeevan Ratnasingham

    Full Text Available Because many animal species are undescribed, and because the identification of known species is often difficult, interim taxonomic nomenclature has often been used in biodiversity analysis. By assigning individuals to presumptive species, called operational taxonomic units (OTUs, these systems speed investigations into the patterning of biodiversity and enable studies that would otherwise be impossible. Although OTUs have conventionally been separated through their morphological divergence, DNA-based delineations are not only feasible, but have important advantages. OTU designation can be automated, data can be readily archived, and results can be easily compared among investigations. This study exploits these attributes to develop a persistent, species-level taxonomic registry for the animal kingdom based on the analysis of patterns of nucleotide variation in the barcode region of the cytochrome c oxidase I (COI gene. It begins by examining the correspondence between groups of specimens identified to a species through prior taxonomic work and those inferred from the analysis of COI sequence variation using one new (RESL and four established (ABGD, CROP, GMYC, jMOTU algorithms. It subsequently describes the implementation, and structural attributes of the Barcode Index Number (BIN system. Aside from a pragmatic role in biodiversity assessments, BINs will aid revisionary taxonomy by flagging possible cases of synonymy, and by collating geographical information, descriptive metadata, and images for specimens that are likely to belong to the same species, even if it is undescribed. More than 274,000 BIN web pages are now available, creating a biodiversity resource that is positioned for rapid growth.

  15. Histogram analysis for age change of human lung with computed tomography

    International Nuclear Information System (INIS)

    Shirabe, Ichiju

    1990-01-01

    In order to evaluate physiological changes of normal lung with aging by computed tomography (CT), the peak position (PP) and full width half maximum (FWHM) of CT-histogram were studied in 77 normal human lung. Above 30 years old, PP tended to be seen in the lower attenuation value with advancing ages, with the result that the follow equation was obtained. CT attenuation value of PP=-0.87 x age -815. The peak position shifted to the range of higher CT attenuation in 30's. FWHM did not change with advancing ages. There were no differences of peak value and FWHM among the upper, middle and lower lung field. In this study, physiological changes of lung were evaluated quantitatively. Furthermore, this study was considered to be useful for diagnosis and treatment in lung diseases. (author)

  16. Preprocessing with image denoising and histogram equalization for endoscopy image analysis using texture analysis.

    Science.gov (United States)

    Hiroyasu, Tomoyuki; Hayashinuma, Katsutoshi; Ichikawa, Hiroshi; Yagi, Nobuaki

    2015-08-01

    A preprocessing method for endoscopy image analysis using texture analysis is proposed. In a previous study, we proposed a feature value that combines a co-occurrence matrix and a run-length matrix to analyze the extent of early gastric cancer from images taken with narrow-band imaging endoscopy. However, the obtained feature value does not identify lesion zones correctly due to the influence of noise and halation. Therefore, we propose a new preprocessing method with a non-local means filter for de-noising and contrast limited adaptive histogram equalization. We have confirmed that the pattern of gastric mucosa in images can be improved by the proposed method. Furthermore, the lesion zone is shown more correctly by the obtained color map.

  17. Illumination compensation using oriented local histogram equalization and its application to face recognition.

    Science.gov (United States)

    Lee, Ping-Han; Wu, Szu-Wei; Hung, Yi-Ping

    2012-09-01

    Illumination compensation and normalization play a crucial role in face recognition. The existing algorithms either compensated low-frequency illumination, or captured high-frequency edges. However, the orientations of edges were not well exploited. In this paper, we propose the orientated local histogram equalization (OLHE) in brief, which compensates illumination while encoding rich information on the edge orientations. We claim that edge orientation is useful for face recognition. Three OLHE feature combination schemes were proposed for face recognition: 1) encoded most edge orientations; 2) more compact with good edge-preserving capability; and 3) performed exceptionally well when extreme lighting conditions occurred. The proposed algorithm yielded state-of-the-art performance on AR, CMU PIE, and extended Yale B using standard protocols. We further evaluated the average performance of the proposed algorithm when the images lighted differently were observed, and the proposed algorithm yielded the promising results.

  18. REAL-TIME FACE RECOGNITION BASED ON OPTICAL FLOW AND HISTOGRAM EQUALIZATION

    Directory of Open Access Journals (Sweden)

    D. Sathish Kumar

    2013-05-01

    Full Text Available Face recognition is one of the intensive areas of research in computer vision and pattern recognition but many of which are focused on recognition of faces under varying facial expressions and pose variation. A constrained optical flow algorithm discussed in this paper, recognizes facial images involving various expressions based on motion vector computation. In this paper, an optical flow computation algorithm which computes the frames of varying facial gestures, and integrating with synthesized image in a probabilistic environment has been proposed. Also Histogram Equalization technique has been used to overcome the effect of illuminations while capturing the input data using camera devices. It also enhances the contrast of the image for better processing. The experimental results confirm that the proposed face recognition system is more robust and recognizes the facial images under varying expressions and pose variations more accurately.

  19. A fast and effective model for wavelet subband histograms and its application in texture image retrieval.

    Science.gov (United States)

    Pi, Ming Hong; Tong, C S; Choy, Siu Kai; Zhang, Hong

    2006-10-01

    This paper presents a novel, effective, and efficient characterization of wavelet subbands by bit-plane extractions. Each bit plane is associated with a probability that represents the frequency of 1-bit occurrence, and the concatenation of all the bit-plane probabilities forms our new image signature. Such a signature can be extracted directly from the code-block code-stream, rather than from the de-quantized wavelet coefficients, making our method particularly adaptable for image retrieval in the compression domain such as JPEG2000 format images. Our signatures have smaller storage requirement and lower computational complexity, and yet, experimental results on texture image retrieval show that our proposed signatures are much more cost effective to current state-of-the-art methods including the generalized Gaussian density signatures and histogram signatures.

  20. Gender Perception From Faces Using Boosted LBPH (Local Binary Patten Histograms

    Directory of Open Access Journals (Sweden)

    U. U. Tariq

    2013-06-01

    Full Text Available Automatic Gender classification from faces has several applications such as surveillance, human computer interaction, targeted advertisement etc. Humans can recognize gender from faces quite accurately but for computer vision it is a difficult task. Many studies have targeted this problem but most of these studies used images of faces taken under constrained conditions. Real-world applications however require to process images from real-world, that have significant variation in lighting and pose, which makes the gender classification task very difficult. We have examined the problem of automatic gender classification from faces on real-world images. Using a face detector faces from images are extracted aligned and represented using Local binary pattern histogram. Discriminative features are selected using Adaboost and the boosted LBP features are used to train a support vector machine that provides a recognition rate of 93.29%.

  1. A powerful, low-cost histogramming memory for digital radiography with multi-wire proportional counters

    International Nuclear Information System (INIS)

    Bateman, J.E.; Locke, C.E.R.; Ferrari, C.A.

    1986-01-01

    A powerful, low-cost histogramming memory for digital radiograph with multi-wire proportional counter is described. The memory is based on a commercial video display device coupled to an Apple II microcomputer which, at a total cost of around 2500 pounds gives a system with 512 x 512 pixel resolution and a counting range of 4095 counts per pixel. The system can take data at rates of up to 5000 Hz while providing a live-time display. No hardware modifications are necessary, the comprehensive storage and display facilities being implemented in a combined package of BASIC and ASSEMBLER software. An ACCELERATOR coprocessor card is used to enhance the performance of the system. (author)

  2. Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.

    Science.gov (United States)

    Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki

    2015-03-19

    Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  3. Detection of License Plate using Sliding Window, Histogram of Oriented Gradient, and Support Vector Machines Method

    Science.gov (United States)

    Astawa, INGA; Gusti Ngurah Bagus Caturbawa, I.; Made Sajayasa, I.; Dwi Suta Atmaja, I. Made Ari

    2018-01-01

    The license plate recognition usually used as part of system such as parking system. License plate detection considered as the most important step in the license plate recognition system. We propose methods that can be used to detect the vehicle plate on mobile phone. In this paper, we used Sliding Window, Histogram of Oriented Gradient (HOG), and Support Vector Machines (SVM) method to license plate detection so it will increase the detection level even though the image is not in a good quality. The image proceed by Sliding Window method in order to find plate position. Feature extraction in every window movement had been done by HOG and SVM method. Good result had shown in this research, which is 96% of accuracy.

  4. Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images

    Directory of Open Access Journals (Sweden)

    Inhye Yoon

    2015-03-01

    Full Text Available Since incoming light to an unmanned aerial vehicle (UAV platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i image segmentation based on geometric classes; (ii generation of the context-adaptive transmission map; and (iii intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.

  5. Segmentation of phase contrast microscopy images based on multi-scale local Basic Image Features histograms.

    Science.gov (United States)

    Jaccard, N; Szita, N; Griffin, L D

    2017-09-03

    Phase contrast microscopy (PCM) is routinely used for the inspection of adherent cell cultures in all fields of biology and biomedicine. Key decisions for experimental protocols are often taken by an operator based on typically qualitative observations. However, automated processing and analysis of PCM images remain challenging due to the low contrast between foreground objects (cells) and background as well as various imaging artefacts. We propose a trainable pixel-wise segmentation approach whereby image structures and symmetries are encoded in the form of multi-scale Basic Image Features local histograms, and classification of them is learned by random decision trees. This approach was validated for segmentation of cell versus background, and discrimination between two different cell types. Performance close to that of state-of-the-art specialised algorithms was achieved despite the general nature of the method. The low processing time ( images) is suitable for batch processing of experimental data as well as for interactive segmentation applications.

  6. Optical Extinction Measurements of Dust Density in the GMRO Regolith Test Bin

    Science.gov (United States)

    Lane, J.; Mantovani, J.; Mueller, R.; Nugent, M.; Nick, A.; Schuler, J.; Townsend, I.

    2016-01-01

    A regolith simulant test bin was constructed and completed in the Granular Mechanics and Regolith Operations (GMRO) Lab in 2013. This Planetary Regolith Test Bed (PRTB) is a 64 sq m x 1 m deep test bin, is housed in a climate-controlled facility, and contains 120 MT of lunar-regolith simulant, called Black Point-1 or BP-1, from Black Point, AZ. One of the current uses of the test bin is to study the effects of difficult lighting and dust conditions on Telerobotic Perception Systems to better assess and refine regolith operations for asteroid, Mars and polar lunar missions. Low illumination and low angle of incidence lighting pose significant problems to computer vision and human perception. Levitated dust on Asteroids interferes with imaging and degrades depth perception. Dust Storms on Mars pose a significant problem. Due to these factors, the likely performance of telerobotics is poorly understood for future missions. Current space telerobotic systems are only operated in bright lighting and dust-free conditions. This technology development testing will identify: (1) the impact of degraded lighting and environmental dust on computer vision and operator perception, (2) potential methods and procedures for mitigating these impacts, (3) requirements for telerobotic perception systems for asteroid capture, Mars dust storms and lunar regolith ISRU missions. In order to solve some of the Telerobotic Perception system problems, a plume erosion sensor (PES) was developed in the Lunar Regolith Simulant Bin (LRSB), containing 2 MT of JSC-1a lunar simulant. PES is simply a laser and digital camera with a white target. Two modes of operation have been investigated: (1) single laser spot - the brightness of the spot is dependent on the optical extinction due to dust and is thus an indirect measure of particle number density, and (2) side-scatter - the camera images the laser from the side, showing beam entrance into the dust cloud and the boundary between dust and void. Both

  7. Diviner lunar radiometer gridded brightness temperatures from geodesic binning of modeled fields of view

    Science.gov (United States)

    Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.

    2017-12-01

    An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the

  8. Adaptive Kalman filtering for histogram-based appearance learning in infrared imagery.

    Science.gov (United States)

    Venkataraman, Vijay; Fan, Guoliang; Havlicek, Joseph P; Fan, Xin; Zhai, Yan; Yeary, Mark B

    2012-11-01

    Targets of interest in video acquired from imaging infrared sensors often exhibit profound appearance variations due to a variety of factors, including complex target maneuvers, ego-motion of the sensor platform, background clutter, etc., making it difficult to maintain a reliable detection process and track lock over extended time periods. Two key issues in overcoming this problem are how to represent the target and how to learn its appearance online. In this paper, we adopt a recent appearance model that estimates the pixel intensity histograms as well as the distribution of local standard deviations in both the foreground and background regions for robust target representation. Appearance learning is then cast as an adaptive Kalman filtering problem where the process and measurement noise variances are both unknown. We formulate this problem using both covariance matching and, for the first time in a visual tracking application, the recent autocovariance least-squares (ALS) method. Although convergence of the ALS algorithm is guaranteed only for the case of globally wide sense stationary process and measurement noises, we demonstrate for the first time that the technique can often be applied with great effectiveness under the much weaker assumption of piecewise stationarity. The performance advantages of the ALS method relative to the classical covariance matching are illustrated by means of simulated stationary and nonstationary systems. Against real data, our results show that the ALS-based algorithm outperforms the covariance matching as well as the traditional histogram similarity-based methods, achieving sub-pixel tracking accuracy against the well-known AMCOM closure sequences and the recent SENSIAC automatic target recognition dataset.

  9. Dose-volume histograms associated to long-term colorectal functions in patients receiving pelvic radiotherapy

    International Nuclear Information System (INIS)

    Fokdal, Lars; Honore, Henriette; Hoyer, Morten; Maase, Hans von der

    2005-01-01

    Background and purpose: To correlate long-term colorectal dysfunctions following radical radiotherapy for bladder or prostate cancer with clinical parameters and dose-volume histogram parameters of the small intestine, rectum, and anal canal volume. Materials and methods: Seventy-one patients previously treated for bladder or prostate cancer were interviewed following CT-based radiotherapy of 60-70 Gy with questions concerning long-term colorectal dysfunctions. Median follow-up time was 30 months (range 12-109 months). Clinical parameters and parameters from the dose-volume histograms were correlated with colorectal dysfunctions (Spearman's test). Median and quartile values of all parameters were used as cut-off values for statistical analyses. A logistic regression model was used for analysis of urgency and incontinence in relation to median or maximum radiation dose to the anal canal volume. Results: Rectum length, volume and several dose-volume parameters from the anal canal volume and rectal volume were correlated with late organ dysfunctions. In a logistic model, fecal urgency and incontinence were dependent of dose-volume parameters from the anal canal volume. No relation between age or follow-up time and late effects were found. Dose-volume parameters of the small intestine were not related to any late dysfunctions. Conclusions: A relationship between several late anorectal dysfunctions and dose-volume parameters from the rectum and anal canal volume was demonstrated. It is recommended to exclude the anal canal volume from the high dose-volume and to apply rectal shielding whenever possible to prevent late anorectal dysfunctions

  10. Quantitatively assessed CT imaging measures of pulmonary interstitial pneumonia: Effects of reconstruction algorithms on histogram parameters

    International Nuclear Information System (INIS)

    Koyama, Hisanobu; Ohno, Yoshiharu; Yamazaki, Youichi; Nogami, Munenobu; Kusaka, Akiko; Murase, Kenya; Sugimura, Kazuro

    2010-01-01

    This study aimed the influences of reconstruction algorithm for quantitative assessments in interstitial pneumonia patients. A total of 25 collagen vascular disease patients (nine male patients and 16 female patients; mean age, 57.2 years; age range 32-77 years) underwent thin-section MDCT examinations, and MDCT data were reconstructed with three kinds of reconstruction algorithm (two high-frequencies [A and B] and one standard [C]). In reconstruction algorithm B, the effect of low- and middle-frequency space was suppressed compared with reconstruction algorithm A. As quantitative CT parameters, kurtosis, skewness, and mean lung density (MLD) were acquired from a frequency histogram of the whole lung parenchyma in each reconstruction algorithm. To determine the difference of quantitative CT parameters affected by reconstruction algorithms, these parameters were compared statistically. To determine the relationships with the disease severity, these parameters were correlated with PFTs. In the results, all the histogram parameters values had significant differences each other (p < 0.0001) and those of reconstruction algorithm C were the highest. All MLDs had fair or moderate correlation with all parameters of PFT (-0.64 < r < -0.45, p < 0.05). Though kurtosis and skewness in high-frequency reconstruction algorithm A had significant correlations with all parameters of PFT (-0.61 < r < -0.45, p < 0.05), there were significant correlations only with diffusing capacity of carbon monoxide (DLco) and total lung capacity (TLC) in reconstruction algorithm C and with forced expiratory volume in 1 s (FEV1), DLco and TLC in reconstruction algorithm B. In conclusion, reconstruction algorithm has influence to quantitative assessments on chest thin-section MDCT examination in interstitial pneumonia patients.

  11. Fast analysis of molecular dynamics trajectories with graphics processing units-Radial distribution function histogramming

    International Nuclear Information System (INIS)

    Levine, Benjamin G.; Stone, John E.; Kohlmeyer, Axel

    2011-01-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  12. Designing a power supply for Nim-bin formatted equipment; Diseno de una fuente de alimentacion para equipos con formato Nim-bin

    Energy Technology Data Exchange (ETDEWEB)

    Banuelos G, L. E.; Hernandez D, V. M.; Vega C, H. R., E-mail: lebluis2012@hotmail.com [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98060 Zacatecas, Zac. (Mexico)

    2016-09-15

    From an old Nuclear Chicago power supply that was practically in the trash, was able to recover the 19 inches casing, rear connectors and the housing where the circuits were. From here all mechanical parts were cleaned and the electronic design was started to replace the original voltage and current functions of this equipment. The cards for the ±6, ±12 and ±24 voltages were designed, simulated and tested with circuitry that does not rely on specialized components or that is sold only by the equipment manufacturer. In the handling of the current by each voltage to operate, was possible to tie with the specifications of the manufacturers like Ortec or Canberra where a model of power supply gives a power of 160 Watts. Basic tests were performed to show that the behavior is very similar to commercial equipment; such as the full load regulation index and the noise level in the supply voltages. So our Nim-bin voltage source is viable for use in our institution laboratories. (Author)

  13. Fast Multispectral Imaging by Spatial Pixel-Binning and Spectral Unmixing.

    Science.gov (United States)

    Pan, Zhi-Wei; Shen, Hui-Liang; Li, Chunguang; Chen, Shu-Jie; Xin, John H

    2016-08-01

    Multispectral imaging system is of wide application in relevant fields for its capability in acquiring spectral information of scenes. Its limitation is that, due to the large number of spectral channels, the imaging process can be quite time-consuming when capturing high-resolution (HR) multispectral images. To resolve this limitation, this paper proposes a fast multispectral imaging framework based on the image sensor pixel-binning and spectral unmixing techniques. The framework comprises a fast imaging stage and a computational reconstruction stage. In the imaging stage, only a few spectral images are acquired in HR, while most spectral images are acquired in low resolution (LR). The LR images are captured by applying pixel binning on the image sensor, such that the exposure time can be greatly reduced. In the reconstruction stage, an optimal number of basis spectra are computed and the signal-dependent noise statistics are estimated. Then the unknown HR images are efficiently reconstructed by solving a closed-form cost function that models the spatial and spectral degradations. The effectiveness of the proposed framework is evaluated using real-scene multispectral images. Experimental results validate that, in general, the method outperforms the state of the arts in terms of reconstruction accuracy, with additional 20× or more improvement in computational efficiency.

  14. Evaluation of methods for selecting the midventilation bin in 4DCT scans of lung cancer patients

    DEFF Research Database (Denmark)

    Nygaard, Ditte Eklund; Persson, Gitte Fredberg; Brink, Carsten

    2013-01-01

    based on: 1) visual evaluation of tumour displacement; 2) rigid registration of tumour position; 3) diaphragm displacement in the CC direction; and 4) carina displacement in the CC direction. Determination of the MidV bin based on the displacement of the manually delineated gross tumour volume (GTV.......4-5.4) mm, 1.9 (0.5-6.9) mm, 2.0 (0.5-12.3) mm and 1.1 (0.4-5.4) mm for the visual, rigid registration, diaphragm, carina, and reference method. Median (range) absolute difference between geometric MidV error for the evaluated methods and the reference method was 0.0 (0.0-1.2) mm, 0.0 (0.0-1.7) mm, 0.7 (0.......0-3.9) mm and 1.0 (0.0-6.9) mm for the visual, rigid registration, diaphragm and carina method. Conclusion. The visual and semi-automatic rigid registration methods were equivalent in accuracy for selecting the MidV bin of a 4DCT scan. The methods based on diaphragm and carina displacement cannot...

  15. Frequency-bin entanglement of ultra-narrow band non-degenerate photon pairs

    Science.gov (United States)

    Rieländer, Daniel; Lenhard, Andreas; Jime`nez Farìas, Osvaldo; Máttar, Alejandro; Cavalcanti, Daniel; Mazzera, Margherita; Acín, Antonio; de Riedmatten, Hugues

    2018-01-01

    We demonstrate frequency-bin entanglement between ultra-narrowband photons generated by cavity enhanced spontaneous parametric down conversion. Our source generates photon pairs in widely non-degenerate discrete frequency modes, with one photon resonant with a quantum memory material based on praseodymium doped crystals and the other photon at telecom wavelengths. Correlations between the frequency modes are analyzed using phase modulators and narrowband filters before detection. We show high-visibility two photon interference between the frequency modes, allowing us to infer a coherent superposition of the modes. We develop a model describing the state that we create and use it to estimate optimal measurements to achieve a violation of the Clauser-Horne (CH) Bell inequality under realistic assumptions. With these settings we perform a Bell test and show a significant violation of the CH inequality, thus proving the entanglement of the photons. Finally we demonstrate the compatibility with a quantum memory material by using a spectral hole in the praseodymium (Pr) doped crystal as spectral filter for measuring high-visibility two-photon interference. This demonstrates the feasibility of combining frequency-bin entangled photon pairs with Pr-based solid state quantum memories.

  16. The Islamic Ethics in the poetry of ‘Abdullah bin al-Mubarak (Arabic

    Directory of Open Access Journals (Sweden)

    Dr. Muhammad Ismail Bin Abdul Salam

    2017-01-01

    Full Text Available Abstract ‘Abdullah bin al-Mubark was born in Marw’ one of the prime cities in Khurasan, (nowadays in the surroundings of Afghanistan and Central Asia, in the year 118 AH. In addition to his many talents, achievements and abilities, ‘Abdullah bin al-Mubarak was also gifted in literacy, particularly in the art of poetry. He held an eloquent tongue which was recognized by all who conversed with him and his language displayed the nature of someone who had been taught well. Most of the poetry which has been recorded from him is actually his advice to others, whether they were close friends or high-ranking Caliphs and Rulers. The topics spoken of concerned the common issues which had arisen in his time (e.g. matters pertaining to theology, politics, the worldview, the community etc and as always, they contained much wisdom and hence the books of history have sealed them and recorded them. This research article discussed Biography of Abdullah ibn Al Mubarak, The Islamic Ethics in his poetry,\tImpact of Rhetoric on his poetry with special concentration on the four kinds i.e. citation, impact of Quranic words, Quranic pictorial and Quranic style on his poetry.

  17. Solid State Spin-Wave Quantum Memory for Time-Bin Qubits.

    Science.gov (United States)

    Gündoğan, Mustafa; Ledingham, Patrick M; Kutluer, Kutlu; Mazzera, Margherita; de Riedmatten, Hugues

    2015-06-12

    We demonstrate the first solid-state spin-wave optical quantum memory with on-demand read-out. Using the full atomic frequency comb scheme in a Pr(3+):Y2SiO5 crystal, we store weak coherent pulses at the single-photon level with a signal-to-noise ratio >10. Narrow-band spectral filtering based on spectral hole burning in a second Pr(3+):Y2SiO5 crystal is used to filter out the excess noise created by control pulses to reach an unconditional noise level of (2.0±0.3)×10(-3) photons per pulse. We also report spin-wave storage of photonic time-bin qubits with conditional fidelities higher than achievable by a measure and prepare strategy, demonstrating that the spin-wave memory operates in the quantum regime. This makes our device the first demonstration of a quantum memory for time-bin qubits, with on-demand read-out of the stored quantum information. These results represent an important step for the use of solid-state quantum memories in scalable quantum networks.

  18. Development of Seismic Demand for Chang-Bin Offshore Wind Farm in Taiwan Strait

    Directory of Open Access Journals (Sweden)

    Yu-Kai Wang

    2016-12-01

    Full Text Available Taiwan is located on the Pacific seismic belt, and the soil conditions of Taiwan’s offshore wind farms are softer than those in Europe. To ensure safety and stability of the offshore wind turbine supporting structures, it is important to assess the offshore wind farms seismic forces reasonably. In this paper, the relevant seismic and geological data are obtained for Chang-Bin offshore wind farm in Taiwan Strait, the probabilistic seismic hazard analysis (PSHA is carried out, and the first uniform hazard response spectrum for Chang-Bin offshore wind farm is achieved. Compared with existing design response spectrum in the local regulation, this site-specific seismic hazard analysis has influence on the seismic force considered in the design of supporting structures and therefore affects the cost of the supporting structures. The results show that a site-specific seismic hazard analysis is required for high seismic area. The paper highlights the importance of seismic hazard analysis to assess the offshore wind farms seismic forces. The follow-up recommendations and research directions are given for Taiwan’s offshore wind turbine supporting structures under seismic force considerations.

  19. Measuring Device for Air Speed in Macroporous Media and Its Application Inside Apple Storage Bins

    Directory of Open Access Journals (Sweden)

    Martin Geyer

    2018-02-01

    Full Text Available In cold storage facilities of fruit and vegetables, airflow is necessary for heat removal. The design of storage facilities influences the air speed in the surrounding of the product. Therefore, knowledge about airflow next to the product is important to plan the layout of cold stores adapted to the requirements of the products. A new sensing device (ASL, Air speed logger is developed for omnidirectional measurement of air speed between fruit or vegetables inside storage bins or in bulk. It consists of four interconnected plastic spheres with 80 mm diameter each, adapted to the size of apple fruit. In the free space between the spheres, silicon diodes are fixed for the airflow measurement based on a calorimetric principle. Battery and data logger are mounted inside the spheres. The device is calibrated in a wind tunnel in a measuring range of 0–1.3 m/s. Air speed measurements in fruit bulks on laboratory scale and in an industrial fruit store show air speeds in gaps between fruit with high stability at different airflow levels. Several devices can be placed between stored products for determination of the air speed distribution inside bulks or bin stacks in a storage room.

  20. Efficient Entanglement Concentration of Nonlocal Two-Photon Polarization-Time-Bin Hyperentangled States

    Science.gov (United States)

    Wang, Zi-Hang; Yu, Wen-Xuan; Wu, Xiao-Yuan; Gao, Cheng-Yan; Alzahrani, Faris; Hobiny, Aatef; Deng, Fu-Guo

    2018-03-01

    We present two different hyperentanglement concentration protocols (hyper-ECPs) for two-photon systems in nonlocal polarization-time-bin hyperentangled states with known parameters, including Bell-like and cluster-like states, resorting to the parameter splitting method. They require only one of two parties in quantum communication to operate her photon in the process of entanglement concentration, not two, and they have the maximal success probability. They work with linear optical elements and have good feasibility in experiment, especially in the case that there are a big number of quantum data exchanged as the parties can obtain the information about the parameters of the nonlocal hyperentangled states by sampling a subset of nonlocal hyperentangled two-photon systems and measuring them. As the quantum state of photons in the time-bin degree of freedom suffers from less noise in an optical-fiber channel, these hyper-ECPs may have good applications in practical long-distance quantum communication in the future.

  1. Disrupted membrane structure and intracellular Ca²⁺ signaling in adult skeletal muscle with acute knockdown of Bin1.

    Directory of Open Access Journals (Sweden)

    Andoria Tjondrokoesoemo

    Full Text Available Efficient intracellular Ca²⁺ ([Ca²⁺]i homeostasis in skeletal muscle requires intact triad junctional complexes comprised of t-tubule invaginations of plasma membrane and terminal cisternae of sarcoplasmic reticulum. Bin1 consists of a specialized BAR domain that is associated with t-tubule development in skeletal muscle and involved in tethering the dihydropyridine receptors (DHPR to the t-tubule. Here, we show that Bin1 is important for Ca²⁺ homeostasis in adult skeletal muscle. Since systemic ablation of Bin1 in mice results in postnatal lethality, in vivo electroporation mediated transfection method was used to deliver RFP-tagged plasmid that produced short -hairpin (shRNA targeting Bin1 (shRNA-Bin1 to study the effect of Bin1 knockdown in adult mouse FDB skeletal muscle. Upon confirming the reduction of endogenous Bin1 expression, we showed that shRNA-Bin1 muscle displayed swollen t-tubule structures, indicating that Bin1 is required for the maintenance of intact membrane structure in adult skeletal muscle. Reduced Bin1 expression led to disruption of t-tubule structure that was linked with alterations to intracellular Ca²⁺ release. Voltage-induced Ca²⁺ released in isolated single muscle fibers of shRNA-Bin1 showed that both the mean amplitude of Ca²⁺ current and SR Ca²⁺ transient were reduced when compared to the shRNA-control, indicating compromised coupling between DHPR and ryanodine receptor 1. The mean frequency of osmotic stress induced Ca²⁺ sparks was reduced in shRNA-Bin1, indicating compromised DHPR activation. ShRNA-Bin1 fibers also displayed reduced Ca²⁺ sparks' amplitude that was attributed to decreased total Ca²⁺ stores in the shRNA-Bin1 fibers. Human mutation of Bin1 is associated with centronuclear myopathy and SH3 domain of Bin1 is important for sarcomeric protein organization in skeletal muscle. Our study showing the importance of Bin1 in the maintenance of intact t-tubule structure and ([Ca

  2. A statistic to estimate the variance of the histogram-based mutual information estimator based on dependent pairs of observations

    NARCIS (Netherlands)

    Moddemeijer, R

    In the case of two signals with independent pairs of observations (x(n),y(n)) a statistic to estimate the variance of the histogram based mutual information estimator has been derived earlier. We present such a statistic for dependent pairs. To derive this statistic it is necessary to avail of a

  3. TaBoo SeArch Algorithm with a Modified Inverse Histogram for Reproducing Biologically Relevant Rare Events of Proteins.

    Science.gov (United States)

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2016-05-10

    The TaBoo SeArch (TBSA) algorithm [ Harada et al. J. Comput. Chem. 2015 , 36 , 763 - 772 and Harada et al. Chem. Phys. Lett. 2015 , 630 , 68 - 75 ] was recently proposed as an enhanced conformational sampling method for reproducing biologically relevant rare events of a given protein. In TBSA, an inverse histogram of the original distribution, mapped onto a set of reaction coordinates, is constructed from trajectories obtained by multiple short-time molecular dynamics (MD) simulations. Rarely occurring states of a given protein are statistically selected as new initial states based on the inverse histogram, and resampling is performed by restarting the MD simulations from the new initial states to promote the conformational transition. In this process, the definition of the inverse histogram, which characterizes the rarely occurring states, is crucial for the efficiency of TBSA. In this study, we propose a simple modification of the inverse histogram to further accelerate the convergence of TBSA. As demonstrations of the modified TBSA, we applied it to (a) hydrogen bonding rearrangements of Met-enkephalin, (b) large-amplitude domain motions of Glutamine-Binding Protein, and (c) folding processes of the B domain of Staphylococcus aureus Protein A. All demonstrations numerically proved that the modified TBSA reproduced these biologically relevant rare events with nanosecond-order simulation times, although a set of microsecond-order, canonical MD simulations failed to reproduce the rare events, indicating the high efficiency of the modified TBSA.

  4. Transmission usage cost allocation schemes

    International Nuclear Information System (INIS)

    Abou El Ela, A.A.; El-Sehiemy, R.A.

    2009-01-01

    This paper presents different suggested transmission usage cost allocation (TCA) schemes to the system individuals. Different independent system operator (ISO) visions are presented using the proportional rata and flow-based TCA methods. There are two proposed flow-based TCA schemes (FTCA). The first FTCA scheme generalizes the equivalent bilateral exchanges (EBE) concepts for lossy networks through two-stage procedure. The second FTCA scheme is based on the modified sensitivity factors (MSF). These factors are developed from the actual measurements of power flows in transmission lines and the power injections at different buses. The proposed schemes exhibit desirable apportioning properties and are easy to implement and understand. Case studies for different loading conditions are carried out to show the capability of the proposed schemes for solving the TCA problem. (author)

  5. Legitimate Allocation of Public Healthcare

    DEFF Research Database (Denmark)

    Lippert-Rasmussen, Kasper; Lauridsen, Sigurd

    2009-01-01

    Citizens' consent to political decisions is often regarded as a necessary condition of political legitimacy. Consequently, legitimate allocation of healthcare has seemed almost unattainable in contemporary pluralistic societies. The problem is that citizens do not agree on any single principle...... governing priorities among groups of patients. The Accountability for Reasonableness (A4R) framework suggests an ingenious solution to this problem of moral disagreement. Rather than advocating any substantive distributive principle, its advocates propose a feasible set of conditions, which, if met...... by decision makers at the institutional level, provide, so it is promised, legitimate decisions. While we agree that A4R represents an important contribution to the priority-setting debate, we challenge the framework in two respects. First, we argue that A4R, and more specifically the relevance condition of A...

  6. Theory of stable allocations II

    Directory of Open Access Journals (Sweden)

    Pantelić Svetlana

    2015-01-01

    Full Text Available The Swedish Royal Academy awarded the 2012 Nobel Prize in Economics to Lloyd Shapley and Alvin Roth, for the theory of stable allocations and the practice of market design. These two American researchers worked independently from each other, combining basic theory and empirical investigations. Through their experiments and practical design they generated a flourishing field of research and improved the performance of many markets. Shapley provided the fundamental theoretical contribution to this field of research, whereas Roth, a professor at the Harvard University in Boston, developed and upgraded these theoretical investigations by applying them to the American market of medical doctors. Namely, their research helps explain the market processes at work, for instance, when doctors are assigned to hospitals, students to schools and human organs for transplant to recipients.

  7. Cognitive radio networks dynamic resource allocation schemes

    CERN Document Server

    Wang, Shaowei

    2014-01-01

    This SpringerBrief presents a survey of dynamic resource allocation schemes in Cognitive Radio (CR) Systems, focusing on the spectral-efficiency and energy-efficiency in wireless networks. It also introduces a variety of dynamic resource allocation schemes for CR networks and provides a concise introduction of the landscape of CR technology. The author covers in detail the dynamic resource allocation problem for the motivations and challenges in CR systems. The Spectral- and Energy-Efficient resource allocation schemes are comprehensively investigated, including new insights into the trade-off

  8. Analytical utility of mass spectral binning in proteomic experiments by SPectral Immonium Ion Detection (SPIID)

    DEFF Research Database (Denmark)

    Kelstrup, Christian D; Freese, Christian; Heck, Albert J R

    2014-01-01

    , increasing the demand for advanced data interpretation. Several PTMs are known to generate unique fragment ions during tandem mass spectrometry (MS/MS), the so-called diagnostic ions, which unequivocally identifies that a given mass spectrum relates to a specific PTM. Although such ions hold tremendous......Unambiguous identification of tandem mass spectra is a cornerstone in mass spectrometry (MS)-based proteomics. As the study of post-translational modifications (PTMs) by shotgun proteomics progresses in depth and coverage, the ability to correctly identify PTM-bearing peptides is essential......, formylation and lysine acetylation containing samples. Using the developed software tool we are able to identify known diagnostic ions by comparing histograms of modified and unmodified peptide spectra. Since the investigated tandem mass spectra data are acquired with high mass accuracy, unambiguous...

  9. Evaluation of dose-volume histograms after prostate seed implantation. 4-year experience

    International Nuclear Information System (INIS)

    Hoinkis, C.; Lehmann, D.; Winkler, C.; Herrmann, T.; Hakenberg, O.W.; Wirth, M.P.

    2004-01-01

    Background and purpose: permanent interstitial brachytherapy by seed implantation is a treatment alternative for low-volume low-risk prostate cancer and a complex interdisciplinary treatment with a learning curve. Dose-volume histograms are used to assess postimplant quality. The authors evaluated their learning curve based on dose-volume histograms and analyzed factors influencing implantation quality. Patients and methods: since 1999, 38 patients with a minimum follow-up of 6 months were treated at the authors' institution with seed implantation using palladium-103 or iodine-125, initially using the preplan method and later real-time planning. Postimplant CT was performed after 4 weeks. The dose-volume indices D90, V100, V150, the D max of pre- and postplans, and the size and position of the volume receiving the prescribed dose (high-dose volume) of the postplans were evaluated. In six patients, postplan imaging both by CT and MRI was used and prostate volumes were compared with preimplant transrectal ultrasound volumes. The first five patients were treated under external supervision. Results: patients were divided into three consecutive groups for analysis of the learning curve (group 1: n = 5 patients treated under external supervision; group 2: n = 13 patients; group 3: n = 20 patients). D90 post for the three groups were 79.3%, 74.2%, and 99.9%, the V100 post were 78.6%, 73.5%, and 88.2%, respectively. The relationship between high-dose volume and prostate volume showed a similar increase as the D90, while the relationship between high-dose volume lying outside the prostate and prostate volume remained constant. The ratio between prostate volumes from transrectal ultrasound and CT imaging decreased with increasing D90 post , while the preplanning D90 and V100 remained constant. The different isotopes used, the method of planning, and the implanted activity per prostate volume did not influence results. Conclusion: a learning curve characterized by an increase

  10. Registration for Optical Multimodal Remote Sensing Images Based on FAST Detection, Window Selection, and Histogram Specification

    Directory of Open Access Journals (Sweden)

    Xiaoyang Zhao

    2018-04-01

    Full Text Available In recent years, digital frame cameras have been increasingly used for remote sensing applications. However, it is always a challenge to align or register images captured with different cameras or different imaging sensor units. In this research, a novel registration method was proposed. Coarse registration was first applied to approximately align the sensed and reference images. Window selection was then used to reduce the search space and a histogram specification was applied to optimize the grayscale similarity between the images. After comparisons with other commonly-used detectors, the fast corner detector, FAST (Features from Accelerated Segment Test, was selected to extract the feature points. The matching point pairs were then detected between the images, the outliers were eliminated, and geometric transformation was performed. The appropriate window size was searched and set to one-tenth of the image width. The images that were acquired by a two-camera system, a camera with five imaging sensors, and a camera with replaceable filters mounted on a manned aircraft, an unmanned aerial vehicle, and a ground-based platform, respectively, were used to evaluate the performance of the proposed method. The image analysis results showed that, through the appropriate window selection and histogram specification, the number of correctly matched point pairs had increased by 11.30 times, and that the correct matching rate had increased by 36%, compared with the results based on FAST alone. The root mean square error (RMSE in the x and y directions was generally within 0.5 pixels. In comparison with the binary robust invariant scalable keypoints (BRISK, curvature scale space (CSS, Harris, speed up robust features (SURF, and commercial software ERDAS and ENVI, this method resulted in larger numbers of correct matching pairs and smaller, more consistent RMSE. Furthermore, it was not necessary to choose any tie control points manually before registration

  11. Peer-Allocated Instant Response (PAIR): Computional allocation of peer tutors in learning communities

    NARCIS (Netherlands)

    Westera, Wim

    2009-01-01

    Westera, W. (2007). Peer-Allocated Instant Response (PAIR): Computational allocation of peer tutors in learning communities. Journal of Artificial Societies and Social Simulation, http://jasss.soc.surrey.ac.uk/10/2/5.html

  12. Os Números Binários: do saber escolar ao saber científico Dissertação

    OpenAIRE

    Mendes, Herman do Lago

    2015-01-01

    Os números binários são utilizados atualmente como elemento necessário e fundamental na comunicação entre artefatos tecnológicos digitais por serem utilizados como representação de número (sequências de 0s e 1s) em codificações de caracteres, de imagens, de sons, de qualquer outro tipo de informação. A partir desta aplicação social proporcionada pelo conhecimento científico de números binários, é proposto investigar o lócus dos Números Binários, enquanto saber escolar, saber a ser ensinado e ...

  13. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    International Nuclear Information System (INIS)

    Lott, B.; Escande, L.; Larsson, S.; Ballet, J.

    2012-01-01

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  14. Development of an aerosol microphysical module: Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS)

    Science.gov (United States)

    Matsui, H.; Koike, M.; Kondo, Y.; Fast, J. D.; Takigawa, M.

    2014-09-01

    Number concentrations, size distributions, and mixing states of aerosols are essential parameters for accurate estimations of aerosol direct and indirect effects. In this study, we develop an aerosol module, designated the Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS), that can explicitly represent these parameters by considering new particle formation (NPF), black carbon (BC) aging, and secondary organic aerosol (SOA) processes. A two-dimensional bin representation is used for particles with dry diameters from 40 nm to 10 μm to resolve both aerosol sizes (12 bins) and BC mixing states (10 bins) for a total of 120 bins. The particles with diameters between 1 and 40 nm are resolved using additional eight size bins to calculate NPF. The ATRAS module is implemented in the WRF-Chem model and applied to examine the sensitivity of simulated mass, number, size distributions, and optical and radiative parameters of aerosols to NPF, BC aging, and SOA processes over East Asia during the spring of 2009. The BC absorption enhancement by coating materials is about 50% over East Asia during the spring, and the contribution of SOA processes to the absorption enhancement is estimated to be 10-20% over northern East Asia and 20-35% over southern East Asia. A clear north-south contrast is also found between the impacts of NPF and SOA processes on cloud condensation nuclei (CCN) concentrations: NPF increases CCN concentrations at higher supersaturations (smaller particles) over northern East Asia, whereas SOA increases CCN concentrations at lower supersaturations (larger particles) over southern East Asia. The application of ATRAS in East Asia also shows that the impact of each process on each optical and radiative parameter depends strongly on the process and the parameter in question. The module can be used in the future as a benchmark model to evaluate the accuracy of simpler aerosol models and examine interactions between NPF, BC aging, and SOA

  15. Development of an aerosol microphysical module: Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS)

    Energy Technology Data Exchange (ETDEWEB)

    Matsui, H.; Koike, Makoto; Kondo, Yutaka; Fast, Jerome D.; Takigawa, M.

    2014-09-30

    Number concentrations, size distributions, and mixing states of aerosols are essential parameters for accurate estimation of aerosol direct and indirect effects. In this study, we developed an aerosol module, designated Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS), that can represent these parameters explicitly by considering new particle formation (NPF), black carbon (BC) aging, and secondary organic aerosol (SOA) processes. A two-dimensional bin representation is used for particles with dry diameters from 40 nm to 10 µm to resolve both aerosol size (12 bins) and BC mixing state (10 bins) for a total of 120 bins. The particles with diameters from 1 to 40 nm are resolved using an additional 8 size bins to calculate NPF. The ATRAS module was implemented in the WRF-chem model and applied to examine the sensitivity of simulated mass, number, size distributions, and optical and radiative parameters of aerosols to NPF, BC aging and SOA processes over East Asia during the spring of 2009. BC absorption enhancement by coating materials was about 50% over East Asia during the spring, and the contribution of SOA processes to the absorption enhancement was estimated to be 10 – 20% over northern East Asia and 20 – 35% over southern East Asia. A clear north-south contrast was also found between the impacts of NPF and SOA processes on cloud condensation nuclei (CCN) concentrations: NPF increased CCN concentrations at higher supersaturations (smaller particles) over northern East Asia, whereas SOA increased CCN concentrations at lower supersaturations (larger particles) over southern East Asia. Application of ATRAS to East Asia also showed that the impact of each process on each optical and radiative parameter depended strongly on the process and the parameter in question. The module can be used in the future as a benchmark model to evaluate the accuracy of simpler aerosol models and examine interactions between NPF, BC aging, and SOA

  16. Estimating bulk density of compacted grains in storage bins and modifications of Janssen's load equations as affected by bulk density.

    Science.gov (United States)

    Haque, Ekramul

    2013-03-01

    Janssen created a classical theory based on calculus to estimate static vertical and horizontal pressures within beds of bulk corn. Even today, his equations are widely used to calculate static loadings imposed by granular materials stored in bins. Many standards such as American Concrete Institute (ACI) 313, American Society of Agricultural and Biological Engineers EP 433, German DIN 1055, Canadian Farm Building Code (CFBC), European Code (ENV 1991-4), and Australian Code AS 3774 incorporated Janssen's equations as the standards for static load calculations on bins. One of the main drawbacks of Janssen's equations is the assumption that the bulk density of the stored product remains constant throughout the entire bin. While for all practical purposes, this is true for small bins; in modern commercial-size bins, bulk density of grains substantially increases due to compressive and hoop stresses. Over pressure factors are applied to Janssen loadings to satisfy practical situations such as dynamic loads due to bin filling and emptying, but there are limited theoretical methods available that include the effects of increased bulk density on the loadings of grain transmitted to the storage structures. This article develops a mathematical equation relating the specific weight as a function of location and other variables of materials and storage. It was found that the bulk density of stored granular materials increased with the depth according to a mathematical equation relating the two variables, and applying this bulk-density function, Janssen's equations for vertical and horizontal pressures were modified as presented in this article. The validity of this specific weight function was tested by using the principles of mathematics. As expected, calculations of loads based on the modified equations were consistently higher than the Janssen loadings based on noncompacted bulk densities for all grain depths and types accounting for the effects of increased bulk densities

  17. Modified strip packing heuristics for the rectangular variable-sized bin packing problem

    Directory of Open Access Journals (Sweden)

    FG Ortmann

    2010-06-01

    Full Text Available Two packing problems are considered in this paper, namely the well-known strip packing problem (SPP and the variable-sized bin packing problem (VSBPP. A total of 252 strip packing heuristics (and variations thereof from the literature, as well as novel heuristics proposed by the authors, are compared statistically by means of 1170 SPP benchmark instances in order to identify the best heuristics in various classes. A combination of new heuristics with a new sorting method yields the best results. These heuristics are combined with a previous heuristic for the VSBPP by the authors to find good feasible solutions to 1357 VSBPP benchmark instances. This is the largest statistical comparison of algorithms for the SPP and the VSBPP to the best knowledge of the authors.

  18. Face Image Retrieval of Efficient Sparse Code words and Multiple Attribute in Binning Image

    Directory of Open Access Journals (Sweden)

    Suchitra S

    2017-08-01

    Full Text Available ABSTRACT In photography, face recognition and face retrieval play an important role in many applications such as security, criminology and image forensics. Advancements in face recognition make easier for identity matching of an individual with attributes. Latest development in computer vision technologies enables us to extract facial attributes from the input image and provide similar image results. In this paper, we propose a novel LOP and sparse codewords method to provide similar matching results with respect to input query image. To improve accuracy in image results with input image and dynamic facial attributes, Local octal pattern algorithm [LOP] and Sparse codeword applied in offline and online. The offline and online procedures in face image binning techniques apply with sparse code. Experimental results with Pubfig dataset shows that the proposed LOP along with sparse codewords able to provide matching results with increased accuracy of 90%.

  19. Underwater image quality enhancement of sea cucumbers based on improved histogram equalization and wavelet transform

    Directory of Open Access Journals (Sweden)

    Xi Qiao

    2017-09-01

    Full Text Available Sea cucumbers usually live in an environment where lighting and visibility are generally not controllable, which cause the underwater image of sea cucumbers to be distorted, blurred, and severely attenuated. Therefore, the valuable information from such an image cannot be fully extracted for further processing. To solve the problems mentioned above and improve the quality of the underwater images of sea cucumbers, pre-processing of a sea cucumber image is attracting increasing interest. This paper presents a new method based on contrast limited adaptive histogram equalization and wavelet transform (CLAHE-WT to enhance the sea cucumber image quality. CLAHE was used to process the underwater image for increasing contrast based on the Rayleigh distribution, and WT was used for de-noising based on a soft threshold. Qualitative analysis indicated that the proposed method exhibited better performance in enhancing the quality and retaining the image details. For quantitative analysis, the test with 120 underwater images showed that for the proposed method, the mean square error (MSE, peak signal to noise ratio (PSNR, and entropy were 49.2098, 13.3909, and 6.6815, respectively. The proposed method outperformed three established methods in enhancing the visual quality of sea cucumber underwater gray image.

  20. Sliding window adaptive histogram equalization of intraoral radiographs: effect on image quality.

    Science.gov (United States)

    Sund, T; Møystad, A

    2006-05-01

    To investigate whether contrast enhancement by non-interactive, sliding window adaptive histogram equalization (SWAHE) can enhance the image quality of intraoral radiographs in the dental clinic. Three dentists read 22 periapical and 12 bitewing storage phosphor (SP) radiographs. For the periapical readings they graded the quality of the examination with regard to visually locating the root apex. For the bitewing readings they registered all occurrences of approximal caries on a confidence scale. Each reading was first done on an unprocessed radiograph ("single-view"), and then re-done with the image processed with SWAHE displayed beside the unprocessed version ("twin-view"). The processing parameters for SWAHE were the same for all the images. For the periapical examinations, twin-view was judged to raise the image quality for 52% of those cases where the single-view quality was below the maximum. For the bitewing radiographs, there was a change of caries classification (both positive and negative) with twin-view in 19% of the cases, but with only a 3% net increase in the total number of caries registrations. For both examinations interobserver variance was unaffected. Non-interactive SWAHE applied to dental SP radiographs produces a supplemental contrast enhanced image which in twin-view reading improves the image quality of periapical examinations. SWAHE also affects caries diagnosis of bitewing images, and further study using a gold standard is warranted.

  1. Parameters of proteome evolution from histograms of amino-acid sequence identities of paralogous proteins

    Directory of Open Access Journals (Sweden)

    Yan Koon-Kiu

    2007-11-01

    Full Text Available Abstract Background The evolution of the full repertoire of proteins encoded in a given genome is mostly driven by gene duplications, deletions, and sequence modifications of existing proteins. Indirect information about relative rates and other intrinsic parameters of these three basic processes is contained in the proteome-wide distribution of sequence identities of pairs of paralogous proteins. Results We introduce a simple mathematical framework based on a stochastic birth-and-death model that allows one to extract some of this information and apply it to the set of all pairs of paralogous proteins in H. pylori, E. coli, S. cerevisiae, C. elegans, D. melanogaster, and H. sapiens. It was found that the histogram of sequence identities p generated by an all-to-all alignment of all protein sequences encoded in a genome is well fitted with a power-law form ~ p-γ with the value of the exponent γ around 4 for the majority of organisms used in this study. This implies that the intra-protein variability of substitution rates is best described by the Gamma-distribution with the exponent α ≈ 0.33. Different features of the shape of such histograms allow us to quantify the ratio between the genome-wide average deletion/duplication rates and the amino-acid substitution rate. Conclusion We separately measure the short-term ("raw" duplication and deletion rates rdup∗ MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemOCai3aa0baaSqaaiabbsgaKjabbwha1jabbchaWbqaaiabgEHiQaaaaaa@3283@, rdel∗ MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemOCai3aa0baaSqaaiabbsga

  2. Estimating Selectivity for Current Query of Moving Objects Using Index-Based Histogram

    Science.gov (United States)

    Chi, Jeong Hee; Kim, Sang Ho

    Selectivity estimation is one of the query optimization techniques. It is difficult for the previous selectivity estimation techniques for moving objects to apply the location change of moving objects to synopsis. Therefore, they result in much error when estimating selectivity for queries, because they are based on the extended spatial synopsis which does not consider the property of the moving objects. In order to reduce the estimation error, the existing techniques should often rebuild the synopsis. Consequently problem occurs, that is, the whole database should be read frequently. In this paper, we proposed a moving object histogram method based on quadtree to develop a selectivity estimation technique for moving object queries. We then analyzed the performance of the proposed method through the implementation and evaluation of the proposed method. Our method can be used in various location management systems such as vehicle location tracking systems, location based services, telematics services, emergency rescue service, etc in which the location information of moving objects changes over time.

  3. Classification of amyloid status using machine learning with histograms of oriented 3D gradients

    Directory of Open Access Journals (Sweden)

    Liam Cattell

    2016-01-01

    Full Text Available Brain amyloid burden may be quantitatively assessed from positron emission tomography imaging using standardised uptake value ratios. Using these ratios as an adjunct to visual image assessment has been shown to improve inter-reader reliability, however, the amyloid positivity threshold is dependent on the tracer and specific image regions used to calculate the uptake ratio. To address this problem, we propose a machine learning approach to amyloid status classification, which is independent of tracer and does not require a specific set of regions of interest. Our method extracts feature vectors from amyloid images, which are based on histograms of oriented three-dimensional gradients. We optimised our method on 133 18F-florbetapir brain volumes, and applied it to a separate test set of 131 volumes. Using the same parameter settings, we then applied our method to 209 11C-PiB images and 128 18F-florbetaben images. We compared our method to classification results achieved using two other methods: standardised uptake value ratios and a machine learning method based on voxel intensities. Our method resulted in the largest mean distances between the subjects and the classification boundary, suggesting that it is less likely to make low-confidence classification decisions. Moreover, our method obtained the highest classification accuracy for all three tracers, and consistently achieved above 96% accuracy.

  4. Single-channel blind separation using pseudo-stereo mixture and complex 2-D histogram.

    Science.gov (United States)

    Tengtrairat, N; Gao, Bin; Woo, W L; Dlay, S S

    2013-11-01

    A novel single-channel blind source separation (SCBSS) algorithm is presented. The proposed algorithm yields at least three benefits of the SCBSS solution: 1) resemblance of a stereo signal concept given by one microphone; 2) independent of initialization and a priori knowledge of the sources; and 3) it does not require iterative optimization. The separation process consists of two steps: 1) estimation of source characteristics, where the source signals are modeled by the autoregressive process and 2) construction of masks using only the single-channel mixture. A new pseudo-stereo mixture is formulated by weighting and time-shifting the original single-channel mixture. This creates an artificial mixing system whose parameters will be estimated through our proposed weighted complex 2-D histogram. In this paper, we derive the separability of the proposed mixture model. Conditions required for unique mask construction based on maximum likelihood are also identified. Finally, experimental testing on both synthetic and real-audio sources is conducted to verify that the proposed algorithm yields superior performance and is computationally very fast compared with existing methods.

  5. Face Recognition Performance Improvement using a Similarity Score of Feature Vectors based on Probabilistic Histograms

    Directory of Open Access Journals (Sweden)

    SRIKOTE, G.

    2016-08-01

    Full Text Available This paper proposes an improved performance algorithm of face recognition to identify two face mismatch pairs in cases of incorrect decisions. The primary feature of this method is to deploy the similarity score with respect to Gaussian components between two previously unseen faces. Unlike the conventional classical vector distance measurement, our algorithms also consider the plot of summation of the similarity index versus face feature vector distance. A mixture of Gaussian models of labeled faces is also widely applicable to different biometric system parameters. By comparative evaluations, it has been shown that the efficiency of the proposed algorithm is superior to that of the conventional algorithm by an average accuracy of up to 1.15% and 16.87% when compared with 3x3 Multi-Region Histogram (MRH direct-bag-of-features and Principal Component Analysis (PCA-based face recognition systems, respectively. The experimental results show that similarity score consideration is more discriminative for face recognition compared to feature distance. Experimental results of Labeled Face in the Wild (LFW data set demonstrate that our algorithms are suitable for real applications probe-to-gallery identification of face recognition systems. Moreover, this proposed method can also be applied to other recognition systems and therefore additionally improves recognition scores.

  6. Dose Volume Histogram analysis for rectum and urethral reaction of prostate cancer

    International Nuclear Information System (INIS)

    Yanagi, Takeshi; Tsuji, Hiroshi; Kamada, Tadashi; Tsujii, Hirohiko

    2005-01-01

    The aim of this study is to evaluate the clinically relevant parameters for rectum and urethral reaction using DVH (dose volume histogram) in carbon ion radiotherapy of prostate cancer. In this year, we studied the urinary reaction mainly. 35 patients with prostate cancer were treated with carbon ion beams between June 1995 and December 1997. The applied dose was escalated from 54.0 GyE to 72.0 GyE in fixed 20 fractions. Clinical urinary reaction and rectum reaction were reviewed using Radiation Therapy Oncology Group (RTOG) scoring system for acute reactions, RTOG/European Organization for Research and Treatment of Cancer (EORTC) scoring system for late reactions. Taking the ROI (region of interest) for DVH of urethra, we used surrogate one that was derived from the observation of MR images. 35 patients were analyzed for acute urinary reaction and 34 for late urinary reaction in the study of this year. DVH analysis suggested difference among the grades for acute and late reactions. These analysis appears to be a useful tool for predicting the urinary reactions. (author)

  7. Parallel implementation of the adaptive neighborhood contrast enhancement technique using histogram-based image partitioning

    Science.gov (United States)

    Rangayyan, Rangaraj M.; Alto, Hilary; Gavrilov, Dmitri

    2001-07-01

    An adaptive neighborhood contrast enhancement (ANCE) technique was developed to improve the perceptibility of features in digitized mammographic images for use in breast cancer screening. The computationally intensive algorithm was implemented on a cluster of 30 COMPAQ Alpha processors using the message passing interface. The parallel implementation of the ANCE technique utilizes histogram- based image partitioning with each partition consisting of a list of gray-level values. The master processor allots one set of gray-level values to each slave processor. Each slave locates all possible seed pixels in the image with the designated gray-level values, grows a region around each pixel, enhances the contrast of the seed and any redundant seed pixels if required, and returns the results to the master. The master then sends a new set of gray-level values to the slave for processing. The subdivision of the original image based on gray-level values guarantees that slave processors do not process the same pixel, and is particularly well suited to the characteristics of the ANCE algorithm. The parallelism value of the problem is in the range of 16 - 20; the performance did not improve significantly when more than 16 processors were used. The performance declined when more than 20 processors were used. The result is a substantial improvement in processing time, leading to the enhancement of 4 K X 4 K pixel images in the range of 30 - 90 s.

  8. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  9. Histogram-Based Thresholding for Detection and Quantification of Hemorrhages in Retinal Images

    Directory of Open Access Journals (Sweden)

    Hussain Fadhel Hamdan Jaafar

    2016-12-01

    Full Text Available Retinal image analysis is commonly used for the detection and quantification of retinal diabetic retinopathy. In retinal images, dark lesions including hemorrhages and microaneurysms are the earliest warnings of vision loss. In this paper, new algorithm for extraction and quantification of hemorrhages in fundus images is presented. Hemorrhage candidates are extracted in a preliminary step as a coarse segmentation followed by a fine segmentation step. Local variation processes are applied in the coarse segmentation step to determine boundaries of all candidates with distinct edges. Fine segmentation processes are based on histogram thresholding to extract real hemorrhages from the segmented candidates locally. The proposed method was trained and tested using an image dataset of 153 manually labeled retinal images. At the pixel level, the proposed method could identify abnormal retinal images with 90.7% sensitivity and 85.1% predictive value. Due to its distinctive performance measurements, this technique demonstrates that it could be used for a computer-aided mass screening of retinal diseases.

  10. An adaptive bin framework search method for a beta-sheet protein homopolymer model

    Directory of Open Access Journals (Sweden)

    Hoos Holger H

    2007-04-01

    Full Text Available Abstract Background The problem of protein structure prediction consists of predicting the functional or native structure of a protein given its linear sequence of amino acids. This problem has played a prominent role in the fields of biomolecular physics and algorithm design for over 50 years. Additionally, its importance increases continually as a result of an exponential growth over time in the number of known protein sequences in contrast to a linear increase in the number of determined structures. Our work focuses on the problem of searching an exponentially large space of possible conformations as efficiently as possible, with the goal of finding a global optimum with respect to a given energy function. This problem plays an important role in the analysis of systems with complex search landscapes, and particularly in the context of ab initio protein structure prediction. Results In this work, we introduce a novel approach for solving this conformation search problem based on the use of a bin framework for adaptively storing and retrieving promising locally optimal solutions. Our approach provides a rich and general framework within which a broad range of adaptive or reactive search strategies can be realized. Here, we introduce adaptive mechanisms for choosing which conformations should be stored, based on the set of conformations already stored in memory, and for biasing choices when retrieving conformations from memory in order to overcome search stagnation. Conclusion We show that our bin framework combined with a widely used optimization method, Monte Carlo search, achieves significantly better performance than state-of-the-art generalized ensemble methods for a well-known protein-like homopolymer model on the face-centered cubic lattice.

  11. Credit allocation for research institutes

    Science.gov (United States)

    Wang, J.-P.; Guo, Q.; Yang, K.; Han, J.-T.; Liu, J.-G.

    2017-05-01

    It is a challenging work to assess research performance of multiple institutes. Considering that it is unfair to average the credit to the institutes which is in the different order from a paper, in this paper, we present a credit allocation method (CAM) with a weighted order coefficient for multiple institutes. The results for the APS dataset with 18987 institutes show that top-ranked institutes obtained by the CAM method correspond to well-known universities or research labs with high reputation in physics. Moreover, we evaluate the performance of the CAM method when citation links are added or rewired randomly quantified by the Kendall's Tau and Jaccard index. The experimental results indicate that the CAM method has better performance in robustness compared with the total number of citations (TC) method and Shen's method. Finally, we give the first 20 Chinese universities in physics obtained by the CAM method. However, this method is valid for any other branch of sciences, not just for physics. The proposed method also provides universities and policy makers an effective tool to quantify and balance the academic performance of university.

  12. Time allocation of disabled individuals.

    Science.gov (United States)

    Pagán, Ricardo

    2013-05-01

    Although some studies have analysed the disability phenomenon and its effect on, for example, labour force participation, wages, job satisfaction, or the use of disability pension, the empirical evidence on how disability steals time (e.g. hours of work) from individuals is very scarce. This article examines how disabled individuals allocate their time to daily activities as compared to their non-disabled counterparts. Using time diary information from the Spanish Time Use Survey (last quarter of 2002 and the first three quarters of 2003), we estimate the determinants of time (minutes per day) spent on four aggregate categories (market work, household production, tertiary activities and leisure) for a sample of 27,687 non-disabled and 5250 disabled individuals and decompose the observed time differential by using the Oaxaca-Blinder methodology. The results show that disabled individuals devote less time to market work (especially females), and more time to household production (e.g. cooking, cleaning, child care), tertiary activities (e.g., sleeping, personal care, medical treatment) and leisure activities. We also find a significant effect of age on the time spent on daily activities and important differences by gender and disability status. The results are consistent with the hypothesis that disability steals time, and reiterate the fact that more public policies are needed to balance working life and health concerns among disabled individuals. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Resource allocation based on cost efficiency

    DEFF Research Database (Denmark)

    Dehnokhalaji, Akram; Ghiyasi, Mojtaba; Korhonen, Pekka

    2017-01-01

    In this paper, we consider a resource allocation (RA) problem and develop an approach based on cost (overall) efficiency. The aim is to allocate some inputs among decision making units (DMUs) in such way that their cost efficiencies improve or stay unchanged after RA. We formulate a multi...... examples and an empirical illustration are also provided....

  14. 'Unconscionable and irrational' SAPS human resource allocation ...

    African Journals Online (AJOL)

    These areas also suffer among the highest rates of murder and serious violent crime in the province. The allocation of human resources to policing impinges on various constitutional rights. Given the inequity and irrationality apparent in the allocation of police personnel, the Khayelitsha Commission recommended that this ...

  15. Nash Social Welfare in Multiagent Resource Allocation

    NARCIS (Netherlands)

    S. Ramezani (Sara); U. Endriss; E. David; E.H. Gerding (Enrico); D. Sarne; O. Shehory (Onn)

    2010-01-01

    htmlabstractWe study different aspects of the multiagent resource allocation problem when the objective is to find an allocation that maximizes Nash social welfare, the product of the utilities of the individual agents. The Nash solution is an important welfare criterion that combines efficiency and

  16. Obtaining a Proportional Allocation by Deleting Items

    NARCIS (Netherlands)

    Dorn, B.; de Haan, R.; Schlotter, I.; Röthe, J.

    2017-01-01

    We consider the following control problem on fair allocation of indivisible goods. Given a set I of items and a set of agents, each having strict linear preference over the items, we ask for a minimum subset of the items whose deletion guarantees the existence of a proportional allocation in the

  17. Risk and reliability allocation to risk control

    International Nuclear Information System (INIS)

    Vojnovic, D.; Kozuh, M.

    1992-01-01

    The risk allocation procedure is used as an analytical model to support the optimal decision making for reliability/availability improvement planning. Both levels of decision criteria, the plant risk measures and plant performance indices, are used in risk allocation procedure. Decision support system uses the multi objective decision making concept. (author) [sl

  18. Optimal Allocation in Stratified Randomized Response Model

    Directory of Open Access Journals (Sweden)

    Javid Shabbir

    2005-07-01

    Full Text Available A Warner (1965 randomized response model based on stratification is used to determine the allocation of samples. Both linear and log-linear cost functions are discussed under uni and double stratification. It observed that by using a log-linear cost function, one can get better allocations.

  19. Bounds in the location-allocation problem

    DEFF Research Database (Denmark)

    Juel, Henrik

    1981-01-01

    Develops a family of stronger lower bounds on the objective function value of the location-allocation problem. Solution methods proposed to solve problems in location-allocation; Efforts to develop a more efficient bound solution procedure; Determination of the locations of the sources....

  20. Histogram Analysis of CT Perfusion of Hepatocellular Carcinoma for Predicting Response to Transarterial Radioembolization: Value of Tumor Heterogeneity Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Reiner, Caecilia S., E-mail: caecilia.reiner@usz.ch; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology (Switzerland); Schaefer, Niklaus; Veit-Haibach, Patrick [University Hospital Zurich, Division of Nuclear Medicine (Switzerland); Pfammatter, Thomas; Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology (Switzerland)

    2016-03-15

    PurposeTo evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE).Materials and MethodsSixteen patients (15 male; mean age 65 years; age range 47–80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters’ ability to discriminate responders from non-responders.ResultsAccording to mRECIST, 8 patients (50 %) were responders and 8 (50 %) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min{sup −1} 100 mL{sup −1}); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min{sup −1} 100 mL{sup −1}; p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min{sup −1} 100 mL{sup −1}, therapy response could be predicted with a sensitivity of 88 % (7/8) and specificity of 75 % (6/8).ConclusionVoxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.

  1. Optimal allocation of reviewers for peer feedback

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Jensen, Ulf Aslak; Jørgensen, Rasmus Malthe

    2017-01-01

    to submissions in such a way that all submissions receive feedback of similar quality and that we are able to significantly outperform simple random allocation of reviewers. Additionally we investigate the effect of pre-allocating reviews in comparison to allocating reviewers live during the review process......Peer feedback is the act of letting students give feedback to each other on submitted work. There are multiple reasons to use peer feedback, including students getting more feedback, time saving for teachers and increased learning by letting students reflect on work by others. In order for peer...... feedback to be effective students should give and receive useful feedback. A key challenge in peer feedback is allocating the feedback givers in a good way. It is important that reviewers are allocated to submissions such that the feedback distribution is fair - meaning that all students receive good...

  2. Optimal allocation of resources in systems

    International Nuclear Information System (INIS)

    Derman, C.; Lieberman, G.J.; Ross, S.M.

    1975-01-01

    In the design of a new system, or the maintenance of an old system, allocation of resources is of prime consideration. In allocating resources it is often beneficial to develop a solution that yields an optimal value of the system measure of desirability. In the context of the problems considered in this paper the resources to be allocated are components already produced (assembly problems) and money (allocation in the construction or repair of systems). The measure of desirability for system assembly will usually be maximizing the expected number of systems that perform satisfactorily and the measure in the allocation context will be maximizing the system reliability. Results are presented for these two types of general problems in both a sequential (when appropriate) and non-sequential context

  3. Online: a program to display histograms and control monitor processes on the WA62 VAX data acquisition system at CERN

    International Nuclear Information System (INIS)

    Hand, R.P.

    1981-02-01

    ONLINE is a program which can be launched from any terminal on the WA62 experiment's DEC VAX 11/780 computer when the Native mode data acquisition system is running. It is used to display histograms produced by the various experiment monitor processes running under the system and can establish links with such processes to allow the user to issue monitor commands and change internal monitor process parameters. This report describes the criteria used in the design of ONLINE and shows some of the features of the VAX/VMS Operating System which are used to access histograms produced by monitor processes, to establish communications links with monitor processes and to provide the user with an easy to learn system for the examination of online experimental data in a graphical form. Also given, is a brief account of the way monitor processes are structured and how this structure facilitates user-monitor dialogue. (author)

  4. ADC histograms predict response to anti-angiogenic therapy in patients with recurrent high-grade glioma

    Energy Technology Data Exchange (ETDEWEB)

    Nowosielski, Martha; Tinkhauser, Gerd; Stockhammer, Guenther [Innsbruck Medical University, Department of Neurology, Innsbruck (Austria); Recheis, Wolfgang; Schocke, Michael; Gotwald, Thaddaeus [Innsbruck Medical University, Department of Radiology, Innsbruck (Austria); Goebel, Georg [Innsbruck Medical University, Department of Medical Statistics, Informatics and Health Economics, Innsbruck (Austria); Gueler, Oezguer [Innsbruck Medical University, 4D Visualization Laboratory, University Clinic of Oto-, Rhino- and Laryngology, Innsbruck (Austria); Kostron, Herwig [Innsbruck Medical University, Department of Neurosurgery, Innsbruck (Austria); Hutterer, Markus [Innsbruck Medical University, Department of Neurology, Innsbruck (Austria); Paracelsus Medical University Salzburg-Christian Doppler Hospital, Department of Neurology, Salzburg (Austria)

    2011-04-15

    The purpose of this study is to evaluate apparent diffusion coefficient (ADC) maps to distinguish anti-vascular and anti-tumor effects in the course of anti-angiogenic treatment of recurrent high-grade gliomas (rHGG) as compared to standard magnetic resonance imaging (MRI). This retrospective study analyzed ADC maps from diffusion-weighted MRI in 14 rHGG patients during bevacizumab/irinotecan (B/I) therapy. Applying image segmentation, volumes of contrast-enhanced lesions in T1 sequences and of hyperintense T2 lesions (hT2) were calculated. hT2 were defined as regions of interest (ROI) and registered to corresponding ADC maps (hT2-ADC). Histograms were calculated from hT2-ADC ROIs. Thereafter, histogram asymmetry termed ''skewness'' was calculated and compared to progression-free survival (PFS) as defined by the Response Assessment Neuro-Oncology (RANO) Working Group criteria. At 8-12 weeks follow-up, seven (50%) patients showed a partial response, three (21.4%) patients were stable, and four (28.6%) patients progressed according to RANO criteria. hT2-ADC histograms demonstrated statistically significant changes in skewness in relation to PFS at 6 months. Patients with increasing skewness (n = 11) following B/I therapy had significantly shorter PFS than did patients with decreasing or stable skewness values (n = 3, median percentage change in skewness 54% versus -3%, p = 0.04). In rHGG patients, the change in ADC histogram skewness may be predictive for treatment response early in the course of anti-angiogenic therapy and more sensitive than treatment assessment based solely on RANO criteria. (orig.)

  5. Comparison of dose length, area, and volume histograms as quantifiers of urethral dose in prostate brachytherapy

    International Nuclear Information System (INIS)

    Butler, Wayne M.; Merrick, Gregory S.; Dorsey, Anthony T.; Hagedorn, Brenda M.

    2000-01-01

    Purpose: To determine the magnitude of the differences between urethral dose-volume, dose-area, and dose-length histograms (DVH, DAH, and DLH, respectively, or DgH generically). Methods and Materials: Six consecutive iodine-125 ( 125 I) patients and 6 consecutive palladium-103 ( 103 Pd) patients implanted via a modified uniform planning approach were evaluated with day 0 computed tomography (CT)-based dosimetry. The urethra was identified by the presence of a urinary catheter and was hand drawn on the CT images with a mean radius of 3.3 ± 0.7 mm. A 0.1-mm calculation matrix was employed for the urethral volume and surface analysis, and urethral dose points were placed at the centroid of the urethra on each 5-mm CT slice. Results: Although individual patient DLHs were step-like, due to the sparseness of the data points, the composite urethral DLH, DAH, and DVHs were qualitatively similar. The DAH curve delivered more radiation than the other two curves at all doses greater than 90% of the prescribed minimum peripheral dose (mPD) to the prostate. In addition, the DVH curve was consistently higher than the DLH curve at most points throughout that range. Differences between the DgH curves were analyzed by integrating the difference curves between 0 and 200% of the mPD. The area-length, area-volume, and volume-length difference curves integrated in the ratio of 3:2:1. The differences were most pronounced near the inflection point of the DgH curves with mean A 125 , V 125 , and L 125 values of 36.6%, 31.4%, and 23.0%, respectively, of the urethra. Quantifiers of urethral hot spots such as D 10 , defined as the minimal dose delivered to the hottest 10% of the urethra, followed the same ranking: area analysis indicated the highest dose and length analysis, the lowest dose. D 10 was 148% and 136% of mPD for area and length evaluations, respectively. Comparing the two isotopes in terms of the amount of urethra receiving a given dose, 103 Pd implants were significantly

  6. Identifying Memory Allocation Patterns in HEP Software

    Science.gov (United States)

    Kama, S.; Rauschmayr, N.

    2017-10-01

    HEP applications perform an excessive amount of allocations/deallocations within short time intervals which results in memory churn, poor locality and performance degradation. These issues are already known for a decade, but due to the complexity of software frameworks and billions of allocations for a single job, up until recently no efficient mechanism has been available to correlate these issues with source code lines. However, with the advent of the Big Data era, many tools and platforms are now available to do large scale memory profiling. This paper presents, a prototype program developed to track and identify each single (de-)allocation. The CERN IT Hadoop cluster is used to compute memory key metrics, like locality, variation, lifetime and density of allocations. The prototype further provides a web based visualization back-end that allows the user to explore the results generated on the Hadoop cluster. Plotting these metrics for every single allocation over time gives a new insight into application’s memory handling. For instance, it shows which algorithms cause which kind of memory allocation patterns, which function flow causes how many short-lived objects, what are the most commonly allocated sizes etc. The paper will give an insight into the prototype and will show profiling examples for the LHC reconstruction, digitization and simulation jobs.

  7. Intravoxel incoherent motion (IVIM histogram biomarkers for prediction of neoadjuvant treatment response in breast cancer patients

    Directory of Open Access Journals (Sweden)

    Gene Y. Cho

    Full Text Available Objective: To examine the prognostic capabilities of intravoxel incoherent motion (IVIM metrics and their ability to predict response to neoadjuvant treatment (NAT. Additionally, to observe changes in IVIM metrics between pre- and post-treatment MRI. Methods: This IRB-approved, HIPAA-compliant retrospective study observed 31 breast cancer patients (32 lesions. Patients underwent standard bilateral breast MRI along with diffusion-weighted imaging before and after NAT. Six patients underwent an additional IVIM-MRI scan 12–14 weeks after initial scan and 2 cycles of treatment. In addition to apparent diffusion coefficients (ADC from monoexponential decay, IVIM mean values (tissue diffusivity Dt, perfusion fraction fp, and pseudodiffusivity Dp and histogram metrics were derived using a biexponential model. An additional filter identified voxels of highly vascular tumor tissue (VTT, excluding necrotic or normal tissue. Clinical data include histology of biopsy and clinical response to treatment through RECIST assessment. Comparisons of treatment response were made using Wilcoxon rank-sum tests. Results: Average, kurtosis, and skewness of pseudodiffusion Dp significantly differentiated RECIST responders from nonresponders. ADC and Dt values generally increased (∼70% and VTT% values generally decreased (∼20% post-treatment. Conclusion: Dp metrics showed prognostic capabilities; slow and heterogeneous pseudodiffusion offer poor prognosis. Baseline ADC/Dt parameters were not significant predictors of response. This work suggests that IVIM mean values and heterogeneity metrics may have prognostic value in the setting of breast cancer NAT. Keywords: Breast cancer, Diffusion weighted MRI, Intravoxel incoherent motion, Neoadjuvant treatment, Response evaluation criteria in solid tumors

  8. Dose evaluation of organs at risk (OAR) cervical cancer using dose volume histogram (DVH) on brachytherapy

    Science.gov (United States)

    Arif Wibowo, R.; Haris, Bambang; Inganatul Islamiyah, dan

    2017-05-01

    Brachytherapy is one way to cure cervical cancer. It works by placing a radioactive source near the tumor. However, there are some healthy tissues or organs at risk (OAR) such as bladder and rectum which received radiation also. This study aims to evaluate the radiation dose of the bladder and rectum. There were 12 total radiation dose data of the bladder and rectum obtained from patients’ brachytherapy. The dose of cervix for all patients was 6 Gy. Two-dimensional calculation of the radiation dose was based on the International Commission on Radiation Units and Measurements (ICRU) points or called DICRU while the 3-dimensional calculation derived from Dose Volume Histogram (DVH) on a volume of 2 cc (D2cc). The radiation dose of bladder and rectum from both methods were analysed using independent t test. The mean DICRU of bladder was 4.33730 Gy and its D2cc was4.78090 Gy. DICRU and D2cc bladder did not differ significantly (p = 0.144). The mean DICRU of rectum was 3.57980 Gy and 4.58670 Gy for D2cc. The mean DICRU of rectum differed significantly from D2cc of rectum (p = 0.000). The three-dimensional method radiation dose of the bladder and rectum was higher than the two-dimensional method with ratios 1.10227 for bladder and 1.28127 for rectum. The radiation dose of the bladder and rectum was still below the tolerance dose. Two-dimensional calculation of the bladder and rectum dose was lower than three-dimension which was more accurate due to its calculation at the whole volume of the organs.

  9. Bleeding detection in wireless capsule endoscopy using adaptive colour histogram model and support vector classification

    Science.gov (United States)

    Mackiewicz, Michal W.; Fisher, Mark; Jamieson, Crawford

    2008-03-01

    Wireless Capsule Endoscopy (WCE) is a colour imaging technology that enables detailed examination of the interior of the gastrointestinal tract. A typical WCE examination takes ~ 8 hours and captures ~ 40,000 useful images. After the examination, the images are viewed as a video sequence, which generally takes a clinician over an hour to analyse. The manufacturers of the WCE provide certain automatic image analysis functions e.g. Given Imaging offers in their Rapid Reader software: The Suspected Blood Indicator (SBI), which is designed to report the location in the video of areas of active bleeding. However, this tool has been reported to have insufficient specificity and sensitivity. Therefore it does not free the specialist from reviewing the entire footage and was suggested only to be used as a fast screening tool. In this paper we propose a method of bleeding detection that uses in its first stage Hue-Saturation-Intensity colour histograms to track a moving background and bleeding colour distributions over time. Such an approach addresses the problem caused by drastic changes in blood colour distribution that occur when it is altered by gastrointestinal fluids and allow detection of other red lesions, which although are usually "less red" than fresh bleeding, they can still be detected when the difference between their colour distributions and the background is large enough. In the second stage of our method, we analyse all candidate blood frames, by extracting colour (HSI) and texture (LBP) features from the suspicious image regions (obtained in the first stage) and their neighbourhoods and classifying them using Support Vector Classifier into Bleeding, Lesion and Normal classes. We show that our algorithm compares favourably with the SBI on the test set of 84 full length videos.

  10. Assessment of Autonomic Function by Phase Rectification of RRInterval Histogram Analysis in Chagas Disease

    Directory of Open Access Journals (Sweden)

    Olivassé Nasari Junior

    2015-06-01

    Full Text Available Background: In chronic Chagas disease (ChD, impairment of cardiac autonomic function bears prognostic implications. Phase‑rectification of RR-interval series isolates the sympathetic, acceleration phase (AC and parasympathetic, deceleration phase (DC influences on cardiac autonomic modulation. Objective: This study investigated heart rate variability (HRV as a function of RR-interval to assess autonomic function in healthy and ChD subjects. Methods: Control (n = 20 and ChD (n = 20 groups were studied. All underwent 60-min head-up tilt table test under ECG recording. Histogram of RR-interval series was calculated, with 100 ms class, ranging from 600–1100 ms. In each class, mean RR-intervals (MNN and root-mean-squared difference (RMSNN of consecutive normal RR-intervals that suited a particular class were calculated. Average of all RMSNN values in each class was analyzed as function of MNN, in the whole series (RMSNNT, and in AC (RMSNNAC and DC (RMSNNDC phases. Slopes of linear regression lines were compared between groups using Student t-test. Correlation coefficients were tested before comparisons. RMSNN was log-transformed. (α < 0.05. Results: Correlation coefficient was significant in all regressions (p < 0.05. In the control group, RMSNNT, RMSNNAC, and RMSNNDC significantly increased linearly with MNN (p < 0.05. In ChD, only RMSNNAC showed significant increase as a function of MNN, whereas RMSNNT and RMSNNDC did not. Conclusion: HRV increases in proportion with the RR-interval in healthy subjects. This behavior is lost in ChD, particularly in the DC phase, indicating cardiac vagal incompetence.

  11. Variance of a potential of mean force obtained using the weighted histogram analysis method.

    Science.gov (United States)

    Cukier, Robert I

    2013-11-27

    A potential of mean force (PMF) that provides the free energy of a thermally driven system along some chosen reaction coordinate (RC) is a useful descriptor of systems characterized by complex, high dimensional potential energy surfaces. Umbrella sampling window simulations use potential energy restraints to provide more uniform sampling along a RC so that potential energy barriers that would otherwise make equilibrium sampling computationally difficult can be overcome. Combining the results from the different biased window trajectories can be accomplished using the Weighted Histogram Analysis Method (WHAM). Here, we provide an analysis of the variance of a PMF along the reaction coordinate. We assume that the potential restraints used for each window lead to Gaussian distributions for the window reaction coordinate densities and that the data sampling in each window is from an equilibrium ensemble sampled so that successive points are statistically independent. Also, we assume that neighbor window densities overlap, as required in WHAM, and that further-than-neighbor window density overlap is negligible. Then, an analytic expression for the variance of the PMF along the reaction coordinate at a desired level of spatial resolution can be generated. The variance separates into a sum over all windows with two kinds of contributions: One from the variance of the biased window density normalized by the total biased window density and the other from the variance of the local (for each window's coordinate range) PMF. Based on the desired spatial resolution of the PMF, the former variance can be minimized relative to that from the latter. The method is applied to a model system that has features of a complex energy landscape evocative of a protein with two conformational states separated by a free energy barrier along a collective reaction coordinate. The variance can be constructed from data that is already available from the WHAM PMF construction.

  12. Test Anxiety & Its Relation to Perceived Academic Self-Efficacy among Al Hussein Bin Talal University Students

    Science.gov (United States)

    sa'ad alzboon, Habis

    2016-01-01

    This study aimed to identify the degree of perceived academic self-efficacy and the relationship nature between test anxiety and perceived academic self-efficacy among students of Al Hussein Bin Talal University (AHU). Moreover, to identify the degree of available statistical significance differences that are attributed to gender, college and…

  13. Metagenomic analysis and functional characterization of the biogas microbiome using high throughput shotgun sequencing and a novel binning strategy

    DEFF Research Database (Denmark)

    Campanaro, Stefano; Treu, Laura; Kougias, Panagiotis

    2016-01-01

    dissect the bioma involved in anaerobic digestion by means of high throughput Illumina sequencing (~51 gigabases of sequence data), disclosing nearly one million genes and extracting 106 microbial genomes by a novel strategy combining two binning processes. Microbial phylogeny and putative taxonomy...

  14. Mida teie oma tervise heaks teete? / Ly Jagor, Ülle Mihhailova, Ene Pill, Katrin Käbin...[jt.

    Index Scriptorium Estoniae

    2008-01-01

    Küsimusele vastavad Pärnu Õppenõustamiskeskuse psühholoog Ly Jagor, Maasika lasteaia juhataja asetäitja Ülle Mihhailova, Tallinna Perekeskuse ja Tähetorni lastekeskuse psühholoog Ene Pill, Tallinna Nõmme Noortemaja väikelaste ringijuht Katrin Käbin, Tääksi Põhikooli õpetaja Silva Kolk

  15. Control of the Oriental Fruit Moth, Grapholita molesta, Using Entomopathogenic Nematodes in Laboratory and Fruit Bin Assays.

    Science.gov (United States)

    Riga, E; Lacey, L A; Guerra, N; Headrick, H L

    2006-03-01

    The oriental fruit moth (OFM), Grapholita molesta (Busck), which is among the most important insect pests of peaches and nectarines, has developed resistance to a wide range of insecticides. We investigated the ability of the entomopathogenic nematodes (EPN) Steinernema carpocapsae (Weiser), S. feltiae (Filipjev), S. riobrave (Cabanillas et al.), and Heterorhabditis marelatus (Liu and Berry) to control OFM under laboratory and fruit bin conditions. At a dosage of 10 infective juveniles (IJ)/cm(2) in the laboratory, S. carpocapsae caused 63%, S. feltiae 87.8%, S. riobrave 75.6%, and H. marelatus 67.1% OFM mortality. All four nematode species caused significant OFM larval mortality in comparison to the nontreated controls. Steinernema feltiae was used for the bin assays due to the higher OFM mortality it caused than the other tested EPN species and to its ability to find OFM under cryptic environments. Diapausing cocooned OFM larvae in miniature fruit bins were susceptible to IJ of S. feltiae in infested corner supports and cardboard strips. Treatment of bins with suspensions of 10 or 25 S. feltiae IJ/ml water with wetting agent (Silwet L77) resulted in 33.3 to 59% and 77.7 to 81.6% OFM mortality in corner supports and cardboard strips, respectively. This paper presents new information on the use of EPN, specifically S. feltiae, as nonchemical means of OFM control.

  16. Neoadjuvant chemotherapy with bleomycin, ifosfamide and nedaplatin (NAC-BIN) followed by radiotherapy in locoregionally advanced uterine cervical cancer

    Energy Technology Data Exchange (ETDEWEB)

    Toita, Takafumi; Ogawa, Kazuhiko; Kakinohana, Yasumasa; Adachi, Genki; Nishikuramori, Yukiko; Murayama, Sadayuki [Ryukyus Univ., Nishihara, Okinawa (Japan). School of Medicine

    2000-06-01

    Twelve patients with locoregionally advanced uterine cervical cancer (IIIB: 10, IVA: 2) were treated with neoadjuvant chemotherapy consisting of bleomycin, ifosfamide, and nedaplatin (NAC-BIN) and full dose radical radiotherapy. NAC-BIN achieved one complete response and seven partial responses, for a response rate of 67%. Hematologic toxicity was the most common side effect. Five experienced grade 3{<=}leukopenia, and three had grade 3{<=}anemia. With the mean follow-up of 25 months (range: 9-52 months), nine of 12 patients developed recurrence. Eight had pelvic recurrence alone, and one had both pelvic recurrence and distant metastases. The 2-year pelvic control rate, disease-free survival rate (DFS), and absolute survival rate (AS) were 25%, 25%, and 42%, respectively. The 2-year DFS and AS for patients who responded well to NAC-BIN (CR+PR) was 38% and 63%, whereas for those with a poor response (NC) were 0%. From these results, we consider that preoperative NAC-BIN should not be indicated for patients with unresectable stage (stage III{<=}) uterine cervical cancer, because poor responders must subsequently be treated with definitive radiotherapy and may suffer poor prognosis. (author)

  17. Case report of intrafamilial variability in autosomal recessive centronuclear myopathy associated to a novel BIN1 stop mutation

    Directory of Open Access Journals (Sweden)

    Kurul Semra

    2010-12-01

    Full Text Available Abstract Centronuclear myopathies (CNM describe a group of rare muscle diseases typically presenting an abnormal positioning of nuclei in muscle fibers. To date, three genes are known to be associated to a classical CNM phenotype. The X-linked neonatal form (XLCNM is due to mutations in MTM1 and involves a severe and generalized muscle weakness at birth. The autosomal dominant form results from DNM2 mutations and has been described with early childhood and adult onset (ADCNM. Autosomal recessive centronuclear myopathy (ARCNM is less characterized and has recently been associated to mutations in BIN1, encoding amphiphysin 2. Here we present the first clinical description of intrafamilal variability in two first-degree cousins with a novel BIN1 stop mutation. In addition to skeletal muscle defects, both patients have mild mental retardation and the more severely affected male also displays abnormal ventilation and cardiac arrhythmia, thus expanding the phenotypic spectrum of BIN1-related CNM to non skeletal muscle defects. We provide an up-to-date review of all previous cases with ARCNM and BIN1 mutations.

  18. Odor volatiles associated with microflora in damp ventilated and non-ventilated bin-stored bulk wheat.

    Science.gov (United States)

    Tuma, D; Sinha, R N; Muir, W E; Abramson, D

    1989-05-01

    Western hard red spring wheat, stored at 20 and 25% moisture contents for 10 months during 1985-86, was monitored for biotic and abiotic variables in 10 unheated bins in Winnipeg, Manitoba. The major odor volatiles identified were 3-methyl-1-butanol, 3-octanone and 1-octen-3-ol. The production of these volatiles was associated and correlated with microfloral infection. Ventilation, used for cooling and drying of grain, disrupted microfloral growth patterns and production of volatiles. The highest levels of 3-methyl-1-butanol occurred in 25% moisture content wheat infected with bacteria, Penicillium spp. and Fusarium spp. In non-ventilated (control) bins with 20% moisture content wheat, 3-methyl-1-butanol was correlated with infection by members of the Aspergillus glaucus group and bacteria. In control bins, 1-octen-3-ol production was correlated with infection of wheat of both moisture contents by Penicillium spp. The fungal species, isolated from damp bin-stored wheat and tested for production of odor volatiles on wheat substrate, included Alternaria alternata (Fr.) Keissler, Aspergillus repens (Corda) Saccardo, A. flavus Link ex Fries, A. versicolor (Vuill.) Tiraboschi, Penicillium chrysogenum Thom, P. cyclopium Westling, Fusarium moniliforme Sheldon, F. semitectum (Cooke) Sacc. In the laboratory, fungus-inoculated wheat produced 3-methyl-1-butanol; 3-octanone and 1-octen-3-ol were also produced, but less frequently. Two unidentified bacterial species isolated from damp wheat and inoculated on agar produced 3-methyl-1-butanol.

  19. Internet Addiction and Its Relationship with Self-Efficacy Level among Al-Hussein Bin Talal University Students

    Science.gov (United States)

    Alrekebat, Amjad Farhan

    2016-01-01

    The aim of this study was to identify the Internet addiction and its relationship to self-efficacy level among Al-Hussein Bin Talal University students. The study sample consisted of 300 female and male students, who were selected randomly. The participants completed a questionnaire that consisted of two scales: Internet addiction which was…

  20. Investigating the Role of Global Histogram Equalization Technique for99mTechnetium-Methylene diphosphonate Bone Scan Image Enhancement.

    Science.gov (United States)

    Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh

    2017-01-01

    99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P histogram equalization technique in combination with some other postprocessing technique is useful.

  1. Countering Negative Effects of Terrain Slope on Airborne Laser Scanner Data Using Procrustean Transformation and Histogram Matching

    Directory of Open Access Journals (Sweden)

    Endre Hofstad Hansen

    2017-10-01

    Full Text Available Forest attributes such as tree heights, diameter distribution, volumes, and biomass can be modeled utilizing the relationship between remotely sensed metrics as predictor variables, and measurements of forest attributes on the ground. The quality of the models relies on the actual relationship between the forest attributes and the remotely sensed metrics. The processing of airborne laser scanning (ALS point clouds acquired under heterogeneous terrain conditions introduces a distortion of the three-dimensional shape and structure of the ALS data for tree crowns and thus errors in the derived metrics. In the present study, Procrustean transformation and histogram matching were proposed as a means of countering the distortion of the ALS data. The transformations were tested on a dataset consisting of 192 field plots of 250 m2 in size located on a gradient from gentle to steep terrain slopes in western Norway. Regression models with predictor variables derived from (1 Procrustean transformed- and (2 histogram matched point clouds were compared to models with variables derived from untransformed point clouds. Models for timber volume, basal area, dominant height, Lorey’s mean height, basal area weighted mean diameter, and number of stems were assessed. The results indicate that both (1 Procrustean transformation and (2 histogram matching can be used to counter crown distortion in ALS point clouds. Furthermore, both techniques are simple and can easily be implemented in the traditional processing chain of ALS metrics extraction.

  2. A Reversible Data Hiding Scheme for 3D Polygonal Models Based on Histogram Shifting with High Embedding Capacity

    Science.gov (United States)

    Huang, Yao-Hsien; Tsai, Yuan-Yu

    2015-06-01

    Reversibility is the ability to recover the stego media back to the cover media without any error after correctly extracting the secret message. This study proposes a reversible data hiding scheme for 3D polygonal models based on histogram shifting. Specifically, the histogram construction is based on the geometric similarity between neighboring vertices. The distances between the neighboring vertices in a 3D model with some point in the 3D space are usually similar, especially for a high-resolution 3D model. Therefore, the difference between the above distances of the neighboring vertices has a small value for a high probability. This study uses the modified breadth-first search to traverse each vertex once in a sequential order and determine the unique referencing neighbor for each vertex. The histogram is then constructed based on the normalized distance difference of neighboring vertices. This approach significantly increases embedding capacity. Experimental results show that the proposed algorithm can achieve higher embedding capacity than existing algorithms while still maintaining acceptable model distortion. This algorithm also provides greater robustness against similarity transformation attacks and vertex reordering attacks. The proposed technique is feasible for 3D reversible data hiding.

  3. Novel Variants of a Histogram Shift-Based Reversible Watermarking Technique for Medical Images to Improve Hiding Capacity

    Directory of Open Access Journals (Sweden)

    Vishakha Kelkar

    2017-01-01

    Full Text Available In telemedicine systems, critical medical data is shared on a public communication channel. This increases the risk of unauthorised access to patient’s information. This underlines the importance of secrecy and authentication for the medical data. This paper presents two innovative variations of classical histogram shift methods to increase the hiding capacity. The first technique divides the image into nonoverlapping blocks and embeds the watermark individually using the histogram method. The second method separates the region of interest and embeds the watermark only in the region of noninterest. This approach preserves the medical information intact. This method finds its use in critical medical cases. The high PSNR (above 45 dB obtained for both techniques indicates imperceptibility of the approaches. Experimental results illustrate superiority of the proposed approaches when compared with other methods based on histogram shifting techniques. These techniques improve embedding capacity by 5–15% depending on the image type, without affecting the quality of the watermarked image. Both techniques also enable lossless reconstruction of the watermark and the host medical image. A higher embedding capacity makes the proposed approaches attractive for medical image watermarking applications without compromising the quality of the image.

  4. RelMon: A general approach to QA, validation and physics analysis through comparison of large sets of histograms

    CERN Document Server

    Piparo, Danilo

    2012-01-01

    The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the pl...

  5. A 222 energy bins response matrix for a 6Lil scintillator Bss system

    International Nuclear Information System (INIS)

    Lacerda, M. A. S.; Vega C, H. R.; Mendez V, R.; Lorente F, A.; Ibanez F, S.; Gallego D, E.

    2016-10-01

    A new response matrix was calculated for a Bonner Sphere Spectrometer (Bss) with a 6 Lil(Eu) scintillator. We utilized the Monte Carlo N-particle radiation transport code MCNPX, version 2.7.0, with Endf/B-VII.0 nuclear data library to calculate the responses for 6 spheres and the bare detector, for energies varying from 9.441 E(-10) MeV to 105.9 MeV, with 20 equal-log(E)-width bins per energy decade, totalizing 222 energy groups. A Bss, like the modeled in this work, was utilized to measure the neutron spectrum generated by the 241 AmBe source of the Universidad Politecnica de Madrid. From the count rates obtained with this Bss system we unfolded neutron spectrum utilizing the BUNKIUT code for 31 energy bins (UTA-4 response matrix) and the MAXED code with the new calculated response functions. We compared spectra obtained with these Bss system / unfold codes with that obtained from measurements performed with a Bss system constituted of 12 spheres with a spherical 3 He Sp-9 counter (Centronic Ltd., UK) and MAXED code with the system-specific response functions (Bss-CIEMAT). A relatively good agreement was observed between our response matrix and that calculated by other authors. In general, we observed an improvement in the agreement as the energy increases. However, higher discrepancies were observed for energies close to 1-E(-8) MeV and, mainly, for energies above 20 MeV. These discrepancies were mainly attributed to the differences in cross-section libraries employed. The ambient dose equivalent (H (10)) calculated with the 6 Lil-MAXED showed a good agreement with values measured with the neutron area monitor Bert hold Lb 6411 and within 12% the value obtained with another Bss system (Bss-CIEMAT). The response matrix calculated in this work can be utilized together with the MAXED code to generate neutron spectra with a good energy resolution up to 20 MeV. Some additional tests are being done to validate this response matrix and improve the results for energies

  6. A 222 energy bins response matrix for a {sup 6}Lil scintillator Bss system

    Energy Technology Data Exchange (ETDEWEB)

    Lacerda, M. A. S. [Centro de Desenvolvimento da Tecnologia Nuclear, Laboratorio de Calibracao de Dosimetros, Av. Pte. Antonio Carlos 6627, 31270-901 Pampulha, Belo Horizonte, Minas Gerais (Brazil); Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98068 Zacatecas, Zac. (Mexico); Mendez V, R. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Laboratorio de Patrones Neutronicos, Av. Complutense 22, 28040 Madrid (Spain); Lorente F, A.; Ibanez F, S.; Gallego D, E., E-mail: masl@cdtn.br [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, 28006 Madrid (Spain)

    2016-10-15

    A new response matrix was calculated for a Bonner Sphere Spectrometer (Bss) with a {sup 6}Lil(Eu) scintillator. We utilized the Monte Carlo N-particle radiation transport code MCNPX, version 2.7.0, with Endf/B-VII.0 nuclear data library to calculate the responses for 6 spheres and the bare detector, for energies varying from 9.441 E(-10) MeV to 105.9 MeV, with 20 equal-log(E)-width bins per energy decade, totalizing 222 energy groups. A Bss, like the modeled in this work, was utilized to measure the neutron spectrum generated by the {sup 241}AmBe source of the Universidad Politecnica de Madrid. From the count rates obtained with this Bss system we unfolded neutron spectrum utilizing the BUNKIUT code for 31 energy bins (UTA-4 response matrix) and the MAXED code with the new calculated response functions. We compared spectra obtained with these Bss system / unfold codes with that obtained from measurements performed with a Bss system constituted of 12 spheres with a spherical {sup 3}He Sp-9 counter (Centronic Ltd., UK) and MAXED code with the system-specific response functions (Bss-CIEMAT). A relatively good agreement was observed between our response matrix and that calculated by other authors. In general, we observed an improvement in the agreement as the energy increases. However, higher discrepancies were observed for energies close to 1-E(-8) MeV and, mainly, for energies above 20 MeV. These discrepancies were mainly attributed to the differences in cross-section libraries employed. The ambient dose equivalent (H (10)) calculated with the {sup 6}Lil-MAXED showed a good agreement with values measured with the neutron area monitor Bert hold Lb 6411 and within 12% the value obtained with another Bss system (Bss-CIEMAT). The response matrix calculated in this work can be utilized together with the MAXED code to generate neutron spectra with a good energy resolution up to 20 MeV. Some additional tests are being done to validate this response matrix and improve the

  7. Melhoramentos no código Wilson-Devinney para binárias eclipsantes

    Science.gov (United States)

    Vieira, L. A.; Vaz, L. P. R.

    2003-08-01

    A análise de curvas de luz e velocidades radiais de sistemas binários eclipsantes pode ser feita por meio de vários modelos. Um desses é o Modelo Wilson-Devinney (WD). Ao longo dos anos, esse modelo sofreu várias alterações em seus códigos principais, com a finalidade de torná-lo mais consistente tanto fíisica como numericamente. O Modelo WD tem sido melhorado de várias maneiras em seus dois códigos: um para a predição das curvas de luz teórica e de velocidade radiais e outra para as soluções destas curvas. Teoricamente, na física do modelo, nós introduzimos a possibilidade de levar em conta os efeitos do movimento apsidal. Numericamente, nós introduzimos a possibilidade de usar o Método SIMPLEX no procedimento da solução, como uma alternativa para o já implementado Método de Mínimos Quadrados (Least Squares Method). Estas modificações, juntamente com outras já introduzidas pelo nosso grupo anteriormente, tornam o código mais eficiente na solução das curvas de luz e de velocidade radiais de binárias eclipsantes. Como o modelo tem sido usado para analisar sistemas com componentes pré-sequência principal (TY CrA, Casey et al. 1998, Vaz et al. 1998), SM 790, Stassun et al. 2003), este melhoramento beneficiará estes casos também. Apresentamos os resultados obtidos com a modificação do código WD por meio do uso de dados da estrela GL Carinae, comprovando, (1) que os parâmetros orbitais calculados por nós são coerentes com os obtidos anteriormente na literatura (Giménez & Clausen, 1986) e com os obtidos por Faria (1987), e (2) que a implementação do Método SIMPLEX torna o código mais lento mas completamente consistente internamente e evita os problemas gerados pelo uso do Método de Mínimos Quadrados, tais como imprecisão no cálculo das derivadas parciais e convergência para mínimos locais.

  8. Pengaruh Pola MACD Histogram IHSG Terhadap Pola MACD Histogram Perusahaan Dari Daftar Indeks LQ45 (Periode Februari s.d Juli 2015 Bursa Efek Jakarta [Effect of MACD Histogram IHSG Patterns to Patterns of Companies Listed on the Jakarta Stock Exchange LQ-45 (Period February till July 2015

    Directory of Open Access Journals (Sweden)

    Heri Fatkhurrokhim

    2015-09-01

    Full Text Available The purpose of this study is to determine the effect of MACDIHSG pattern to the MACD Company pattern in the L045 index on the period of February until July 2015 at the Indonesia Stock Exchange. This study also aims to facilitate investors to make investment decisions in the stock market. This study provides benefits to capital market investors, especially for stock investors in the Indonesia Stock Exchange as a mean of enhancing their insights in the development of technical analysis on investing. For the general public as well in order to know that investing in the stock market differs from gambling since there are analyzes that easy and can be applied very simply. In addition, this research aims to enhance the reader's desire to invest in the stock market. The samples were the closing data price of IHSG and shares ofLQ45 in the period of February until July 2015 in the Indonesia Stock Exchange. Based on the hypothesis testing, it can be explained that the MACD Histogram IHSG has a significant effect on 38 issuers listed in LQ45. As for the difference between the MACD Histogram effect against one company with another company that is very small. From the 38 stocks that rank on the top 3 company, the Summarecon Agung Tbk is on the top 1, that amounted to 98.3348%, then Alam Sutera Reality Lestari Tbk amounted to 98.2376%, and Adhi Katya (Persero Tbk amounted to 98.1320%. The third of these shares have MACD Histogram movement that approaching the MACD Histogram IHSG.

  9. The Effectiveness of the Curriculum Biography of the Prophet in the Development of Social Intelligence Skills of Al-Hussein Bin Talal University Students

    Science.gov (United States)

    Al-Khateeb, Omar; Alrub, Mohammad Abo

    2015-01-01

    This study aimed to find out how the effectiveness of the curriculum biography of the Prophet in the development of social intelligence skills of Al-Hussein Bin Talal University students and the study sample consisted of 365 students from Al-Hussein Bin Talal University for the first semester 2014-2015 students were selected in accessible manner.…

  10. Spatial Allocator for air quality modeling

    Science.gov (United States)

    The Spatial Allocator is a set of tools that helps users manipulate and generate data files related to emissions and air quality modeling without requiring the use of a commercial Geographic Information System.

  11. U.S. Army Recruiter Allocation Model

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... Our methodology will build on both the new and old schools of recruiting by conducting stakeholder interviews that will lead us to a model that is an efficient starting point for the Recruiter Mission Allocation (RMA...

  12. Worst-case analysis of heap allocations

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Huber, Benedikt; Schoeberl, Martin

    2010-01-01

    In object oriented languages, dynamic memory allocation is a fundamental concept. When using such a language in hard real-time systems, it becomes important to bound both the worst-case execution time and the worst-case memory consumption. In this paper, we present an analysis to determine...... the worst-case heap allocations of tasks. The analysis builds upon techniques that are well established for worst-case execution time analysis. The difference is that the cost function is not the execution time of instructions in clock cycles, but the allocation in bytes. In contrast to worst-case execution...... time analysis, worst-case heap allocation analysis is not processor dependent. However, the cost function depends on the object layout of the runtime system. The analysis is evaluated with several real-time benchmarks to establish the usefulness of the analysis, and to compare the memory consumption...

  13. Resource allocation criteria in a hospital

    OpenAIRE

    Bodina, A.; Pavan, A.; Castaldi, S.

    2017-01-01

    Summary Introduction. Allocate fixed resources among competing users is a challenge in terms of hospital management in order to obtain the best performance considering strategic objectives. In order to address this need, a system of evaluation in an important research and teaching hospital was designed. This study describes resource allocation criteria in a hospital focusing on the evaluation system and its developed application methodology. Methods. The indicator system allows the strategic ...

  14. Cost allocation. Combined heat and power production

    International Nuclear Information System (INIS)

    Sidzikauskas, V.

    2002-01-01

    The benefits of Combined Heat and Power (CHP) generation are discussed. The include improvement in energy intensity of 1% by 2010, 85-90% efficiency versus 40-50% of condensation power and others. Share of CHP electricity production in ERRA countries is presented.Solutions for a development CHP cost allocation are considered. Conclusion are presented for CHP production cost allocation. (R.P.)

  15. Macroeconomic Influences on Optimal Asset Allocation

    OpenAIRE

    Flavin, Thomas; Wickens, Michael R.

    2002-01-01

    We develop a tactical asset allocation strategy that incorporates the effects of macroeconomic variables. The joint distribution of financial asset returns and the macroeconomic variables is modelled using a VAR with a multivariate GARCH (M-GARCH) error structure. As a result, the portfolio frontier is time varying and subject to contagion from the macroeconomic variable. Optimal asset allocation requires that this be taken into account. We illustrate how to do this using three ri...

  16. Optimality versus stability in water resource allocation.

    Science.gov (United States)

    Read, Laura; Madani, Kaveh; Inanloo, Bahareh

    2014-01-15

    Water allocation is a growing concern in a developing world where limited resources like fresh water are in greater demand by more parties. Negotiations over allocations often involve multiple groups with disparate social, economic, and political status and needs, who are seeking a management solution for a wide range of demands. Optimization techniques for identifying the Pareto-optimal (social planner solution) to multi-criteria multi-participant problems are commonly implemented, although often reaching agreement for this solution is difficult. In negotiations with multiple-decision makers, parties who base decisions on individual rationality may find the social planner solution to be unfair, thus creating a need to evaluate the willingness to cooperate and practicality of a cooperative allocation solution, i.e., the solution's stability. This paper suggests seeking solutions for multi-participant resource allocation problems through an economics-based power index allocation method. This method can inform on allocation schemes that quantify a party's willingness to participate in a negotiation rather than opt for no agreement. Through comparison of the suggested method with a range of distance-based multi-criteria decision making rules, namely, least squares, MAXIMIN, MINIMAX, and compromise programming, this paper shows that optimality and stability can produce different allocation solutions. The mismatch between the socially-optimal alternative and the most stable alternative can potentially result in parties leaving the negotiation as they may be too dissatisfied with their resource share. This finding has important policy implications as it justifies why stakeholders may not accept the socially optimal solution in practice, and underlies the necessity of considering stability where it may be more appropriate to give up an unstable Pareto-optimal solution for an inferior stable one. Authors suggest assessing the stability of an allocation solution as an

  17. Discrete Tolerance Allocation for Product Families

    OpenAIRE

    2011-01-01

    Abstract This paper extends earlier research on the discrete tolerance allocation problem in order to optimize an entire product family simultaneously. This methodology enables top-down tolerancing approach where requirements on assembly level on products within a family are allocated to single part requirements. The proposed solution has been implemented as an interface with an optimization algorithm coupled with a variation simulation software. The paper also consists of an exten...

  18. Revenue Allocation and Economic Development in Nigeria

    Directory of Open Access Journals (Sweden)

    Dagwom Yohanna Dang

    2013-09-01

    Full Text Available This study empirically examines the impact of revenue allocation on economic development in Nigeria. Specifically, the study looks at how the various revenue allocations to the three tiers of government affect real gross domestic product (RGDP in Nigeria using time series data for the period 1993 to 2012. Error correction model (ECM and Pairwise Granger Causality test are used in analyzing the data. The study carries out test of stationarity of the variables using Augmented Dickey–Fuller unit root test and test of long-run relationship among the variables using Johansen Cointegration test. The study’s findings show that revenue allocations have significant causal relationship with economic development in Nigeria, with only revenue allocation to states having significant negative relationship. Unidirectional causality runs from revenue allocations to real GDP in Nigeria. All variables of the study are cointegrated and have a long-run relationship that 87.62% of the short-run disequilibrium is corrected yearly. The study recommends among others that more financial control and value for money audit should be carried out to minimize wastages and corruption in the states of the federation, so as to change the direction of influence of states’ revenue allocation on economic development.

  19. Michael Jackson, Bin Laden and I: functions of positive and negative, public and private flashbulb memories.

    Science.gov (United States)

    Demiray, Burcu; Freund, Alexandra M

    2015-01-01

    This study examined the perceived psychosocial functions of flashbulb memories: It compared positive and negative public flashbulb memories (positive: Bin Laden's death, negative: Michael Jackson's death) with private ones (positive: pregnancy, negative: death of a loved one). A sample of n = 389 young and n = 176 middle-aged adults answered canonical category questions used to identify flashbulb memories and rated the personal significance, the psychological temporal distance, and the functions of each memory (i.e., self-continuity, social-boding, directive functions). Hierarchical regressions showed that, in general, private memories were rated more functional than public memories. Positive and negative private memories were comparable in self-continuity and directionality, but the positive private memory more strongly served social functions. In line with the positivity bias in autobiographical memory, positive flashbulb memories felt psychologically closer than negative ones. Finally, middle-aged adults rated their memories as less functional regarding self-continuity and social-bonding than young adults. Results are discussed regarding the tripartite model of autobiographical memory functions.

  20. [Transformation of sainfoin by Agrobacterium rhizogenes LBA9402 Bin19 and regeneration of transgenic plants].

    Science.gov (United States)

    Xu, Z Q; Ma, H J; Hao, J G; Jia, J F

    2000-03-01

    Hypocotyl segments of Onobrychis viciaefolia were transformed by Agrobacterium rhizogenes LBA9402 which harboured pBin19 and pRi1855. Seedling age and preculture time of hypocotyl segments influenced the transformation frequency. Paper electrophoresis revealed that 70% of single hairy root cultures could synthesize agropine. Calli were induced from hairy root segments on MS medium containing 0-9.05 mumol/L 2,4-D and 0-2.22 mumol/L 6-BA at first, then they were transferred onto MS0 medium without kanamycin for regeneration. Constitution and concentration of phytohormones in callus induction media affected subsequent regeneration of calluses on MS0 medium remarkably. Regeneration frequency and shoot number per callus declined when 2,4-D concentration in callus induction media increased from 4.52 to 9.05 mumol/L, while they ascended when 6-BA in callus induction media increased from 0 to 2.22 mumol/L. On MS medium supplemented with 4.52 mumol/L 2,4-D and 2.22 mumol/L 6-BA, only 14.2% hairy root segments could produce calluses, but the regeneration frequency reached 58.1% and the shoot number per callus was 37.2. In 32 analysed plants regenerated from 8 kanamycin resistant hairy root lines, 25 were nptII positive and showed different copy numbers.

  1. Transits maghrébins à Istanbul. Trajectoires, profils et stratégies

    Directory of Open Access Journals (Sweden)

    Jean-François Pérouse

    2012-03-01

    Full Text Available Istanbul est devenu depuis le début des années 1990 un foyer important, à la fois polarisateur et redistributeur, dans le système complexe des migrations internationales, dont les migrations de transit, des pays pauvres vers les pays riches, constituent une des dimensions les plus saillantes. Dans ce cadre, on s’efforce ici d’examiner le cas des Maghrébins qui, s’ils ne forment pas numériquement le “groupe” le plus en vue, n’en offrent pas moins une configuration intéressante, tant par les types de trajectoires qui les conduisent à Istanbul que par les stratégies de survie et de “faire communauté” dans la métropole, que par les modalités de sortie vers les horizons convoités. On insiste ici sur les diverses modalités de transit, la prégnance des identités nationales, l’efficace irremplaçable des liens familiaux, la faible insertion dans le marché du travail métropolitain et les redéfinitions des rapports de genre occasionnées.

  2. DIGITAL WORKFLOW FOR THE CONSERVATION OF BAHRAIN BUILT HERITAGE: THE SHEIK ISA BIN ALI HOUSE

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2017-08-01

    Full Text Available Currently, the commercial market offers several tools for digital documentation of historic sites and buildings. Photogrammetry and laser scanning play a fundamental role in the acquisition of metric information, which is then processed to generate reliable records particularly useful also in the built heritage conservation field. Although potentially very fast and accurate, such techniques require expert operators to produce reliable results, especially in the case of complex and large sites. The aim of this paper is to present the digital workflow developed for data acquisition and processing of the Shaikh Isa Bin Ali house in Muharraq, Bahrain. This historic structure is an outstanding example of Bahrain architecture as well as tangible memory of the country history, with strong connotations in the Bahrain cultural identity. The building has been documented employing several digital techniques, including: aerial (drone and terrestrial photogrammetry, rectifying photography, total station and laser scanning. The documentation project has been developed for the Bahrain Authority for Culture and Antiquities (BACA by a multidisciplinary team of experts from Carleton Immersive Media Studio (CIMS, Carleton University, Canada and Gicarus Lab (Politecnico di Milano, Italy.

  3. VIOLATIONS OF THE GENDER EQUALITY PRINCIPLES REVEALED IN CARMEN BIN LADIN’S INSIDE THE KIGDOM

    Directory of Open Access Journals (Sweden)

    Rinawati Rinawati

    2017-04-01

    Full Text Available This paper analyzed the theme of violations of the gender equality principles in Carmen‘s Bin Ladin‘s Inside the Kingdom. The best seller novel was based on the true story of the author‘s life in Saudi Arabia under the gender prohibitions of Wahhabi custom. The analytical perspective adopted in this study is shaped by the idea of Islamic feminism. The analysis resulted in the finding that gender problem revealed in the novel was due to the violations of the gender equality principles. The violations of the gender equality principles included the practice of honor killing, women‘s face covering, the construction of women‘s inferiority, the prohibition of entering mosque for women, segregation of sexes, divorced women‘s getting no child custody rights, no obligationof educating women, forced marriage, temporary marriage, female genital mutilation and improper polygamy practice. In conclusion, the women depicted in the novel are not truly treated according to the gender equality principles.

  4. Physical and Chemical Properties of Coal Bottom Ash (CBA) from Tanjung Bin Power Plant

    Science.gov (United States)

    Izzati Raihan Ramzi, Nurul; Shahidan, Shahiron; Zulkhairi Maarof, Mohamad; Ali, Noorwirdawati

    2016-11-01

    The objective of this study is to determine the physical and chemical characteristics of Coal Bottom Ash (CBA) obtained from Tanjung Bin Power Plant Station and compare them with the characteristics of natural river sand (as a replacement of fine aggregates). Bottom ash is the by-product of coal combustion during the electricity generating process. However, excess bottom ash production due to the high production of electricity in Malaysia has caused several environmental problems. Therefore, several tests have been conducted in order to determine the physical and chemical properties of bottom ash such as specific gravity, density, particle size distribution, Scanning Electron Microscopic (SEM) and X- Ray Fluorescence (XRF) in the attempt to produce sustainable material from waste. The results indicated that the natural fine aggregate and coal bottom ash have very different physical and chemical properties. Bottom ash was classified as Class C ash. The porous structure, angular and rough texture of bottom ash affected its specific gravity and particle density. From the tests, it was found that bottom ash is recommended to be used in concrete as a replacement for fine aggregates.

  5. Digital Workflow for the Conservation of Bahrain Built Heritage: the Sheik Isa Bin ALI House

    Science.gov (United States)

    Barazzetti, L.; Mezzino, D.; Santana Quintero, M.

    2017-08-01

    Currently, the commercial market offers several tools for digital documentation of historic sites and buildings. Photogrammetry and laser scanning play a fundamental role in the acquisition of metric information, which is then processed to generate reliable records particularly useful also in the built heritage conservation field. Although potentially very fast and accurate, such techniques require expert operators to produce reliable results, especially in the case of complex and large sites. The aim of this paper is to present the digital workflow developed for data acquisition and processing of the Shaikh Isa Bin Ali house in Muharraq, Bahrain. This historic structure is an outstanding example of Bahrain architecture as well as tangible memory of the country history, with strong connotations in the Bahrain cultural identity. The building has been documented employing several digital techniques, including: aerial (drone) and terrestrial photogrammetry, rectifying photography, total station and laser scanning. The documentation project has been developed for the Bahrain Authority for Culture and Antiquities (BACA) by a multidisciplinary team of experts from Carleton Immersive Media Studio (CIMS, Carleton University, Canada) and Gicarus Lab (Politecnico di Milano, Italy).

  6. Apparent diffusion coefficient histogram analysis of neonatal hypoxic-ischemic encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Cauley, Keith A. [University of Massachusetts Medical School, Department of Radiology, Worcester, MA (United States); New York Presbyterian Hospital, Columbia University Medical Center, Department of Radiology, New York, NY (United States); Filippi, Christopher G. [New York Presbyterian Hospital, Columbia University Medical Center, Department of Radiology, New York, NY (United States)

    2014-06-15

    Diffusion-weighted imaging is a valuable tool in the assessment of the neonatal brain, and changes in diffusion are seen in normal development as well as in pathological states such as hypoxic-ischemic encephalopathy (HIE). Various methods of quantitative assessment of diffusion values have been reported. Global ischemic injury occurring during the time of rapid developmental changes in brain myelination can complicate the imaging diagnosis of neonatal HIE. To compare a quantitative method of histographic analysis of brain apparent coefficient (ADC) maps to the qualitative interpretation of routine brain MR imaging studies. We correlate changes in diffusion values with gestational age in radiographically normal neonates, and we investigate the sensitivity of the method as a quantitative measure of hypoxic-ischemic encephalopathy. We reviewed all brain MRI studies from the neonatal intensive care unit (NICU) at our university medical center over a 4-year period to identify cases that were radiographically normal (23 cases) and those with diffuse, global hypoxic-ischemic encephalopathy (12 cases). We histographically displayed ADC values of a single brain slice at the level of the basal ganglia and correlated peak (s-sD{sub av}) and lowest histogram values (s-sD{sub lowest}) with gestational age. Normative s-sD{sub av} values correlated significantly with gestational age and declined linearly through the neonatal period (r {sup 2} = 0.477, P < 0.01). Six of 12 cases of known HIE demonstrated significantly lower s-sD{sub av} and s-sD{sub lowest} ADC values than were reflected in the normative distribution; several cases of HIE fell within a 95% confidence interval for normative studies, and one case demonstrated higher-than-normal s-sD{sub av}. Single-slice histographic display of ADC values is a rapid and clinically feasible method of quantitative analysis of diffusion. In this study normative values derived from consecutive neonates without radiographic evidence of

  7. Dose-volume histogram parameters for predicting radiation pneumonitis using receiver operating characteristic curve

    International Nuclear Information System (INIS)

    Wang Dongqing; Zhang Jian; Li Baosheng; Sun Hongfu

    2012-01-01

    Objective: To assess the accuracy (ACC), sensitivity (SEN), and specificity (SPE) of dose-volume histogram (DVH) parameters in predicting the radiation pneumonitis (RP) using receiver operating characteristic (ROC) curve. Methods: Complete clinical data of 118 non-small cell lung cancer patients treated with three-dimensional conformal and intensity-modulated radiotherapy plus chemotherapy were included. Chi-square and logistic regression were retrospectively applied to analyze the correlations between DVH parameters [relative lung volume received ≥ 5 Gy (V 5 ), 10 Gy (V 10 ), 13 Gy (V 13 ), 20 Gy (V 20 ) and 30 Gy (V 30 ) and mean lung dose (MLD)] and grade 2 (and above) RP defined by the National Cancer Institute Common Terminology Criteria for Adverse Events, version 3.0. ROC curve was adopted to investigate the predictive ACC, SEN and SPE of potential DVH parameters associated with RP. Results: Total lungs V 5 , V 10 , V 13 , V 20 and MLD were all correlated to the development of RP (χ 2 =4.786, 5.771, 6.366, 7.367 and 6.945, P<0.05) according to univariate analysis. However, total lungs V 30 , patient characteristics (age, sex, KPS, tumor location, pathology) and treatment factors (prescription dose, radiotherapy technique, chemotherapy method and timing) were not contributors to RP. Logistic regression showed that V 20 of both lungs remains tight by associated with RP (χ 2 =10.96, OR=4.16, 95% CI 1.40 ∼ 12.36, P<0.05), although significant colinearity was found between V 20 and other DVH parameters (r=0.767-0.902, P<0.05). ROC curve confirmed that V 20 of both lungs could act as a predictor for RP (Z=2.038, P<0.05). The predictive ACC, SEN, and SPE were 0.645 (95% CI 0.498-0.793), 0.650 (95% CI 0.408-0.864), and 0.674 (95% CI 0.571-0.765), respectively. However, the positive predictive value was only 28.9%. Conclusions: V 20 of both lungs was correlated to the development of RP. It could act as a predictor for RP though the predictability is limited

  8. The enigma of sex allocation in Selaginella.

    Science.gov (United States)

    Petersen, Kurt B; Burd, Martin

    2018-02-12

    The division of resource investment between male and female functions is poorly known for land plants other than angiosperms. The ancient lycophyte genus Selaginella is similar in some ways to angiosperms (in heterospory and in having sex allocation occur in the sporophyte generation, for example) but lacks the post-fertilization maternal investments that angiosperms make via fruit and seed tissues. One would therefore expect Selaginella to have sex allocation values less female-biased than in flowering plants and closer to the theoretical prediction of equal investment in male and female functions. Nothing is currently known of sex allocation in the genus, so even the simplest predictions have not been tested. Volumetric measurements of microsporangial and megasporangial investment were made in 14 species of Selaginella from four continents. In five of these species the length of the main above-ground axis of each plant was measured to determine whether sex allocation is related to plant size. Of the 14 species, 13 showed male-biased allocations, often extreme, in population means and among the great majority of individual plants. There was some indication from the five species with axis length measurements that relative male allocation might be related to the release height of spores, but this evidence is preliminary. Sex allocation in Selaginella provides a phylogenetic touchstone showing how the innovations of fruit and seed investment in the angiosperm life cycle lead to typically female-biased allocations in that lineage. Moreover, the male bias we found in Selaginella requires an evolutionary explanation. The bias was often greater than what would occur from the mere absence of seed and fruit investments, and thus poses a challenge to sex allocation theory. It is possible that differences between microspores and megaspores in their dispersal ecology create selective effects that favour male-biased sexual allocation. This hypothesis remains tentative. © The

  9. Whole Tumor Histogram-profiling of Diffusion-Weighted Magnetic Resonance Images Reflects Tumorbiological Features of Primary Central Nervous System Lymphoma.

    Science.gov (United States)

    Schob, Stefan; Münch, Benno; Dieckow, Julia; Quäschling, Ulf; Hoffmann, Karl-Titus; Richter, Cindy; Garnov, Nikita; Frydrychowicz, Clara; Krause, Matthias; Meyer, Hans-Jonas; Surov, Alexey

    2018-04-01

    Diffusion weighted imaging (DWI) quantifies motion of hydrogen nuclei in biological tissues and hereby has been used to assess the underlying tissue microarchitecture. Histogram-profiling of DWI provides more detailed information on diffusion characteristics of a lesion than the standardly calculated values of the apparent diffusion coefficient (ADC)-minimum, mean and maximum. Hence, the aim of our study was to investigate, which parameters of histogram-profiling of DWI in primary central nervous system lymphoma can be used to specifically predict features like cellular density, chromatin content and proliferative activity. Pre-treatment ADC maps of 21 PCNSL patients (8 female, 13 male, 28-89 years) from a 1.5T system were used for Matlab-based histogram profiling. Results of histopathology (H&E staining) and immunohistochemistry (Ki-67 expression) were quantified. Correlations between histogram-profiling parameters and neuropathologic examination were calculated using SPSS 23.0. The lower percentiles (p10 and p25) showed significant correlations with structural parameters of the neuropathologic examination (cellular density, chromatin content). The highest percentile, p90, correlated significantly with Ki-67 expression, resembling proliferative activity. Kurtosis of the ADC histogram correlated significantly with cellular density. Histogram-profiling of DWI in PCNSL provides a comprehensible set of parameters, which reflect distinct tumor-architectural and tumor-biological features, and hence, are promising biomarkers for treatment response and prognosis. Copyright © 2018. Published by Elsevier Inc.

  10. Amina Bin Qarrish de Tetuan: registros da vida de uma mulher marroquina do século XIX Amina Bin Qarrish of Tetuan: records of the life of a Moroccan woman of the 19th century

    Directory of Open Access Journals (Sweden)

    Nadia Erzijni

    2008-06-01

    Full Text Available Este artigo registra a tentativa de escrever a biografia de uma mulher de Tetuan do século XIX, Amina Bin Qarrish (morta em 1889, esposa de 'Abdalkrim ar-Razini (morto em 1909. Ela foi uma mulher rica, o último membro sobrevivente da família de Zawiyat Ben Qarrish, uma comunidade Sufi fundada em Tetuan no final do século XVII ou início do século XVIII. As principais fontes para esta biografia são uma coleção de documentos inéditos (documentos legais, cartas e anotações em cadernos do arquivo Razini em Tetuan. Esses documentos tratam de herança, casamento, divórcio, propriedade, procurações, presentes, awqaf, experiências de guerra e da vida familiar de Amina Bin Qarrish e de mulheres contemporâneas. Também é possível usar a história oral para auxiliar a interpretar esses documentos.This article records the attempt to write a biography of a woman of 19th century Tetuan, Amina Bin Qarrish (d.1889, wife of 'Abdalkrim ar-Razini (d.1909. She was a wealthy woman, the last surviving member of the family of the Zawiyat Ben Qarrish, a Sufi fraternity founded in Tetuan in the late 17th or early 18th century. The principal sources for the biography are a small collection of unpublished documents (legal documents, letters and jottings in notebooks in the Razini archive in Tetuan. These documents deal with the inheritance, marriage, divorce, property, power of attorney, gifts, awqaf, experience of war, and family life of Amina Bin Qarrish, and those of contemporary women. It is also possible to use oral history to help interpret these documents.

  11. Evaluation of low-grade glioma structural changes after chemotherapy using DTI-based histogram analysis and functional diffusion maps

    Energy Technology Data Exchange (ETDEWEB)

    Castellano, Antonella; Iadanza, Antonella; Falini, Andrea [San Raffaele Scientific Institute and Vita-Salute San Raffaele University, Neuroradiology Unit and CERMAC, Milano (Italy); Donativi, Marina [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); Ruda, Roberta; Bertero, Luca; Soffietti, Riccardo [University of Torino, Department of Neuro-oncology, Turin (Italy); De Nunzio, Giorgio [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); INFN (National Institute of Nuclear Physics), Lecce (Italy); Riva, Marco; Bello, Lorenzo [Universita degli Studi di Milano, Milan, and Humanitas Research Hospital, Department of Medical Biotechnology and Translational Medicine, Rozzano, MI (Italy); Rucco, Matteo [University of Camerino, School of Science and Technology, Computer Science Division, Camerino, MC (Italy)

    2016-05-15

    To explore the role of diffusion tensor imaging (DTI)-based histogram analysis and functional diffusion maps (fDMs) in evaluating structural changes of low-grade gliomas (LGGs) receiving temozolomide (TMZ) chemotherapy. Twenty-one LGG patients underwent 3T-MR examinations before and after three and six cycles of dose-dense TMZ, including 3D-fluid-attenuated inversion recovery (FLAIR) sequences and DTI (b = 1000 s/mm{sup 2}, 32 directions). Mean diffusivity (MD), fractional anisotropy (FA), and tensor-decomposition DTI maps (p and q) were obtained. Histogram and fDM analyses were performed on co-registered baseline and post-chemotherapy maps. DTI changes were compared with modifications of tumour area and volume [according to Response Assessment in Neuro-Oncology (RANO) criteria], and seizure response. After three cycles of TMZ, 20/21 patients were stable according to RANO criteria, but DTI changes were observed in all patients (Wilcoxon test, P ≤ 0.03). After six cycles, DTI changes were more pronounced (P ≤ 0.005). Seventy-five percent of patients had early seizure response with significant improvement of DTI values, maintaining stability on FLAIR. Early changes of the 25th percentiles of p and MD predicted final volume change (R{sup 2} = 0.614 and 0.561, P < 0.0005, respectively). TMZ-related changes were located mainly at tumour borders on p and MD fDMs. DTI-based histogram and fDM analyses are useful techniques to evaluate the early effects of TMZ chemotherapy in LGG patients. (orig.)

  12. Multiscale molecular dynamics simulations of membrane remodeling by Bin/Amphiphysin/Rvs family proteins

    Science.gov (United States)

    Chun, Chan; Haohua, Wen; Lanyuan, Lu; Jun, Fan

    2016-01-01

    Membrane curvature is no longer thought of as a passive property of the membrane; rather, it is considered as an active, regulated state that serves various purposes in the cell such as between cells and organelle definition. While transport is usually mediated by tiny membrane bubbles known as vesicles or membrane tubules, such communication requires complex interplay between the lipid bilayers and cytosolic proteins such as members of the Bin/Amphiphysin/Rvs (BAR) superfamily of proteins. With rapid developments in novel experimental techniques, membrane remodeling has become a rapidly emerging new field in recent years. Molecular dynamics (MD) simulations are important tools for obtaining atomistic information regarding the structural and dynamic aspects of biological systems and for understanding the physics-related aspects. The availability of more sophisticated experimental data poses challenges to the theoretical community for developing novel theoretical and computational techniques that can be used to better interpret the experimental results to obtain further functional insights. In this review, we summarize the general mechanisms underlying membrane remodeling controlled or mediated by proteins. While studies combining experiments and molecular dynamics simulations recall existing mechanistic models, concurrently, they extend the role of different BAR domain proteins during membrane remodeling processes. We review these recent findings, focusing on how multiscale molecular dynamics simulations aid in understanding the physical basis of BAR domain proteins, as a representative of membrane-remodeling proteins. Project supported by the National Natural Science Foundation of China (Grant No. 21403182) and the Research Grants Council of Hong Kong, China (Grant No. CityU 21300014).

  13. SOME NOTES ON COST ALLOCATION IN MULTICASTING

    Directory of Open Access Journals (Sweden)

    Darko Skorin-Kapov

    2012-12-01

    Full Text Available We analyze the cost allocation strategies with the problef of broadcasting information from some source to a number of communication network users. A multicast routing chooses a minimum cost tree network that spans the source and all the receivers. The cost of such a network is distributed among its receivers who may be individuals or organizations with possibly conflicting interests. Providing network developers, users and owners with practical computable 'fair' cost allocation solution procedures is of great importance for network mamagement. Consequently, this multidisciplinary problem was extensively studied by Operational Researchers, Economists, Mathematicians and Computer Scientists. The fairness of various proposed solutions was even argued in US courts. This presentation overviews some previously published, as well as some recent results, in the development of algorithmic mechanisms to efficiently compute 'attractive' cost allocation solutions for multicast networks. Specifically, we will analyze cooperative game theory based cost allocation models that avoid cross subsidies and/or are distance and population monotonic. We will also present some related open cost allocation problems and the potential contribution that such models might make to this problem in the future.

  14. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  15. Risk capital allocation with autonomous subunits

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Smilgins, Aleksandrs

    2016-01-01

    Risk capital allocation problems have been widely discussed in the academic literature. We consider a set of independent subunits collaborating in order to reduce risk: that is, when subunit portfolios are merged a diversification benefit arises and the risk of the group as a whole is smaller tha...... fairness tests related directly to the problem of risk capital allocation and show that the Lorenz set satisfies all three tests in contrast to other well-known coherent methods. Finally, we discuss how to deal with non-uniqueness of the Lorenz set.......Risk capital allocation problems have been widely discussed in the academic literature. We consider a set of independent subunits collaborating in order to reduce risk: that is, when subunit portfolios are merged a diversification benefit arises and the risk of the group as a whole is smaller than...... the sum of the risks of the individual subunits. The question is how to allocate the risk capital of the group among the subunits in a fair way. In this paper we propose to use the Lorenz set as an allocation method. We show that the Lorenz set is operational and coherent. Moreover, we propose three...

  16. Cognitive allocation and the control room

    International Nuclear Information System (INIS)

    Paradies, M.W.

    1985-01-01

    One of the weakest links in the design of nuclear power plants is the inattention to the needs and capabilities of the operators. This flaw causes decreased plant reliability and reduced plant safety. To solve this problem the designer must, in the earliest stages of the design process, consider the operator's abilities. After the system requirements have been established, the designer must consider what functions to allocate to each part of the system. The human must be considered as part of this system. The allocation of functions needs to consider not only the mechanical tasks to be performed, but also the control requirements and the overall control philosophy. In order for the designers to consider the control philosophy, they need to know what control decisions should be automated and what decisions should be made by an operator. They also need to know how these decisions will be implemented: by an operator or by automation. ''Cognitive Allocation'' is the allocation of the decision making process between operators and machines. It defines the operator's role in the system. When designing a power plant, a cognitive allocation starts the process of considering the operator's abilities. This is the first step to correcting the weakest link in the current plant design

  17. Curve Evolution in Subspaces and Exploring the Metameric Class of Histogram of Gradient Orientation based Features using Nonlinear Projection Methods

    DEFF Research Database (Denmark)

    Tatu, Aditya Jayant

    tracking interfaces, active contour based segmentation methods and others. It can also be used to study shape spaces, as deforming a shape can be thought of as evolving its boundary curve. During curve evolution a curve traces out a path in the infinite dimensional space of curves. Due to application...... defined subspace, the N-links bicycle chain space, i.e. the space of curves with equidistant neighboring landmark points. This in itself is a useful shape space for medical image analysis applications. The Histogram of Gradient orientation based features are many in number and are widely used...

  18. DOES OSAMA BIN LADEN READ JEAN FRANCOIS LYOTARD? REFLECTIONS AROUND THE POSSIBLE RELATIONS BETWEEN STRATEGY AND POSTMODERNITY.

    Directory of Open Access Journals (Sweden)

    Gustavo Martín Fragachán

    2011-07-01

    Full Text Available Have Osama Bin Laden been reading Jean-Francois Lyotard books? Probably is not the case. But the new models of strategy (asymmetric war, Fourth generation war, out of restrictions war or cybernetics war are organized upon some of the criteria showed for many authors (Jean-Francois Lyotard among them as belonging to the so-called Postmodern condition. In these pages I essay to prove how close the new concepts of strategy and post modernity are

  19. Mutations in BIN1 associated with centronuclear myopathy disrupt membrane remodeling by affecting protein density and oligomerization.

    Directory of Open Access Journals (Sweden)

    Tingting Wu

    Full Text Available The regulation of membrane shapes is central to many cellular phenomena. Bin/Amphiphysin/Rvs (BAR domain-containing proteins are key players for membrane remodeling during endocytosis, cell migration, and endosomal sorting. BIN1, which contains an N-BAR domain, is assumed to be essential for biogenesis of plasma membrane invaginations (T-tubules in muscle tissues. Three mutations, K35N, D151N and R154Q, have been discovered so far in the BAR domain of BIN1 in patients with centronuclear myopathy (CNM, where impaired organization of T-tubules has been reported. However, molecular mechanisms behind this malfunction have remained elusive. None of the BIN1 disease mutants displayed a significantly compromised curvature sensing ability. However, two mutants showed impaired membrane tubulation both in vivo and in vitro, and displayed characteristically different behaviors. R154Q generated smaller membrane curvature compared to WT N-BAR. Quantification of protein density on membranes revealed a lower membrane-bound density for R154Q compared to WT and the other mutants, which appeared to be the primary reason for the observation of impaired deformation capacity. The D151N mutant was unable to tubulate liposomes under certain experimental conditions. At medium protein concentrations we found 'budding' structures on liposomes that we hypothesized to be intermediates during the tubulation process except for the D151N mutant. Chemical crosslinking assays suggested that the D151N mutation impaired protein oligomerization upon membrane binding. Although we found an insignificant difference between WT and K35N N-BAR in in vitro assays, depolymerizing actin in live cells allowed tubulation of plasma membranes through the K35N mutant. Our results provide insights into the membrane-involved pathophysiological mechanisms leading to human disease.

  20. Development of an aerosol microphysical module: Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS)

    OpenAIRE

    H. Matsui; M. Koike; Y. Kondo; J. D. Fast; M. Takigawa

    2014-01-01

    Number concentrations, size distributions, and mixing states of aerosols are essential parameters for accurate estimation of aerosol direct and indirect effects. In this study, we develop an aerosol module, designated Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS), that can represent these parameters explicitly by considering new particle formation (NPF), black carbon (BC) aging, and secondary organic aerosol (SOA) processes. A...

  1. Calculation of rectal dose surface histograms in the presence of time varying deformations

    International Nuclear Information System (INIS)

    Roeske, John C.; Spelbring, Danny R.; Vijayakumar, S.; Forman, Jeffrey D.; Chen, George T.Y.

    1996-01-01

    Purpose: Dose volume (DVH) and dose surface histograms (DSH) of the bladder and rectum are usually calculated from a single treatment planning scan. These DVHs and DSHs will eventually be correlated with complications to determine parameters for normal tissue complication probabilities (NTCP). However, from day to day, the size and shape of the rectum and bladder may vary. The purpose of this study is to compare a more accurate estimate of the time integrated DVHs and DSHs of the rectum (in the presence of daily variations in rectal shape) to initial DVHs/DSHs. Methods: 10 patients were scanned once per week during the course of fractionated radiotherapy, typically accumulating a total of six scans. The rectum and bladder were contoured on each of the studies. The model used to assess effects of rectal contour deformation is as follows: the contour on a given axial slice (see figure) is boxed within a rectangle. A line drawn parallel to the AP axis through the rectangle equally partitions the box. Starting at the intersection of the vertical line and the rectal contour, points on the contour are marked off representing the same rectal dose point, even in the presence of distortion. Corresponding numbered points are used to sample the dose matrix and create a composite DSH. The model assumes uniform stretching of the rectal contour for any given axial cut, and no twist of the structure or vertical displacement. A similar model is developed for the bladder with spherical symmetry. Results: Normalized DSHs (nDSH) for each CT scan were calculated as well as the time averaged nDSH over all scans. These were compared with the nDSH from the initial planning scan. Individual nDSHs differed by 8% surface area irradiated at the 80% dose level, to as much as 20% surface area in the 70-100% dose range. DSH variations are due to position and shape changes in the rectum during different CT scans. The spatial distribution of dose is highly variable, and depends on the field

  2. Crash Simulation of a Vertical Drop Test of a B737 Fuselage Section with Overhead Bins and Luggage

    Science.gov (United States)

    Jackson, Karen E.; Fasanella, Edwin L.

    2004-01-01

    The focus of this paper is to describe a crash simulation of a 30-ft/s vertical drop test of a Boeing 737 (B737) fuselage section. The drop test of the 10-ft. long fuselage section of a B737 aircraft was conducted in November of 2000 at the FAA Technical Center in Atlantic City, NJ. The fuselage section was outfitted with two different commercial overhead stowage bins. In addition, 3,229-lbs. of luggage were packed in the cargo hold to represent a maximum take-off weight condition. The main objective of the test was to evaluate the response and failure modes of the overhead stowage bins in a narrow-body transport fuselage section when subjected to a severe, but survivable, impact. A secondary objective of the test was to generate experimental data for correlation with the crash simulation. A full-scale 3-dimensional finite element model of the fuselage section was developed and a crash simulation was conducted using the explicit, nonlinear transient dynamic code, MSC.Dytran. Pre-test predictions of the fuselage and overhead bin responses were generated for correlation with the drop test data. A description of the finite element model and an assessment of the analytical/experimental correlation are presented. In addition, suggestions for modifications to the model to improve correlation are proposed.

  3. Exact Bayesian bin classification: a fast alternative to Bayesian classification and its application to neural response analysis.

    Science.gov (United States)

    Endres, D; Földiák, P

    2008-02-01

    We investigate the general problem of signal classification and, in particular, that of assigning stimulus labels to neural spike trains recorded from single cortical neurons. Finding efficient ways of classifying neural responses is especially important in experiments involving rapid presentation of stimuli. We introduce a fast, exact alternative to Bayesian classification. Instead of estimating the class-conditional densities p(x|y) (where x is a scalar function of the feature[s], y the class label) and converting them to P(y|x) via Bayes' theorem, this probability is evaluated directly and without the need for approximations. This is achieved by integrating over all possible binnings of x with an upper limit on the number of bins. Computational time is quadratic in both the number of observed data points and the number of bins. The algorithm also allows for the computation of feedback signals, which can be used as input to subsequent stages of inference, e.g. neural network training. Responses of single neurons from high-level visual cortex (area STSa) to rapid sequences of complex visual stimuli are analysed. Information latency and response duration increase nonlinearly with presentation duration, suggesting that neural processing speeds adapt to presentation speeds.

  4. PDE-Foam - a probability-density estimation method using self-adapting phase-space binning

    CERN Document Server

    Dannheim, Dominik; Voigt, Alexander; Grahn, Karl-Johan; Speckmayer, Peter

    2009-01-01

    Probability-Density Estimation (PDE) is a multivariate discrimination technique based on sampling signal and background densities defined by event samples from data or Monte-Carlo (MC) simulations in a multi-dimensional phase space. To efficiently use large event samples to estimate the probability density, a binary search tree (range searching) is used in the PDE-RS implementation. It is a generalisation of standard likelihood methods and a powerful classification tool for problems with highly non-linearly correlated observables. In this paper, we present an innovative improvement of the PDE method that uses a self-adapting binning method to divide the multi-dimensional phase space in a finite number of hyper-rectangles (cells). The binning algorithm adjusts the size and position of a predefined number of cells inside the multidimensional phase space, minimizing the variance of the signal and background densities inside the cells. The binned density information is stored in binary trees, allowing for a very ...

  5. 12 CFR 347.303 - Allocated transfer risk reserve.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Allocated transfer risk reserve. 347.303... GENERAL POLICY INTERNATIONAL BANKING International Lending § 347.303 Allocated transfer risk reserve. (a) Establishment of Allocated Transfer Risk Reserve. A banking institution shall establish an allocated transfer...

  6. 40 CFR 60.4142 - Hg allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Hg allowance allocations. 60.4142... Coal-Fired Electric Steam Generating Units Hg Allowance Allocations § 60.4142 Hg allowance allocations. (a)(1) The baseline heat input (in MMBtu) used with respect to Hg allowance allocations under...

  7. Allocating environmental liabilities within a facilities agreement

    International Nuclear Information System (INIS)

    Harvie, A.

    2000-01-01

    Some environmental issues at jointly owned oil and gas facilities in Alberta are examined, including ways to allocate liability for those issues among the facility's owners, and between the facility's owners and custom users. Causes of environmental contamination, the scope of clean-up costs and some industry initiatives to sort out the question of who pays the environmental costs are also discussed. Some aspects of the legislation in Alberta imposing environmental liabilities on parties to a construction, ownership and operation (CO and O) agreement, and relevant provisions of the Petroleum Joint Venture Association (PJVA)'s Model Operating Procedure are also explained. The author concludes by regretting the industry's failure to develop adequate mechanisms to allocate the costs of environmental damage resulting from operations, and by recommending that agreements pertaining to joint ownership of a facility should address the issues involved in allocating environmental liabilities

  8. Optimal allocation of reviewers for peer feedback

    DEFF Research Database (Denmark)

    Wind, David Kofoed; Jensen, Ulf Aslak; Jørgensen, Rasmus Malthe

    2017-01-01

    Peer feedback is the act of letting students give feedback to each other on submitted work. There are multiple reasons to use peer feedback, including students getting more feedback, time saving for teachers and increased learning by letting students reflect on work by others. In order for peer...... feedback to be effective students should give and receive useful feedback. A key challenge in peer feedback is allocating the feedback givers in a good way. It is important that reviewers are allocated to submissions such that the feedback distribution is fair - meaning that all students receive good...... feedback. In this paper we present a novel way to intelligently allocate reviewers for peer feedback. We train a statistical model to infer the quality of feedback based on a dataset of feedback quality evaluations. This dataset contains more than 20,000 reviews where the receiver of the feedback has...

  9. Robust resource allocation in future wireless networks

    CERN Document Server

    Parsaeefard, Saeedeh; Mokari, Nader

    2017-01-01

    This book presents state-of-the-art research on robust resource allocation in current and future wireless networks. The authors describe the nominal resource allocation problems in wireless networks and explain why introducing robustness in such networks is desirable. Then, depending on the objectives of the problem, namely maximizing the social utility or the per-user utility, cooperative or competitive approaches are explained and their corresponding robust problems are considered in detail. For each approach, the costs and benefits of robust schemes are discussed and the algorithms for reducing their costs and improving their benefits are presented. Considering the fact that such problems are inherently non-convex and intractable, a taxonomy of different relaxation techniques is presented, and applications of such techniques are shown via several examples throughout the book. Finally, the authors argue that resource allocation continues to be an important issue in future wireless networks, and propose spec...

  10. SU-F-J-135: Tumor Displacement-Based Binning for Respiratory-Gated Time-Independent 5DCT Treatment Planning

    International Nuclear Information System (INIS)

    Yang, L; O’Connell, D; Lee, P; Shaverdian, N; Kishan, A; Lewis, J; Dou, T; Thomas, D; Qi, X; Low, D

    2016-01-01

    Purpose: A published 5DCT breathing motion model enables image reconstruction at any user-selected breathing phase, defined by the model as a specific amplitude (v) and rate (f). Generation of reconstructed phase-specific CT scans will be required for time-independent radiation dose distribution simulations. This work answers the question: how many amplitude and rate bins are required to describe the tumor motion with a specific spatial resolution? Methods: 19 lung-cancer patients with 21 tumors were scanned using a free-breathing 5DCT protocol, employing an abdominally positioned pneumatic-bellows breathing surrogate and yielding voxel-specific motion model parameters α and β corresponding to motion as a function of amplitude and rate, respectively. Tumor GTVs were contoured on the first (reference) of 25 successive free-breathing fast helical CT image sets. The tumor displacements were binned into widths of 1mm to 5mm in 1mm steps and the total required number of bins recorded. The simulation evaluated the number of bins needed to encompass 100% of the breathing-amplitude and between the 5th and 95th percentile amplitudes to exclude breathing outliers. Results: The mean respiration-induced tumor motion was 9.90mm ± 7.86mm with a maximum of 25mm. The number of bins required was a strong function of the spatial resolution and varied widely between patients. For example, for 2mm bins, between 1–13 amplitude bins and 1–9 rate bins were required to encompass 100% of the breathing amplitude, while 1–6 amplitude bins and 1–3 rate bins were required to encompass 90% of the breathing amplitude. Conclusion: The strong relationship between number of bins and spatial resolution as well as the large variation between patients implies that time-independent radiation dose distribution simulations should be conducted using patient-specific data and that the breathing conditions will have to be carefully considered. This work will lead to the assessment of the

  11. How to Optimize the Commission Allocation

    Directory of Open Access Journals (Sweden)

    Yi Lan

    2017-01-01

    Full Text Available in this paper, the commission allocation mechanism between the mobile application store and the mobile application developer is studied under the game theory. Firstly, based on the non-cooperative game theory, the paper researches the equilibrium solution of the inter-firm game under the conditions of different sales scale, and then extend the study to the circumstance of infinite game. In addition, the paper analyzes the Pareto improvement achieved by choosing cooperative strategy of strategic alliance from different parts. Finally, problem of the commission allocation problem is resolved.

  12. Allocation of Decommissioning and Waste Liabilities

    International Nuclear Information System (INIS)

    Varley, Geoff

    2011-11-01

    The work demonstrates that there are a number of methods available for cost allocation, the pros and cons of which are examined. The study investigates potential proportional and incremental methods in some depth. A recommendation in principle to use the latter methodology is given. It is concluded that a 'fair assumption' is that the potential allocation of costs for 'the RMA Leaching Hall' probably is small, in relation to the total costs, and estimated to be not more than about 175 kSEK, plus any costs associated with decommissioning/ disposal of a number of small pieces of equipment added by the current operator

  13. Type monotonic allocation schemes for multi-glove games

    OpenAIRE

    Brânzei, R.; Solymosi, T.; Tijs, S.H.

    2007-01-01

    Multiglove markets and corresponding games are considered.For this class of games we introduce the notion of type monotonic allocation scheme.Allocation rules for multiglove markets based on weight systems are introduced and characterized.These allocation rules generate type monotonic allocation schemes for multiglove games and are also helpful in proving that each core element of the corresponding game is extendable to a type monotonic allocation scheme.The T-value turns out to generate a ty...

  14. Enhancement of Edge-based Image Quality Measures Using Entropy for Histogram Equalization-based Contrast Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    H. T. R. Kurmasha

    2017-12-01

    Full Text Available An Edge-based image quality measure (IQM technique for the assessment of histogram equalization (HE-based contrast enhancement techniques has been proposed that outperforms the Absolute Mean Brightness Error (AMBE and Entropy which are the most commonly used IQMs to evaluate Histogram Equalization based techniques, and also the two prominent fidelity-based IQMs which are Multi-Scale Structural Similarity (MSSIM and Information Fidelity Criterion-based (IFC measures. The statistical evaluation results show that the Edge-based IQM, which was designed for detecting noise artifacts distortion, has a Person Correlation Coefficient (PCC > 0.86 while the others have poor or fair correlation to human opinion, considering the Human Visual Perception (HVP. Based on HVP, this paper propose an enhancement to classic Edge-based IQM by taking into account the brightness saturation distortion which is the most prominent distortion in HE-based contrast enhancement techniques. It is tested and found to have significantly well correlation (PCC > 0.87, Spearman rank order correlation coefficient (SROCC > 0.92, Root Mean Squared Error (RMSE < 0.1054, and Outlier Ratio (OR = 0%.

  15. Detection of simulated microcalcifications in fixed mammary tissue: An ROC study of the effect of local versus global histogram equalization.

    Science.gov (United States)

    Sund, T; Olsen, J B

    2006-09-01

    To investigate whether sliding window adaptive histogram equalization (SWAHE) of digital mammograms improves the detection of simulated calcifications, as compared to images normalized by global histogram equalization (GHE). Direct digital mammograms were obtained from mammary tissue phantoms superimposed with different frames. Each frame was divided into forty squares by a wire mesh, and contained granular calcifications randomly positioned in about 50% of the squares. Three radiologists read the mammograms on a display monitor. They classified their confidence in the presence of microcalcifications in each square on a scale of 1 to 5. Images processed with GHE were first read and used as a reference. In a later session, the same images processed with SWAHE were read. The results were compared using ROC methodology. When the total areas AZ were compared, the results were completely equivocal. When comparing the high-specificity partial ROC area AZ,0.2 below false-positive fraction (FPF) 0.20, two of the three observers performed best with the images processed with SWAHE. The difference was not statistically significant. When the reader's confidence threshold in malignancy is set at a high level, increasing the contrast of mammograms with SWAHE may enhance the visibility of microcalcifications without adversely affecting the false-positive rate. When the reader's confidence threshold is set at a low level, the effect of SWAHE is an increase of false positives. Further investigation is needed to confirm the validity of the conclusions.

  16. Properties of the histogram location approach and the extent and change of downward nominal wage rigidity in the EU

    Directory of Open Access Journals (Sweden)

    Andreas Behr

    2006-06-01

    Full Text Available The histogram location approach has been proposed by Kahn (1997 to estimate the fraction of wage cuts prevented by downward nominal wage rigidity. In this paper, we analyze the validity of the approach by means of a simulation study which yielded evidence of unbiasedness but also of potential underestimation of rigidity parameter uncertainty and therefore of potential anticonservative inference. We apply the histogram location approach to estimate the extent of downward nominal wage rigidity across the EU for 1995-2001. Our data base is the User Data Base (UDB of the European Community Household Panel (ECHP. The results show wide variation in the fraction of wage cuts prevented by nominal wage rigidity across the EU. The lowest rigidity parameters are found for the UK, Spain and Ireland, the largest for Portugal and Italy. Analyzing the change of rigidity between sub periods 1995-1997 and 1999-2001 even shows an widening of the differences in nominal wage rigidity. Due to the finding of large differences across the EU, the results imply that the costs of low inflation policies across the EU differ substantially.

  17. Dose-volume histogram evaluation of prone and supine patient position in external beam radiotherapy for cervical and endometrial cancer

    International Nuclear Information System (INIS)

    Pinkawa, Michael; Gagel, Bernd; Demirel, Cengiz; Schmachtenberg, Axel; Asadpour, Branka; Eble, Michael J.

    2003-01-01

    Background and purpose: To evaluate the influence of patient positioning on dose-volume histograms of organs at risk in external beam radiotherapy for cervical and endometrial cancer. Materials and methods: In 20 patients scheduled for definitive (7) or postoperative (13) external beam radiotherapy of the pelvis treatment planning CT scans were performed in supine and prone (belly board) positions. After volume definition of target and organs at risk treatment plans were calculated applying the four-field box technique. The dose-volume histograms of organs at risk were compared. Results: Radiotherapy in prone position causes a reduction of the bladder portion (mean 15%, p<0.001) and an increase of the rectum portion (mean 11%, p<0.001) within the 90% isodose. A reduction of the bowel portion could only be observed in postoperatively treated patients (mean 13%, p<0.001). In definitive radiotherapy the target volume increases in supine position (mean 7%, p=0.02) due to an anterior tumour/uterus movement, so that bowel portions within the 90% isodose are similar. The bladder filling correlates with a reduction of bladder and bowel (postoperatively treated patients) dose. Conclusions: External beam radiotherapy of the pelvis should be performed in prone position in postoperative patients because of best bowel protection. Considering the additional HDR brachytherapy rectum protection takes the highest priority in definitive treatment--the requirements are best met in supine position. An adequate bladder filling is important to reduce the irradiated bladder and bowel volumes

  18. ANALYSE DATA IN THE FORM OF HISTOGRAM WITH COMPARISON BETWEEN KOLMOGOROV-SMIRNOV TEST AND CHI SQUARE TEST

    CERN Document Server

    Pg Haji Mohd Ariffin, Ak Muhamad Amirul Irfan

    2015-01-01

    This paper presents the project that I have been tasked while attending a three-month Summer Programme at CERN. The Project specification is to analyse the result of a weekly data produced by Compact Muon Solenoid (CMS) in the form of histograms. CMS is a detector which is a multi-purpose apparatus use to operate at the Large Hadron Collider (LHC) at CERN. It will yield head-on collisions of two proton (ion) beams of 7 TeV (2.75 TeV per nucleon) each, with a design luminosity of 10 34 cm -2s-1. A comparison of the results is then made using two methods namely Kolmogorov Smirnov Statistic Test and Chi-Squared Test. These tests will be further elaborated in the subsequent paragraphs. To execute this project, I have to firstly study the entire basic computer programming in particular C++ and the ROOT Basic Programmes. This is important to ensure the tasks given can be resolved within the given time. A program is subsequently written to provide output of histogram and calculation of Kolmogorov-Smirnov Test and Ch...

  19. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    Science.gov (United States)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  20. Adaptive resource allocation for efficient patient scheduling

    NARCIS (Netherlands)

    Vermeulen, Ivan B.; Bohte, Sander M.; Elkhuizen, Sylvia G.; Lameris, Han; Bakker, Piet J. M.; La Poutré, Han

    2009-01-01

    Efficient scheduling of patient appointments on expensive resources is a complex and dynamic task. A resource is typically used by several patient groups. To service these groups, resource capacity is often allocated per group, explicitly or implicitly. Importantly, due to fluctuations in demand,