Hamiltonian replica-exchange in GROMACS: a flexible implementation
Bussi, Giovanni
2013-01-01
A simple and general implementation of Hamiltonian replica exchange for the popular molecular-dynamics software GROMACS is presented. In this implementation, arbitrarily different Hamiltonians can be used for the different replicas without incurring in any significant performance penalty. The implementation was validated on a simple toy model - alanine dipeptide in water - and applied to study the rearrangement of an RNA tetraloop, where it was used to compare recently proposed force-field co...
Hamiltonian replica-exchange in GROMACS: a flexible implementation
Bussi, Giovanni
2013-01-01
A simple and general implementation of Hamiltonian replica exchange for the popular molecular-dynamics software GROMACS is presented. In this implementation, arbitrarily different Hamiltonians can be used for the different replicas without incurring in any significant performance penalty. The implementation was validated on a simple toy model - alanine dipeptide in water - and applied to study the rearrangement of an RNA tetraloop, where it was used to compare recently proposed force-field corrections.
Hamiltonian replica exchange molecular dynamics using soft-core interactions.
Hritz, Jozef; Oostenbrink, Chris
2008-04-14
To overcome the problem of insufficient conformational sampling within biomolecular simulations, we have developed a novel Hamiltonian replica exchange molecular dynamics (H-REMD) scheme that uses soft-core interactions between those parts of the system that contribute most to high energy barriers. The advantage of this approach over other H-REMD schemes is the possibility to use a relatively small number of replicas with locally larger differences between the individual Hamiltonians. Because soft-core potentials are almost the same as regular ones at longer distances, most of the interactions between atoms of perturbed parts will only be slightly changed. Rather, the strong repulsion between atoms that are close in space, which in many cases results in high energy barriers, is weakened within higher replicas of our proposed scheme. In addition to the soft-core interactions, we proposed to include multiple replicas using the same Hamiltonian/level of softness. We have tested the new protocol on the GTP and 8-Br-GTP molecules, which are known to have high energy barriers between the anti and syn conformation of the base with respect to the sugar moiety. During two 25 ns MD simulations of both systems the transition from the more stable to the less stable (but still experimentally observed) conformation is not seen at all. Also temperature REMD over 50 replicas for 1 ns did not show any transition at room temperature. On the other hand, more than 20 of such transitions are observed in H-REMD using six replicas (at three different Hamiltonians) during 6.8 ns per replica for GTP and 12 replicas (at six different Hamiltonians) during 8.7 ns per replica for 8-Br-GTP. The large increase in sampling efficiency was obtained from an optimized H-REMD scheme involving soft-core potentials, with multiple simulations using the same level of softness. The optimization of the scheme was performed by fast mimicking [J. Hritz and C. Oostenbrink, J. Chem. Phys. 127, 204104 (2007)].
Wang, Kai; Chodera, John D; Yang, Yanzhi; Shirts, Michael R
2013-12-01
We present a method to identify small molecule ligand binding sites and poses within a given protein crystal structure using GPU-accelerated Hamiltonian replica exchange molecular dynamics simulations. The Hamiltonians used vary from the physical end state of protein interacting with the ligand to an unphysical end state where the ligand does not interact with the protein. As replicas explore the space of Hamiltonians interpolating between these states, the ligand can rapidly escape local minima and explore potential binding sites. Geometric restraints keep the ligands from leaving the vicinity of the protein and an alchemical pathway designed to increase phase space overlap between intermediates ensures good mixing. Because of the rigorous statistical mechanical nature of the Hamiltonian exchange framework, we can also extract binding free energy estimates for all putative binding sites. We present results of this methodology applied to the T4 lysozyme L99A model system for three known ligands and one non-binder as a control, using an implicit solvent. We find that our methodology identifies known crystallographic binding sites consistently and accurately for the small number of ligands considered here and gives free energies consistent with experiment. We are also able to analyze the contribution of individual binding sites to the overall binding affinity. Our methodology points to near term potential applications in early-stage structure-guided drug discovery.
Ostermeir, Katja; Zacharias, Martin
2014-01-15
A Hamiltonian Replica-Exchange Molecular Dynamics (REMD) simulation method has been developed that employs a two-dimensional backbone and one-dimensional side chain biasing potential specifically to promote conformational transitions in peptides. To exploit the replica framework optimally, the level of the biasing potential in each replica was appropriately adapted during the simulations. This resulted in both high exchange rates between neighboring replicas and improved occupancy/flow of all conformers in each replica. The performance of the approach was tested on several peptide and protein systems and compared with regular MD simulations and previous REMD studies. Improved sampling of relevant conformational states was observed for unrestrained protein and peptide folding simulations as well as for refinement of a loop structure with restricted mobility of loop flanking protein regions.
Enhanced conformational sampling of carbohydrates by Hamiltonian replica-exchange simulation.
Mishra, Sushil Kumar; Kara, Mahmut; Zacharias, Martin; Koca, Jaroslav
2014-01-01
Knowledge of the structure and conformational flexibility of carbohydrates in an aqueous solvent is important to improving our understanding of how carbohydrates function in biological systems. In this study, we extend a variant of the Hamiltonian replica-exchange molecular dynamics (MD) simulation to improve the conformational sampling of saccharides in an explicit solvent. During the simulations, a biasing potential along the glycosidic-dihedral linkage between the saccharide monomer units in an oligomer is applied at various levels along the replica runs to enable effective transitions between various conformations. One reference replica runs under the control of the original force field. The method was tested on disaccharide structures and further validated on biologically relevant blood group B, Lewis X and Lewis A trisaccharides. The biasing potential-based replica-exchange molecular dynamics (BP-REMD) method provided a significantly improved sampling of relevant conformational states compared with standard continuous MD simulations, with modest computational costs. Thus, the proposed BP-REMD approach adds a new dimension to existing carbohydrate conformational sampling approaches by enhancing conformational sampling in the presence of solvent molecules explicitly at relatively low computational cost.
Directory of Open Access Journals (Sweden)
Nanyu Han
Full Text Available Neuraminidase (NA of influenza is a key target for antiviral inhibitors, and the 150-cavity in group-1 NA provides new insight in treating this disease. However, NA of 2009 pandemic influenza (09N1 was found lacking this cavity in a crystal structure. To address the issue of flexibility of the 150-loop, Hamiltonian replica exchange molecular dynamics simulations were performed on different groups of NAs. Free energy landscape calculated based on the volume of 150-cavity indicates that 09N1 prefers open forms of 150-loop. The turn A (residues 147-150 of the 150-loop is discovered as the most dynamical motif which induces the inter-conversion of this loop among different conformations. In the turn A, the backbone dynamic of residue 149 is highly related with the shape of 150-loop, thus can function as a marker for the conformation of 150-loop. As a contrast, the closed conformation of 150-loop is more energetically favorable in N2, one of group-2 NAs. The D147-H150 salt bridge is found having no correlation with the conformation of 150-loop. Instead the intimate salt bridge interaction between the 150 and 430 loops in N2 variant contributes the stabilizing factor for the closed form of 150-loop. The clustering analysis elaborates the structural plasticity of the loop. This enhanced sampling simulation provides more information in further structural-based drug discovery on influenza virus.
Han, Nanyu; Mu, Yuguang
2013-01-01
Neuraminidase (NA) of influenza is a key target for antiviral inhibitors, and the 150-cavity in group-1 NA provides new insight in treating this disease. However, NA of 2009 pandemic influenza (09N1) was found lacking this cavity in a crystal structure. To address the issue of flexibility of the 150-loop, Hamiltonian replica exchange molecular dynamics simulations were performed on different groups of NAs. Free energy landscape calculated based on the volume of 150-cavity indicates that 09N1 prefers open forms of 150-loop. The turn A (residues 147-150) of the 150-loop is discovered as the most dynamical motif which induces the inter-conversion of this loop among different conformations. In the turn A, the backbone dynamic of residue 149 is highly related with the shape of 150-loop, thus can function as a marker for the conformation of 150-loop. As a contrast, the closed conformation of 150-loop is more energetically favorable in N2, one of group-2 NAs. The D147-H150 salt bridge is found having no correlation with the conformation of 150-loop. Instead the intimate salt bridge interaction between the 150 and 430 loops in N2 variant contributes the stabilizing factor for the closed form of 150-loop. The clustering analysis elaborates the structural plasticity of the loop. This enhanced sampling simulation provides more information in further structural-based drug discovery on influenza virus.
Zacharias, Martin
2008-03-01
Coarse-grained elastic network models (ENM) of proteins can be used efficiently to explore the global mobility of a protein around a reference structure. A new Hamiltonian-replica exchange molecular dynamics (H-RexMD) method has been designed that effectively combines information extracted from an ENM analysis with atomic-resolution MD simulations. The ENM analysis is used to construct a distance-dependent penalty (flooding or biasing) potential that can drive the structure away from its current conformation in directions compatible with the ENM model. Various levels of the penalty or biasing potential are added to the force field description of the MD simulation along the replica coordinate. One replica runs at the original force field. By focusing the penalty potential on the relevant soft degrees of freedom the method avoids the rapid increase of the replica number with increasing system size to cover a desired temperature range in conventional (temperature) RexMD simulations. The application to domain motions in lysozyme of bacteriophage T4 and to peptide folding indicates significantly improved conformational sampling compared to conventional MD simulations.
Zeller, Fabian; Zacharias, Martin
2014-02-11
The accurate calculation of potentials of mean force for ligand-receptor binding is one of the most important applications of molecular simulation techniques. Typically, the separation distance between ligand and receptor is chosen as a reaction coordinate along which a PMF can be calculated with the aid of umbrella sampling (US) techniques. In addition, restraints can be applied on the relative position and orientation of the partner molecules to reduce accessible phase space. An approach combining such phase space reduction with flattening of the free energy landscape and configurational exchanges has been developed, which significantly improves the convergence of PMF calculations in comparison with standard umbrella sampling. The free energy surface along the reaction coordinate is smoothened by iteratively adapting biasing potentials corresponding to previously calculated PMFs. Configurations are allowed to exchange between the umbrella simulation windows via the Hamiltonian replica exchange method. The application to a DNA molecule in complex with a minor groove binding ligand indicates significantly improved convergence and complete reversibility of the sampling along the pathway. The calculated binding free energy is in excellent agreement with experimental results. In contrast, the application of standard US resulted in large differences between PMFs calculated for association and dissociation pathways. The approach could be a useful alternative to standard US for computational studies on biomolecular recognition processes.
Protein-ligand docking using hamiltonian replica exchange simulations with soft core potentials.
Luitz, Manuel P; Zacharias, Martin
2014-06-23
Molecular dynamics (MD) simulations in explicit solvent allow studying receptor-ligand binding processes including full flexibility of the binding partners and an explicit inclusion of solvation effects. However, in MD simulations, the search for an optimal ligand-receptor complex geometry is frequently trapped in locally stable non-native binding geometries. A Hamiltonian replica-exchange (H-REMD)-based protocol has been designed to enhance the sampling of putative ligand-receptor complexes. It is based on softening nonbonded ligand-receptor interactions along the replicas and one reference replica under the control of the original force field. The efficiency of the method has been evaluated on two receptor-ligand systems and one protein-peptide complex. Starting from misplaced initial docking geometries, the H-REMD method reached in each case the known binding geometry significantly faster than a standard MD simulation. The approach could also be useful to identify and evaluate alternative binding geometries in a given binding region with small relative differences in binding free energy.
Meng, Yilin; Dashti, Danial Sabri; Roitberg, Adrian E
2011-09-13
Alchemical free energy calculations play a very important role in the field of molecular modeling. Efforts have been made to improve the accuracy and precision of those calculations. One of the efforts is to employ a Hamiltonian replica exchange molecular dynamics (H-REMD) method to enhance conformational sampling. In this paper, we demonstrated that HREMD method not only improves convergence in alchemical free energy calculations but also can be used to compute free energy differences directly via the Free Energy Perturbation (FEP)algorithm. We show a direct mapping between the H-REMD and the usual FEP equations, which are then used directly to compute free energies. The H-REMD alchemical free energy calculation (Replica exchange Free Energy Perturbation, REFEP) was tested on predicting the pK(a) value of the buried Asp26 in thioredoxin. We compare the results of REFEP with TI and regular FEP simulations. REFEP calculations converged faster than those from TI and regular FEP simulations. The final predicted pK(a) value from the H-REMD simulation was also very accurate, only 0.4 pK(a) unit above the experimental value. Utilizing the REFEP algorithm significantly improves conformational sampling, and this in turn improves the convergence of alchemical free energy simulations.
Protein-Ligand Binding from Distancefield Distances and Hamiltonian Replica Exchange Simulations.
de Ruiter, Anita; Oostenbrink, Chris
2013-02-12
The calculation of protein-ligand binding free energies is an important goal in the field of computational chemistry. Applying path-sampling methods for this purpose involves calculating the associated potential of mean force (PMF) and gives insight into the binding free energy along the binding process. Without a priori knowledge about the binding path, sampling reversible binding can be difficult to achieve. To alleviate this problem, we introduce the distancefield (DF) as a reaction coordinate for such calculations. DF is a grid-based method in which the shortest distance between the binding site and a ligand is determined avoiding routes that pass through the protein. Combining this reaction coordinate with Hamiltonian replica exchange molecular dynamics (HREMD) allows for the reversible binding of the ligand to the protein. A comparison is made between umbrella sampling using regular distance restraints and HREMD with DF restraints to study aspirin binding to the protein phospholipase A2. Although the free energies of binding are similar for both methods, the increased sampling with HREMD has a significant influence on the shape of the PMF. A remarkable agreement between the calculated binding free energies from the PMF and the experimental estimate is obtained.
Roe, Daniel R; Bergonzo, Christina; Cheatham, Thomas E
2014-04-03
Many problems studied via molecular dynamics require accurate estimates of various thermodynamic properties, such as the free energies of different states of a system, which in turn requires well-converged sampling of the ensemble of possible structures. Enhanced sampling techniques are often applied to provide faster convergence than is possible with traditional molecular dynamics simulations. Hamiltonian replica exchange molecular dynamics (H-REMD) is a particularly attractive method, as it allows the incorporation of a variety of enhanced sampling techniques through modifications to the various Hamiltonians. In this work, we study the enhanced sampling of the RNA tetranucleotide r(GACC) provided by H-REMD combined with accelerated molecular dynamics (aMD), where a boosting potential is applied to torsions, and compare this to the enhanced sampling provided by H-REMD in which torsion potential barrier heights are scaled down to lower force constants. We show that H-REMD and multidimensional REMD (M-REMD) combined with aMD does indeed enhance sampling for r(GACC), and that the addition of the temperature dimension in the M-REMD simulations is necessary to efficiently sample rare conformations. Interestingly, we find that the rate of convergence can be improved in a single H-REMD dimension by simply increasing the number of replicas from 8 to 24 without increasing the maximum level of bias. The results also indicate that factors beyond replica spacing, such as round trip times and time spent at each replica, must be considered in order to achieve optimal sampling efficiency.
Vreede, Jocelyne; Wolf, Maarten G; de Leeuw, Simon W; Bolhuis, Peter G
2009-05-07
Hydrogen bonds play an important role in stabilizing (meta-)stable states in protein folding. Hence, they can potentially be used as a way to bias these states in molecular simulation methods. Previously, Wolf et al. showed that applying repulsive and attractive hydrogen bond biasing potentials in an alternating way significantly accelerates the folding process (Wolf, M. G.; de Leeuw, S. W. Biophys. J. 2008, 94, 3742). As the biasing potentials are only active during a fixed time interval, this alternating scheme does not represent a thermodynamic equilibrium. In this work, we present a Hamiltonian replica exchange molecular dynamics (REMD) scheme that aims to shuffle and reorder hydrogen bonds in the protein backbone. We therefore apply adapted hydrogen bond potentials in a Hamiltonian REMD scheme, which we call hydrogen bond switching (HS). To compare the performance of the HS to a standard REMD method, we performed HS and temperature REMD simulations of a beta-heptapeptide in methanol. Both methods sample the conformational space to a similar extent. As the HS simulation required only five replicas, while the REMD simulation required 20 replicas, the HS method is significantly more efficient. We tested the HS method also on a larger system, 16-residue polyalanine in water. Both of the simulations starting from a completely unfolded and a folded conformation resulted in an ensemble with, apart from the starting structure, similar conformational minima. We can conclude that the HS method provides an efficient way to sample the conformational space of a protein, without requiring knowledge of the folded states beforehand. In addition, these simulations revealed that convergence was hampered by replicas having a preference for specific biasing potentials. As this sorting effect is inherent to any Hamiltonian REMD method, finding a solution will result in an additional increase in the efficiency of Hamiltonian REMD methods in general.
Mentes, Ahmet; Deng, Nan-Jie; Vijayan, R S K; Xia, Junchao; Gallicchio, Emilio; Levy, Ronald M
2016-05-10
Molecular dynamics modeling of complex biological systems is limited by finite simulation time. The simulations are often trapped close to local energy minima separated by high energy barriers. Here, we introduce Hamiltonian replica exchange (H-REMD) with torsional flattening in the Binding Energy Distribution Analysis Method (BEDAM), to reduce energy barriers along torsional degrees of freedom and accelerate sampling of intramolecular degrees of freedom relevant to protein-ligand binding. The method is tested on a standard benchmark (T4 Lysozyme/L99A/p-xylene complex) and on a library of HIV-1 integrase complexes derived from the SAMPL4 blind challenge. We applied the torsional flattening strategy to 26 of the 53 known binders to the HIV Integrase LEDGF site found to have a binding energy landscape funneled toward the crystal structure. We show that our approach samples the conformational space more efficiently than the original method without flattening when starting from a poorly docked pose with incorrect ligand dihedral angle conformations. In these unfavorable cases convergence to a binding pose within 2-3 Å from the crystallographic pose is obtained within a few nanoseconds of the Hamiltonian replica exchange simulation. We found that torsional flattening is insufficient in cases where trapping is due to factors other than torsional energy, such as the formation of incorrect intramolecular hydrogen bonds and stacking. Work is in progress to generalize the approach to handle these cases and thereby make it more widely applicable.
Guardiani, Carlo; Procacci, Piero
2013-06-21
The inhibitors of the Tumor Necrosis Factor-α Converting Enzyme represent promising tools for the treatment of Rheumatoid Arthritis, Multiple Sclerosis and other autoimmune diseases. In this work, using Hamiltonian Replica Exchange Molecular Dynamics simulations and atomistic force field we perform an accurate structural characterization of a group of tartrate-based inhibitors. The simulations highlight a correlation between the conformational landscape in bulk solvent and inhibition potency. Since the structures in bulk solvent are much more compact than the crystallographic bound state, we formulate the hypothesis of a two-step docking mechanism: (i) formation of an intermediate between the compact, hydroxyl exposing conformations in solution and the catalytic zinc ion; (ii) structural rearrangement in the active site of TACE of the zinc-tethered drug in the final binding conformation.
Fan, Hao; Periole, Xavier; Mark, Alan E.
2012-01-01
The efficiency of using a variant of Hamiltonian replica-exchange molecular dynamics (Chaperone H-replica-exchange molecular dynamics [CH-REMD]) for the refinement of protein structural models generated de novo is investigated. In CH-REMD, the interaction between the protein and its environment, spe
Ostermeir, Katja; Zacharias, Martin
2014-12-01
Coarse-grained elastic network models (ENM) of proteins offer a low-resolution representation of protein dynamics and directions of global mobility. A Hamiltonian-replica exchange molecular dynamics (H-REMD) approach has been developed that combines information extracted from an ENM analysis with atomistic explicit solvent MD simulations. Based on a set of centers representing rigid segments (centroids) of a protein, a distance-dependent biasing potential is constructed by means of an ENM analysis to promote and guide centroid/domain rearrangements. The biasing potentials are added with different magnitude to the force field description of the MD simulation along the replicas with one reference replica under the control of the original force field. The magnitude and the form of the biasing potentials are adapted during the simulation based on the average sampled conformation to reach a near constant biasing in each replica after equilibration. This allows for canonical sampling of conformational states in each replica. The application of the methodology to a two-domain segment of the glycoprotein 130 and to the protein cyanovirin-N indicates significantly enhanced global domain motions and improved conformational sampling compared with conventional MD simulations.
Jiang, Wei; Roux, Benoît
2010-07-01
Free Energy Perturbation with Replica Exchange Molecular Dynamics (FEP/REMD) offers a powerful strategy to improve the convergence of free energy computations. In particular, it has been shown previously that a FEP/REMD scheme allowing random moves within an extended replica ensemble of thermodynamic coupling parameters "lambda" can improve the statistical convergence in calculations of absolute binding free energy of ligands to proteins [J. Chem. Theory Comput. 2009, 5, 2583]. In the present study, FEP/REMD is extended and combined with an accelerated MD simulations method based on Hamiltonian replica-exchange MD (H-REMD) to overcome the additional problems arising from the existence of kinetically trapped conformations within the protein receptor. In the combined strategy, each system with a given thermodynamic coupling factor lambda in the extended ensemble is further coupled with a set of replicas evolving on a biased energy surface with boosting potentials used to accelerate the inter-conversion among different rotameric states of the side chains in the neighborhood of the binding site. Exchanges are allowed to occur alternatively along the axes corresponding to the thermodynamic coupling parameter lambda and the boosting potential, in an extended dual array of coupled lambda- and H-REMD simulations. The method is implemented on the basis of new extensions to the REPDSTR module of the biomolecular simulation program CHARMM. As an illustrative example, the absolute binding free energy of p-xylene to the nonpolar cavity of the L99A mutant of T4 lysozyme was calculated. The tests demonstrate that the dual lambda-REMD and H-REMD simulation scheme greatly accelerates the configurational sampling of the rotameric states of the side chains around the binding pocket, thereby improving the convergence of the FEP computations.
Meli, Massimiliano; Colombo, Giorgio
2013-06-06
Herein, we present a novel Hamiltonian replica exchange protocol for classical molecular dynamics simulations of protein folding/unfolding. The scheme starts from the analysis of the energy-networks responsible for the stabilization of the folded conformation, by means of the energy-decomposition approach. In this framework, the compact energetic map of the native state is generated by a preliminary short molecular dynamics (MD) simulation of the protein in explicit solvent. This map is simplified by means of an eigenvalue decomposition. The highest components of the eigenvector associated with the lowest eigenvalue indicate which sites, named "hot spots", are likely to be responsible for the stability and correct folding of the protein. In the Hamiltonian replica exchange protocol, we use modified force-field parameters to treat the interparticle non-bonded potentials of the hot spots within the protein and between protein and solvent atoms, leaving unperturbed those relative to all other residues, as well as solvent-solvent interactions. We show that it is possible to reversibly simulate the folding/unfolding behavior of two test proteins, namely Villin HeadPiece HP35 (35 residues) and Protein A (62 residues), using a limited number of replicas. We next discuss possible implications for the study of folding mechanisms via all atom simulations.
Directory of Open Access Journals (Sweden)
Massimiliano Meli
2013-06-01
Full Text Available Herein, we present a novel Hamiltonian replica exchange protocol for classical molecular dynamics simulations of protein folding/unfolding. The scheme starts from the analysis of the energy-networks responsible for the stabilization of the folded conformation, by means of the energy-decomposition approach. In this framework, the compact energetic map of the native state is generated by a preliminary short molecular dynamics (MD simulation of the protein in explicit solvent. This map is simplified by means of an eigenvalue decomposition. The highest components of the eigenvector associated with the lowest eigenvalue indicate which sites, named “hot spots”, are likely to be responsible for the stability and correct folding of the protein. In the Hamiltonian replica exchange protocol, we use modified force-field parameters to treat the interparticle non-bonded potentials of the hot spots within the protein and between protein and solvent atoms, leaving unperturbed those relative to all other residues, as well as solvent-solvent interactions. We show that it is possible to reversibly simulate the folding/unfolding behavior of two test proteins, namely Villin HeadPiece HP35 (35 residues and Protein A (62 residues, using a limited number of replicas. We next discuss possible implications for the study of folding mechanisms via all atom simulations.
Fan, Hao; Periole, Xavier; Mark, Alan E
2012-07-01
The efficiency of using a variant of Hamiltonian replica-exchange molecular dynamics (Chaperone H-replica-exchange molecular dynamics [CH-REMD]) for the refinement of protein structural models generated de novo is investigated. In CH-REMD, the interaction between the protein and its environment, specifically, the electrostatic interaction between the protein and the solvating water, is varied leading to cycles of partial unfolding and refolding mimicking some aspects of folding chaperones. In 10 of the 15 cases examined, the CH-REMD approach sampled structures in which the root-mean-square deviation (RMSD) of secondary structure elements (SSE-RMSD) with respect to the experimental structure was more than 1.0 Å lower than the initial de novo model. In 14 of the 15 cases, the improvement was more than 0.5 Å. The ability of three different statistical potentials to identify near-native conformations was also examined. Little correlation between the SSE-RMSD of the sampled structures with respect to the experimental structure and any of the scoring functions tested was found. The most effective scoring function tested was the DFIRE potential. Using the DFIRE potential, the SSE-RMSD of the best scoring structures was on average 0.3 Å lower than the initial model. Overall the work demonstrates that targeted enhanced-sampling techniques such as CH-REMD can lead to the systematic refinement of protein structural models generated de novo but that improved potentials for the identification of near-native structures are still needed.
To investigate the dose difference by respiratory motion using overlap volume histogram
Energy Technology Data Exchange (ETDEWEB)
Shin, Dong Seok; Kang, Seong Hee; Kim, Dong Su; Kim, Tae Ho; Kim, Kyeong Hyeon; Cho, Min Seok; Suh, Tae Suk [Dept. of Biomedical Engineering, Research Institute of Biomedical Engineering, College of Medicine, The Catholic University of Korea, Seoul (Korea, Republic of)
2015-04-15
Four-dimensional dose (4D dose) which uses deformable image registration (DIR) algorithm from four-dimensional computed tomography (4DCT) images can consider dosimetric effect by respiratory motions. The dose difference between 3D dose and 4D dose can be varied according to geometrical relationship between a planning target volume (PTV) and an organ at risk (OAR). The purpose of this study is to investigate the dose difference between 3D and 4D dose using overlap volume histogram (OVH) which is an indicator that quantifies the geometrical relationship between a PTV and an OAR. The tendency of dose difference variation was verified according to OVH. OVH is seems to be an indicator that has a potential to predict the dose difference between 4D and 3D dose.
Energy Technology Data Exchange (ETDEWEB)
Shin, D; Kang, S; Kim, D; Kim, T; Kim, K; Cho, M; Suh, T [The Catholic University of Korea, Seoul (Korea, Republic of)
2015-06-15
Purpose: The dose difference between three-dimensional dose (3D dose) and 4D dose which considers motion due to respiratory can be varied according to geometrical relationship between planning target volume (PTV) and organ at risk (OAR). The purpose of the study is to investigate the dose difference between 3D and 4D dose using overlap volume histogram (OVH) which is an indicator that quantify geometrical relationship between a PTV and an OAR. Methods: Five liver cancer patients who previously treated stereotactic body radiotherapy (SBRT) were investigated. Four-dimensional computed tomography (4DCT) images were acquired for all patients. ITV-based treatment planning was performed. 3D dose was calculated on the end-exhale phase image as a reference phase image. 4D dose accumulation was implemented from all phase images using dose warping technique used deformable image registration (DIR) algorithm (Horn and Schunck optical flow) in DIRART. In this study OVH was used to quantify geometrical relationship between a PTV and an OAR. OVH between a PTV and a selected OAR was generated for each patient case and compared for all cases. The dose difference between 3D and 4D dose for normal organ was calculated and compared for all cases according to OVH. Results: The 3D and 4D dose difference for OAR was analyzed using dose-volume histogram (DVH). On the basis of a specific point which corresponds to 10% of OAR volume overlapped with expanded PTV, mean dose difference was 34.56% in minimum OVH distance case and 13.36% in maximum OVH distance case. As the OVH distance increased, mean dose difference between 4D and 3D dose was decreased. Conclusion: The tendency of dose difference variation was verified according to OVH. OVH is seems to be indicator that has a potential to predict the dose difference between 4D and 3D dose. This work was supported by the Radiation Technology R&D program (No. 2013M2A2A7043498) and the Mid-career Researcher Program (2014R1A2A1A10050270) through
Vreede, J.; Wolf, M.G.; de Leeuw, S.W.; Bolhuis, P.G.
2009-01-01
Hydrogen bonds play an important role in stabilizing (meta-)stable states in protein folding. Hence, they can potentially be used as a way to bias these states in molecular simulation methods. Previously, Wolf et al. showed that applying repulsive and attractive hydrogen bond biasing potentials in a
Cukier, Robert I
2011-01-28
Leucine zippers consist of alpha helical monomers dimerized (or oligomerized) into alpha superhelical structures known as coiled coils. Forming the correct interface of a dimer from its monomers requires an exploration of configuration space focused on the side chains of one monomer that must interdigitate with sites on the other monomer. The aim of this work is to generate good interfaces in short simulations starting from separated monomers. Methods are developed to accomplish this goal based on an extension of a previously introduced [Su and Cukier, J. Phys. Chem. B 113, 9595, (2009)] hamiltonian temperature replica exchange method (HTREM), which scales the hamiltonian in both potential and kinetic energies that was used for the simulation of dimer melting curves. The new method, HTREM_MS (MS designates mean square), focused on interface formation, adds restraints to the hamiltonians for all but the physical system, which is characterized by the normal molecular dynamics force field at the desired temperature. The restraints in the nonphysical systems serve to prevent the monomers from separating too far, and have the dual aims of enhancing the sampling of close in configurations and breaking unwanted correlations in the restrained systems. The method is applied to a 31-residue truncation of the 33-residue leucine zipper (GCN4-p1) of the yeast transcriptional activator GCN4. The monomers are initially separated by a distance that is beyond their capture length. HTREM simulations show that the monomers oscillate between dimerlike and monomerlike configurations, but do not form a stable interface. HTREM_MS simulations result in the dimer interface being faithfully reconstructed on a 2 ns time scale. A small number of systems (one physical and two restrained with modified potentials and higher effective temperatures) are sufficient. An in silico mutant that should not dimerize because it lacks charged residues that provide electrostatic stabilization of the dimer does not with HTREM_MS, giving confidence in the method. The interface formation time scale is sufficiently short that using HTREM_MS as a screening tool to validate leucine zipper design methods may be feasible.
Jo, Sunhwan; Chipot, Christophe; Roux, Benoît
2015-05-12
The performance and accuracy of different simulation schemes for estimating the entropy inferred from free energy calculations are tested. The results obtained from replica-exchange molecular dynamics (REMD) simulations based on a simplified toy model are compared to exact numerically derived ones to assess accuracy and convergence. It is observed that the error in entropy estimation decreases by at least an order of magnitude and the quantities of interest converge much faster when the simulations are coupled via a temperature REMD algorithm and the trajectories from different temperatures are combined. Simulations with the infinite-swapping method and its variants show some improvement over the traditional nearest-neighbor REMD algorithms, but they are more computationally expensive. To test the methodologies further, the free energy profile for the reversible association of two methane molecules in explicit water was calculated and decomposed into its entropic and enthalpic contributions. Finally, a strategy based on umbrella sampling computations carried out via simultaneous temperature and Hamiltonian REMD simulations is shown to yield the most accurate entropy estimation. The entropy profile between the two methane molecules displays the characteristic signature of a hydrophobic interaction.
Curuksu, Jeremy; Zacharias, Martin
2009-03-14
Although molecular dynamics (MD) simulations have been applied frequently to study flexible molecules, the sampling of conformational states separated by barriers is limited due to currently possible simulation time scales. Replica-exchange (Rex)MD simulations that allow for exchanges between simulations performed at different temperatures (T-RexMD) can achieve improved conformational sampling. However, in the case of T-RexMD the computational demand grows rapidly with system size. A Hamiltonian RexMD method that specifically enhances coupled dihedral angle transitions has been developed. The method employs added biasing potentials as replica parameters that destabilize available dihedral substates and was applied to study coupled dihedral transitions in nucleic acid molecules. The biasing potentials can be either fixed at the beginning of the simulation or optimized during an equilibration phase. The method was extensively tested and compared to conventional MD simulations and T-RexMD simulations on an adenine dinucleotide system and on a DNA abasic site. The biasing potential RexMD method showed improved sampling of conformational substates compared to conventional MD simulations similar to T-RexMD simulations but at a fraction of the computational demand. It is well suited to study systematically the fine structure and dynamics of large nucleic acids under realistic conditions including explicit solvent and ions and can be easily extended to other types of molecules.
Cukier, Robert I.
2011-01-01
Leucine zippers consist of alpha helical monomers dimerized (or oligomerized) into alpha superhelical structures known as coiled coils. Forming the correct interface of a dimer from its monomers requires an exploration of configuration space focused on the side chains of one monomer that must interdigitate with sites on the other monomer. The aim of this work is to generate good interfaces in short simulations starting from separated monomers. Methods are developed to accomplish this goal based on an extension of a previously introduced [Su and Cukier, J. Phys. Chem. B 113, 9595, (2009)] Hamiltonian temperature replica exchange method (HTREM), which scales the Hamiltonian in both potential and kinetic energies that was used for the simulation of dimer melting curves. The new method, HTREM_MS (MS designates mean square), focused on interface formation, adds restraints to the Hamiltonians for all but the physical system, which is characterized by the normal molecular dynamics force field at the desired temperature. The restraints in the nonphysical systems serve to prevent the monomers from separating too far, and have the dual aims of enhancing the sampling of close in configurations and breaking unwanted correlations in the restrained systems. The method is applied to a 31-residue truncation of the 33-residue leucine zipper (GCN4-p1) of the yeast transcriptional activator GCN4. The monomers are initially separated by a distance that is beyond their capture length. HTREM simulations show that the monomers oscillate between dimerlike and monomerlike configurations, but do not form a stable interface. HTREM_MS simulations result in the dimer interface being faithfully reconstructed on a 2 ns time scale. A small number of systems (one physical and two restrained with modified potentials and higher effective temperatures) are sufficient. An in silico mutant that should not dimerize because it lacks charged residues that provide electrostatic stabilization of the dimer does not with HTREM_MS, giving confidence in the method. The interface formation time scale is sufficiently short that using HTREM_MS as a screening tool to validate leucine zipper design methods may be feasible.
2010-10-21
transfor- mation, the hybrid potential VAB remains well-defined. Therefore, the associated reversible work can be computed. This reversible work formally...corresponds to changing the potential of the system VAB from the state that represents the original molecule A to the state that corresponds to molecule...constructed. Typically, the hybrid Hamiltonian of the system (HAB) that includes the potential VAB is linearly interpolated between the end points, molecules
Directory of Open Access Journals (Sweden)
Ting Song
2014-03-01
Full Text Available Purpose: We developed a plan quality classification model to assess IMRT plan quality of prostate cancer patients for automatic plan quality control. Methods: For hollow organs such as rectum and bladder, dose-wall-histogram (DWH was used to evaluate OAR dose sparing in our institution. Correspondingly, we proposed a new descriptor called overlap-wall-histogram (OWH to describe the complex spatial relationship between PTV and a hollow organ. Two metrics calculated from the OWH and DWH are introduced to quantitatively evaluate the difficulty of patient geometry for planning and plan quality in terms of OAR sparing, respectively. A linear correlation between these two metrics was observed after plotting plan quality metric as a function of geometry difficulty metric studied from a database of prostate cases treated in our institution with acceptable plan quality. Thus, a fitting line was built acting as the boundary of high quality and poor quality plans. A query plan falling above the boundary is assessed as high quality, vice versa poor quality. Results: 15 prostate IMRT plans were used to test our model. One was identified as poor quality and the others were common-level. After re-planning all plans, the dose constraints for bladder wall W75 (percentage of wall receiving more than 75Gy, W70, W65 and W60 can be reduced by 3.34%, 3%, 6.99%, 6.54% for that poor quality plan and 1.11%, 0.95%, 1.45% and 1.81% averagely for the common-level quality group, without sacrificing PTV coverage and rectum dose sparing. Conclusion: An effective model was built to provide automatic IMRT plan quality control by evaluating hollow OAR dose sparing for prostate cancer patients. Furthermore, for the query plan with poor quality, potential improvement of plan quality can be estimated and a good reference plan with similar or harder geometry can be automatically chosen from our database to help guide the re-planning if necessary.---------------------------Cite this
Minh, David D L
2015-01-01
A binding potential of mean force (BPMF) is a free energy of noncovalent association in which one binding partner is flexible and the other is rigid. I have developed a method to calculate BPMFs for protein-ligand systems. The method is based on replica exchange sampling from multiple thermodynamic states at different temperatures and protein-ligand interaction strengths. Protein-ligand interactions are represented by interpolating precomputed electrostatic and van der Waals grids. Using a simple estimator for thermodynamic length, thermodynamic states are initialized at approximately equal intervals. The method is demonstrated on the Astex diverse set, a database of 85 protein-ligand complexes relevant to pharmacy or agriculture. Fifteen independent simulations of each complex were started using poses from crystallography, docking, or the lowest-energy pose observed in the other simulations. Benchmark simulations completed within three days on a single processor. Overall, protocols initialized using the ther...
Color Histogram Diffusion for Image Enhancement
Kim, Taemin
2011-01-01
Various color histogram equalization (CHE) methods have been proposed to extend grayscale histogram equalization (GHE) for color images. In this paper a new method called histogram diffusion that extends the GHE method to arbitrary dimensions is proposed. Ranges in a histogram are specified as overlapping bars of uniform heights and variable widths which are proportional to their frequencies. This diagram is called the vistogram. As an alternative approach to GHE, the squared error of the vistogram from the uniform distribution is minimized. Each bar in the vistogram is approximated by a Gaussian function. Gaussian particles in the vistoram diffuse as a nonlinear autonomous system of ordinary differential equations. CHE results of color images showed that the approach is effective.
Structure Size Enhanced Histogram
Wesarg, Stefan; Kirschner, Matthias
Direct volume visualization requires the definition of transfer functions (TFs) for the assignment of opacity and color. Multi-dimensional TFs are based on at least two image properties, and are specified by means of 2D histograms. In this work we propose a new type of a 2D histogram which combines gray value with information about the size of the structures. This structure size enhanced (SSE) histogram is an intuitive approach for representing anatomical features. Clinicians — the users we are focusing on — are much more familiar with selecting features by their size than by their gradient magnitude value. As a proof of concept, we employ the SSE histogram for the definition of two-dimensional TFs for the visualization of 3D MRI and CT image data.
Fast tracking using edge histograms
Rokita, Przemyslaw
1997-04-01
This paper proposes a new algorithm for tracking objects and objects boundaries. This algorithm was developed and applied in a system used for compositing computer generated images and real world video sequences, but can be applied in general in all tracking systems where accuracy and high processing speed are required. The algorithm is based on analysis of histograms obtained by summing along chosen axles pixels of edge segmented images. Edge segmentation is done by spatial convolution using gradient operator. The advantage of such an approach is that it can be performed in real-time using available on the market hardware convolution filters. After edge extraction and histograms computation, respective positions of maximums in edge intensity histograms, in current and previous frame, are compared and matched. Obtained this way information about displacement of histograms maximums, can be directly converted into information about changes of target boundaries positions along chosen axles.
Multispectral histogram normalization contrast enhancement
Soha, J. M.; Schwartz, A. A.
1979-01-01
A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.
LHCb: Machine assisted histogram classification
Somogyi, P; Gaspar, C
2009-01-01
LHCb is one of the four major experiments under completion at the Large Hadron Collider (LHC). Monitoring the quality of the acquired data is important, because it allows the verification of the detector performance. Anomalies, such as missing values or unexpected distributions can be indicators of a malfunctioning detector, resulting in poor data quality. Spotting faulty components can be either done visually using instruments such as the LHCb Histogram Presenter, or by automated tools. In order to assist detector experts in handling the vast monitoring information resulting from the sheer size of the detector, a graph-theoretic based clustering tool, combined with machine learning algorithms is proposed and demonstrated by processing histograms representing 2D event hitmaps. The concept is proven by detecting ion feedback events in the LHCb RICH subdetector.
Visualizing Contour Trees within Histograms
DEFF Research Database (Denmark)
Kraus, Martin
2010-01-01
Many of the topological features of the isosurfaces of a scalar volume field can be compactly represented by its contour tree. Unfortunately, the contour trees of most real-world volume data sets are too complex to be visualized by dot-and-line diagrams. Therefore, we propose a new visualization...... that is suitable for large contour trees and efficiently conveys the topological structure of the most important isosurface components. This visualization is integrated into a histogram of the volume data; thus, it offers strictly more information than a traditional histogram. We present algorithms...... to automatically compute the graph layout and to calculate appropriate approximations of the contour tree and the surface area of the relevant isosurface components. The benefits of this new visualization are demonstrated with the help of several publicly available volume data sets....
Quantitative histogram analysis of images
Holub, Oliver; Ferreira, Sérgio T.
2006-11-01
A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for
ACTION RECOGNITION USING SALIENT NEIGHBORING HISTOGRAMS
DEFF Research Database (Denmark)
Ren, Huamin; Moeslund, Thomas B.
2013-01-01
and temporal dimensions. Instead, we propose a salient vocabulary construction algorithm to select visual words from a global point of view, and form compact descriptors to represent discriminative histograms in the neighborhoods. Those salient neighboring histograms are then trained to model different actions...
Interpreting Histograms. As Easy as It Seems?
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2014-01-01
Histograms are widely used, but recent studies have shown that they are not as easy to interpret as it might seem. In this article, we report on three studies on the interpretation of histograms in which we investigated, namely, (1) whether the misinterpretation by university students can be considered to be the result of heuristic reasoning, (2)…
Spline smoothing of histograms by linear programming
Bennett, J. O.
1972-01-01
An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.
Histogram-kernel Error and Its Application for Bin Width Selection in Histograms
Institute of Scientific and Technical Information of China (English)
Xiu-xiang Wang; Jian-fang Zhang
2012-01-01
Histogram and kernel estimators are usually regarded as the two main classical data-based nonparametric tools to estimate the underlying density functions for some given data sets.In this paper we will integrate them and define a histogram-kernel error based on the integrated square error between histogram and binned kernel density estimator,and then exploit its asymptotic properties. Just as indicated in this paper,the histogram-kernel error only depends on the choice of bin width and the data for the given prior kernel densities.The asymptotic optimal bin width is derived by minimizing the mean histogram-kernel error.By comparing with Scott's optimal bin width formula for a histogram,a new method is proposed to construct the data-based histogram without knowledge of the underlying density function.Monte Carlo study is used to verify the usefulness of our method for different kinds of density functions and sample sizes.
STUDY OF RBC HISTOGRAM IN VARIOUS ANEMIAS
Directory of Open Access Journals (Sweden)
Sandhya
2014-12-01
Full Text Available Over the past few years complete blood count (CBC by the automated hematology analyzers and microscopic examination of peripheral smear have complemented each other to provide a comprehensive report on patients’ blood sample. Numerous classifications for anemia have been established and the important parameters involved in the classifications are Hb, HCT, MCV, RDW, MCH, MCHC, reticulocytes and IRF. Many of these values are obtained only by automated heamatology analyzers. One histogram graph is worth 1000 numbers. A large collection of data, displayed as a visual image, can convey information with far more impact than the numbers alone. In hematology, these data take on several forms, one of which is the RBC histogram. Therefore a study of variation in RBC histograms in various anemias. Many times it is seen that histogram patterns show varying features when a simultaneous peripheral smear is reported. It is also seen that there are many limitations when manual peripheral smears reporting is done for example: peripheral smear reports are subjective, labor intensive and statistically unreliable. However microscopic peripheral smear examination also has their advantages. This study intends to create a guide to laboratory personnel and clinicians with sufficient accuracy to presumptively diagnose morphological classes of anemia directly from the automated hematology cell counter forms and correlate with morphological features of peripheral smear examination. OBJECTIVE: 1. The objective of the study is to know the utility and advantage of red cell histograms. 2. To study the automated histogram patterns along with morphological features noticed on peripheral smear examination. SOURCE OF DATA: All anemic patients from Central Diagnostic Laboratory of A.J.IMS. METHOD OF COLLECTION OF DATA: A total of about 100 patients were included in the study. Complete blood count including HB, TC, DC, Platelet count hematocrit value, RBC indices was obtained
Network histograms and universality of blockmodel approximation
Olhede, Sofia C.; Wolfe, Patrick J.
2014-01-01
In this paper we introduce the network histogram, a statistical summary of network interactions to be used as a tool for exploratory data analysis. A network histogram is obtained by fitting a stochastic blockmodel to a single observation of a network dataset. Blocks of edges play the role of histogram bins and community sizes that of histogram bandwidths or bin sizes. Just as standard histograms allow for varying bandwidths, different blockmodel estimates can all be considered valid representations of an underlying probability model, subject to bandwidth constraints. Here we provide methods for automatic bandwidth selection, by which the network histogram approximates the generating mechanism that gives rise to exchangeable random graphs. This makes the blockmodel a universal network representation for unlabeled graphs. With this insight, we discuss the interpretation of network communities in light of the fact that many different community assignments can all give an equally valid representation of such a network. To demonstrate the fidelity-versus-interpretability tradeoff inherent in considering different numbers and sizes of communities, we analyze two publicly available networks—political weblogs and student friendships—and discuss how to interpret the network histogram when additional information related to node and edge labeling is present. PMID:25275010
Local histograms and image occlusion models
Massar, Melody L; Fickus, Matthew; Kovacevic, Jelena
2011-01-01
The local histogram transform of an image is a data cube that consists of the histograms of the pixel values that lie within a fixed neighborhood of any given pixel location. Such transforms are useful in image processing applications such as classification and segmentation, especially when dealing with textures that can be distinguished by the distributions of their pixel intensities and colors. We, in particular, use them to identify and delineate biological tissues found in histology images obtained via digital microscopy. In this paper, we introduce a mathematical formalism that rigorously justifies the use of local histograms for such purposes. We begin by discussing how local histograms can be computed as systems of convolutions. We then introduce probabilistic image models that can emulate textures one routinely encounters in histology images. These models are rooted in the concept of image occlusion. A simple model may, for example, generate textures by randomly speckling opaque blobs of one color on ...
The Online Histogram Presenter for the ATLAS experiment a modular system for histogram visualization
Dotti, A; Vitillo, R A
2010-01-01
The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.
Dotti, Andrea; Adragna, Paolo; Vitillo, Roberto A.
2010-04-01
The Online Histogram Presenter (OHP) is the ATLAS tool to display histograms produced by the online monitoring system. In spite of the name, the Online Histogram Presenter is much more than just a histogram display. To cope with the large amount of data, the application has been designed to minimise the network traffic; sophisticated caching, hashing and filtering algorithms reduce memory and CPU usage. The system uses Qt and ROOT for histogram visualisation and manipulation. In addition, histogram visualisation can be extensively customised through configuration files. Finally, its very modular architecture features a lightweight plug-in system, allowing extensions to accommodate specific user needs. After an architectural overview of the application, the paper is going to present in detail the solutions adopted to increase the performance and a description of the plug-in system.
HEBS: Histogram Equalization for Backlight Scaling
Iranli, Ali; Pedram, Massoud
2011-01-01
In this paper, a method is proposed for finding a pixel transformation function that maximizes backlight dimming while maintaining a pre-specified image distortion level for a liquid crystal display. This is achieved by finding a pixel transformation function, which maps the original image histogram to a new histogram with lower dynamic range. Next the contrast of the transformed image is enhanced so as to compensate for brightness loss that would arise from backlight dimming. The proposed approach relies on an accurate definition of the image distortion which takes into account both the pixel value differences and a model of the human visual system and is amenable to highly efficient hardware realization. Experimental results show that the histogram equalization for backlight scaling method results in about 45% power saving with an effective distortion rate of 5% and 65% power saving for a 20% distortion rate. This is significantly higher power savings compared to previously reported backlight dimming approa...
Oriented Shape Index Histograms for Cell Classification
DEFF Research Database (Denmark)
Larsen, Anders Boesen Lindbo; Dahl, Anders Bjorholm; Larsen, Rasmus
2015-01-01
evaluate our new feature descriptor using a public dataset consisting of HEp-2 cell images from indirect immunoflourescence lighting. Our results show that we can improve classification performance significantly when including the shape index orientation. Notably, we show that shape index orientation......We propose a novel extension to the shape index histogram feature descriptor where the orientation of the second-order curvature is included in the histograms. The orientation of the shape index is reminiscent but not equal to gradient orientation which is widely used for feature description. We...
The Development of Cluster and Histogram Methods
Swendsen, Robert H.
2003-11-01
This talk will review the history of both cluster and histogram methods for Monte Carlo simulations. Cluster methods are based on the famous exact mapping by Fortuin and Kasteleyn from general Potts models onto a percolation representation. I will discuss the Swendsen-Wang algorithm, as well as its improvement and extension to more general spin models by Wolff. The Replica Monte Carlo method further extended cluster simulations to deal with frustrated systems. The history of histograms is quite extensive, and can only be summarized briefly in this talk. It goes back at least to work by Salsburg et al. in 1959. Since then, it has been forgotten and rediscovered several times. The modern use of the method has exploited its ability to efficiently determine the location and height of peaks in various quantities, which is of prime importance in the analysis of critical phenomena. The extensions of this approach to the multiple histogram method and multicanonical ensembles have allowed information to be obtained over a broad range of parameters. Histogram simulations and analyses have become standard techniques in Monte Carlo simulations.
Method of infrared image enhancement based on histogram
Institute of Scientific and Technical Information of China (English)
WANG Liang; YAN Jie
2011-01-01
Aiming at the problem in infrared image enhancement, a new method is given based on histogram. Using the gray character- istics of target, the upper-bouod threshold is selected adaptively and the histogram is processed by the threshold. After choosing the gray transform function based on the gray level distribution of image, the gray transformation is done during histogram equalization. Finally, the enhanced image is obtained. Compared with histogram equalization (HE), histogram double equalization (HDE) and plateau histogram equalization (PE), the simulation results demonstrate that the image enhancement effect of this method has obvious superiority. At the same time, its operation speed is fast and real-time ability is excellent.
Underwater sonar image recognition based on gray-spatial histograms
Institute of Scientific and Technical Information of China (English)
LIU Zhuo-fu; SANG En-fang
2003-01-01
A new gray-spatial histogram is proposed, which incorporates spatial information with gray compositions without sacrificing the robustness of traditional gray histograms. The purpose is to consider the representation role of gray compositions and spatial information simultaneously. Each entry in the gray-spatial histogram is the gray frequency and corresponding position information of images. In the experiments of sonar image recognition, the results show that the gray-spatial histogram is effective in practical use.
Efficient CBIR Using Color Histogram Processing
Directory of Open Access Journals (Sweden)
Neetu Sharma. S
2011-03-01
Full Text Available The need for efficient content-based image retrieval system has increased hugely. Efficient and effectiveretrieval techniques of images are desired because of the explosive growth of digital images. content basedimage retrieval (CBIR is a promising approach because of its automatic indexing retrieval based on theirsemantic features and visual appearance. The similarity of images depends on the featurerepresentation.However users have difficulties in representing their information needs in queries to contentbased image retrieval systems. In this paper we investigate two methods for describing the contents ofimages. The first one characterizes images by global descriptor attributes, while the second is based oncolor histogram approach.To compute feature vectors for Global descriptor, required time is much less ascompared to color histogram. Hence cross correlation value & image descriptor attributes are calculatedprior histogram implementation to make CBIR system more efficient.The performance of this approach ismeasured and results are shown. The aim of this paper is to compare various global descriptor attributesand to make CBIR system more efficient. It is found that further modifications are needed to produce betterperformance in searching images.
Research on scale effect of histogram
Institute of Scientific and Technical Information of China (English)
张颢; 焦子锑; 杨华; 李小文; 王锦地; 苏理宏; 阎广建; 赵红蕊
2002-01-01
To describe the spatial relationship among the earth objects compactly, in this paper, we raised the concept of histo-variogram based on the analysis of the characteristics of other spatial analyzing methods such as variogram, information entropy. And we also raised a new spatial analysis method of histogram decomposition based on the definition of standing pixel and contour pixel. At the end of this paper, we demonstrated the characteristics of histo-variogram by two experiments, one for spatial analysis, the other for image fusion.
OVERLAPPING VIRTUAL CADASTRAL DOCUMENTATION
Directory of Open Access Journals (Sweden)
Madalina - Cristina Marian
2013-12-01
Full Text Available Two cadastrale plans of buildings, can overlap virtual. Overlap is highlighted when digital reception. According to Law no. 7/1996 as amended and supplemented, to solve these problems is by updating the database graphs, the repositioning. This paper addresses the issue of overlapping virtual cadastre in the history of the period 1999-2012.
Text Anomalies Detection Using Histograms of Words
Directory of Open Access Journals (Sweden)
Abdulwahed Faraj Almarimi
2016-01-01
Full Text Available Authors of written texts mainly can be characterized by some collection of attributes obtained from texts. Texts of the same author are very similar from the style point of view. We can consider that attributes of a full text are very similar to attributes of parts in the same text. In the same thoughts can be compared different parts of the same text. In the paper, we describe an algorithm based on histograms of a mapped text to interval. In the mapping, it is kipped the word order as in the text. Histograms are analyzed from a cluster point of view. If a cluster dispersion is not large, the text is probably written by the same author. If the cluster dispersion is large, the text will be split in two or more parts and the same analysis will be done for the text parts. The experiments were done on English and Arabic texts. For combined English texts our algorithm covers that texts were not written by one author. We have got the similar results for combined Arabic texts. Our algorithm can be used to basic text analysis if the text was written by one author.
Liu, Jun
2010-01-01
The group Lasso is an extension of the Lasso for feature selection on (predefined) non-overlapping groups of features. The non-overlapping group structure limits its applicability in practice. There have been several recent attempts to study a more general formulation, where groups of features are given, potentially with overlaps between the groups. The resulting optimization is, however, much more challenging to solve due to the group overlaps. In this paper, we consider the efficient optimization of the overlapping group Lasso penalized problem. We reveal several key properties of the proximal operator associated with the overlapping group Lasso, and compute the proximal operator by solving the smooth and convex dual problem, which allows the use of the gradient descent type of algorithms for the optimization. We have performed empirical evaluations using the breast cancer gene expression data set, which consists of 8,141 genes organized into (overlapping) gene sets. Experimental results demonstrate the eff...
Contrast Enhancement of Color Images with Bi-Histogram
Directory of Open Access Journals (Sweden)
Paramjit Singh,
2014-06-01
Full Text Available Histogram equalization is a widely used scheme for contrast enhancement in a variety of applications due to its simple function and effectiveness. One possible drawback of the histogram equalization is that it can change the mean brightness of an image significantly as a consequence of histogram flattening. Clearly, this is not a desirable property when preserving the original mean brightness of a given image is necessary. Bi-histogram equalization is able to overcome this drawback for gray scale images. In this paper, we explore the use of bi-histogram equalization based technique for enhancing RGB color images. The technique is based on cumulative density function of a quantized image. From the results it is concluded that bi-histogram equalization is able to improve the contrast of colored images significantly.
COLOR PERCEPTION HISTOGRAM FOR IMAGE RETRIEVAL USING MULTIPLE SIMILARITY MEASURES
Directory of Open Access Journals (Sweden)
R. Malini
2014-01-01
Full Text Available This study aims to increase the retrieval efficiency of proposed image retrieval system on the basis of color content. A new idea of feature extraction based on color perception histogram is proposed. First, the color histogram is constructed for HSV image. Secondly, the true color and grey color components are identified based on hue and intensity. The weight for true and grey color components is calculated using NBS distance. An updated histogram is constructed using weighted true and grey color values. The color features extracted from the updated histogram of query image and for all the images in image database are compared with existing color histogram based technique by using multiple similarity measures. Experimental results show that proposed image retrieval based on the color perception histogram gives higher retrieval performance in terms of high average precision and average recall with less computational complexity.
Calculating Method of Moments Uniform Bin Width Histograms
James S. Weber
2016-01-01
A clear articulation of Method of Moments (MOM) Histograms is instructive and has waited 121 years since 1895. Also of interest are enabling uniform bin width (UBW) shape level sets. Mean-variance MOM uniform bin width frequency and density histograms are not unique, however ranking them by histogram skewness compared to data skewness helps. Although theoretical issues rarely take second place to calculations, here calculations based on shape level sets are central and challenge uncritically ...
Numerical study of QCD phase diagram at high temperature and density by a histogram method
Ejiri, Shinji; Hatsuda, Tetsuo; Kanaya, Kazuyuki; Nakagawa, Yoshiyuki; Ohno, Hiroshi; Saito, Hana; Umeda, Takashi
2012-01-01
We study the QCD phase structure at high temperature and density adopting a histogram method. Because the quark determinant is complex at finite density, the Monte-Carlo method cannot be applied directly. We use a reweighting method and try to solve the problems which arise in the reweighting method, i.e. the sign problem and the overlap problem. We discuss the chemical potential dependence of the probability distribution function in the heavy quark mass region and examine the applicability of the approach in the light quark region.
Devnagari document segmentation using histogram approach
Dongre, Vikas J; 10.5121/ijcseit.2011.1305
2011-01-01
Document segmentation is one of the critical phases in machine recognition of any language. Correct segmentation of individual symbols decides the accuracy of character recognition technique. It is used to decompose image of a sequence of characters into sub images of individual symbols by segmenting lines and words. Devnagari is the most popular script in India. It is used for writing Hindi, Marathi, Sanskrit and Nepali languages. Moreover, Hindi is the third most popular language in the world. Devnagari documents consist of vowels, consonants and various modifiers. Hence proper segmentation of Devnagari word is challenging. A simple histogram based approach to segment Devnagari documents is proposed in this paper. Various challenges in segmentation of Devnagari script are also discussed.
Contrast enhancement via texture region based histogram equalization
Singh, Kuldeep; Vishwakarma, Dinesh K.; Singh Walia, Gurjit; Kapoor, Rajiv
2016-08-01
This paper presents two novel contrast enhancement approaches using texture regions-based histogram equalization (HE). In HE-based contrast enhancement methods, the enhanced image often contains undesirable artefacts because an excessive number of pixels in the non-textured areas heavily bias the histogram. The novel idea presented in this paper is to suppress the impact of pixels in non-textured areas and to exploit texture features for the computation of histogram in the process of HE. The first algorithm named as Dominant Orientation-based Texture Histogram Equalization (DOTHE), constructs the histogram of the image using only those image patches having dominant orientation. DOTHE categories image patches into smooth, dominant or non-dominant orientation patches by using the image variance and singular value decomposition algorithm and utilizes only dominant orientation patches in the process of HE. The second method termed as Edge-based Texture Histogram Equalization, calculates significant edges in the image and constructs the histogram using the grey levels present in the neighbourhood of edges. The cumulative density function of the histogram formed from texture features is mapped on the entire dynamic range of the input image to produce the contrast-enhanced image. Subjective as well as objective performance assessment of proposed methods is conducted and compared with other existing HE methods. The performance assessment in terms of visual quality, contrast improvement index, entropy and measure of enhancement reveals that the proposed methods outperform the existing HE methods.
Optimization Algorithms Testing and Convergence by Using a Stacked Histogram
Directory of Open Access Journals (Sweden)
ZAPLATILEK, K.
2011-02-01
Full Text Available The article describes an original method of optimization algorithms testing and convergence. The method is based on so-called stacked histogram. Stacked histogram is a histogram with its features marked by a chosen colour scheme. Thus, the histogram maintains the information on the input digital sequence. This approach enables an easy identification of the hidden defects in the random process statistical distribution. The stacked histogram is used for the testing of the convergent quality of various optimization techniques. Its width, position and colour scheme provides enough information on the chosen algorithm optimization trajectory. Both the classic iteration techniques and the stochastic optimization algorithm with the adaptation were used as examples.
DPCube: Differentially Private Histogram Release through Multidimensional Partitioning
Xiao, Yonghui; Fan, Liyue; Goryczka, Slawomir
2012-01-01
Differential privacy is a strong notion for protecting individual privacy in privacy preserving data analysis or publishing. In this paper, we study the problem of differentially private histogram release for random workloads. We study two multidimensional partitioning strategies including: 1) a baseline cell-based partitioning strategy for releasing an equi-width cell histogram, and 2) an innovative 2-phase kd-tree based partitioning strategy for releasing a v-optimal histogram. We formally analyze the utility of the released histograms and quantify the errors for answering linear queries such as counting queries. We formally characterize the property of the input data that will guarantee the optimality of the algorithm. Finally, we implement and experimentally evaluate several applications using the released histograms, including counting queries, classification, and blocking for record linkage and show the benefit of our approach.
Directory of Open Access Journals (Sweden)
Fariba Rezaeetalab
2016-12-01
Full Text Available Overlap syndrome, which is known as the coexistence of chronic obstructive pulmonary disease (COPD and obstructive sleep apnea (OSA, was first defined by Flenley. Although it can refer to concomitant occurrence of any of the pulmonary diseases and OSA, overlap syndrome is commonly considered as the coexistence of OSA and COPD. This disease has unique adverse health consequences distinct from either condition alone. Given the high prevalence of each solitary disease, overlap syndrome is also likely to be common and clinically relevant. Despite the fact that overlap syndrome has been described in the literature for nearly 30 years, paucity of evaluations and studies limited the discussion on diagnosis, prevalence, pathophysiology, treatment, and outcomes of this disease. This review article addresses these issues by reviewing several recent studies conducted in Iran or other countries. This review suggests that overlap syndrome has worse outcomes than either disease alone. Our findings accentuated the urgent need for further studies on overlap syndrome and all overlaps between OSA and chronic pulmonary disease to provide a deeper insight into diagnosis and non-invasive treatments of this disease.
Contrast Enhancement through Clustered Histogram Equalization
Directory of Open Access Journals (Sweden)
Shen-Chuan Tai
2012-09-01
Full Text Available This study proposed a contrast enhancement algorithm. Some methods enhance images depending on only the global or the local information, therefore it would cause over-enhancement usually and make the image look unnatural. The proposed method enhances image based on the global and local information. For the global part, we proposed mapping curves to find the new average, maximum and minimum intensity to try to suit the concept of Human Visual System (HVS for obtaining the better perceptual results. For the local part, we utilized fuzzy c-means clustering algorithm to group image and we can obtain the information of intensity distribution and pixel number from each group. Then we calculate weights according to the information and enhance images by Histogram Equalization (HE depending on the weights. The experiment results show that our method can enhance the contrast of image steadily and it causes over-enhancement with lower probability than other methods. The whole image not only looks natural but also shows detail texture more clearly after applying our method.
A Learning Framework for Self-Tuning Histograms
Viswanathan, Raajay; Laxman, Srivatsan; Arasu, Arvind
2011-01-01
We propose a general learning theoretic formulation for estimating self-tuning histograms. Our formulation uses query feedback from a workload as training data to estimate a histogram that minimizes the expected error on future queries. Our formulation is flexible in the sense that it allows the design and comparison of different methods (possibly specialized for different settings). We first study the simple class of equi-width histograms in our learning framework and present a learning algorithm (EquiHist) that is competitive in many settings and that has formal error guarantees. We then go beyond equi-width histograms and present a novel learning algorithm (SpHist) for estimating general histograms. Here we use Haar wavelets to reduce the problem of learning histograms to a sparse vectory recovery problem. Both algorithms have multiple advantages over existing methods: 1) simple and scalable extensions to multi-dimensional data, 2) scale with number of histogram buckets and size of query feedback, 3) natur...
Size histograms of gold nanoparticles measured by gravitational sedimentation.
Alexander, Colleen M; Goodisman, Jerry
2014-03-15
Sedimentation curves of gold nanoparticles in water were obtained by measuring the optical density of a suspension over time. The results are not subject to sampling errors, and refer to the particles in situ. Curves obtained simultaneously at several wave lengths were analyzed together to derive the size histogram of the sedimenting particles. The bins in the histogram were 5 nm wide and centered at diameters 60, 65, …, 110 nm. To get the histogram, we weighted previously calculated solutions to the Mason-Weaver sedimentation-diffusion equation for various particle diameters with absorption/scattering coefficients and size (diameter) abundances {c(j)}, and found the {c(j)} which gave the best fit to all the theoretical sedimentation curves. The effects of changing the number of bins and the wave lengths used were studied. Going to smaller bins would mean determining more parameters and require more wave lengths. The histograms derived from sedimentation agreed quite well in general with the histogram derived from TEM. Differences found for the smallest particle diameters are partly due to statistical fluctuations (TEM found only 1-2 particles out of 103 with these diameters). More important is that the TEM histogram indicates 12% of the particles have diameters of 75±2.5 nm, and the sedimentation histogram shows none. We show that this reflects the difference between the particles in situ, which possess a low-density shell about 1 nm thick, and the bare particles on the TEM stage. Correcting for this makes agreement between the two histograms excellent. Comparing sedimentation-derived with TEM-derived histograms thus shows differences between the particles in situ and on the TEM stage.
Data analysis recipes: Choosing the binning for a histogram
Hogg, David W
2008-01-01
Data points are placed in bins when a histogram is created, but there is always a decision to be made about the number or width of the bins. This decision is often made arbitrarily or subjectively, but it need not be. A jackknife or leave-one-out cross-validation likelihood is defined and employed as a scalar objective function for optimization of the locations and widths of the bins. The objective is justified as being related to the histogram's usefulness for predicting future data. The method works for data or histograms of any dimensionality.
2000 fps multi-object tracking based on color histogram
Gu, Qingyi; Takaki, Takeshi; Ishii, Idaku
2012-06-01
In this study, we develop a real-time, color histogram-based tracking system for multiple color-patterned objects in a 512×512 image at 2000 fps. Our system can simultaneously extract the positions, areas, orientation angles, and color histograms of multiple objects in an image using the hardware implementation of a multi-object, color histogram extraction circuit module on a high-speed vision platform. It can both label multiple objects in an image consisting of connected components and calculate their moment features and 16-bin hue-based color histograms using cell-based labeling. We demonstrate the performance of our system by showing several experimental results: (1) tracking of multiple color-patterned objects on a plate rotating at 16 rps, and (2) tracking of human hand movement with two color-patterned drinking bottles.
Illusion induced overlapped optics.
Zang, XiaoFei; Shi, Cheng; Li, Zhou; Chen, Lin; Cai, Bin; Zhu, YiMing; Zhu, HaiBin
2014-01-13
The traditional transformation-based cloak seems like it can only hide objects by bending the incident electromagnetic waves around the hidden region. In this paper, we prove that invisible cloaks can be applied to realize the overlapped optics. No matter how many in-phase point sources are located in the hidden region, all of them can overlap each other (this can be considered as illusion effect), leading to the perfect optical interference effect. In addition, a singular parameter-independent cloak is also designed to obtain quasi-overlapped optics. Even more amazing of overlapped optics is that if N identical separated in-phase point sources covered with the illusion media, the total power outside the transformation region is N2I0 (not NI0) (I0 is the power of just one point source, and N is the number point sources), which seems violating the law of conservation of energy. A theoretical model based on interference effect is proposed to interpret the total power of these two kinds of overlapped optics effects. Our investigation may have wide applications in high power coherent laser beams, and multiple laser diodes, and so on.
Content-based Image Retrieval Using Color Histogram
Institute of Scientific and Technical Information of China (English)
HUANG Wen-bei; HE Liang; GU Jun-zhong
2006-01-01
This paper introduces the principles of using color histogram to match images in CBIR. And a prototype CBIR system is designed with color matching function. A new method using 2-dimensional color histogram based on hue and saturation to extract and represent color information of an image is presented. We also improve the Euclidean-distance algorithm by adding Center of Color to it. The experiment shows modifications made to Euclidean-distance significantly elevate the quality and efficiency of retrieval.
Bin recycling strategy for improving the histogram precision on GPU
Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.
2016-07-01
Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.
Overlap among Environmental Databases.
Miller, Betty
1981-01-01
Describes the methodology and results of a study comparing the overlap of Enviroline, Pollution, and the Environmental Periodicals Bibliography files through searches on acid rain, asbestos and water, diesel, glass recycling, Lake Erie, Concorde, reverse osmosis wastewater treatment cost, and Calspan. Nine tables are provided. (RBF)
Discrimination of paediatric brain tumours using apparent diffusion coefficient histograms
Energy Technology Data Exchange (ETDEWEB)
Bull, Jonathan G.; Clark, Christopher A. [UCL Institute of Child Health, Imaging and Biophysics Unit, London (United Kingdom); Saunders, Dawn E. [Great Ormond Street Hospital for Children NHS Trust, Department of Radiology, London (United Kingdom)
2012-02-15
To determine if histograms of apparent diffusion coefficients (ADC) can be used to differentiate paediatric brain tumours. Imaging of histologically confirmed tumours with pre-operative ADC maps were reviewed (54 cases, 32 male, mean age 6.1 years; range 0.1-15.8 years) comprising 6 groups. Whole tumour ADC histograms were calculated; normalised for volume. Stepwise logistic regression analysis was used to differentiate tumour types using histogram metrics, initially for all groups and then for specific subsets. All 6 groups (5 dysembryoplastic neuroectodermal tumours, 22 primitive neuroectodermal tumours (PNET), 5 ependymomas, 7 choroid plexus papillomas, 4 atypical teratoid rhabdoid tumours (ATRT) and 9 juvenile pilocytic astrocytomas (JPA)) were compared. 74% (40/54) were correctly classified using logistic regression of ADC histogram parameters. In the analysis of posterior fossa tumours, 80% of ependymomas, 100% of astrocytomas and 94% of PNET-medulloblastoma were classified correctly. All PNETs were discriminated from ATRTs (22 PNET and 4 supratentorial ATRTs) (100%). ADC histograms are useful in differentiating paediatric brain tumours, in particular, the common posterior fossa tumours of childhood. PNETs were differentiated from supratentorial ATRTs, in all cases, which has important implications in terms of clinical management. (orig.)
Schmerer, Hans-Jörg; Capuano, Stella; Egger, Hartmut; Koch, Michael
2015-01-01
We set up a model of offshoring with heterogeneous producers that captures two empirical regularities on offshoring firms: larger, more productive firms are more likely to make use of the offshoring opportunity; the fraction of firms that engages in offshoring is positive and smaller than one in any size or revenue category. These patterns generate an overlap of offshoring and non-offshoring firms, which is non-monotonic in the costs of offshoring. In an empirical exercise, we employ firm-lev...
Energy Technology Data Exchange (ETDEWEB)
Iancu, Costin; Parry, Husbands; Hargrove, Paul
2005-07-08
Hiding communication latency is an important optimization for parallel programs. Programmers or compilers achieve this by using non-blocking communication primitives and overlapping communication with computation or other communication operations. Using non-blocking communication raises two issues: performance and programmability. In terms of performance, optimizers need to find a good communication schedule and are sometimes constrained by lack of full application knowledge. In terms of programmability, efficiently managing non-blocking communication can prove cumbersome for complex applications. In this paper we present the design principles of HUNT, a runtime system designed to search and exploit some of the available overlap present at execution time in UPC programs. Using virtual memory support, our runtime implements demand-driven synchronization for data involved in communication operations. It also employs message decomposition and scheduling heuristics to transparently improve the non-blocking behavior of applications. We provide a user level implementation of HUNT on a variety of modern high performance computing systems. Results indicate that our approach is successful in finding some of the overlap available at execution time. While system and application characteristics influence performance, perhaps the determining factor is the time taken by the CPU to execute a signal handler. Demand driven synchronization at execution time eliminates the need for the explicit management of non-blocking communication. Besides increasing programmer productivity, this feature also simplifies compiler analysis for communication optimizations.
Should unfolded histograms be used to test hypotheses?
Cousins, Robert D; Sun, Yipeng
2016-01-01
In many analyses in high energy physics, attempts are made to remove the effects of detector smearing in data by techniques referred to as "unfolding" histograms, thus obtaining estimates of the true values of histogram bin contents. Such unfolded histograms are then compared to theoretical predictions, either to judge the goodness of fit of a theory, or to compare the abilities of two or more theories to describe the data. When doing this, even informally, one is testing hypotheses. However, a more fundamentally sound way to test hypotheses is to smear the theoretical predictions by simulating detector response and then comparing to the data without unfolding; this is also frequently done in high energy physics, particularly in searches for new physics. One can thus ask: to what extent does hypothesis testing after unfolding data materially reproduce the results obtained from testing by smearing theoretical predictions? We argue that this "bottom-line-test" of unfolding methods should be studied more commonl...
Histogram bin width selection for time-dependent Poisson processes
Energy Technology Data Exchange (ETDEWEB)
Koyama, Shinsuke; Shinomoto, Shigeru [Department of Physics, Graduate School of Science, Kyoto University, Sakyo-ku, Kyoto 606-8502 (Japan)
2004-07-23
In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.
Histogram bin width selection for time-dependent Poisson processes
Koyama, Shinsuke; Shinomoto, Shigeru
2004-07-01
In constructing a time histogram of the event sequences derived from a nonstationary point process, we wish to determine the bin width such that the mean squared error of the histogram from the underlying rate of occurrence is minimized. We find that the optimal bin widths obtained for a doubly stochastic Poisson process and a sinusoidally regulated Poisson process exhibit different scaling relations with respect to the number of sequences, time scale and amplitude of rate modulation, but both diverge under similar parametric conditions. This implies that under these conditions, no determination of the time-dependent rate can be made. We also apply the kernel method to these point processes, and find that the optimal kernels do not exhibit any critical phenomena, unlike the time histogram method.
Face recognition with histograms of fractional differential gradients
Yu, Lei; Ma, Yan; Cao, Qi
2014-05-01
It has proved that fractional differentiation can enhance the edge information and nonlinearly preserve textural detailed information in an image. This paper investigates its ability for face recognition and presents a local descriptor called histograms of fractional differential gradients (HFDG) to extract facial visual features. HFDG encodes a face image into gradient patterns using multiorientation fractional differential masks, from which histograms of gradient directions are computed as the face representation. Experimental results on Yale, face recognition technology (FERET), Carnegie Mellon University pose, illumination, and expression (CMU PIE), and A. Martinez and R. Benavente (AR) databases validate the feasibility of the proposed method and show that HFDG outperforms local binary patterns (LBP), histograms of oriented gradients (HOG), enhanced local directional patterns (ELDP), and Gabor feature-based methods.
Click-iT assay with improved DNA distribution histograms.
Hamelik, Ronald M; Krishan, Awtar
2009-10-01
The Click-iT Assay developed and commercialized by Invitrogen is based on incorporation of a new 5-bromo-2'-deoxyuridine analog, 5-ethynyl-2'-deoxyuridine (EdU) into newly synthesized DNA and its recognition by azide dyes via a copper mediated "click" reaction. This relatively convenient and useful procedure depends on fixation of cells with paraformaldehyde and staining of the DNA with 7-aminoactinomycin-D (7-AAD). Both of these procedures result in DNA histograms with broad coefficients of variation (CV's). In this report, we have shown that after EdU incorporation, nuclei isolated by lysis can be incubated with the Click-iT Assay and stained with propidium iodide for generation of DNA histograms with low CV's. This modified procedure results in better DNA histograms by replacing 7-AAD with propidium iodide and also saves processing time by eliminating the fixation and permeabilization steps.
Blind Digital Image Watermarking Robust Against Histogram Equalization
Directory of Open Access Journals (Sweden)
H. Sadawarti
2012-01-01
Full Text Available Problem statement: Piracy in the presence of internet and computers proves to be a biggest damage to the industry. Easy editing and copying of images yields a great damage to the owner as original images can be distributed through internet very easily. To reduce the piracy and duplicity of the digital multimedia files, digital watermarking technique is dominating over the other available techniques. There are certain methods or attacks which are used to damage the watermark. One of the major attacks is histogram equalization and reducing the number of histogram equalized levels. Thus, there is a need to develop a method so that the watermark can be protected after histogram equalization. Approach: A blind digital watermarking algorithm is presented which embed the watermark in frequency domain. Firstly, DWT is applied on the original image and then DCT on the 4Ã4 blocks to target the particular frequencies of the image for embedding the watermark which does not have more effect after histogram equalization. Also, to enhance the security of the watermark dual encryption technique is deployed. Results: Algorithm applied to four images which are Lena, Cameraman, Baboon and Peppers. The evaluation of the algorithm is calculated in terms of peak signal to noise ratio and non correlation. The results prove that the algorithm is robust to histogram equalization attack up to 2 grey levels. Conclusion/Recommendations: The developed algorithm proved its performance against histogram equalization but the algorithm can also be checked for the other attacks which can be addition of white noise, Gaussian noise, filtering.
Remote logo detection using angle-distance histograms
Youn, Sungwook; Ok, Jiheon; Baek, Sangwook; Woo, Seongyoun; Lee, Chulhee
2016-05-01
Among all the various computer vision applications, automatic logo recognition has drawn great interest from industry as well as various academic institutions. In this paper, we propose an angle-distance map, which we used to develop a robust logo detection algorithm. The proposed angle-distance histogram is invariant against scale and rotation. The proposed method first used shape information and color characteristics to find the candidate regions and then applied the angle-distance histogram. Experiments show that the proposed method detected logos of various sizes and orientations.
Resolving Histogram Binning Dilemmas with Binless and Binfull Algorithms
Krislock, Abram
2014-01-01
The histogram is an analysis tool in widespread use within many sciences, with high energy physics as a prime example. However, there exists an inherent bias in the choice of binning for the histogram, with different choices potentially leading to different interpretations. This paper aims to eliminate this bias using two "debinning" algorithms. Both algorithms generate an observed cumulative distribution function from the data, and use it to construct a representation of the underlying probability distribution function. The strengths and weaknesses of these two algorithms are compared and contrasted. The applicability and future prospects of these algorithms is also discussed.
Integral Histogram with Random Projection for Pedestrian Detection
Liu, Chang-Hua; Lin, Jian-Kun
2015-01-01
In this paper, we give a systematic study to report several deep insights into the HOG, one of the most widely used features in the modern computer vision and image processing applications. We first show that, its magnitudes of gradient can be randomly projected with random matrix. To handle over-fitting, an integral histogram based on the differences of randomly selected blocks is proposed. The experiments show that both the random projection and integral histogram outperform the HOG feature obviously. Finally, the two ideas are combined into a new descriptor termed IHRP, which outperforms the HOG feature with less dimensions and higher speed. PMID:26569486
Packing ellipsoids with overlap
Uhler, Caroline
2012-01-01
The problem of packing ellipsoids of different sizes and shapes into an ellipsoidal container so as to minimize a measure of overlap between ellipsoids is considered. A bilevel optimization formulation is given, together with an algorithm for the general case and a simpler algorithm for the special case in which all ellipsoids are in fact spheres. Convergence results are proved and computational experience is described and illustrated. The motivating application - chromosome organization in the human cell nucleus - is discussed briefly, and some illustrative results are presented.
Histogram-Based Calibration Method for Pipeline ADCs
Son, Hyeonuk; Jang, Jaewon; Kim, Heetae; Kang, Sungho
2015-01-01
Measurement and calibration of an analog-to-digital converter (ADC) using a histogram-based method requires a large volume of data and a long test duration, especially for a high resolution ADC. A fast and accurate calibration method for pipelined ADCs is proposed in this research. The proposed calibration method composes histograms through the outputs of each stage and calculates error sources. The digitized outputs of a stage are influenced directly by the operation of the prior stage, so the results of the histogram provide the information of errors in the prior stage. The composed histograms reduce the required samples and thus calibration time being implemented by simple modules. For 14-bit resolution pipelined ADC, the measured maximum integral non-linearity (INL) is improved from 6.78 to 0.52 LSB, and the spurious-free dynamic range (SFDR) and signal-to-noise-and-distortion ratio (SNDR) are improved from 67.0 to 106.2dB and from 65.6 to 84.8dB, respectively. PMID:26070196
Color and Edge Histograms Based Medicinal Plants' Image Retrieval
Directory of Open Access Journals (Sweden)
Basavaraj S. Anami
2012-08-01
Full Text Available In this paper, we propose a methodology for color and edge histogram based medicinal plants image retrieval. The medicinal plants are divided into herbs, shrubs and trees. The medicinal plants are used in ayurvedic medicines. Manual identification of medicinal plants requires a priori knowledge. Automatic recognition of medicinal plants is useful. We have considered medicinal plant species, such as Papaya, Neem, Tulasi and Aloevera are considered for identification and retrieval. The color histograms are obtained in RGB, HSV and YCbCr color spaces. The number of valleys and peaks in the color histograms are used as features. But, these features alone are not helpful in discriminating plant images, since majority plant images are green in color. We have used edge and edge direction histograms in the work to get edges in the stem and leafy parts. Finally, these features are used in retrieval of medicinal plant images. Absolute distance, Euclidean distance and mean square error, similarity distance measures are deployed in the work. The results show an average retrieval efficiency of 94% and 98% for edge and edge direction features respectively.
Reversible Watermarking dengan Metode Modifikasi Histogram pada Difference Image
Directory of Open Access Journals (Sweden)
Yustina Retno
2011-07-01
Full Text Available Abstract— Reversible watermarking schemes are widely used to maintain the authenticity of the digital image. This research will discuss a method on histogram modification of difference image in which the difference image is created from the difference value of adjacent pixels of the image grayscale. Embedding process begins by dividing the host image and watermark into b blocks, followed by making a difference image of the host image block. From the difference image histogram, determine the peak value, and modify the histogram based on the peak value. Then, insert each block of the watermark to the difference image that has been modified and transform back into the grayscale image. Extraction and recovery process is the inverted version of the embedding stage. This process begins by dividing the watermarked image into b blocks, followed by making a difference image of each block. Then, extract the data and shift difference image histogram using a peak value. That difference image is transformed back into a grayscale image. Experimental results demonstrate that the average insertion capacity is 14% greater than Xue's with PSNR value over 48 dB for 4 x 4 pixels and 23 % greater with PSNR over 46 dB for 8 x 8 pixels. From the comparison of robustness to line and salt n’ pepper on the density of 0.05 noises is obtained that the watermark with ECC is more robust than a watermark without ECC. JPEG compression in lossless mode may be applied to the watermarked image. Multiple insertion of watermark can be done with the consequence that the more insertions will result in lower PSNR values. Keywords—difference image, histogram modification, reversible watermarking
Content Based Image Retrieval Using Local Color Histogram
Directory of Open Access Journals (Sweden)
Metty Mustikasari, Eri Prasetyo,, Suryadi Harmanto
2014-01-01
Full Text Available —This paper proposes a technique to retrieve images based on color feature using local histogram. The image is divided into nine sub blocks of equal size. The color of each sub-block is extracted by quantifying the HSV color space into 12x6x6 histogram. In this retrieval system Euclidean distance and City block distance are used to measure similarity of images. This algorithm is tested by using Corel image database. The performance of retrieval system is measured in terms of its recall and precision. The effectiveness of retrieval system is also measured based on AVRR (Average Rank of Relevant Images and IAVRR (Ideal Average Rank of Relevant Images which is proposed by Faloutsos. The experimental results show that the retrieval system has a good performance and the evaluation results of city block has achieved higher retrieval performance than the evaluation results of the Euclidean distance.
Metamorphic Virus Variants Classification Using Opcode Frequency Histogram
Rad, Babak Bashari
2011-01-01
In order to prevent detection and evade signature-based scanning methods, which are normally exploited by antivirus software, metamorphic viruses use several various obfuscation approaches. They transform their code in new instances as look entirely or partly different and contain dissimilar sequences of string, but their behavior and function remain unchanged. This obfuscation process allows them to stay away from the string based signature detection. In this research, we use a statistical technique to compare the similarity between two files infected by two morphed versions of a given metamorphic virus. Our proposed solution based on static analysis and it uses the histogram of machine instructions frequency in various offspring of obfuscated viruses. We use Euclidean histogram distance metric to compare a pair of portable executable (PE) files. The aim of this study is to show that for some particular obfuscation methods, the presented solution can be exploited to detect morphed varieties of a file. Hence,...
Contrast Enhancement Using Brightness Preserving Histogram Plateau Limit Technique
Directory of Open Access Journals (Sweden)
K. Santhi
2014-07-01
Full Text Available Contrast enhancement is an important factor in the gray scale images. One of the widely accepted contrast enhancement method is the Histogram equalization (HE. HE achieves comparatively better performance on almost all types of image but sometimes produces excessive visual deterioration. The proposed Contrast Enhancement using Brightness Preserving Histogram Plateau Limit (CEBPHPL method provides better brightness preservation without allowing in excess of contrast improvement measure. This method decomposes the input image by computing the local maxima of the smoothed image using Gaussian filter which reduces the noise. Then the clipping process has been implemented which provides the good enhancement rate than the conventional methods. The experimental result of the proposed CEBPHPL is better than the existing methods.
Head Tracking Using Shapes and Adaptive Color Histograms
Institute of Scientific and Technical Information of China (English)
刘青山; 马颂德; 卢汉青
2002-01-01
A new method is presented for tracking a person's head in real-time. Thehead is shaped as an ellipse, and the adaptively modified RGB color histogram is used torepresent the tracked object (head). The method is composed of two parts. First, a robustnonparametric technique, called mean shift algorithm, is adopted for histogram matching toestimate the head's location in the current frame. Second, a local search is performed afterhistogram matching to maximize the normalized gradient magnitude around the boundary ofthe elliptical head, so that a more accurate location and the best scale size of the head can beobtained. The method is demonstrated to be a real-time tracker and robust to clutter, scalevariation, occlusion, rotation and camera motion, for several test sequences.
Accelerated weight histogram method for exploring free energy landscapes
Lindahl, Viveca; Hess, Berk
2014-01-01
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and an...
Histogram Equalization to Model Adaptation for Robust Speech Recognition
Directory of Open Access Journals (Sweden)
Hoirin Kim
2010-01-01
Full Text Available We propose a new model adaptation method based on the histogram equalization technique for providing robustness in noisy environments. The trained acoustic mean models of a speech recognizer are adapted into environmentally matched conditions by using the histogram equalization algorithm on a single utterance basis. For more robust speech recognition in the heavily noisy conditions, trained acoustic covariance models are efficiently adapted by the signal-to-noise ratio-dependent linear interpolation between trained covariance models and utterance-level sample covariance models. Speech recognition experiments on both the digit-based Aurora2 task and the large vocabulary-based task showed that the proposed model adaptation approach provides significant performance improvements compared to the baseline speech recognizer trained on the clean speech data.
Accelerating Monte Carlo Renderers by Ray Histogram Fusion
Directory of Open Access Journals (Sweden)
Mauricio Delbracio
2015-03-01
Full Text Available This paper details the recently introduced Ray Histogram Fusion (RHF filter for accelerating Monte Carlo renderers [M. Delbracio et al., Boosting Monte Carlo Rendering by Ray Histogram Fusion, ACM Transactions on Graphics, 33 (2014]. In this filter, each pixel in the image is characterized by the colors of the rays that reach its surface. Pixels are compared using a statistical distance on the associated ray color distributions. Based on this distance, it decides whether two pixels can share their rays or not. The RHF filter is consistent: as the number of samples increases, more evidence is required to average two pixels. The algorithm provides a significant gain in PSNR, or equivalently accelerates the rendering process by using many fewer Monte Carlo samples without observable bias. Since the RHF filter depends only on the Monte Carlo samples color values, it can be naturally combined with all rendering effects.
Histogram method in finite density QCD with phase quenched simulations
Nakagawa, Y; Aoki, S; Kanaya, K; Ohno, H; Saito, H; Hatsuda, T; Umeda, T
2011-01-01
We propose a new approach to finite density QCD based on a histogram method with phase quenched simulations at finite chemical potential. Integrating numerically the derivatives of the logarithm of the quark determinant with respect to the chemical potential, we calculate the reweighting factor and the complex phase of the quark determinant. The complex phase is handled with a cumulant expansion to avoid the sign problem. We examine the applicability of this method.
The FLIC Overlap Quark Propagator
Kamleh, W; Leinweber, D B; Williams, A G; Zhang, J; Kamleh, Waseem; Bowman, Patrick O.; Leinweber, Derek B.; Williams, Anthony G.; Zhang, Jianbo
2004-01-01
FLIC overlap fermions are a variant of the standard (Wilson) overlap action, with the FLIC (Fat Link Irrelevant Clover) action as the overlap kernel rather than the Wilson action. The structure of the FLIC overlap fermion propagator in momentum space is studied, and a comparison against previous studies of the Wilson overlap propagator in quenched QCD is performed. To explore the scaling properties of the propagator for the two actions, numerical calculations are performed in Landau Gauge across three lattices with different lattice spacing $a$ and similar physical volumes. We find that at light quark masses the acti ons agree in both the infrared and the ultraviolet, but at heavier masses some disagreement in the ultraviolet appears. This is attributed to the two action s having different discretisation errors with the FLIC overlap providing superior performance in this regime. Both actions scale reasonably, but some scaling violations are observed.
Extraction of Facial Feature Points Using Cumulative Histogram
Paul, Sushil Kumar; Bouakaz, Saida
2012-01-01
This paper proposes a novel adaptive algorithm to extract facial feature points automatically such as eyebrows corners, eyes corners, nostrils, nose tip, and mouth corners in frontal view faces, which is based on cumulative histogram approach by varying different threshold values. At first, the method adopts the Viola-Jones face detector to detect the location of face and also crops the face region in an image. From the concept of the human face structure, the six relevant regions such as right eyebrow, left eyebrow, right eye, left eye, nose, and mouth areas are cropped in a face image. Then the histogram of each cropped relevant region is computed and its cumulative histogram value is employed by varying different threshold values to create a new filtering image in an adaptive way. The connected component of interested area for each relevant filtering image is indicated our respective feature region. A simple linear search algorithm for eyebrows, eyes and mouth filtering images and contour algorithm for nos...
Independent histogram pursuit for segmentation of skin lesions
DEFF Research Database (Denmark)
Gomez, D.D.; Butakoff, C.; Ersbøll, Bjarne Kjær;
2008-01-01
In this paper, an unsupervised algorithm, called the Independent Histogram Pursuit (HIP), for segmenting dermatological lesions is proposed. The algorithm estimates a set of linear combinations of image bands that enhance different structures embedded in the image. In particular, the first estima...... to deal with different types of dermatological lesions. The boundary detection precision using k-means segmentation was close to 97%. The proposed algorithm can be easily combined with the majority of classification algorithms.......In this paper, an unsupervised algorithm, called the Independent Histogram Pursuit (HIP), for segmenting dermatological lesions is proposed. The algorithm estimates a set of linear combinations of image bands that enhance different structures embedded in the image. In particular, the first...... estimated combination enhances the contrast of the lesion to facilitate its segmentation. Given an N-band image, this first combination corresponds to a line in N dimensions, such that the separation between the two main modes of the histogram obtained by projecting the pixels onto this line, is maximized...
Conformational thermodynamics of biomolecular complexes: The histogram-based method
Das, Amit; Sikdar, Samapan; Ghosh, Mahua; Chakrabarti, J.
2015-09-01
Conformational changes in biomacromolecules govern majority of biological processes. Complete characterization of conformational contributions to thermodynamics of complexation of biomacromolecules has been challenging. Although, advances in NMR relaxation experiments and several computational studies have revealed important aspects of conformational entropy changes, efficient and large-scale estimations still remain an intriguing facet. Recent histogram-based method (HBM) offers a simple yet rigorous route to estimate both conformational entropy and free energy changes from same set of histograms in an efficient manner. The HBM utilizes the power of histograms which can be generated as accurately as desired from an arbitrarily large sample space from atomistic simulation trajectories. Here we discuss some recent applications of the HBM, using dihedral angles of amino acid residues as conformational variables, which provide good measure of conformational thermodynamics of several protein-peptide complexes, obtained from NMR, metal-ion binding to an important metalloprotein, interfacial changes in protein-protein complex and insight to protein function, coupled with conformational changes. We conclude the paper with a few future directions worth pursuing.
Topological susceptibility from overlap fermion
Institute of Scientific and Technical Information of China (English)
应和平; 张剑波
2003-01-01
We numerically calculate the topological charge of the gauge configurations on a finite lattice by the fermionic method with overlap fermions. By using the lattice index theorem, we identify the index of the massless overlap fermion operator to the topological charge of the background gauge configuration. The resulting topological susceptibility X is in good agreement with the anticipation made by Witten and Veneziano.
Umanodan, Tomokazu; Fukukura, Yoshihiko; Kumagae, Yuichi; Shindo, Toshikazu; Nakajo, Masatoyo; Takumi, Koji; Nakajo, Masanori; Hakamada, Hiroto; Umanodan, Aya; Yoshiura, Takashi
2017-04-01
To determine the diagnostic performance of apparent diffusion coefficient (ADC) histogram analysis in diffusion-weighted (DW) magnetic resonance imaging (MRI) for differentiating adrenal adenoma from pheochromocytoma. We retrospectively evaluated 52 adrenal tumors (39 adenomas and 13 pheochromocytomas) in 47 patients (21 men, 26 women; mean age, 59.3 years; range, 16-86 years) who underwent DW 3.0T MRI. Histogram parameters of ADC (b-values of 0 and 200 [ADC200 ], 0 and 400 [ADC400 ], and 0 and 800 s/mm(2) [ADC800 ])-mean, variance, coefficient of variation (CV), kurtosis, skewness, and entropy-were compared between adrenal adenomas and pheochromocytomas, using the Mann-Whitney U-test. Receiver operating characteristic (ROC) curves for the histogram parameters were generated to differentiate adrenal adenomas from pheochromocytomas. Sensitivity and specificity were calculated by using a threshold criterion that would maximize the average of sensitivity and specificity. Variance and CV of ADC800 were significantly higher in pheochromocytomas than in adrenal adenomas (P < 0.001 and P = 0.001, respectively). With all b-value combinations, the entropy of ADC was significantly higher in pheochromocytomas than in adrenal adenomas (all P ≤ 0.001), and showed the highest area under the ROC curve among the ADC histogram parameters for diagnosing adrenal adenomas (ADC200 , 0.82; ADC400 , 0.87; and ADC800 , 0.92), with sensitivity of 84.6% and specificity of 84.6% (cutoff, ≤2.82) with ADC200 ; sensitivity of 89.7% and specificity of 84.6% (cutoff, ≤2.77) with ADC400 ; and sensitivity of 94.9% and specificity of 92.3% (cutoff, ≤2.67) with ADC800 . ADC histogram analysis of DW MRI can help differentiate adrenal adenoma from pheochromocytoma. 3 J. Magn. Reson. Imaging 2017;45:1195-1203. © 2016 International Society for Magnetic Resonance in Medicine.
Monte Carlo estimation of the number of tatami tilings
Kimura, Kenji; Higuchi, Saburo
2016-04-01
Motivated by the way Japanese tatami mats are placed on the floor, we consider domino tilings with a constraint and estimate the number of such tilings of plane regions. We map the system onto a monomer-dimer model with a novel local interaction on the dual lattice. We make use of a variant of the Hamiltonian replica exchange Monte Carlo method where data for ferromagnetic and anti-ferromagnetic models are combined to make a single family of histograms. The properties of the density of states is studied beyond exact enumeration and combinatorial methods. The logarithm of the number of the tilings is linear in the boundary length of the region for all the regions studied.
Histogram Modification and Wavelet Transform for High Performance Watermarking
Directory of Open Access Journals (Sweden)
Ying-Shen Juang
2012-01-01
Full Text Available This paper proposes a reversible watermarking technique for natural images. According to the similarity of neighbor coefficients’ values in wavelet domain, most differences between two adjacent pixels are close to zero. The histogram is built based on these difference statistics. As more peak points can be used for secret data hiding, the hiding capacity is improved compared with those conventional methods. Moreover, as the differences concentricity around zero is improved, the transparency of the host image can be increased. Experimental results and comparison show that the proposed method has both advantages in hiding capacity and transparency.
Colorfulness Enhancement Using Image Classifier Based on Chroma-histogram
Institute of Scientific and Technical Information of China (English)
Moon-cheol KIM; Kyoung-won LIM
2010-01-01
The paper proposes a colorfulness enhancement of pictorial images using image classifier based on chroma histogram.This ap-poach firstly estimates strength of colorfulness of images and their types.With such determined information,the algorithm automatically adjusts image colorfulness for a better natural image look.With the help of an additional detection of skin colors and a pixel chroma adaptive local processing,the algorithm produces more natural image look.The algorithm performance had been tested with an image quality judgment experiment of 20 persons.The experimental result indicates a better image preference.
Text-Independent Speaker Identification Using the Histogram Transform Model
DEFF Research Database (Denmark)
Ma, Zhanyu; Yu, Hong; Tan, Zheng-Hua;
2017-01-01
In this paper, we propose a novel probabilistic method for the task of text-independent speaker identification (SI). In order to capture the dynamic information during SI, we design a super-MFCCs features by cascading three neighboring Mel-frequency Cepstral coefficients (MFCCs) frames together....... These super-MFCC vectors are utilized for probabilistic model training such that the speaker’s characteristics can be sufficiently captured. The probability density function (PDF) of the aforementioned super-MFCCs features is estimated by the recently proposed histogram transform (HT) method. To recedes...
An Improved Double-Threshold Method Based on Gradient Histogram
Institute of Scientific and Technical Information of China (English)
YANG Shen; CHEN Shu-zhen; ZHANG Bing
2004-01-01
This paper analyzes the characteristics of the output gradient histogram and shortages of several traditional automatic threshold methods in order to segment the gradient image better. Then an improved double-threshold method is proposed, which is combined with the method of maximum classes variance, estimating-area method and double-threshold method. This method can automatically select two different thresholds to segment gradient images. The computer simulation is performed on the traditional methods and this algorithm and proves that this method can get satisfying result.
Phase structure of finite density QCD with a histogram method
Nakagawa, Yoshiyuki; Ejiri, Shinji; Hatsuda, Tetsuo; Kanaya, Kazuyuki; Ohno, Hiroshi; Saito, Hana; Umeda, Takashi
2012-01-01
We study the phase structure of QCD in the $T-\\mu$ plane using a histogram method and the reweighting technique by performing phase quenched simulations of two-flavor QCD with RG-improved gauge action and O($a$) improved Wilson quark action. Taking the effects of the complex phase of the quark determinant using the cumulant expansion method, we calculate the probability distribution function of plaquette and phase-quenched determinant as a function of $T$ and $\\mu$. We discuss the order of the QCD phase transition consulting the shape of the probability distribution function.
Efficient local statistical analysis via integral histograms with discrete wavelet transform.
Lee, Teng-Yok; Shen, Han-Wei
2013-12-01
Histograms computed from local regions are commonly used in many visualization applications, and allowing the user to query histograms interactively in regions of arbitrary locations and sizes plays an important role in feature identification and tracking. Computing histograms in regions with arbitrary location and size, nevertheless, can be time consuming for large data sets since it involves expensive I/O and scan of data elements. To achieve both performance- and storage-efficient query of local histograms, we present a new algorithm called WaveletSAT, which utilizes integral histograms, an extension of the summed area tables (SAT), and discrete wavelet transform (DWT). Similar to SAT, an integral histogram is the histogram computed from the area between each grid point and the grid origin, which can be be pre-computed to support fast query. Nevertheless, because one histogram contains multiple bins, it will be very expensive to store one integral histogram at each grid point. To reduce the storage cost for large integral histograms, WaveletSAT treats the integral histograms of all grid points as multiple SATs, each of which can be converted into a sparse representation via DWT, allowing the reconstruction of axis-aligned region histograms of arbitrary sizes from a limited number of wavelet coefficients. Besides, we present an efficient wavelet transform algorithm for SATs that can operate on each grid point separately in logarithmic time complexity, which can be extended to parallel GPU-based implementation. With theoretical and empirical demonstration, we show that WaveletSAT can achieve fast preprocessing and smaller storage overhead than the conventional integral histogram approach with close query performance.
On the Neuberger overlap operator
Boriçi, Artan
1999-04-01
We compute Neuberger's overlap operator by the Lanczos algorithm applied to the Wilson-Dirac operator. Locality of the operator for quenched QCD data and its eigenvalue spectrum in an instanton background are studied.
Variational Histogram Equalization for Single Color Image Defogging
Directory of Open Access Journals (Sweden)
Li Zhou
2016-01-01
Full Text Available Foggy images taken in the bad weather inevitably suffer from contrast loss and color distortion. Existing defogging methods merely resort to digging out an accurate scene transmission in ignorance of their unpleasing distortion and high complexity. Different from previous works, we propose a simple but powerful method based on histogram equalization and the physical degradation model. By revising two constraints in a variational histogram equalization framework, the intensity component of a fog-free image can be estimated in HSI color space, since the airlight is inferred through a color attenuation prior in advance. To cut down the time consumption, a general variation filter is proposed to obtain a numerical solution from the revised framework. After getting the estimated intensity component, it is easy to infer the saturation component from the physical degradation model in saturation channel. Accordingly, the fog-free image can be restored with the estimated intensity and saturation components. In the end, the proposed method is tested on several foggy images and assessed by two no-reference indexes. Experimental results reveal that our method is relatively superior to three groups of relevant and state-of-the-art defogging methods.
Accelerated weight histogram method for exploring free energy landscapes
Lindahl, V.; Lidmar, J.; Hess, B.
2014-07-01
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
Accelerated weight histogram method for exploring free energy landscapes
Energy Technology Data Exchange (ETDEWEB)
Lindahl, V.; Lidmar, J.; Hess, B. [Department of Theoretical Physics and Swedish e-Science Research Center, KTH Royal Institute of Technology, 10691 Stockholm (Sweden)
2014-07-28
Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.
Lean histogram of oriented gradients features for effective eye detection
Sharma, Riti; Savakis, Andreas
2015-11-01
Reliable object detection is very important in computer vision and robotics applications. The histogram of oriented gradients (HOG) is established as one of the most popular hand-crafted features, which along with support vector machine (SVM) classification provides excellent performance for object recognition. We investigate dimensionality deduction on HOG features in combination with SVM classifiers to obtain efficient feature representation and improved classification performance. In addition to lean HOG features, we explore descriptors resulting from dimensionality reduction on histograms of binary descriptors. We consider three-dimensionality reduction techniques: standard principal component analysis, random projections, a computationally efficient linear mapping that is data independent, and locality preserving projections (LPP), which learns the manifold structure of the data. Our methods focus on the application of eye detection and were tested on an eye database created using the BioID and FERET face databases. Our results indicate that manifold learning is beneficial to classification utilizing HOG features. To demonstrate the broader usefulness of lean HOG features for object class recognition, we evaluated our system's classification performance on the CalTech-101 dataset with favorable outcomes.
The Transition Matrix in Flat-histogram Sampling
Brown, Gregory; Eisenbach, M.; Li, Y. W.; Stocks, G. M.; Nicholson, D. M.; Odbadrakh, Kh.; Rikvold, P. A.
2015-03-01
Calculating the thermodynamic density of states (DOS) via flat-histogram sampling is a powerful numerical method for characterizing the temperature-dependent properties of materials. Since the calculated DOS is refined directly from the statistics of the sampling, methods of accelerating the sampling, e.g. through windowing and slow forcing, skew the resulting DOS. Calculating the infinite-temperature transition matrix during the flat-histogram sampling decouples the sampling from estimating the DOS, and allows the techniques of Transition Matrix Monte Carlo to be applied. This enables the calculation of the properties for very large system sizes and thus finite-size scaling analysis of the specific heat, magnetic susceptibility, and cumulant crossings at critical points. We discuss these developments in the context of models for magnetocaloric and spin-crossover materials. This work was performed at the Oak Ridge National Laboratory, which is managed by UT-Battelle for the U.S. Department of Energy. It was sponsored by the U.S. Department of Energy, Office of Basic Energy Sciences, Office of Advanced Scientific Computing Research, and the Oak Ridge Leadership Computing Facility. PAR is supported by the National Science Foundation.
Adaptive edge histogram descriptor for landmine detection using GPR
Frigui, Hichem; Fadeev, Aleksey; Karem, Andrew; Gader, Paul
2009-05-01
The Edge Histogram Detector (EHD) is a landmine detection algorithm for sensor data generated by ground penetrating radar (GPR). It uses edge histograms for feature extraction and a possibilistic K-Nearest Neighbors (K-NN) rule for confidence assignment. To reduce the computational complexity of the EHD and improve its generalization, the K-NN classifier uses few prototypes that can capture the variations of the signatures within each class. Each of these prototypes is assigned a label in the class of mines and a label in the class of clutter to capture its degree of sharing among these classes. The EHD has been tested extensively. It has demonstrated excellent performance on large real world data sets, and has been implemented in real time versions in hand-held and vehicle mounted GPR. In this paper, we propose two modifications to the EHD to improve its performance and adaptability. First, instead of using a fixed threshold to decide if the edge at a certain location is strong enough, we use an adaptive threshold that is learned from the background surrounding the target. This modification makes the EHD more adaptive to different terrains and to mines buried at different depths. Second, we introduce an additional training component that tunes the prototype features and labels to different environments. Results on large and diverse GPR data collections show that the proposed adaptive EHD outperforms the baseline EHD. We also show that the edge threshold can vary significantly according to the edge type, alarm depth, and soil conditions.
Alves, Nelson A; Rizzi, Leandro G
2015-01-01
Microcanonical thermostatistics analysis has become an important tool to reveal essential aspects of phase transitions in complex systems. An efficient way to estimate the microcanonical inverse temperature $\\beta(E)$ and the microcanonical entropy $S(E)$ is achieved with the statistical temperature weighted histogram analysis method (ST-WHAM). The strength of this method lies on its flexibility, as it can be used to analyse data produced by algorithms with generalised sampling weights. However, for any sampling weight, ST-WHAM requires the calculation of derivatives of energy histograms $H(E)$, which leads to non-trivial and tedious binning tasks for models with continuous energy spectrum such as those for biomolecular and colloidal systems. Here, we discuss two alternative methods that avoid the need for such energy binning to obtain continuous estimates for $H(E)$ in order to evaluate $\\beta(E)$ by using ST-WHAM: (i) a series expansion to estimate probability densities from the empirical cumulative distrib...
Chi-Squared Distance Metric Learning for Histogram Data
Directory of Open Access Journals (Sweden)
Wei Yang
2015-01-01
Full Text Available Learning a proper distance metric for histogram data plays a crucial role in many computer vision tasks. The chi-squared distance is a nonlinear metric and is widely used to compare histograms. In this paper, we show how to learn a general form of chi-squared distance based on the nearest neighbor model. In our method, the margin of sample is first defined with respect to the nearest hits (nearest neighbors from the same class and the nearest misses (nearest neighbors from the different classes, and then the simplex-preserving linear transformation is trained by maximizing the margin while minimizing the distance between each sample and its nearest hits. With the iterative projected gradient method for optimization, we naturally introduce the l2,1 norm regularization into the proposed method for sparse metric learning. Comparative studies with the state-of-the-art approaches on five real-world datasets verify the effectiveness of the proposed method.
Illumination Invariant Face Recognition using SQI and Weighted LBP Histogram
Directory of Open Access Journals (Sweden)
Mohsen Biglari
2014-12-01
Full Text Available Face recognition under uneven illumination is still an open problem. One of the main challenges in real-world face recognition systems is illumination variation. In this paper, a novel illumination invariant face recognition approach base on Self Quotient Image (SQI and weighted Local Binary Pattern (WLBP histogram has been proposed. In this system, the performance of the system is increased by using different sigma values of SQI for training and testing. Furthermore, using two multi-region uniform LBP operators for feature extraction simultaneously, made the system more robust to illumination variation. This approach gathers information of the image in different local and global levels. The weighted Chi square statistic is used for histogram comparison and NN (1-NN is used as classifier. The weighted approach emphasizes on the more important regions in the faces. The proposed approach is compared with some new and traditional methods like QI, SQI, QIR, MQI, DMQI, DSFQI, PCA and LDA on Yale face database B and CMU-PIE database. The experimental results show that the proposed method outperforms other tested methods.
Visual Contrast Enhancement Algorithm Based on Histogram Equalization
Directory of Open Access Journals (Sweden)
Chih-Chung Ting
2015-07-01
Full Text Available Image enhancement techniques primarily improve the contrast of an image to lend it a better appearance. One of the popular enhancement methods is histogram equalization (HE because of its simplicity and effectiveness. However, it is rarely applied to consumer electronics products because it can cause excessive contrast enhancement and feature loss problems. These problems make the images processed by HE look unnatural and introduce unwanted artifacts in them. In this study, a visual contrast enhancement algorithm (VCEA based on HE is proposed. VCEA considers the requirements of the human visual perception in order to address the drawbacks of HE. It effectively solves the excessive contrast enhancement problem by adjusting the spaces between two adjacent gray values of the HE histogram. In addition, VCEA reduces the effects of the feature loss problem by using the obtained spaces. Furthermore, VCEA enhances the detailed textures of an image to generate an enhanced image with better visual quality. Experimental results show that images obtained by applying VCEA have higher contrast and are more suited to human visual perception than those processed by HE and other HE-based methods.
Implementing a 3D histogram version of the Energy-Test in ROOT
Energy Technology Data Exchange (ETDEWEB)
Cohen, E.O., E-mail: cohen.erez7@gmail.com [School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978 (Israel); Reid, I.D., E-mail: ivan.reid@brunel.ac.uk [College of Engineering, Design and Physical Sciences, Brunel University London, Uxbridge UB8 3PH (United Kingdom); Piasetzky, E., E-mail: eip@tauphy.tau.ac.il [School of Physics and Astronomy, Tel Aviv University, Tel Aviv 69978 (Israel)
2016-08-21
Comparing simulation and data histograms is of interest in nuclear and particle physics experiments; however, the leading three-dimensional histogram comparison tool available in ROOT, the 3D Kolmogorov–Smirnov test, exhibits shortcomings. Throughout the following, we present and discuss the implementation of an alternative comparison test for three-dimensional histograms, based on the Energy-Test by Aslan and Zech. The software package can be found at (http://www-nuclear.tau.ac.il/ecohen/).
Hand Vein Images Enhancement Based on Local Gray-level Information Histogram
Jun Wang; Guoqing Wang; Ming Li; Wenkai Du; Wenhui Yu
2015-01-01
Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image in...
A CMOS VLSI IC for real-time opto-electronic two-dimensional histogram generation
1993-01-01
Approved for public release; distribution is unlimited. Histogram generation, a standard image processing operation, is a record of the intensity distribution in the image. Histogram generation has straight forward implementations on digital computers using high level languages. A prototype of an optical-electronic histogram generator has been designed and tested for 1-D objects using wirewrapped MSI TTL components. The system has shown to be fairly modular in design. The aspects of the ex...
Maximum-likelihood fits to histograms for improved parameter estimation
Fowler, Joseph W
2013-01-01
Straightforward methods for adapting the familiar chi^2 statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K-alpha fluorescence spectrum, a poor choice of chi^2 can lead to biases of at least 10% in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for chi^2 minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.
Fast Graph Partitioning Active Contours for Image Segmentation Using Histograms
Directory of Open Access Journals (Sweden)
Nath SumitK
2009-01-01
Full Text Available Abstract We present a method to improve the accuracy and speed, as well as significantly reduce the memory requirements, for the recently proposed Graph Partitioning Active Contours (GPACs algorithm for image segmentation in the work of Sumengen and Manjunath (2006. Instead of computing an approximate but still expensive dissimilarity matrix of quadratic size, , for a 2D image of size and regular image tiles of size , we use fixed length histograms and an intensity-based symmetric-centrosymmetric extensor matrix to jointly compute terms associated with the complete dissimilarity matrix. This computationally efficient reformulation of GPAC using a very small memory footprint offers two distinct advantages over the original implementation. It speeds up convergence of the evolving active contour and seamlessly extends performance of GPAC to multidimensional images.
TSimpleAnalysis: histogramming many trees in parallel
Giommi, Luca
2016-01-01
I worked inside the ROOT team of EP-SFT group. My project focuses on writing a ROOT class that has the aim of creating histograms from a TChain. The name of the class is TSimpleAnalysis and it is already integrated in ROOT. The work that I have done was to write the source, the header le of the class and also a python script, that allows to the user to use the class through the command line. This represents a great improvement respect to the usual user code that counts lines and lines of code to do the same thing. (Link for the class: https://root.cern.ch/doc/master/classTSimpleAnalysis.html)
HISTOGRAM TECHNIQUE WITH PIXEL INDICATOR FOR HIGH FIDELITY STEGANOGRAPHY
Directory of Open Access Journals (Sweden)
V.Meiamai
2013-06-01
Full Text Available In this current world of increasing technology trends and the “internet age”, the security of our personal information has become more important than it has ever been there are media reports ofidentity theft and fraud and the numbers of innocent victims are increasing exponentially. Steganography plays an important role in preventing such information destruction by implementing a principle ofimperceptible secret sharing. By this security can be established by clearly embedding data in such a way that the quality of the image is not affected. The existing methodology prevailing now is based on pixel indicator and number of data to be embedded is by pixel value differencing technique. A limitation in this methodology is that the pixel indicator channel is manually selected. The proposed methodology uses pixel indicator channel which is decided using histogram technique and the secret message file has to be embedded in the plane which has the highest color intensity.
Robust Reversible Watermarking Using Integer Wavelet Transform and Histogram Shifting
Directory of Open Access Journals (Sweden)
Deepali T. Biradar
2014-02-01
Full Text Available Along with the rapid development of multimedia technology, digital image is becoming an important vector of network information communication. But meanwhile, the security and reliability of the information are more and more unable to be guaranteed. Images often suffer certain degree destruction in the transmission process which influences the correct extraction of hiding information as well as the image authentication. As a good digital media copyright protection method, information hiding technology already has become a research hotspot. In medical, legal and some other sensitive fields, however, slightly modify to carrier image may cause irreparable damage. Consequently, robust reversible watermarking technology seems more important. However, conventional RRW methods have some drawbacks like- unsatisfactory reversibility, limited robustness, and invisibility for watermarked images. Therefore, it is necessary to have a method to address these problems. So we proposed a novel method using IWT, Histogram Shifting and clustering for watermark embedding and extraction.
Topological susceptibility from the overlap
Del Debbio, L; Debbio, Luigi Del; Pica, Claudio
2004-01-01
The chiral symmetry at finite lattice spacing of Ginsparg-Wilson fermionic actions constrains the renormalization of the lattice operators; in particular, the topological susceptibility does not require any renormalization, when using a fermionic estimator to define the topological charge. Therefore, the overlap formalism appears as an appealing candidate to study the continuum limit of the topological susceptibility while keeping the systematic errors under theoretical control. We present results for the SU(3) pure gauge theory using the index of the overlap Dirac operator to study the topology of the gauge configurations. The topological charge is obtained from the zero modes of the overlap and using a new algorithm for the spectral flow analysis. A detailed comparison with cooling techniques is presented. Particular care is taken in assessing the systematic errors. Relatively high statistics (500 to 1000 independent configurations) yield an extrapolated continuum limit with errors that are comparable with ...
Topological susceptibility from the overlap
DEFF Research Database (Denmark)
Del Debbio, Luigi; Pica, Claudio
2003-01-01
The chiral symmetry at finite lattice spacing of Ginsparg-Wilson fermionic actions constrains the renormalization of the lattice operators; in particular, the topological susceptibility does not require any renormalization, when using a fermionic estimator to define the topological charge....... Therefore, the overlap formalism appears as an appealing candidate to study the continuum limit of the topological susceptibility while keeping the systematic errors under theoretical control. We present results for the SU(3) pure gauge theory using the index of the overlap Dirac operator to study...... the topology of the gauge configurations. The topological charge is obtained from the zero modes of the overlap and using a new algorithm for the spectral flow analysis. A detailed comparison with cooling techniques is presented. Particular care is taken in assessing the systematic errors. Relatively high...
Modified Dynamic Histogram Equalization For Image Enhancement in Gray-scale Images
Directory of Open Access Journals (Sweden)
R.M. Meenal
2012-12-01
Full Text Available To improve the quality of digital images captured in low light environment from consumer electronics devices like cell phone cameras, Modified Dynamic Histogram Equalization (MDHE is proposed. Initially, the proposed method divides the histogram into two sub-histograms based on median. Then 2nd subhistogram is further divided into two sub-histograms based on median of it. The resultant sub-histograms are clipped according to the mean of the intensity occurrence of the input image independently. The new dynamic range is allocated to each sub-histogram. The first sub-histogram is equalized independently using Global Histogram Equalization (GHE method. GHE is found the better enhancement for lower gray-levels and suffers over-enhancement problem in higher gray-level. The proposed method utilizes the advantages of both GHE and QDHE. This method utilizes GHE for enhancing lower gray-levels and QDHE for enhancing higher gray-levels. Simulation results show that the proposed method yields better quality images in terms of Discrete Entropy value compared with other conventional methods.
Hand Vein Images Enhancement Based on Local Gray-level Information Histogram
Directory of Open Access Journals (Sweden)
Jun Wang
2015-06-01
Full Text Available Based on the Histogram equalization theory, this paper presents a novel concept of histogram to realize the contrast enhancement of hand vein images, avoiding the lost of topological vein structure or importing the fake vein information. Firstly, we propose the concept of gray-level information histogram, the fundamental characteristic of which is that the amplitudes of the components can objectively reflect the contribution of the gray levels and information to the representation of image information. Then, we propose the histogram equalization method that is composed of an automatic histogram separation module and an intensity transformation module, and the histogram separation module is a combination of the proposed prompt multiple threshold procedure and an optimum peak signal-to-noise (PSNR calculation to separate the histogram into small-scale detail, the use of the intensity transformation module can enhance the vein images with vein topological structure and gray information preservation for each generated sub-histogram. Experimental results show that the proposed method can achieve extremely good contrast enhancement effect.
Yang, Su
2005-02-01
A new descriptor for symbol recognition is proposed. 1) A histogram is constructed for every pixel to figure out the distribution of the constraints among the other pixels. 2) All the histograms are statistically integrated to form a feature vector with fixed dimension. The robustness and invariance were experimentally confirmed.
Yang, Q. X.; Li, S. H.; Smith, M. B.
The magnetic field homogeneity of a radiofrequency coil is very important in both magnetic resonance imaging and spectroscopy. In this report, a method is proposed for quantitatively evaluating the RF magnetic field homogeneity from its histogram, which is obtained by either experimental measurement or theoretical calculation. The experimental histogram and theoretical histogram can be compared directly to verify the theoretical findings. The RF field homogeneities of the bird-cage coil, slotted-tube resonator, cosine wire coil, and a new radial plate coil design were evaluated using this method. The results showed that the experimental histograms and the corresponding theoretical histograms are consistent. This method provides an easy and sensitive way of evaluating the magnetic field homogeneity and facilitates the design and evaluation of new RF coil configurations.
An automated detection of glaucoma using histogram features
Institute of Scientific and Technical Information of China (English)
Karthikeyan; Sakthivel; Rengarajan; Narayanan
2015-01-01
Glaucoma is a chronic and progressive optic neurodegenerative disease leading to vision deterioration and in most cases produce increased pressure within the eye. This is due to the backup of fluid in the eye; it causes damage to the optic nerve. Hence, early detection diagnosis and treatment of an eye help to prevent the loss of vision. In this paper, a novel method is proposed for the early detection of glaucoma using a combination of magnitude and phase features from the digital fundus images. Local binary patterns(LBP) and Daugman’s algorithm are used to perform the feature set extraction.The histogram features are computed for both the magnitude and phase components. The Euclidean distance between the feature vectors are analyzed to predict glaucoma. The performance of the proposed method is compared with the higher order spectra(HOS)features in terms of sensitivity, specificity, classification accuracy and execution time. The proposed system results 95.45% output for sensitivity, specificity and classification. Also, the execution time for the proposed method takes lesser time than the existing method which is based on HOS features. Hence, the proposed system is accurate, reliable and robust than the existing approach to predict the glaucoma features.
Extended census transform histogram for land-use scene classification
Yuan, Baohua; Li, Shijin
2017-04-01
With the popular use of high-resolution satellite images, more and more research efforts have been focused on land-use scene classification. In scene classification, effective visual features can significantly boost the final performance. As a typical texture descriptor, the census transform histogram (CENTRIST) has emerged as a very powerful tool due to its effective representation ability. However, the most prominent limitation of CENTRIST is its small spatial support area, which may not necessarily be adept at capturing the key texture characteristics. We propose an extended CENTRIST (eCENTRIST), which is made up of three subschemes in a greater neighborhood scale. The proposed eCENTRIST not only inherits the advantages of CENTRIST but also encodes the more useful information of local structures. Meanwhile, multichannel eCENTRIST, which can capture the interactions from multichannel images, is developed to obtain higher categorization accuracy rates. Experimental results demonstrate that the proposed method can achieve competitive performance when compared to state-of-the-art methods.
Histogram Planimetry Method for the Measurement of Irregular Wounds.
Yesiloglu, Nebil; Yildiz, Kemalettin; Cem Akpinar, Ali; Gorgulu, Tahsin; Sirinoglu, Hakan; Ozcan, Arzu
2016-09-01
Irregularly shaped wounds or flap borders usually require specified software or devices to measure their area and follow-up wound healing. In this study, an easy way of area measurement called histogram planimetry (HP) for wounds with irregular geometric shapes is defined and compared to conventional millimetric wound measurement. Ten irregularly bordered geometric shapes were measured by 4 different individuals working as surgical assistants using both HP and manual millimetric measurement tools. The amount of time for each wound shape calculation as well as the measurements of the wound areas were noted. All measurements were compared for each method and between each individual using the Wilcoxon signed-rank test. There was no statistically significant difference between 2 measurement methods by means of measured areas; however, measurement time was significantly lower when the HP method was used. There also was no significant difference between the individuals' measurements and calculation times. These results indicated that HP is useful as a conventional millimetric square wound measurement technique with significantly lower measurement times. Due to the development of photo-editor software technologies, measurements in the surgical field have become more accurate and rapid than conventional manual methods without consuming the time and energy needed for other studies. A future study including comparisons between the presented method and complex computerized measurement methods, in terms of duration and accuracy, may provide additional supportive data for the authors' method.
Using color histograms and SPA-LDA to classify bacteria.
de Almeida, Valber Elias; da Costa, Gean Bezerra; de Sousa Fernandes, David Douglas; Gonçalves Dias Diniz, Paulo Henrique; Brandão, Deysiane; de Medeiros, Ana Claudia Dantas; Véras, Germano
2014-09-01
In this work, a new approach is proposed to verify the differentiating characteristics of five bacteria (Escherichia coli, Enterococcus faecalis, Streptococcus salivarius, Streptococcus oralis, and Staphylococcus aureus) by using digital images obtained with a simple webcam and variable selection by the Successive Projections Algorithm associated with Linear Discriminant Analysis (SPA-LDA). In this sense, color histograms in the red-green-blue (RGB), hue-saturation-value (HSV), and grayscale channels and their combinations were used as input data, and statistically evaluated by using different multivariate classifiers (Soft Independent Modeling by Class Analogy (SIMCA), Principal Component Analysis-Linear Discriminant Analysis (PCA-LDA), Partial Least Squares Discriminant Analysis (PLS-DA) and Successive Projections Algorithm-Linear Discriminant Analysis (SPA-LDA)). The bacteria strains were cultivated in a nutritive blood agar base layer for 24 h by following the Brazilian Pharmacopoeia, maintaining the status of cell growth and the nature of nutrient solutions under the same conditions. The best result in classification was obtained by using RGB and SPA-LDA, which reached 94 and 100 % of classification accuracy in the training and test sets, respectively. This result is extremely positive from the viewpoint of routine clinical analyses, because it avoids bacterial identification based on phenotypic identification of the causative organism using Gram staining, culture, and biochemical proofs. Therefore, the proposed method presents inherent advantages, promoting a simpler, faster, and low-cost alternative for bacterial identification.
Clique graphs and overlapping communities
Evans, T. S.
2010-12-01
It is shown how to construct a clique graph in which properties of cliques of a fixed order in a given graph are represented by vertices in a weighted graph. Various definitions and motivations for these weights are given. The detection of communities or clusters is used to illustrate how a clique graph may be exploited. In particular a benchmark network is shown where clique graphs find the overlapping communities accurately while vertex partition methods fail.
Energy Technology Data Exchange (ETDEWEB)
Yu Chunshui [Department of Radiology, Xuanwu Hospital of Capital Medical University, Beijing (China); Education Ministry Key Laboratory for Neurodegenerative Disease, Beijing (China); Lin Fuchun [State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, Wuhan (China); Liu Yaou; Duan Yunyun [Department of Radiology, Xuanwu Hospital of Capital Medical University, Beijing (China); Lei Hao [State Key Laboratory of Magnetic Resonance and Atomic and Molecular Physics, Wuhan Institute of Physics and Mathematics, Chinese Academy of Sciences, Wuhan (China); Li Kuncheng [Department of Radiology, Xuanwu Hospital of Capital Medical University, Beijing (China); Education Ministry Key Laboratory for Neurodegenerative Disease, Beijing (China)], E-mail: kunchengli1955@yahoo.com.cn
2008-11-15
Objective: The purposes of our study were to employ diffusion tensor imaging (DTI)-based histogram analysis to determine the presence of occult damage in clinically isolated syndrome (CIS), to compare its severity with relapsing-remitting multiple sclerosis (RRMS), and to determine correlations between DTI histogram measures and clinical and MRI indices in these two diseases. Materials and methods: DTI scans were performed in 19 CIS and 19 RRMS patients and 19 matched healthy volunteers. Histogram analyses of mean diffusivity and fractional anisotropy were performed in normal-appearing brain tissue (NABT), normal-appearing white matter (NAWM) and gray matter (NAGM). Correlations were analyzed between these measures and expanded disability status scale (EDSS) scores, T{sub 2}WI lesion volumes (LV) and normalized brain tissue volumes (NBTV) in CIS and RRMS patients. Results: Significant differences were found among CIS, RRMS and control groups in the NBTV and most of the DTI histogram measures of the NABT, NAWM and NAGM. In CIS patients, some DTI histogram measures showed significant correlations with LV and NBTV, but none of them with EDSS. In RRMS patients, however, some DTI histogram measures were significantly correlated with LV, NBTV and EDSS. Conclusion: Occult damage occurs in both NAGM and NAWM in CIS, but the severity is milder than that in RRMS. In CIS and RRMS, the occult damage might be related to both T2 lesion load and brain tissue atrophy. Some DTI histogram measures might be useful for assessing the disease progression in RRMS patients.
Quadrant Dynamic with Automatic Plateau Limit Histogram Equalization for Image Enhancement
Directory of Open Access Journals (Sweden)
P. Jagatheeswari
2014-01-01
Full Text Available The fundamental and important preprocessing stage in image processing is the image contrast enhancement technique. Histogram equalization is an effective contrast enhancement technique. In this paper, a histogram equalization based technique called quadrant dynamic with automatic plateau limit histogram equalization (QDAPLHE is introduced. In this method, a hybrid of dynamic and clipped histogram equalization methods are used to increase the brightness preservation and to reduce the overenhancement. Initially, the proposed QDAPLHE algorithm passes the input image through a median filter to remove the noises present in the image. Then the histogram of the filtered image is divided into four subhistograms while maintaining second separated point as the mean brightness. Then the clipping process is implemented by calculating automatically the plateau limit as the clipped level. The clipped portion of the histogram is modified to reduce the loss of image intensity value. Finally the clipped portion is redistributed uniformly to the entire dynamic range and the conventional histogram equalization is executed in each subhistogram independently. Based on the qualitative and the quantitative analysis, the QDAPLHE method outperforms some existing methods in literature.
Parameterization of the Age-Dependent Whole Brain Apparent Diffusion Coefficient Histogram
Directory of Open Access Journals (Sweden)
Uwe Klose
2015-01-01
Full Text Available Purpose. The distribution of apparent diffusion coefficient (ADC values in the brain can be used to characterize age effects and pathological changes of the brain tissue. The aim of this study was the parameterization of the whole brain ADC histogram by an advanced model with influence of age considered. Methods. Whole brain ADC histograms were calculated for all data and for seven age groups between 10 and 80 years. Modeling of the histograms was performed for two parts of the histogram separately: the brain tissue part was modeled by two Gaussian curves, while the remaining part was fitted by the sum of a Gaussian curve, a biexponential decay, and a straight line. Results. A consistent fitting of the histograms of all age groups was possible with the proposed model. Conclusions. This study confirms the strong dependence of the whole brain ADC histograms on the age of the examined subjects. The proposed model can be used to characterize changes of the whole brain ADC histogram in certain diseases under consideration of age effects.
Xu, Kuan-Man
2006-01-01
A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.
HEp-2 Cell Classification Using Shape Index Histograms With Donut-Shaped Spatial Pooling
DEFF Research Database (Denmark)
Larsen, Anders Boesen Lindbo; Vestergaard, Jacob Schack; Larsen, Rasmus
2014-01-01
We present a new method for automatic classification of indirect immunoflourescence images of HEp-2 cells into different staining pattern classes. Our method is based on a new texture measure called shape index histograms that captures second-order image structure at multiple scales. Moreover, we...... datasets. Our results show that shape index histograms are superior to other popular texture descriptors for HEp-2 cell classification. Moreover, when comparing to other automated systems for HEp-2 cell classification we show that shape index histograms are very competitive; especially considering...
CUDA implementation of histogram stretching function for improving X-ray image.
Lee, Yong H; Kim, Kwan W; Kim, Soon S
2013-01-01
This paper presents a method to improve the contrast of digital X-ray image using CUDA program on a GPU. The histogram is commonly used to get the statistical distribution of the contrast in image processing. To increase the visibility of the image in real time, we use the histogram stretching function. It is difficult to implement the function on a GPU because the CUDA program is due to handle the complex process to transfer the source data and the processed results between the memory of GPU and the host system. As a result, we show to operate the histogram stretching function quickly on GPU by the CUDA program.
Histogram-driven cupping correction (HDCC) in CT
Kyriakou, Y.; Meyer, M.; Lapp, R.; Kalender, W. A.
2010-04-01
Typical cupping correction methods are pre-processing methods which require either pre-calibration measurements or simulations of standard objects to approximate and correct for beam hardening and scatter. Some of them require the knowledge of spectra, detector characteristics, etc. The aim of this work was to develop a practical histogram-driven cupping correction (HDCC) method to post-process the reconstructed images. We use a polynomial representation of the raw-data generated by forward projection of the reconstructed images; forward and backprojection are performed on graphics processing units (GPU). The coefficients of the polynomial are optimized using a simplex minimization of the joint entropy of the CT image and its gradient. The algorithm was evaluated using simulations and measurements of homogeneous and inhomogeneous phantoms. For the measurements a C-arm flat-detector CT (FD-CT) system with a 30×40 cm2 detector, a kilovoltage on board imager (radiation therapy simulator) and a micro-CT system were used. The algorithm reduced cupping artifacts both in simulations and measurements using a fourth-order polynomial and was in good agreement to the reference. The minimization algorithm required less than 70 iterations to adjust the coefficients only performing a linear combination of basis images, thus executing without time consuming operations. HDCC reduced cupping artifacts without the necessity of pre-calibration or other scan information enabling a retrospective improvement of CT image homogeneity. However, the method can work with other cupping correction algorithms or in a calibration manner, as well.
Zhang, Weiming; Hu, Xiaocheng; Li, Xiaolong; Yu, Nenghai
2013-07-01
State-of-the-art schemes for reversible data hiding (RDH) usually consist of two steps: first construct a host sequence with a sharp histogram via prediction errors, and then embed messages by modifying the histogram with methods, such as difference expansion and histogram shift. In this paper, we focus on the second stage, and propose a histogram modification method for RDH, which embeds the message by recursively utilizing the decompression and compression processes of an entropy coder. We prove that, for independent identically distributed (i.i.d.) gray-scale host signals, the proposed method asymptotically approaches the rate-distortion bound of RDH as long as perfect compression can be realized, i.e., the entropy coder can approach entropy. Therefore, this method establishes the equivalency between reversible data hiding and lossless data compression. Experiments show that this coding method can be used to improve the performance of previous RDH schemes and the improvements are more significant for larger images.
Bridge recognition of median-resolution SAR images using pun histogram entropy
Institute of Scientific and Technical Information of China (English)
Wenyu Wu; Dong Yin; Rong Zhang; Yan Liu; Jia Pan
2009-01-01
A novel algorithm for bridge recognition of median synthetic aperture radar (SAR) images using histogram entropy presented by Pun is proposed. Firstly, Lee filter and histogram proportion are used to denoise the original image and to make the target evident. Then, water regions are gained through histogram segmentation and the contours of water regions are extracted. After these, the potential bridge targets are obtained based on the space relativity between bridges and water regions using improved contour search. At last, bridges are recognized by extracting the feature of Pun histogram entropy (PHE) of these potential bridge targets. Experimental results show the good qualities of the algorithm, such as fast speed, high rate of recognition, and low rate of false target.
Color Histograms Adapted to Query-Target Images for Object Recognition across Illumination Changes
Directory of Open Access Journals (Sweden)
Jack-Gérard Postaire
2005-08-01
Full Text Available Most object recognition schemes fail in case of illumination changes between the color image acquisitions. One of the most widely used solutions to cope with this problem is to compare the images by means of the intersection between invariant color histograms. The main originality of our approach is to cope with the problem of illumination changes by analyzing each pair of query and target images constructed during the retrieval, instead of considering each image of the database independently from each other. In this paper, we propose a new approach which determines color histograms adapted to each pair of images. These adapted color histograms are obtained so that their intersection is higher when the two images are similar than when they are different. The adapted color histograms processing is based on an original model of illumination changes based on rank measures of the pixels within the color component images.
Face verification system for Android mobile devices using histogram based features
Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu
2016-07-01
This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.
Background Extraction Method Based on Block Histogram Analysis for Video Image
Institute of Scientific and Technical Information of China (English)
Li Hua; Peng Qiang
2005-01-01
A novel method of histogram analysis for background extraction in video image is proposed, which is derived from the pixelbased histogram analysis. Not only the statistical property of pixels between temporal frames, but also the correlation of local pixels in a single frame is exploited in this method. When carrying out histogram analysis for background extraction, the proposed method is not based on a single pixel but on a 2×2 block that has much less computational quantities and can extract a sound background image from video sequence simultaneously. A comparative experiment between the proposed method and the pixel-based histogram analysis shows that the proposed method has a faster speed in background extraction and the obtained background image is better in quantity.
Adaptive gamma correction based on cumulative histogram for enhancing near-infrared images
Huang, Zhenghua; Zhang, Tianxu; Li, Qian; Fang, Hao
2016-11-01
Histogram-based methods have been proven their ability in image enhancement. To improve low contrast while preserving details and high brightness in near-infrared images, a novel method called adaptive gamma correction based on cumulative histogram (AGCCH) is studied in this paper. This novel image enhancement method improves the contrast of local pixels through adaptive gamma correction (AGC), which is formed by incorporating a cumulative histogram or cumulative sub-histogram into the weighting distribution. Both qualitatively and quantitatively, experimental results demonstrate that the proposed image enhancement with the AGCCH method can perform well in brightness preservation, contrast enhancement, and detail preservation, and it is superior to previous state-of-the-art methods.
Hao, Zi-long; Liu, Yong; Chen, Ruo-wang
2016-11-01
In view of the histogram equalizing algorithm to enhance image in digital image processing, an Infrared Image Gray adaptive adjusting Enhancement Algorithm Based on Gray Redundancy Histogram-dealing Technique is proposed. The algorithm is based on the determination of the entire image gray value, enhanced or lowered the image's overall gray value by increasing appropriate gray points, and then use gray-level redundancy HE method to compress the gray-scale of the image. The algorithm can enhance image detail information. Through MATLAB simulation, this paper compares the algorithm with the histogram equalization method and the algorithm based on gray redundancy histogram-dealing technique , and verifies the effectiveness of the algorithm.
Overlap in Facebook Profiles Reflects Relationship Closeness.
Castañeda, Araceli M; Wendel, Markie L; Crockett, Erin E
2015-01-01
We assessed the association between self-reported Inclusion of Other in the Self (IOS) and Facebook overlap. Ninety-two participants completed online measures of IOS and investment model constructs. Researchers then recorded Facebook data from participants' profile pages. Results from multilevel models revealed that IOS predicted Facebook overlap. Furthermore, Facebook overlap was associated with commitment and investment in ways comparable to self-reported IOS. These findings suggest that overlap in Facebook profiles can be used to measure relationship closeness.
The equivalent Histograms in clinical practice; Los histogramas equivalentes en la practica clinica
Energy Technology Data Exchange (ETDEWEB)
Pizarro Trigo, F.; Teijeira Garcia, M.; Zaballos Carrera, S.
2013-07-01
Is frequently abused of The tolerances established for organ at risk [1] in diagrams of standard fractionation (2Gy/session, 5 sessions per week) when applied to Dose-Volume histograms non-standard schema. The purpose of this work is to establish when this abuse may be more important and realize a transformation of fractionation non-standard of histograms dosis-volumen. Is exposed a case that can be useful to make clinical decisions. (Author)
Generation of non-overlapping fiber architecture
DEFF Research Database (Denmark)
Chapelle, Lucie; Lévesque, M.; Brøndsted, Povl
2015-01-01
of overlapping sphero-cylinders. At the end of the first step, a system of overlapping fibers is obtained. In order to obtain a hard-core configuration where fibers cannot overlap other fibers, we use an iterative method called the force-biased algorithm. It applies virtual forces on each point of the fiber...
Region of Interest Detection Based on Histogram Segmentation for Satellite Image
Kiadtikornthaweeyot, Warinthorn; Tatnall, Adrian R. L.
2016-06-01
High resolution satellite imaging is considered as the outstanding applicant to extract the Earth's surface information. Extraction of a feature of an image is very difficult due to having to find the appropriate image segmentation techniques and combine different methods to detect the Region of Interest (ROI) most effectively. This paper proposes techniques to classify objects in the satellite image by using image processing methods on high-resolution satellite images. The systems to identify the ROI focus on forests, urban and agriculture areas. The proposed system is based on histograms of the image to classify objects using thresholding. The thresholding is performed by considering the behaviour of the histogram mapping to a particular region in the satellite image. The proposed model is based on histogram segmentation and morphology techniques. There are five main steps supporting each other; Histogram classification, Histogram segmentation, Morphological dilation, Morphological fill image area and holes and ROI management. The methods to detect the ROI of the satellite images based on histogram classification have been studied, implemented and tested. The algorithm is be able to detect the area of forests, urban and agriculture separately. The image segmentation methods can detect the ROI and reduce the size of the original image by discarding the unnecessary parts.
Scintigraphic image contrast-enhancement techniques: Global and local area histogram equalization
Energy Technology Data Exchange (ETDEWEB)
Verdenet, J.; Cardot, J.C.; Baud, M.; Chervet, H.; Bidet, R.; Duvernoy, J.
1981-06-01
This article develops two contrast-modification techniques for the display of scintigraphic images. Based on histogram-modification techniques, histogram equalization, where each level of gray is used to the same extent, gives maximum entropy. The first technique uses the application of histogram equalization in the whole image. To eliminate contrast attenuation in image areas that represent a statistically small but important portion of the gray scale histogram, local area histogram equalization has been applied to images with differences in intensity. Both techniques were tested using a phantom with known characteristics. The global equalization technique is more suitable to bone scintigraphies, and some well-chosen boundaries improved the difference between two comparable areas. For liver scintigraphies, where intensity is quite equal in every pixel, a local area equalization was chosen that allowed detection of heterogeneous structures. The image resulting from histogram-equalization techniques improve the readability of data, but are often far from usual images and necessitate an apprenticeship for the physician.
DE-STRIPING FOR TDICCD REMOTE SENSING IMAGE BASED ON STATISTICAL FEATURES OF HISTOGRAM
Directory of Open Access Journals (Sweden)
H.-T. Gao
2016-06-01
Full Text Available Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.
De-Striping for Tdiccd Remote Sensing Image Based on Statistical Features of Histogram
Gao, Hui-ting; Liu, Wei; He, Hong-yan; Zhang, Bing-xian; Jiang, Cheng
2016-06-01
Aim to striping noise brought by non-uniform response of remote sensing TDI CCD, a novel de-striping method based on statistical features of image histogram is put forward. By analysing the distribution of histograms,the centroid of histogram is selected to be an eigenvalue representing uniformity of ground objects,histogrammic centroid of whole image and each pixels are calculated first,the differences between them are regard as rough correction coefficients, then in order to avoid the sensitivity caused by single parameter and considering the strong continuity and pertinence of ground objects between two adjacent pixels,correlation coefficient of the histograms is introduces to reflect the similarities between them,fine correction coefficient is obtained by searching around the rough correction coefficient,additionally,in view of the influence of bright cloud on histogram,an automatic cloud detection based on multi-feature including grey level,texture,fractal dimension and edge is used to pre-process image.Two 0-level panchromatic images of SJ-9A satellite with obvious strip noise are processed by proposed method to evaluate the performance, results show that the visual quality of images are improved because the strip noise is entirely removed,we quantitatively analyse the result by calculating the non-uniformity ,which has reached about 1% and is better than histogram matching method.
On Multigrid for Overlapping Grids
Energy Technology Data Exchange (ETDEWEB)
Henshaw, W
2004-01-13
The solution of elliptic partial differential equations on composite overlapping grids using multigrid is discussed. An approach is described that provides a fast and memory efficient scheme for the solution of boundary value problems in complex geometries. The key aspects of the new scheme are an automatic coarse grid generation algorithm, an adaptive smoothing technique for adjusting residuals on different component grids, and the use of local smoothing near interpolation boundaries. Other important features include optimizations for Cartesian component grids, the use of over-relaxed Red-Black smoothers and the generation of coarse grid operators through Galerkin averaging. Numerical results in two and three dimensions show that very good multigrid convergence rates can be obtained for both Dirichlet and Neumann/mixed boundary conditions. A comparison to Krylov based solvers shows that the multigrid solver can be much faster and require significantly less memory.
Overlapping Structures in Sensory-Motor Mappings
Earland, Kevin; Lee, Mark; Shaw, Patricia; Law, James
2014-01-01
This paper examines a biologically-inspired representation technique designed for the support of sensory-motor learning in developmental robotics. An interesting feature of the many topographic neural sheets in the brain is that closely packed receptive fields must overlap in order to fully cover a spatial region. This raises interesting scientific questions with engineering implications: e.g. is overlap detrimental? does it have any benefits? This paper examines the effects and properties of overlap between elements arranged in arrays or maps. In particular we investigate how overlap affects the representation and transmission of spatial location information on and between topographic maps. Through a series of experiments we determine the conditions under which overlap offers advantages and identify useful ranges of overlap for building mappings in cognitive robotic systems. Our motivation is to understand the phenomena of overlap in order to provide guidance for application in sensory-motor learning robots. PMID:24392118
彩色直方图估测前列腺血流的价值%Evaluation Prostatic Flow with Color Histogram
Institute of Scientific and Technical Information of China (English)
陈亚青; 周永昌; 黄慕民; 唐天雪; 张惠箴; 陈洁晴
2001-01-01
Objective:To set relative reference data of normal and abnormal prostatic color flow with color histogram and to discuss the role in the diagnosis of prostatic diseases.Methods：Twenty-five patients with benign prostatic hypertrophy(BPH)and twenty-five with prostatic carcinoma(PCa)were measured by color histogram with two types of transrectal probe(endfire and biplane),and twenty-two normal cases were as the control group.Results:There were significant differences in the mean BCR among three groups,and there was no significant difference between the two types of the probe.However,there was overlap on individuals in each group,the overlap range was 17%～26%，74%～83% with no overlap.Conclusions:Measurements of Prostatic blood flow with color histogram can become one of the objective parameters for diagnosis of prostatic diseases.%目的：本文利用彩色直方图对正常和异常前列腺彩色血流给出定量参考值并探讨其对前列腺疾病诊断的价值。方法：分别用端射式和双平面二种探头对经穿刺活检证实的前列腺增生症和前列腺癌各25例用彩色直方图软件测得前列腺内彩色血流面积占前列腺面积的百分比(BCR)，并有22例正常对照。结果：正常组、前列腺增生症组、前列腺癌组之间BCR的均值在统计学上有显著意义(端射式P＜0.05；双平面P＜0.01)，二种探头之间无统计学意义(P＞0.05)。但对个体而言各组间存在一定交叉，其交叉范围为17%～26%，不交叉范围为74%～83%。结论：彩色直方图测定前列腺血流方法可作为前列腺疾病诊断中客观的参考依据。
Directory of Open Access Journals (Sweden)
Ali Maleki
2015-12-01
Full Text Available Background: Presently, the graphical data of blood cells (histograms and cytograms or/ scattergrams that they are usually available in all modern automated hematology analyzers are an integral a part of automated complete blood count (CBC. To find incorrect results from automated hematology analyzer and establish the samples that require additional analysis, Laboratory employees will use those data for quality control of obtaining results, to assist identification of complex and troublesome cases. Methods: During this descriptive analytic study, in addition to erythrocyte graphs from variety of patients, referring from March 2013 to Feb 2014 to our clinical laboratory, Zagros Hospital, Kermanshah, Iran, are given, the papers published in relevant literature as well as available published manuals of automatic blood cell counters were used. articles related to the key words of erythrocyte graphs and relevant literature as well as available published manuals of automatic blood cell counters were searched from valid databases such as Springer Link, google scholar, Pubmed and Sciencedirect. Then, the articles related to erythrogram, erythrocyte histogram and hematology analyzer graphs are involved in diagnosis of hematological disorder were searched and selected for this study. Results: Histograms and different automated CBC parameter become abnormal in various pathologic conditions, and can present important clues for diagnosis and treatment of hematologic and non-hematologic disorders. In several instances, these histograms have characteristic appearances in an exceedingly wide range of pathological conditions. In some hematologic disorders like iron deficiency or megaloblastic anemia, a sequential histogram can clearly show the progressive treatment and management. Conclusion: These graphical data are often accompanied by other automated CBC parameter and microscopic examination of peripheral blood smears (PBS, and can help in monitoring and
Digital image modification detection using color information and its histograms.
Zhou, Haoyu; Shen, Yue; Zhu, Xinghui; Liu, Bo; Fu, Zigang; Fan, Na
2016-09-01
The rapid development of many open source and commercial image editing software makes the authenticity of the digital images questionable. Copy-move forgery is one of the most widely used tampering techniques to create desirable objects or conceal undesirable objects in a scene. Existing techniques reported in the literature to detect such tampering aim to improve the robustness of these methods against the use of JPEG compression, blurring, noise, or other types of post processing operations. These post processing operations are frequently used with the intention to conceal tampering and reduce tampering clues. A robust method based on the color moments and other five image descriptors is proposed in this paper. The method divides the image into fixed size overlapping blocks. Clustering operation divides entire search space into smaller pieces with similar color distribution. Blocks from the tampered regions will reside within the same cluster since both copied and moved regions have similar color distributions. Five image descriptors are used to extract block features, which makes the method more robust to post processing operations. An ensemble of deep compositional pattern-producing neural networks are trained with these extracted features. Similarity among feature vectors in clusters indicates possible forged regions. Experimental results show that the proposed method can detect copy-move forgery even if an image was distorted by gamma correction, addictive white Gaussian noise, JPEG compression, or blurring.
A global survey of aerosol-liquid water cloud overlap based on four years of CALIPSO-CALIOP data
Directory of Open Access Journals (Sweden)
A. Devasthale
2010-09-01
Full Text Available The presence of aerosols over highly reflective liquid water cloud tops poses a big challenge in simulating their radiative impacts. Particularly, absorbing aerosols, such as smoke, may have significant impact in such situations and even change the sign of net radiative forcing. Until now, it was not possible to obtain information on such overlap events realistically from the existing passive satellite sensors. However, the CALIOP instrument onboard NASA's CALIPSO satellite allows us to examine these events with an unprecedented accuracy.
Using four years of collocated CALIPSO 5 km Aerosol and Cloud Layer Version 3 Products (June 2006–May 2010, we quantify, for the first time, the macrophysical characteristics of overlapping aerosol and water cloud layers globally. We investigate seasonal variability in these characteristics over six latitude bands to understand the hemispheric differences. We compute a the percentage cases when such overlap is seen globally and seasonally when all aerosol types are included (AAO case in the analysis, b the joint histograms of aerosol layer base height and cloud layer top height, and c the joint histograms of aerosol and cloud geometrical thicknesses in such overlap cases. We also investigate frequency of smoke aerosol-cloud overlap (SAO case.
The results show a distinct seasonality in overlap frequency in both AAO and SAO cases. Globally, the frequency is highest during JJA months in AAO case, while for the SAO case, it is highest in SON months. The seasonal mean overlap frequency can regionally exceed 20% in AAO case and 10% in SAO case. There is a tendency that the vertical separation between aerosol and cloud layers increases from high to low latitude regions in the both hemispheres. In about 5–10% cases the vertical distance between aerosol and cloud layers is less than 100 m, while about in 45–60% cases it less than a kilometer in the annual means for different latitudinal bands
Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function
Directory of Open Access Journals (Sweden)
Heung-Kyu Lee
2010-01-01
Full Text Available We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.
Improved Steganographic Method Preserving Pixel-Value Differencing Histogram with Modulus Function
Directory of Open Access Journals (Sweden)
Lee Hae-Yeoun
2010-01-01
Full Text Available Abstract We herein advance a secure steganographic algorithm that uses a turnover policy and a novel adjusting process. Although the method of Wang et al. uses Pixel-Value Differencing (PVD and their modulus function provides high capacity and good image quality, the embedding process causes a number of artifacts, such as abnormal increases and fluctuations in the PVD histogram, which may reveal the existence of the hidden message. In order to enhance the security of the algorithm, a turnover policy is used that prevents abnormal increases in the histogram values and a novel adjusting process is devised to remove the fluctuations at the border of the subrange in the PVD histogram. The proposed method therefore eliminates all the weaknesses of the PVD steganographic methods thus far proposed and guarantees secure communication. In the experiments described herein, the proposed algorithm is compared with other PVD steganographic algorithms by using well-known steganalysis techniques, such as RS-analysis, steganalysis for LSB matching, and histogram-based attacks. The results support our contention that the proposed method enhances security by keeping the PVD histogram similar to the cover, while also providing high embedding capacity and good imperceptibility to the naked eye.
The L_infinity constrained global optimal histogram equalization technique for real time imaging
Ren, Qiongwei; Niu, Yi; Liu, Lin; Jiao, Yang; Shi, Guangming
2015-08-01
Although the current imaging sensors can achieve 12 or higher precision, the current display devices and the commonly used digital image formats are still only 8 bits. This mismatch causes significant waste of the sensor precision and loss of information when storing and displaying the images. For better usage of the precision-budget, tone mapping operators have to be used to map the high-precision data into low-precision digital images adaptively. In this paper, the classic histogram equalization tone mapping operator is reexamined in the sense of optimization. We point out that the traditional histogram equalization technique and its variants are fundamentally improper by suffering from local optimum problems. To overcome this drawback, we remodel the histogram equalization tone mapping task based on graphic theory which achieves the global optimal solutions. Another advantage of the graphic-based modeling is that the tone-continuity is also modeled as a vital constraint in our approach which suppress the annoying boundary artifacts of the traditional approaches. In addition, we propose a novel dynamic programming technique to solve the histogram equalization problem in real time. Experimental results shows that the proposed tone-preserved global optimal histogram equalization technique outperforms the traditional approaches by exhibiting more subtle details in the foreground while preserving the smoothness of the background.
Solving Partial Differential Equations on Overlapping Grids
Energy Technology Data Exchange (ETDEWEB)
Henshaw, W D
2008-09-22
We discuss the solution of partial differential equations (PDEs) on overlapping grids. This is a powerful technique for efficiently solving problems in complex, possibly moving, geometry. An overlapping grid consists of a set of structured grids that overlap and cover the computational domain. By allowing the grids to overlap, grids for complex geometries can be more easily constructed. The overlapping grid approach can also be used to remove coordinate singularities by, for example, covering a sphere with two or more patches. We describe the application of the overlapping grid approach to a variety of different problems. These include the solution of incompressible fluid flows with moving and deforming geometry, the solution of high-speed compressible reactive flow with rigid bodies using adaptive mesh refinement (AMR), and the solution of the time-domain Maxwell's equations of electromagnetism.
Overlapped frequency-time division multiplexing
Institute of Scientific and Technical Information of China (English)
JIANG Hui; LI Dao-ben
2009-01-01
A technique named overlapped frequency-time division multiplexing (OVFTDM)) is proposed in this article. The technique is derived from Nyquist system and frequency-time division multiplexing system. When the signals are compactly overlapped without the orthogonality in time domain, the technique is named overlapped time division multiplexing (OVTDM), whereas when signals are compactly overlapped without the orthogonality in frequency domain, the technique is called overlapped frequency division multiplexing (OVFDM). To further improve spectral efficiency, the OVFTDM in which signals are overlapped both in frequency domain and in time domain is explored. OVFTDM does not depend on orthogonality whatever in time domain or in frequency domain like Nyquist system or OFDM system, but on the convolutional constraint relationship among signals. Therefore, not only the spectral efficiency but also the reliability is improved. The simulations verify the validity of this theory.
Correlated Edge Overlaps in Multiplex Networks
Baxter, Gareth J; da Costa, Rui A; Dorogovtsev, Sergey N; Mendes, José F F
2016-01-01
We develop the theory of sparse multiplex networks with partially overlapping links based on their local tree-likeness. This theory enables us to find the giant mutually connected component in a two-layer multiplex network with arbitrary correlations between connections of different types. We find that correlations between the overlapping and non-overlapping links markedly change the phase diagram of the system, leading to multiple hybrid phase transitions. For assortative correlations we observe recurrent hybrid phase transitions.
Finding overlapping communities using seed set
Yang, Jin-Xuan; Zhang, Xiao-Dong
2017-02-01
The local optimization algorithm using seed set to find overlapping communities has become more and more a significant method, but it is a great challenge how to choose a good seed set. In this paper, a new method is proposed to achieve the choice of candidate seed sets, and yields a new algorithm to find overlapping communities in complex networks. By testing in real world networks and synthetic networks, this method can successfully detect overlapping communities and outperform other state-of-the-art overlapping community detection methods.
Overlap syndromes among autoimmune liver diseases
Institute of Scientific and Technical Information of China (English)
Christian Rust; Ulrich Beuers
2008-01-01
The three major immune disorders of the liver are autoimmune hepatitis (AIH),primary biliary cirrhosis (PBC) and primary sclerosing cholangitis (PSC).Variant forms of these diseases are generally called overlap syndromes,although there has been no standardised definition.Patients with overlap syndromes present with both hepatitic and cholestatic serum liver tests and have histological features of AIH and PBC or PSC.The AIH-PBC overlap syndrome is the most common form,affecting almost 10% of adults with AIH or PBC.Single cases of AIH and autoimmune cholangitis (AMA-negative PBC) overlap syndrome have also been reported.The AIH-PSC overlap syndrome is predominantly found in children,adolescents and young adults with AIH or PSC.Interestingly,transitions from one autoimmune to another have also been reported in a minority of patients,especially transitions from PBC to AIH-PBC overlap syndrome.Overlap syndromes show a progressive course towards liver cirrhosis and liver failure without treatment.Therapy for overlap syndromes is empiric,since controlled trials are not available in these rare disorders.Anticholestatic therapy with ursodeoxycholic acid is usually combined with immunosuppressive therapy with corticosteroids and/or azathioprine in both AIH-PBC and AIH-PSC overlap syndromes.In end-stage disease,liver transplantation is the treatment of choice.
Recovery of the histogram of hourly ozone distribution from weekly average concentrations
Energy Technology Data Exchange (ETDEWEB)
Olcese, Luis E. [Departamento de Fisico Quimica/INFIQC, Facultad de Ciencias Quimicas, Universidad Nacional de Cordoba, 5000 Cordoba (Argentina)]. E-mail: lolcese@fcq.unc.edu.ar; Toselli, Beatriz M. [Departamento de Fisico Quimica/INFIQC, Facultad de Ciencias Quimicas, Universidad Nacional de Cordoba, 5000 Cordoba (Argentina)
2006-05-15
A simple method is presented for estimating hourly distribution of air pollutants, based on data collected by passive sensors on a weekly or bi-weekly basis with no need for previous measurements at a site. In order for this method to be applied to locations where no hourly records are available, reference data from other sites are required to generate calibration histograms. The proposed procedure allows one to obtain the histogram of hourly ozone values during a given week with an error of about 30%, which is good considering the simplicity of this approach. This method can be a valuable tool for sites that lack previous hourly records of pollutant ambient concentrations, where it can be used to verify compliance with regulations or to estimate the AOT40 index with an acceptable degree of exactitude. - The histogram of hourly ozone distribution can be obtained based on passive sensor data.
A novel JPEG steganography method based on modulus function with histogram analysis
Directory of Open Access Journals (Sweden)
V. Banoci
2012-06-01
Full Text Available In this paper, we present a novel steganographic method for embedding of secret data in still grayscale JPEG image. In order to provide large capacity of the proposed method while maintaining good visual quality of stego-image, the embedding process is performed in quantized transform coefficients of Discrete Cosine transform (DCT by modifying coefficients according to modulo function, what gives to the steganography system blind extraction predisposition. After-embedding histogram of proposed Modulo Histogram Fitting (MHF method is analyzed to secure steganography system against steganalysis attacks. In addition, AES ciphering was implemented to increase security and improve histogram after-embedding characteristics of proposed steganography system as experimental results show.
Fiorino, Claudio; Maggiulli, Eleonora; Broggi, Sara; Liberini, Simone; Cattaneo, Giovanni Mauro; Dell'oca, Italo; Faggiano, Elena; Di Muzio, Nadia; Calandrino, Riccardo; Rizzo, Giovanna
2011-06-07
The Jacobian of the deformation field of elastic registration between images taken during radiotherapy is a measure of inter-fraction local deformation. The histogram of the Jacobian values (Jac) within an organ was introduced (JVH-Jacobian-volume-histogram) and first applied in quantifying parotid shrinkage. MVCTs of 32 patients previously treated with helical tomotherapy for head-neck cancers were collected. Parotid deformation was evaluated through elastic registration between MVCTs taken at the first and last fractions. Jac was calculated for each voxel of all parotids, and integral JVHs were calculated for each parotid; the correlation between the JVH and the planning dose-volume histogram (DVH) was investigated. On average, 82% (±17%) of the voxels shrinks (Jac 50% (Jac Jac Jac0.5Jac and the JVH are promising tools for scoring/modelling toxicity and for evaluating organ/contour variations with potential applications in adaptive radiotherapy.
Energy degeneracies from Broad Histogram Method and Wang-Landau Sampling
Lima, Alexandre Pereira; Girardi, Daniel
2016-01-01
In this work, we present a comparative study of the accuracy provided by the Wang-Landau sampling and the Broad Histogram method to estimate de density of states of the two dimensional Ising ferromagnet. The microcanonical averages used to describe the thermodynamic behaviour and to use the Broad Histogram method were obtained using the single spin-flip Wang-Landau sampling, attempting to convergence issues and accuracy improvements. We compare the results provided by both techniques with the exact ones for thermodynamic properties and critical exponents. Our results, within the Wang-Landau sampling, reveal that the Broad Histogram approach provides a better description of the density of states for all cases analysed.
PERFORMANCE MEASUREMENTS OF FEATURE TRACKING AND HISTOGRAM BASED TRAFFIC CONGESTION ALGORITHMS
Directory of Open Access Journals (Sweden)
Ozgur Altun
2015-05-01
Full Text Available In this paper, feature tracking based and histogram based traffic congestion detection systems are developed. Developed all system are designed to run as real time application. In this work, ORB (Oriented FAST and Rotated BRIEF feature extraction method have been used to develop feature tracking based traffic congestion solution. ORB is a rotation invariant, fast and resistant to noise method and contains the power of FAST and BRIEF feature extraction methods. Also, two different approaches, which are standard deviation and weighed average, have been applied to find out the congestion information by using histogram of the image to develop histogram based traffic congestion solution. Both systems have been tested on different weather conditions such as cloudy, sunny and rainy to provide various illumination at both daytime and night. For all developed systems performance results are examined to show the advantages and drawbacks of these systems
HSV Color Histogram and Directional Binary Wavelet Patterns for Content Based Image Retrieval
Directory of Open Access Journals (Sweden)
P.Vijaya Bhaskar Reddy
2012-08-01
Full Text Available This paper presents a new image indexing and retrieval algorithm by integrating color (HSV color histogram and texture (directional binary wavelet patterns (DBWP features. For color feature,first the RGB image is converted to HSV image, and then histograms are constructed from HSV spaces. For texture feature, an 8-bit grayscale image is divided into eight binary bit-planes, and then binary wavelet transform (BWT on each bitplane to extract the multi-resolution binary images. The local binary pattern (LBP features are extracted from the resultant BWT sub-bands. Two experiments have beencarried out for proving the worth of our algorithm. It is further mentioned that the database considered for experiments are Corel 1000 database (DB1, and MIT VisTex database (DB2. The results after beinginvestigated show a significant improvement in terms of their evaluation measures as compared to HSV histogram and DBWP.
Multi-radius centralized binary pattern histogram projection for face recognition
Institute of Scientific and Technical Information of China (English)
Xiaofeng Fu; Wei Wei
2009-01-01
The existing local binary pattern (LBP) operators have several disadvantages such as rather long histograms,lower discrimination,and sensitivity to noise.Aiming at these problems,we propose the centralized binary pattern (CBP) operator.CBP operator can significantly rcduce the histograms' dimensionality,offer stronger discrimination,and decrease the white noise's influence on face images.Moreover,for increasing the recognition accuracy and speed,we use multi-radius CBP histogram as face representation and project it onto locality preserving projection (LPP) space to obtain lower dimensional features.Experiments on FERET and CAS-PEAL databases demonstrate that the proposed method is superior to other modern approaches not only in recognition accuracy but also in recognition speed.
Reversible Data Hiding Based on Two-level HDWT Coefficient Histograms
Directory of Open Access Journals (Sweden)
Xu-Ren Luo
2011-05-01
Full Text Available In recent years, reversible data hiding has attracted much more attention than before. Reversibilitysignifies that the original media can be recovered without any loss from the marked media afterextracting the embedded message. This paper presents a new method that adopts two-level wavelettransform and exploits the feature of large wavelet coefficient variance to achieve the goal of highcapacity with imperceptibility. Our method differs from those of previous ones in which the waveletcoefficients histogram not gray-level histogram is manipulated. Besides, clever shifting rules areintroduced into histogram to avoid the decimal problem in pixel values after recovery to achievereversibility. With small alteration of the wavelet coefficients in the embedding process, and therefore lowvisual distortion is obtained in the marked image. In addition, an important feature of our design is thatthe use of threshold is much different from previous studies. The results indicate that our design issuperior to many other state-of-the-art reversible data hiding schemes.
Glioma grade assessment by using histogram analysis of diffusion tensor imaging-derived maps.
Jakab, András; Molnár, Péter; Emri, Miklós; Berényi, Ervin
2011-07-01
Current endeavors in neuro-oncology include morphological validation of imaging methods by histology, including molecular and immunohistochemical techniques. Diffusion tensor imaging (DTI) is an up-to-date methodology of intracranial diagnostics that has gained importance in studies of neoplasia. Our aim was to assess the feasibility of discriminant analysis applied to histograms of preoperative diffusion tensor imaging-derived images for the prediction of glioma grade validated by histomorphology. Tumors of 40 consecutive patients included 13 grade II astrocytomas, seven oligoastrocytomas, six grade II oligodendrogliomas, three grade III oligoastrocytomas, and 11 glioblastoma multiformes. Preoperative DTI data comprised: unweighted (B (0)) images, fractional anisotropy, longitudinal and radial diffusivity maps, directionally averaged diffusion-weighted imaging, and trace images. Sampling consisted of generating histograms for gross tumor volumes; 25 histogram bins per scalar map were calculated. The histogram bins that allowed the most precise determination of low-grade (LG) or high-grade (HG) classification were selected by multivariate discriminant analysis. Accuracy of the model was defined by the success rate of the leave-one-out cross-validation. Statistical descriptors of voxel value distribution did not differ between LG and HG tumors and did not allow classification. The histogram model had 88.5% specificity and 85.7% sensitivity in the separation of LG and HG gliomas; specificity was improved when cases with oligodendroglial components were omitted. Constructing histograms of preoperative radiological images over the tumor volume allows representation of the grade and enables discrimination of LG and HG gliomas which has been confirmed by histopathology.
Histogram Bins Matching Approach for CBIR Based on Linear grouping for Dimensionality Reduction
Directory of Open Access Journals (Sweden)
H. B. Kekre
2013-11-01
Full Text Available This paper describes the histogram bins matching approach for CBIR. Histogram bins are reduced from 256 to 32 and 16 by linear grouping and effect of this dimensionality reduction is analyzed, compared, and evaluated. Work presented in this paper contributes in all three main phases of CBIR that are feature extraction, similarity matching and performance evaluation. Feature extraction explores the idea of histogram bins matching for three colors R, G and B. Histogram bin contents are used to represent the feature vector in three forms. First form of feature is count of pixels, and then other forms are obtained by computing the total and mean of intensities for the pixels falling in each of the histogram bins. Initially the size of the feature vector is 256 components as histogram with the all 256 bins. Further the size of the feature vector is reduced to 32 bins and then 16 bins by simple linear grouping of the bins. Feature extraction processes for each size and type of the feature vector is executed over the database of 2000 BMP images having 20 different classes. It prepares the feature vector databases as preprocessing part of this work. Similarity matching between query and database image feature vectors is carried out by means of first five orders of Minkowski distance and also with the cosine correlation distance. Same set of 200 query images are executed for all types of feature vector and for all similarity measures. Performance of all aspects addressed in this paper are evaluated using three parameters PRCP (Precision Recall Cross over Point, LS (longest string, LSRR (Length of String to Retrieve all Relevant images.
Glioma grade assessment by using histogram analysis of diffusion tensor imaging-derived maps
Energy Technology Data Exchange (ETDEWEB)
Jakab, Andras; Berenyi, Ervin [University of Debrecen Medical and Health Science Center, Department of Biomedical Laboratory and Imaging Science, Faculty of Medicine, Debrecen (Hungary); Molnar, Peter [University of Debrecen Medical and Health Science Center, Institute of Pathology, Faculty of Medicine, Debrecen (Hungary); Emri, Miklos [University of Debrecen Medical and Health Science Center, Institute of Nuclear Medicine, Faculty of Medicine, Debrecen (Hungary)
2011-07-15
Current endeavors in neuro-oncology include morphological validation of imaging methods by histology, including molecular and immunohistochemical techniques. Diffusion tensor imaging (DTI) is an up-to-date methodology of intracranial diagnostics that has gained importance in studies of neoplasia. Our aim was to assess the feasibility of discriminant analysis applied to histograms of preoperative diffusion tensor imaging-derived images for the prediction of glioma grade validated by histomorphology. Tumors of 40 consecutive patients included 13 grade II astrocytomas, seven oligoastrocytomas, six grade II oligodendrogliomas, three grade III oligoastrocytomas, and 11 glioblastoma multiformes. Preoperative DTI data comprised: unweighted (B{sub 0}) images, fractional anisotropy, longitudinal and radial diffusivity maps, directionally averaged diffusion-weighted imaging, and trace images. Sampling consisted of generating histograms for gross tumor volumes; 25 histogram bins per scalar map were calculated. The histogram bins that allowed the most precise determination of low-grade (LG) or high-grade (HG) classification were selected by multivariate discriminant analysis. Accuracy of the model was defined by the success rate of the leave-one-out cross-validation. Statistical descriptors of voxel value distribution did not differ between LG and HG tumors and did not allow classification. The histogram model had 88.5% specificity and 85.7% sensitivity in the separation of LG and HG gliomas; specificity was improved when cases with oligodendroglial components were omitted. Constructing histograms of preoperative radiological images over the tumor volume allows representation of the grade and enables discrimination of LG and HG gliomas which has been confirmed by histopathology. (orig.)
Adapting histogram for automatic noise data removal in building interior point cloud data
Shukor, S. A. Abdul; Rushforth, E. J.
2015-05-01
3D point cloud data is now preferred by researchers to generate 3D models. These models can be used throughout a variety of applications including 3D building interior models. The rise of Building Information Modeling (BIM) for Architectural, Engineering, Construction (AEC) applications has given 3D interior modelling more attention recently. To generate a 3D model representing the building interior, a laser scanner is used to collect the point cloud data. However, this data often comes with noise. This is due to several factors including the surrounding objects, lighting and specifications of the laser scanner. This paper highlights on the usage of the histogram to remove the noise data. Histograms, used in statistics and probability, are regularly being used in a number of applications like image processing, where a histogram can represent the total number of pixels in an image at each intensity level. Here, histograms represent the number of points recorded at range distance intervals in various projections. As unwanted noise data has a sparser cloud density compared to the required data and is usually situated at a notable distance from the required data, noise data will have lower frequencies in the histogram. By defining the acceptable range using the average frequency, points below this range can be removed. This research has shown that these histograms have the capabilities to remove unwanted data from 3D point cloud data representing building interiors automatically. This feature will aid the process of data preprocessing in producing an ideal 3D model from the point cloud data.
Spline Histogram Method for Reconstruction of Probability Density Functions of Clusters of Galaxies
Docenko, Dmitrijs; Berzins, Karlis
We describe the spline histogram algorithm which is useful for visualization of the probability density function setting up a statistical hypothesis for a test. The spline histogram is constructed from discrete data measurements using tensioned cubic spline interpolation of the cumulative distribution function which is then differentiated and smoothed using the Savitzky-Golay filter. The optimal width of the filter is determined by minimization of the Integrated Square Error function. The current distribution of the TCSplin algorithm written in f77 with IDL and Gnuplot visualization scripts is available from www.virac.lv/en/soft.html.
Spline histogram method for reconstruction of probability density function of clusters of galaxies
Docenko, D; Docenko, Dmitrijs; Berzins, Karlis
2003-01-01
We describe the spline histogram algorithm which is useful for visualization of the probability density function setting up a statistical hypothesis for a test. The spline histogram is constructed from discrete data measurements using tensioned cubic spline interpolation of the cumulative distribution function which is then differentiated and smoothed using the Savitzky-Golay filter. The optimal width of the filter is determined by minimization of the Integrated Square Error function. The current distribution of the TCSplin algorithm written in f77 with IDL and Gnuplot visualization scripts is available from http://www.virac.lv/en/soft.html
Similarity Estimation Between DNA Sequences Based on Local Pattern Histograms of Binary Images
Institute of Scientific and Technical Information of China (English)
Yusei Kobori; Satoshi Mizuta
2016-01-01
Graphical representation of DNA sequences is one of the most popular techniques for alignment-free sequence comparison. Here, we propose a new method for the feature extraction of DNA sequences represented by binary images, by estimating the similarity between DNA sequences using the frequency histograms of local bitmap patterns of images. Our method shows linear time complexity for the length of DNA sequences, which is practical even when long sequences, such as whole genome sequences, are compared. We tested five distance measures for the estimation of sequence similarities, and found that the histogram intersection and Manhattan distance are the most appropriate ones for phylogenetic analyses.
Yamamoto, Takahiro; Sasaoka, Kenji; Watanabe, Satoshi
2011-05-27
Universal fluctuations in phonon transmission and other features of phonon-transmission histograms are investigated by performing numerical simulations of coherent-phonon transport in isotope-disordered carbon nanotubes. Interestingly, the phonon-transmission fluctuation in the diffusive regime is universal, irrespective of the average phonon transmission, the tube chirality, and the concentrations, and masses of isotopes. We also find that the histogram, which has a Gaussian distribution in the diffusive regime, has a log-normal distribution in the localization regime. © 2011 American Physical Society
Overlapping Community Detection by Online Cluster Aggregation
Kozdoba, Mark
2015-01-01
We present a new online algorithm for detecting overlapping communities. The main ingredients are a modification of an online k-means algorithm and a new approach to modelling overlap in communities. An evaluation on large benchmark graphs shows that the quality of discovered communities compares favorably to several methods in the recent literature, while the running time is significantly improved.
DIMENSIONS OF SELF-AFFINESETS WITH OVERLAPS
Institute of Scientific and Technical Information of China (English)
华苏
2003-01-01
The authors develop an algorithm to show that a class of self-affine sets with overlaps canbe viewed as sofic affine-invariant sets without overlaps, thus by using the results of [11] and[10], the Hausdorff and Minkowski dimensions are determined.
Bayesian Overlapping Community Detection in Dynamic Networks
Ghorbani, Mahsa; Khodadadi, Ali
2016-01-01
Detecting community structures in social networks has gained considerable attention in recent years. However, lack of prior knowledge about the number of communities, and their overlapping nature have made community detection a challenging problem. Moreover, many of the existing methods only consider static networks, while most of real world networks are dynamic and evolve over time. Hence, finding consistent overlapping communities in dynamic networks without any prior knowledge about the number of communities is still an interesting open research problem. In this paper, we present an overlapping community detection method for dynamic networks called Dynamic Bayesian Overlapping Community Detector (DBOCD). DBOCD assumes that in every snapshot of network, overlapping parts of communities are dense areas and utilizes link communities instead of common node communities. Using Recurrent Chinese Restaurant Process and community structure of the network in the last snapshot, DBOCD simultaneously extracts the numbe...
Neural overlap in processing music and speech.
Peretz, Isabelle; Vuvan, Dominique; Lagrois, Marie-Élaine; Armony, Jorge L
2015-03-19
Neural overlap in processing music and speech, as measured by the co-activation of brain regions in neuroimaging studies, may suggest that parts of the neural circuitries established for language may have been recycled during evolution for musicality, or vice versa that musicality served as a springboard for language emergence. Such a perspective has important implications for several topics of general interest besides evolutionary origins. For instance, neural overlap is an important premise for the possibility of music training to influence language acquisition and literacy. However, neural overlap in processing music and speech does not entail sharing neural circuitries. Neural separability between music and speech may occur in overlapping brain regions. In this paper, we review the evidence and outline the issues faced in interpreting such neural data, and argue that converging evidence from several methodologies is needed before neural overlap is taken as evidence of sharing.
Neural overlap in processing music and speech
Peretz, Isabelle; Vuvan, Dominique; Lagrois, Marie-Élaine; Armony, Jorge L.
2015-01-01
Neural overlap in processing music and speech, as measured by the co-activation of brain regions in neuroimaging studies, may suggest that parts of the neural circuitries established for language may have been recycled during evolution for musicality, or vice versa that musicality served as a springboard for language emergence. Such a perspective has important implications for several topics of general interest besides evolutionary origins. For instance, neural overlap is an important premise for the possibility of music training to influence language acquisition and literacy. However, neural overlap in processing music and speech does not entail sharing neural circuitries. Neural separability between music and speech may occur in overlapping brain regions. In this paper, we review the evidence and outline the issues faced in interpreting such neural data, and argue that converging evidence from several methodologies is needed before neural overlap is taken as evidence of sharing. PMID:25646513
A novel method for the evaluation of uncertainty in dose volume histogram computation
Cutanda-Henriquez, Francisco
2007-01-01
Dose volume histograms are a useful tool in state-of-the-art radiotherapy planning, and it is essential to be aware of their limitations. Dose distributions computed by treatment planning systems are affected by several sources of uncertainty such as algorithm limitations, measurement uncertainty in the data used to model the beam and residual differences between measured and computed dose, once the model is optimized. In order to take into account the effect of uncertainty, a probabilistic approach is proposed and a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal or greater than a certain value is found using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a relationship is given for practical computations. This method is applied to a set of dose volume histograms for different regions of interest for 6 brain pat...
Rhodes, Andrew P.; Christian, John A.; Evans, Thomas
2017-01-01
With the availability and popularity of 3D sensors, it is advantageous to re-examine the use of point cloud descriptors for the purpose of pose estimation and spacecraft relative navigation. One popular descriptor is the oriented unique repeatable clustered viewpoint feature histogram (OUR-CVFH), which is most often utilized in personal and industrial robotics to simultaneously recognize and navigate relative to an object. Recent research into using the OUR-CVFH descriptor for spacecraft navigation has produced favorable results. Since OUR-CVFH is the most recent innovation in a large family of feature histogram point cloud descriptors, discussions of parameter settings and insights into its functionality are spread among various publications and online resources. This paper organizes the history of feature histogram point cloud descriptors for a straightforward explanation of their evolution. This article compiles all the requisite information needed to implement OUR-CVFH into one location, as well as providing useful suggestions on how to tune the generation parameters. This work is beneficial for anyone interested in using this histogram descriptor for object recognition or navigation - may it be personal robotics or spacecraft navigation.
Three-dimensional histogram visualization in different color spaces and applications
Marcu, Gabriel G.; Abe, Satoshi
1995-10-01
A visualization procedure for the 3D histogram of color images is presented. The procedure assumes that the histogram is available as a table that associates to a pixel color the number of its appearance in the image. The procedure runs for the RGB, YMC, HSV, HSL, L*a*b, and L*u*v color spaces and it is easily extendable to other color spaces if the analytical form of color transformations is available. Each histogram value is represented in the color space as a colored ball, in a position corresponding to the place of the color in the space. A simple drawing procedure is used instead of more complicated 3D rendering techniques. The 3D histogram visualization offers a clear and intuitive representation of the color distribution of the image. The procedure is applied to derive a clusterization technique for color classification and visualize its results, to display comparatively the gamut of different color devices, and to detect the misalignment of the RGB planes of a color image. Diagrams illustrating the visualization procedure are presented for each application.
Marcu, Gabriel G.; Abe, Satoshi
1995-04-01
The paper presents a dynamically visualization procedure for 3D histogram of color images. The procedure runs for RGB, YMC, HSV, HSL device dependent color spaces and for Lab, Luv device independent color spaces and it is easily extendable to other color spaces if the analytical form of color transformations is available. Each histogram value is represented in the color space as a colored ball, in a position corresponding to the place of color in the color space. The paper presents the procedures for nonlinear ball normalization, ordering of drawing, space edges drawing, translation, scaling and rotation of the histogram. The 3D histogram visualization procedure can be used in different applications described in the second part of the paper. It enables to get a clear representation of the range of colors of one image, to derive and compare the efficiency of different clusterization procedures for color classification, to display comparatively the gamut of different color devices, to select the color space for an optimal mapping procedure of the outside gamut colors for minimizing the hue error, to detect bad-alignment in RGB planes for a sequential process.
Effect of molecular organization on the image histograms of polarization SHG microscopy.
Psilodimitrakopoulos, Sotiris; Amat-Roldan, Ivan; Loza-Alvarez, Pablo; Artigas, David
2012-10-01
Based on its polarization dependency, second harmonic generation (PSHG) microscopy has been proven capable to structurally characterize molecular architectures in different biological samples. By exploiting this polarization dependency of the SHG signal in every pixel of the image, average quantitative structural information can be retrieved in the form of PSHG image histograms. In the present study we experimentally show how the PSHG image histograms can be affected by the organization of the SHG active molecules. Our experimental scenario grounds on two inherent properties of starch granules. Firstly, we take advantage of the radial organization of amylopectin molecules (the SHG source in starch) to attribute shifts of the image histograms to the existence of tilted off the plane molecules. Secondly, we use the property of starch to organize upon hydration to demonstrate that the degree of structural order at the molecular level affects the width of the PSHG image histograms. The shorter the width is the more organized the molecules in the sample are, resulting in a reliable method to measure order. The implication of this finding is crucial to the interpretation of PSHG images used for example in tissue diagnostics.
Histogram of oriented phase (HOP): a new descriptor based on phase congruency
Ragb, Hussin K.; Asari, Vijayan K.
2016-05-01
In this paper we present a low level image descriptor called Histogram of Oriented Phase based on phase congruency concept and the Principal Component Analysis (PCA). Since the phase of the signal conveys more information regarding signal structure than the magnitude, the proposed descriptor can precisely identify and localize image features over the gradient based techniques, especially in the regions affected by illumination changes. The proposed features can be formed by extracting the phase congruency information for each pixel in the image with respect to its neighborhood. Histograms of the phase congruency values of the local regions in the image are computed with respect to its orientation. These histograms are concatenated to construct the Histogram of Oriented Phase (HOP) features. The dimensionality of HOP features is reduced using PCA algorithm to form HOP-PCA descriptor. The dimensionless quantity of the phase congruency leads the HOP-PCA descriptor to be more robust to the image scale variations as well as contrast and illumination changes. Several experiments were performed using INRIA and DaimlerChrysler datasets to evaluate the performance of the HOP-PCA descriptor. The experimental results show that the proposed descriptor has better detection performance and less error rates than a set of the state of the art feature extraction methodologies.
Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.
Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael
2016-07-01
'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance
Direct-space methods in phase extension and phase refinement. IV. The double-histogram method.
Refaat, L S; Tate, C; Woolfson, M M
1996-03-01
In the conventional histogram-matching technique for phase extension and refinement for proteins a simple one-to-one transformation is made in the protein region to modify calculated density so that it will have some target histogram in addition to solvent flattening. This work describes an investigation where the density modification takes into account not only the current calculated density at a grid point but also some characteristic of the environment of the grid point within some distance R. This characteristic can be one of the local maximum density, the local minimum density or the local variance of density. The grid points are divided into ten groups, each containing the same number of grid points, for ten different ranges of value of the local characteristic. The ten groups are modified to give different histograms, each corresponding to that obtained under the same circumstances from a structure similar to the one under investigation. This process is referred to as the double-histogram matching method. Other processes which have been investigated are the weighting of structure factors when calculating maps with estimated phases and also the use of a factor to dampen the change of density and so control the refinement process. Two protein structures were used in numerical trials, RNApl [Bezborodova, Ermekbaeva, Shlyapnikov, Polyakov & Bezborodov (1988). Biokhimiya, 53, 965-973] and 2-Zn insulin [Baker, Blundell, Cutfield, Cutfield, Dodson, Dodson, Hodgkin, Hubbard, lsaacs, Reynolds, Sakabe, Sakabe & Vijayan (1988). Philos. Trans. R. Soc. London Ser. B, 319, 456--469]. Comparison of the proposed procedures with the normal histogram-matching technique without structure-factor weighting or damping gives mean phase errors reduced by up to 10 degrees with map correlation coefficients improved by as much as 0.14. Compared to the normal histogram used with weighting of structure factors and damping, the improvement due to the use of the double-histogram method is
Fan, Jihong
2016-09-17
Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD-based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stochastic learning framework, we have one triplet of bags, including one basic bag, one positive bag, and one negative bag. These bags are mapped to histograms using a multi-instance dictionary. We argue that the EMD between the basic histogram and the positive histogram should be smaller than that between the basic histogram and the negative histogram. Base on this condition, we design a hinge loss. By minimizing this hinge loss and some regularization terms of the dictionary, we update the dictionary instances. The experiments over multi-instance retrieval applications shows its effectiveness when compared to other dictionary learning methods over the problems of medical image retrieval and natural language relation classification. © 2016 The Natural Computing Applications Forum
Generation of stable overlaps between antiparallel filaments
Johann, D; Kruse, K
2015-01-01
During cell division, sister chromatids are segregated by the mitotic spindle, a bipolar assembly of interdigitating antiparallel polar filaments called microtubules. Establishing a stable overlap region is essential for maintenance of bipolarity, but the underlying mechanisms are poorly understood. Using a particle-based stochastic model, we find that the interplay of motors and passive cross linkers can robustly generate partial overlaps between antiparallel filaments. Our analysis shows that motors reduce the overlap in a length-dependent manner, whereas passive cross linkers increase it independently of the length. In addition to maintaining structural integrity, passive cross linkers can thus also have a dynamic role for size regulation.
Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.
2015-03-01
Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.
Lin, Yuning; Li, Hui; Chen, Ziqian; Ni, Ping; Zhong, Qun; Huang, Huijuan; Sandrasegaran, Kumar
2015-05-01
The purpose of this study was to investigate the application of histogram analysis of apparent diffusion coefficient (ADC) in characterizing pathologic features of cervical cancer and benign cervical lesions. This prospective study was approved by the institutional review board, and written informed consent was obtained. Seventy-three patients with cervical cancer (33-69 years old; 35 patients with International Federation of Gynecology and Obstetrics stage IB cervical cancer) and 38 patients (38-61 years old) with normal cervix or cervical benign lesions (control group) were enrolled. All patients underwent 3-T diffusion-weighted imaging (DWI) with b values of 0 and 800 s/mm(2). ADC values of the entire tumor in the patient group and the whole cervix volume in the control group were assessed. Mean ADC, median ADC, 25th and 75th percentiles of ADC, skewness, and kurtosis were calculated. Histogram parameters were compared between different pathologic features, as well as between stage IB cervical cancer and control groups. Mean ADC, median ADC, and 25th percentile of ADC were significantly higher for adenocarcinoma (p = 0.021, 0.006, and 0.004, respectively), and skewness was significantly higher for squamous cell carcinoma (p = 0.011). Median ADC was statistically significantly higher for well or moderately differentiated tumors (p = 0.044), and skewness was statistically significantly higher for poorly differentiated tumors (p = 0.004). No statistically significant difference of ADC histogram was observed between lymphovascular space invasion subgroups. All histogram parameters differed significantly between stage IB cervical cancer and control groups (p < 0.05). Distribution of ADCs characterized by histogram analysis may help to distinguish early-stage cervical cancer from normal cervix or cervical benign lesions and may be useful for evaluating the different pathologic features of cervical cancer.
A CMOS VLSI IC for real-time opto-electronic two-dimensional histogram generation
Richstein, James K.
1993-12-01
Histogram generation, a standard image processing operation, is a record of the intensity distribution in the image. Histogram generation has straightforward implementations on digital computers using high level languages. A prototype of an optical-electronic histogram generator was designed and tested for 1-D objects using wirewrapped MSI TTL components. The system has shown to be fairly modular in design. The aspects of the extension to two dimensions and the VLSI implementation of this design are discussed. In this paper, we report a VLSI design to be used in a two-dimensional real-time histogram generation scheme. The overall system design is such that the electronic signal obtained from the optically scanned two-dimensional semi-opaque image is processed and displayed within a period of one cycle of the scanning process. Specifically, in the VLSI implementation of the two-dimensional histogram generator, modifications were made to the original design. For the two-dimensional application, the output controller was analyzed as a finite state machine. The process used to describe the required timing signals and translate them to a VLSI finite state machine using Computer Aided Design Tools is discussed. In addition, the circuitry for sampling, binning, and display were combined with the timing circuitry on one IC. In the original design, the pulse width of the electronically sampled photodetector is limited with an analog one-shot. The high sampling rates associated with the extension to two dimensions requires significant reduction in the original 1-D prototype's sample pulse width of approximately 75 ns. The alternate design using VLSI logic gates will provide one-shot pulse widths of approximately 3 ns.
Overlap maximum matching ratio (OMMR)：a new measure to evaluate overlaps of essential modules
Institute of Scientific and Technical Information of China (English)
Xiao-xia ZHANG; Qiang-hua XIAO; Bin LI; Sai HU; Hui-jun XIONG; Bi-hai ZHAO
2015-01-01
Protein complexes are the basic units of macro-molecular organizations and help us to understand the cell’s mechanism. The development of the yeast two-hybrid, tandem affinity purification, and mass spectrometry high-throughput proteomic techniques supplies a large amount of protein-protein interaction data, which make it possible to predict overlapping complexes through computational methods. Research shows that overlapping complexes can contribute to identifying essential proteins, which are necessary for the organism to survive and reproduce, and for life’s activities. Scholars pay more attention to the evaluation of protein complexes. However, few of them focus on predicted overlaps. In this paper, an evaluation criterion called overlap maximum matching ratio (OMMR) is proposed to analyze the similarity between the identified overlaps and the benchmark overlap modules. Comparison of essential proteins and gene ontology (GO) analysis are also used to assess the quality of overlaps. We perform a comprehensive comparison of serveral overlapping complexes prediction approaches, using three yeast protein-protein interaction (PPI) networks. We focus on the analysis of overlaps identified by these algorithms. Experimental results indicate the important of overlaps and reveal the relationship between overlaps and identification of essential proteins.
Shin, Young Gyung; Yoo, Jaeheung; Kwon, Hyeong Ju; Hong, Jung Hwa; Lee, Hye Sun; Yoon, Jung Hyun; Kim, Eun-Kyung; Moon, Hee Jung; Han, Kyunghwa; Kwak, Jin Young
2016-08-01
The objective of the study was to evaluate whether texture analysis using histogram and gray level co-occurrence matrix (GLCM) parameters can help clinicians diagnose lymphocytic thyroiditis (LT) and differentiate LT according to pathologic grade. The background thyroid pathology of 441 patients was classified into no evidence of LT, chronic LT (CLT), and Hashimoto's thyroiditis (HT). Histogram and GLCM parameters were extracted from the regions of interest on ultrasound. The diagnostic performances of the parameters for diagnosing and differentiating LT were calculated. Of the histogram and GLCM parameters, the mean on histogram had the highest Az (0.63) and VUS (0.303). As the degrees of LT increased, the mean decreased and the standard deviation and entropy increased. The mean on histogram from gray-scale ultrasound showed the best diagnostic performance as a single parameter in differentiating LT according to pathologic grade as well as in diagnosing LT.
Motor protein accumulation on antiparallel microtubule overlaps
Kuan, Hui-Shun
2015-01-01
Biopolymers serve as one-dimensional tracks on which motor proteins move to perform their biological roles. Motor protein phenomena have inspired theoretical models of one-dimensional transport, crowding, and jamming. Experiments studying the motion of Xklp1 motors on reconstituted antiparallel microtubule overlaps demonstrated that motors recruited to the overlap walk toward the plus end of individual microtubules and frequently switch between filaments. We study a model of this system that couples the totally asymmetric simple exclusion process (TASEP) for motor motion with switches between antiparallel filaments and binding kinetics. We determine steady-state motor density profiles for fixed-length overlaps using exact and approximate solutions of the continuum differential equations and compare to kinetic Monte Carlo simulations. The center region, far from the overlap ends, has a constant motor density as one would na\\"ively expect. However, rather than following a simple binding equilibrium, the center ...
Detecting overlapping coding sequences in virus genomes
Directory of Open Access Journals (Sweden)
Brown Chris M
2006-02-01
Full Text Available Abstract Background Detecting new coding sequences (CDSs in viral genomes can be difficult for several reasons. The typically compact genomes often contain a number of overlapping coding and non-coding functional elements, which can result in unusual patterns of codon usage; conservation between related sequences can be difficult to interpret – especially within overlapping genes; and viruses often employ non-canonical translational mechanisms – e.g. frameshifting, stop codon read-through, leaky-scanning and internal ribosome entry sites – which can conceal potentially coding open reading frames (ORFs. Results In a previous paper we introduced a new statistic – MLOGD (Maximum Likelihood Overlapping Gene Detector – for detecting and analysing overlapping CDSs. Here we present (a an improved MLOGD statistic, (b a greatly extended suite of software using MLOGD, (c a database of results for 640 virus sequence alignments, and (d a web-interface to the software and database. Tests show that, from an alignment with just 20 mutations, MLOGD can discriminate non-overlapping CDSs from non-coding ORFs with a typical accuracy of up to 98%, and can detect CDSs overlapping known CDSs with a typical accuracy of 90%. In addition, the software produces a variety of statistics and graphics, useful for analysing an input multiple sequence alignment. Conclusion MLOGD is an easy-to-use tool for virus genome annotation, detecting new CDSs – in particular overlapping or short CDSs – and for analysing overlapping CDSs following frameshift sites. The software, web-server, database and supplementary material are available at http://guinevere.otago.ac.nz/mlogd.html.
Numerical properties of staggered overlap fermions
de Forcrand, Philippe; Panero, Marco
2010-01-01
We report the results of a numerical study of staggered overlap fermions, following the construction of Adams which reduces the number of tastes from 4 to 2 without fine-tuning. We study the sensitivity of the operator to the topology of the gauge field, its locality and its robustness to fluctuations of the gauge field. We make a first estimate of the computing cost of a quark propagator calculation, and compare with Neuberger's overlap.
Bins Approach for CBIR by Shifting the Histogram to Lower Intensities using proposed polynomials
Directory of Open Access Journals (Sweden)
H. B. Kekre
2012-09-01
Full Text Available This paper describes the novel approach of feature extraction for CBIR systems. It also suggests the use of newly designed polynomial function to modify the image histogram so that the result of the CBIR systemcan be improved. To support this suggestion multiple polynomial functions have been tried. Out of which the best polynomial can be selected to modify the histogram for feature extraction. This gives better performance for the image retrieval based on contents. The separate histograms are obtained for each of the three color planes of the image so that the color information can be handled separately. These histograms are then divided into two equal parts by calculating the centre of gravity. This division of the R, G and B histograms into two parts lead towards the generation of eight bins. Eight bins are designed to hold different types of information like ‘Count of pixels’, ‘Total of intensities’ and ‘Average of intensities’. The work done includes the set of three polynomial functions used modify the histograms. Based on eachpolynomial function and the variation of the information used to represent the eight bin feature vector we could generate the multiple feature vector databases. Two types of bin sets based on type of the bin contents Total and Average of intensities with respect to each of the three polynomial functions for three colors we have 2x3x3 = 18 plus for count of pixels for each polynomial function giving 3 feature vector databases; like this total 18 + 3= 21 feature vector databases are prepared for the system testing. To demonstrate the performance of the system we have used database of 2000 BMP images from 20 different classes where each class has 100 images. 200 images are selected randomly as 10 images from each of the 20 classes to be given as query to the system. To compare the database and query image feature vectors and facilitate the retrieval three similarity measures are used namely Euclidean
Temporal Proximity Promotes Integration of Overlapping Events.
Zeithamova, Dagmar; Preston, Alison R
2017-03-02
Events with overlapping elements can be encoded as two separate representations or linked into an integrated representation; yet, we know little about the conditions that promote one form of representation over the other. Here, we tested the hypothesis that the proximity of overlapping events would increase the probability of integration. Participants first established memories for house-object and face-object pairs; half of the pairs were learned 24 hr before a fMRI session, and the other half 30 min before the session. During scanning, participants encoded object-object pairs that overlapped with the initial pairs acquired on the same or prior day. Participants were also scanned as they made inference judgments about the relationships among overlapping pairs learned on the same or different day. Participants were more accurate and faster when inferring relationships among memories learned on the same day relative to those acquired across days, suggesting that temporal proximity promotes integration. Evidence for reactivation of existing memories-as measured by a visual content classifier-was equivalent during encoding of overlapping pairs from the two temporal conditions. In contrast, evidence for integration-as measured by a mnemonic strategy classifier from an independent study [Richter, F. R., Chanales, A. J. H., & Kuhl, B. A. Predicting the integration of overlapping memories by decoding mnemonic processing states during learning. Neuroimage, 124, 323-335, 2016]-was greater for same-day overlapping events, paralleling the behavioral results. During inference itself, activation patterns further differentiated when participants were making inferences about events acquired on the same day versus across days. These findings indicate that temporal proximity of events promotes integration and further influences the neural mechanisms engaged during inference.
Novel 3-D Object Recognition Methodology Employing a Curvature-Based Histogram
Directory of Open Access Journals (Sweden)
Liang-Chia Chen
2013-07-01
Full Text Available In this paper, a new object recognition algorithm employing a curvature-based histogram is presented. Recognition of three-dimensional (3-D objects using range images remains one of the most challenging problems in 3-D computer vision due to its noisy and cluttered scene characteristics. The key breakthroughs for this problem mainly lie in defining unique features that distinguish the similarity among various 3-D objects. In our approach, an object detection scheme is developed to identify targets underlining an automated search in the range images using an initial process of object segmentation to subdivide all possible objects in the scenes and then applying a process of object recognition based on geometric constraints and a curvature-based histogram for object recognition. The developed method has been verified through experimental tests for its feasibility confirmation.
Saha, Satadal; Nasipuri, Mita; Basu, Dipak Kumar
2010-01-01
Automatic License Plate Recognition system is a challenging area of research now-a-days and binarization is an integral and most important part of it. In case of a real life scenario, most of existing methods fail to properly binarize the image of a vehicle in a congested road, captured through a CCD camera. In the current work we have applied histogram equalization technique over the complete image and also over different hierarchy of image partitioning. A novel scheme is formulated for giving the membership value to each pixel for each hierarchy of histogram equalization. Then the image is binarized depending on the net membership value of each pixel. The technique is exhaustively evaluated on the vehicle image dataset as well as the license plate dataset, giving satisfactory performances.
3D facial expression recognition based on histograms of surface differential quantities
Li, Huibin
2011-01-01
3D face models accurately capture facial surfaces, making it possible for precise description of facial activities. In this paper, we present a novel mesh-based method for 3D facial expression recognition using two local shape descriptors. To characterize shape information of the local neighborhood of facial landmarks, we calculate the weighted statistical distributions of surface differential quantities, including histogram of mesh gradient (HoG) and histogram of shape index (HoS). Normal cycle theory based curvature estimation method is employed on 3D face models along with the common cubic fitting curvature estimation method for the purpose of comparison. Based on the basic fact that different expressions involve different local shape deformations, the SVM classifier with both linear and RBF kernels outperforms the state of the art results on the subset of the BU-3DFE database with the same experimental setting. © 2011 Springer-Verlag.
High Capacity Reversible Watermarking for Audio by Histogram Shifting and Predicted Error Expansion
Directory of Open Access Journals (Sweden)
Fei Wang
2014-01-01
Full Text Available Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability.
The GNAM monitoring system and the OHP histogram presenter for ATLAS
Zema, P F; Crosetti, G; Della Pietra, M; Dotti, A; Ferrari, R; Gaudio, G; Roda, R; Salvatore, D; Sarri, F; Vandelli, W; 14th IEEE-NPSS Real Time Conference 2005
2005-01-01
ATLAS is one of the four experiments under construction along the Large Hadron Collider at CERN. During the 2004 Combined Test Beam, GNAM and OHP were widely used for the online monitoring of both the hardware setup and the data quality. GNAM is a modular framework where detector specific code can be easily plugged into obtain complete online low-level monitoring applications. It is based on the monitoring tools provided by the ATLAS Trigger and Data Acquisition (TDAQ) software. OHP is a histogram presenter, capable to perform both as a configurable display and as a browser. From OHP, requests to execute simple interactive perations (such as reset, rebin or update) on histograms, can be sent to GNAM.
Minimum Error Thresholding Segmentation Algorithm Based on 3D Grayscale Histogram
Directory of Open Access Journals (Sweden)
Jin Liu
2014-01-01
Full Text Available Threshold segmentation is a very important technique. The existing threshold algorithms do not work efficiently for noisy grayscale images. This paper proposes a novel algorithm called three-dimensional minimum error thresholding (3D-MET, which is used to solve the problem. The proposed approach is implemented by an optimal threshold discriminant based on the relative entropy theory and the 3D histogram. The histogram is comprised of gray distribution information of pixels and relevant information of neighboring pixels in an image. Moreover, a fast recursive method is proposed to reduce the time complexity of 3D-MET from O(L6 to O(L3, where L stands for gray levels. Experimental results demonstrate that the proposed approach can provide superior segmentation performance compared to other methods for gray image segmentation.
Kernel Learning of Histogram of Local Gabor Phase Patterns for Face Recognition
Directory of Open Access Journals (Sweden)
Bineng Zhong
2008-06-01
Full Text Available This paper proposes a new face recognition method, named kernel learning of histogram of local Gabor phase pattern (K-HLGPP, which is based on DaugmanÃ¢Â€Â™s method for iris recognition and the local XOR pattern (LXP operator. Unlike traditional Gabor usage exploiting the magnitude part in face recognition, we encode the Gabor phase information for face classification by the quadrant bit coding (QBC method. Two schemes are proposed for face recognition. One is based on the nearest-neighbor classifier with chi-square as the similarity measurement, and the other makes kernel discriminant analysis for HLGPP (K-HLGPP using histogram intersection and Gaussian-weighted chi-square kernels. The comparative experiments show that K-HLGPP achieves a higher recognition rate than other well-known face recognition systems on the large-scale standard FERET, FERET200, and CAS-PEAL-R1 databases.
Multi-stream LSTM-HMM decoding and histogram equalization for noise robust keyword spotting.
Wöllmer, Martin; Marchi, Erik; Squartini, Stefano; Schuller, Björn
2011-09-01
Highly spontaneous, conversational, and potentially emotional and noisy speech is known to be a challenge for today's automatic speech recognition (ASR) systems, which highlights the need for advanced algorithms that improve speech features and models. Histogram Equalization is an efficient method to reduce the mismatch between clean and noisy conditions by normalizing all moments of the probability distribution of the feature vector components. In this article, we propose to combine histogram equalization and multi-condition training for robust keyword detection in noisy speech. To better cope with conversational speaking styles, we show how contextual information can be effectively exploited in a multi-stream ASR framework that dynamically models context-sensitive phoneme estimates generated by a long short-term memory neural network. The proposed techniques are evaluated on the SEMAINE database-a corpus containing emotionally colored conversations with a cognitive system for "Sensitive Artificial Listening".
Directory of Open Access Journals (Sweden)
Dr. T. V. S. Prasad Gupta
2014-10-01
Full Text Available Due to the degradation of observed image the noisy, blurred, Distorted image can be occurred .for restoring image information we propose the sparse representations by conventional modelsmay not be accurate enough for a faithful reconstruction of the original image. To improve the performance of sparse representation-based image restoration,In this method the sparse coding noise is added for image restoration, due to this image restoration the sparse coefficients of original image can be detected. The so-called nonlocally centralized sparse representation (NCSR model is as simple as the standard sparse representation model,for denoising the image here we use the Histogram clipping method by using histogram based sparse representation effectively reduce the noise.and also implement the TMR filter for Quality image.various types of image restoration problems, including denoising, deblurring and super-resolution, validate the generality and state-of-the-art performance of the proposed algorithm.
Automatic Contrast Enhancement of Brain MR Images Using Hierarchical Correlation Histogram Analysis.
Chen, Chiao-Min; Chen, Chih-Cheng; Wu, Ming-Chi; Horng, Gwoboa; Wu, Hsien-Chu; Hsueh, Shih-Hua; Ho, His-Yun
Parkinson's disease is a progressive neurodegenerative disorder that has a higher probability of occurrence in middle-aged and older adults than in the young. With the use of a computer-aided diagnosis (CAD) system, abnormal cell regions can be identified, and this identification can help medical personnel to evaluate the chance of disease. This study proposes a hierarchical correlation histogram analysis based on the grayscale distribution degree of pixel intensity by constructing a correlation histogram, that can improves the adaptive contrast enhancement for specific objects. The proposed method produces significant results during contrast enhancement preprocessing and facilitates subsequent CAD processes, thereby reducing recognition time and improving accuracy. The experimental results show that the proposed method is superior to existing methods by using two estimation image quantitative methods of PSNR and average gradient values. Furthermore, the edge information pertaining to specific cells can effectively increase the accuracy of the results.
High capacity reversible watermarking for audio by histogram shifting and predicted error expansion.
Wang, Fei; Xie, Zhaoxin; Chen, Zuo
2014-01-01
Being reversible, the watermarking information embedded in audio signals can be extracted while the original audio data can achieve lossless recovery. Currently, the few reversible audio watermarking algorithms are confronted with following problems: relatively low SNR (signal-to-noise) of embedded audio; a large amount of auxiliary embedded location information; and the absence of accurate capacity control capability. In this paper, we present a novel reversible audio watermarking scheme based on improved prediction error expansion and histogram shifting. First, we use differential evolution algorithm to optimize prediction coefficients and then apply prediction error expansion to output stego data. Second, in order to reduce location map bits length, we introduced histogram shifting scheme. Meanwhile, the prediction error modification threshold according to a given embedding capacity can be computed by our proposed scheme. Experiments show that this algorithm improves the SNR of embedded audio signals and embedding capacity, drastically reduces location map bits length, and enhances capacity control capability.
Pan, W; Coatrieux, G; Cuppens, N; Cuppens, F; Roux, Ch
2011-01-01
In this article, we present a novel reversible watermarking scheme. Its originality stands in identifying parts of the image that can be watermarked additively with the most adapted lossless modulation between: Pixel Histogram Shifting (PHS) or Dynamical Error Histogram Shifting (DEHS). This classification process makes use of a reference image derived from the image itself, a prediction of it, which has the property to be invariant to the watermark addition. In that way, watermark embedded and reader remain synchronized through this image of reference. DEHS is also an original contribution of this work. It shifts predict-errors between the image and its reference image taking care of the local specificities of the image, thus dynamically. Conducted experiments, on different medical image test sets issued from different modalities and some natural images, show that our method can insert more data with lower distortion than the most recent and efficient methods of the literature.
Recovery of the histogram of hourly ozone distribution from weekly average concentrations.
Olcese, Luis E; Toselli, Beatriz M
2006-05-01
A simple method is presented for estimating hourly distribution of air pollutants, based on data collected by passive sensors on a weekly or bi-weekly basis with no need for previous measurements at a site. In order for this method to be applied to locations where no hourly records are available, reference data from other sites are required to generate calibration histograms. The proposed procedure allows one to obtain the histogram of hourly ozone values during a given week with an error of about 30%, which is good considering the simplicity of this approach. This method can be a valuable tool for sites that lack previous hourly records of pollutant ambient concentrations, where it can be used to verify compliance with regulations or to estimate the AOT40 index with an acceptable degree of exactitude.
Histograms in heavy-quark QCD at finite temperature and density
Saito, H; Aoki, S; Kanaya, K; Nakagawa, Y; Ohno, H; Okuno, K; Umeda, T
2013-01-01
We study the phase structure of lattice QCD with heavy quarks at finite temperature and density by a histogram method. We determine the location of the critical point at which the first-order deconfining transition in the heavy-quark limit turns into a crossover at intermediate quark masses through a change of the shape of the histogram under variation of coupling parameters. We estimate the effect of the complex phase factor which causes the sign problem at finite density, and show that, in heavy-quark QCD, the effect is small around the critical point. We determine the critical surface in 2+1 flavor QCD in the heavy-quark region at all values of the chemical potential mu including mu=infty.
Wavelet fractal character of overlapping signal
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
A new method based on the combining of the wavelet theory with the fractal theory and named wavelet fractal peak position method (WFPPM) is introduced to extract the number of the components and the relevant peak positions from overlapping signals in chemistry. The overlapping signal is first transformed into continuous wavelet transform value of time domain in certain dilation range via continuous wavelet transform (CWT), and then changed into capacity dimensions (Dc). The number of the components and the relevant positions of overlapping peaks can be identified easily according to the change of Dc. An investigation concerning the influence of different dilation ranges on the peak positions extracted by WFPPM is also provided. Studies show that WFPPM is and efficient tool for extracting the peak positions and identifying the number of peaks from unresolved signals, even wht\\en this kind of overlapping is significantly serious. Relative errors of less than 1.0% in peak are found when WFPPM is used in the processing of the cadmium(Ⅱ)-indium(Ⅲ) mixture system. The analytical results demonstrate that the desired peak positions can be extracted conveniently, accurately and rapidly from and unresolved signal via WFPPM. Tremendous developing and applications based on currently reported WFPPM in extracting overlapping signals would be expected in the near futrue.
Sheck, L E; Muirhead, K A; Horan, P K
1980-09-01
Cell sorting and tritiated thymidine autoradiography were used to define the distribution of S phase cells in flow cytometric DNA histograms obtained from exponential mouse lymphoma cells (L5178Y). The numbers of labeled S phase cells, autoradiographically determined from cells sorted at 2-channel intervals in the G1/early S and late S/G2M regions of the histogram, were compared with the numbers of computed S phase cells in comparable 2-channel intervals as predicted by several computer algorithms used to extract cell cycle phase distributions from DNA histograms. Polynomial and multirectangle algorithms gave computed estimates of total %S in close agreement with the tritiated thymidine labeling index for the cell population, while multi-Gaussian algorithms underestimated %S. Interval autoradiographic and algorithm studies confirmed these results in that no significant differences were found between the autoradiographic S phase distribution and S phase distributions calculated by the polynomial and multirectangle models. However, S phase cells were significantly underestimated in G1/early S by a constrained multi-Gaussian model and in both G1/early S and late S/G2 by an unconstrained multi-Gaussian model. For the particular cell line (L5178Y), staining protocol (mithramycin following ethanol fixation) and instrumentation (Coulter TPS-2 cell sorter) used in this study, close agreement between computed %S and tritiated thymidine labeling index was found to be a reliable indicator of an algorithm's success in resolving S phase cells in the G1/S and S/G2 transition regions of the DNA histograms.
Benign-Malignant Lung Nodule Classification with Geometric and Appearance Histogram Features
Shewaye, Tizita Nesibu; Mekonnen, Alhayat Ali
2016-01-01
Lung cancer accounts for the highest number of cancer deaths globally. Early diagnosis of lung nodules is very important to reduce the mortality rate of patients by improving the diagnosis and treatment of lung cancer. This work proposes an automated system to classify lung nodules as malignant and benign in CT images. It presents extensive experimental results using a combination of geometric and histogram lung nodule image features and different linear and non-linear discriminant classifier...
Directory of Open Access Journals (Sweden)
I. O. Zharinov
2016-05-01
Full Text Available We consider evaluation problem of chromaticity coordinates increment for an image displayed by indicating means (liquid crystal panels and etc.. Display device profile set by the weight matrix for components of primary colors serves as basic data for quantitative calculation. Research results have the form of mathematical expressions allowing calculation of increment values of chromaticity coordinates of the image displayed on indicating means and histograms of increment distribution.
On the characteristic form of histograms appearing at the culmination of solar eclipse
Shnoll, S E
2006-01-01
As shown in a number of our works, the form of histograms - distributions of amplitude fluctuations - varies regularly in time, with these variations being similar for processes of any nature, from biochemical reactions to noise in the gravitational antenna and all types of the radioactive decay. In particular, we have revealed basic laws, suggesting a cosmo-physical nature of these phenomena, in the time series created by the noise generators of the global GCP net. On the basis of all the results obtained, a conclusion has been made that the histogram form is determined by fluctuations of the spacetime, which depend on the movement of the measured system (laboratory) relative to the heavenly bodies. An important step to understand the nature of these phenomena was the finding that at the moments of the new Moon, a specific histogram form appears practically simultaneously at different geographical points, from Arctic to Antarctic, in middle latitudes of West and East hemispheres. This effect seems to be not ...
Image Enhancement Pada Screen Capture CCTV Dengan Menggunakan Metode Histogram EKualisasi
Directory of Open Access Journals (Sweden)
Hendro Nugroho
2017-05-01
Full Text Available Penggunaan kamera Closed Circuit Television (CCTV banyak digunakan saat ini terutama perusahaan atau industry, pertokoan dan tempat-temapat strategis lainnya. Dengan menggunakan kamera CCTV keamanan tempat tersebut akan terjamin. Akan tetapi kekurangan dari kamera CCTV adalah apabila ruangan tidak memiliki cahaya kurang (gelap maka hasil objek yang terekam tidak maksimal. Untuk mengatasi permasalahan diatas maka perlu adanya Histogram Ekualisasi yang dapat memberikan perbaikan kualitas citra (Image Enhancement. Untuk citra yang diambil sebagai Image Enhancement sebagai contoh uji coba penelitian adalah citra Screen Capture CCTV. Dari citra Screen Capture CCTV tersebut dilakukan tahap-tahap pemrosesan citra untuk menghasilkan perbaikan kualitas citra yang baik. Rencana pengambilan data citra screen Capture CCTV sejumlah 5 citra. Cara kerja sistem informasi Image Enhancement adalah citra screen Capture CCTV di praproses dengan ukuran 200x260 piksel BMP, kemudia citra di GarayScale untuk dijadikan nilai piksel yang seragam, selajatunya citra di histogram ekualisasi untuk intensitas menjadi seragam. Untuk menguji tingkat kualitas citra menggunakan metode ekstraksi tekstur histogram berbasis rata-rata intensitas dan deviasi standar.
Temporal evolution for the phase histogram of ECG during human ventricular fibrillation
Wu, Ming-Chya; Struzik, Zbigniew R.; Watanabe, Eiichi; Yamamoto, Yoshiharu; Hu, Chin-Kun
2007-07-01
A novel approach to momentary/instantaneous morphological assessment of phase histograms, extending phase statistics analysis, is used to investigate electrocardiograms during ventricular fibrillation (VF) in humans. By using empirical mode decomposition (EMD) and the Hilbert transform, we calculate the instantaneous phase of intrinsic mode functions (IMFs) in Holter data from 16 individuals, and construct the corresponding momentary phase histograms, enabling us to inspect the evolution of the waveform of the time series. A measure defined as the difference between the integrals of the probability distribution density of phase in different regions is then used to characterize the morphology of the momentary histograms and their temporal evolution. We find that the measure of morphology difference allows near perfect classification of the VF data into survivor and non-survivor groups. The technique offers a new possibility to improve the effectiveness of intervention in defibrillation treatment and limit the negative side effects of unnecessary interventions. The approach can be implemented in real time and should provide a useful method for early evaluation of (fatal) VF.
Digital image classification with the help of artificial neural network by simple histogram.
Dey, Pranab; Banerjee, Nirmalya; Kaur, Rajwant
2016-01-01
Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN) may be helpful in this matter. In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577), Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281), validation set (63), and test set (60). The on-line backpropagation training algorithm was used for this study. A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations.
Control system of hexacopter using color histogram footprint and convolutional neural network
Ruliputra, R. N.; Darma, S.
2017-07-01
The development of unmanned aerial vehicles (UAV) has been growing rapidly in recent years. The use of logic thinking which is implemented into the program algorithms is needed to make a smart system. By using visual input from a camera, UAV is able to fly autonomously by detecting a target. However, some weaknesses arose as usage in the outdoor environment might change the target's color intensity. Color histogram footprint overcomes the problem because it divides color intensity into separate bins that make the detection tolerant to the slight change of color intensity. Template matching compare its detection result with a template of the reference image to determine the target position and use it to position the vehicle in the middle of the target with visual feedback control based on Proportional-Integral-Derivative (PID) controller. Color histogram footprint method localizes the target by calculating the back projection of its histogram. It has an average success rate of 77 % from a distance of 1 meter. It can position itself in the middle of the target by using visual feedback control with an average positioning time of 73 seconds. After the hexacopter is in the middle of the target, Convolutional Neural Networks (CNN) classifies a number contained in the target image to determine a task depending on the classified number, either landing, yawing, or return to launch. The recognition result shows an optimum success rate of 99.2 %.
Digital image classification with the help of artificial neural network by simple histogram
Directory of Open Access Journals (Sweden)
Pranab Dey
2016-01-01
Full Text Available Background: Visual image classification is a great challenge to the cytopathologist in routine day-to-day work. Artificial neural network (ANN may be helpful in this matter. Aims and Objectives: In this study, we have tried to classify digital images of malignant and benign cells in effusion cytology smear with the help of simple histogram data and ANN. Materials and Methods: A total of 404 digital images consisting of 168 benign cells and 236 malignant cells were selected for this study. The simple histogram data was extracted from these digital images and an ANN was constructed with the help of Neurointelligence software [Alyuda Neurointelligence 2.2 (577, Cupertino, California, USA]. The network architecture was 6-3-1. The images were classified as training set (281, validation set (63, and test set (60. The on-line backpropagation training algorithm was used for this study. Result: A total of 10,000 iterations were done to train the ANN system with the speed of 609.81/s. After the adequate training of this ANN model, the system was able to identify all 34 malignant cell images and 24 out of 26 benign cells. Conclusion: The ANN model can be used for the identification of the individual malignant cells with the help of simple histogram data. This study will be helpful in the future to identify malignant cells in unknown situations.
Efficient descriptor of histogram of salient edge orientation map for finger vein recognition.
Lu, Yu; Yoon, Sook; Xie, Shan Juan; Yang, Jucheng; Wang, Zhihui; Park, Dong Sun
2014-07-10
Finger vein images are rich in orientation and edge features. Inspired by the edge histogram descriptor proposed in MPEG-7, this paper presents an efficient orientation-based local descriptor, named histogram of salient edge orientation map (HSEOM). HSEOM is based on the fact that human vision is sensitive to edge features for image perception. For a given image, HSEOM first finds oriented edge maps according to predefined orientations using a well-known edge operator and obtains a salient edge orientation map by choosing an orientation with the maximum edge magnitude for each pixel. Then, subhistograms of the salient edge orientation map are generated from the nonoverlapping submaps and concatenated to build the final HSEOM. In the experiment of this paper, eight oriented edge maps were used to generate a salient edge orientation map for HSEOM construction. Experimental results on our available finger vein image database, MMCBNU_6000, show that the performance of HSEOM outperforms that of state-of-the-art orientation-based methods (e.g., Gabor filter, histogram of oriented gradients, and local directional code). Furthermore, the proposed HSEOM has advantages of low feature dimensionality and fast implementation for a real-time finger vein recognition system.
Adaptive local backlight dimming algorithm based on local histogram and image characteristics
Nadernejad, Ehsan; Burini, Nino; Korhonen, Jari; Forchhammer, Søren; Mantel, Claire
2013-02-01
Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted. A classification into three classes based on the average luminance value is performed and, depending on the image luminance class, the extracted information on the local histogram determines the corresponding backlight value. The proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms representing the state of the art among feature based backlight algorithms.
Energy Technology Data Exchange (ETDEWEB)
Fiorino, Claudio; Maggiulli, Eleonora; Broggi, Sara; Cattaneo, Giovanni Mauro; Calandrino, Riccardo [Medical Physics, San Raffaele Scientific Institute, Milano (Italy); Liberini, Simone; Faggiano, Elena; Rizzo, Giovanna [Institute of Molecular Bioimaging and Physiology, CNR, Segrate (Italy); Dell' Oca, Italo; Di Muzio, Nadia, E-mail: fiorino.claudio@hsr.it [Radiotherapy, San Raffaele Scientific Institute, Milano (Italy)
2011-06-07
The Jacobian of the deformation field of elastic registration between images taken during radiotherapy is a measure of inter-fraction local deformation. The histogram of the Jacobian values (Jac) within an organ was introduced (JVH-Jacobian-volume-histogram) and first applied in quantifying parotid shrinkage. MVCTs of 32 patients previously treated with helical tomotherapy for head-neck cancers were collected. Parotid deformation was evaluated through elastic registration between MVCTs taken at the first and last fractions. Jac was calculated for each voxel of all parotids, and integral JVHs were calculated for each parotid; the correlation between the JVH and the planning dose-volume histogram (DVH) was investigated. On average, 82% ({+-}17%) of the voxels shrinks (Jac < 1) and 14% ({+-}17%) shows a local compression >50% (Jac < 0.5). The best correlation between the DVH and the JVH was found between V10 and V15, and Jac < 0.4-0.6 (p < 0.01). The best constraint predicting a higher number of largely compressing voxels (Jac0.5<7.5%, median value) was V15 {>=} 75% (OR: 7.6, p = 0.002). Jac and the JVH are promising tools for scoring/modelling toxicity and for evaluating organ/contour variations with potential applications in adaptive radiotherapy.
Yamada, Kohei; Kizuka, Tokushi
2016-10-01
We observed the tensile deformation of molybdenum (Mo) nanocontacts (NCs) and simultaneously measured their conductance by in situ transmission electron microscopy. During deformation, the contact width decreased from several nanometers to a single-atom size. Mo NCs were thinned via a plastic flowlike deformation process. The process differs from the slip on lattice planes, which is frequently observed in NCs made of noble metals. We plotted histograms of the time-conductance traces measured during the tensile deformation of Mo NCs. In the conductance histograms, we observed peaks at 1.8G0 (G0 = 2e2/h, where e is the electron charge and h is Planck's constant), 3.6G0, and 4.4G0. When the minimum conductance (1.8G0) was measured, the minimum cross-sectional widths of the NCs were 3-7 atoms. These NCs exhibited relaxed structures that formed irregularly after the plastic flowlike deformation occurred in the final stage of the tensile process. We inferred that the aperiodic peaks observed in the conductance histograms originated from irregular variations in the contact areas and atomic configurations of the NCs during the plastic flowlike deformation. Moreover, the conductance value of the single-atom contacts was less than 0.1G0.
Rojas-Lima, J. E.; Domínguez-Pacheco, A.; Hernández-Aguilar, C.; Cruz-Orea, A.
2016-09-01
Considering the necessity of photothermal alternative approaches for characterizing nonhomogeneous materials like maize seeds, the objective of this research work was to analyze statistically the amplitude variations of photopyroelectric signals, by means of nonparametric techniques such as the histogram and the kernel density estimator, and the probability density function of the amplitude variations of two genotypes of maize seeds with different pigmentations and structural components: crystalline and floury. To determine if the probability density function had a known parametric form, the histogram was determined which did not present a known parametric form, so the kernel density estimator using the Gaussian kernel, with an efficiency of 95 % in density estimation, was used to obtain the probability density function. The results obtained indicated that maize seeds could be differentiated in terms of the statistical values for floury and crystalline seeds such as the mean (93.11, 159.21), variance (1.64× 103, 1.48× 103), and standard deviation (40.54, 38.47) obtained from the amplitude variations of photopyroelectric signals in the case of the histogram approach. For the case of the kernel density estimator, seeds can be differentiated in terms of kernel bandwidth or smoothing constant h of 9.85 and 6.09 for floury and crystalline seeds, respectively.
Directory of Open Access Journals (Sweden)
H. B. Kekre
2012-09-01
Full Text Available This paper explores the novel idea for feature extraction based on bins approach for CBIR. This work has fulfilled all the criterion of efficient feature extraction method like dimensionality reduction, fast extraction and efficient retrieval. Main idea used in feature extraction is based on the histogram and the linear functions used to modify it. Each BMP image in database is separated in R, G and B planes. We have calculated the histogram for each of them which are modified using linear equations. These modified histograms are partitioned using CG so that mass of intensities of the image pixels will be distributed uniformly in two parts. This CG partitioning of three image planes leads to generate the eight bins. Information extracted from these bins is in the form of statistical first four absolute moments namely MEAN (MEAN, Standard deviation (STD, Skewness (SKEW, and Kurtosis (KURTO. Each of these moments are computed separately for R, G and B colors. This generates four types of feature vectors of dimension eight. Database of 2000 BMP images is used for the experimentation. Multiple feature vector databases are prepared as part of preprocessing work of this CBIR system. Comparison of query and database feature vector is carried out using three similarity measures namely Euclidean distance (ED, Absolute distance (AD and Cosine correlation distance (CD. To evaluate the retrieval efficiency of this system we have used three parameters, Precision Recall Cross over Point (PRCP, Longest String (LS and Length of String to Retrieve all Relevant (LSRR.
Conductance histogram evolution of an EC-MCBJ fabricated Au atomic point contact
Energy Technology Data Exchange (ETDEWEB)
Yang Yang; Liu Junyang; Chen Zhaobin; Tian Jinghua; Jin Xi; Liu Bo; Yang Fangzu; Tian Zhongqun [State Key Laboratory of Physical Chemistry of Solid Surfaces and Department of Chemistry, College of Chemistry and Chemical Engineering, Xiamen University, Xiamen 361005 (China); Li Xiulan; Tao Nongjian [Center for Bioelectronics and Biosensors, Biodesign Institute, Department of Electrical Engineering, Arizona State University, Tempe, AZ 85287-6206 (United States); Luo Zhongzi; Lu Miao, E-mail: zqtian@xmu.edu.cn [Micro-Electro-Mechanical Systems Research Center, Pen-Tung Sah Micro-Nano Technology Institute, Xiamen University, Xiamen 361005 (China)
2011-07-08
This work presents a study of Au conductance quantization based on a combined electrochemical deposition and mechanically controllable break junction (MCBJ) method. We describe the microfabrication process and discuss improved features of our microchip structure compared to the previous one. The improved structure prolongs the available life of the microchip and also increases the success rate of the MCBJ experiment. Stepwise changes in the current were observed at the last stage of atomic point contact breakdown and conductance histograms were constructed. The evolution of 1G{sub 0} peak height in conductance histograms was used to investigate the probability of formation of an atomic point contact. It has been shown that the success rate in forming an atomic point contact can be improved by decreasing the stretching speed and the degree that the two electrodes are brought into contact. The repeated breakdown and formation over thousands of cycles led to a distinctive increase of 1G{sub 0} peak height in the conductance histograms, and this increased probability of forming a single atomic point contact is discussed.
Sistem Pendeteksi Kualitas Daging Dengan Ekualisasi Histogram Dan Thresholding Berbasis Android
Directory of Open Access Journals (Sweden)
Anggit Sri Herlambang
2016-04-01
Full Text Available Kebutuhan daging sapi yang meningkat sering dimanfaatkan oleh penjual daging sapi untuk melakukan kecurangan. Kecurangan yang sering dimanfaatkan biasanya dalam hal kualitas daging sapi. Kualitas daging ditentukan oleh beberapa parameter, termasuk parameter ukuran, tekstur, karakteristik warna, bau daging dan lain - lain. Parameter adalah salah satu faktor penting untuk menentukan kualitas daging. Umumnya dalam menentukan kualitas daging dilakukan dengan menggunakan indra penglihatan. Sehingga cara manual masih bersifat subjektif dalam menilai kualitas daging. Penelitian ini bertujuan untuk merancang aplikasi sistem pendeteksi kualitas daging dengan sampel 20 citra daging data uji. Sistem pendeteksi kualitas daging dengan ekualisasi histogram dan thresholding berbasis android ini dibangun dengan menggunakan bahasa pemrograman berbasis Android yang terintegrasi dengan SDK Android, Eclipse dan library OpenCV. Metode yang digunakan menggunakan metode pra-pengolahan ekualisasi histogram dan segmentasi thresholding pengolahan citra. Deteksi kualitas daging dilakukan dengan mencari nilai statistik ekstraksi ciri citra berdasarkan data citra daging dari penelitian. Hasil penelitian ini adalah dapat menentukan nilai statistik mean dan standar deviasi dari hasil citra olahan ekualisasi histogram dan thresholding disertai analisis kualitas citra daging sapi. Pengujian black box dari aplikasi sistem pendeteksi kualitas daging ini menunjukkan bahwa semua fungsi yang terdapat pada aplikasi ini telah berhasil berjalan sesuai fungsinya. Penelitian ini harapannya bisa digunakan untuk membantu penelitian tahap selanjutnya.
Local Histogram of Figure/Ground Segmentations for Dynamic Background Subtraction
Directory of Open Access Journals (Sweden)
Bineng Zhong
2010-01-01
Full Text Available We propose a novel feature, local histogram of figure/ground segmentations, for robust and efficient background subtraction (BGS in dynamic scenes (e.g., waving trees, ripples in water, illumination changes, camera jitters, etc.. We represent each pixel as a local histogram of figure/ground segmentations, which aims at combining several candidate solutions that are produced by simple BGS algorithms to get a more reliable and robust feature for BGS. The background model of each pixel is constructed as a group of weighted adaptive local histograms of figure/ground segmentations, which describe the structure properties of the surrounding region. This is a natural fusion because multiple complementary BGS algorithms can be used to build background models for scenes. Moreover, the correlation of image variations at neighboring pixels is explicitly utilized to achieve robust detection performance since neighboring pixels tend to be similarly affected by environmental effects (e.g., dynamic scenes. Experimental results demonstrate the robustness and effectiveness of the proposed method by comparing with four representatives of the state of the art in BGS.
Local Histogram of Figure/Ground Segmentations for Dynamic Background Subtraction
Directory of Open Access Journals (Sweden)
Yuan Xiaotong
2010-01-01
Full Text Available Abstract We propose a novel feature, local histogram of figure/ground segmentations, for robust and efficient background subtraction (BGS in dynamic scenes (e.g., waving trees, ripples in water, illumination changes, camera jitters, etc.. We represent each pixel as a local histogram of figure/ground segmentations, which aims at combining several candidate solutions that are produced by simple BGS algorithms to get a more reliable and robust feature for BGS. The background model of each pixel is constructed as a group of weighted adaptive local histograms of figure/ground segmentations, which describe the structure properties of the surrounding region. This is a natural fusion because multiple complementary BGS algorithms can be used to build background models for scenes. Moreover, the correlation of image variations at neighboring pixels is explicitly utilized to achieve robust detection performance since neighboring pixels tend to be similarly affected by environmental effects (e.g., dynamic scenes. Experimental results demonstrate the robustness and effectiveness of the proposed method by comparing with four representatives of the state of the art in BGS.
Energy Technology Data Exchange (ETDEWEB)
Yamamoto, Akira [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: yakira@kuhp.kyoto-u.ac.jp; Miki, Yukio [Department of Diagnostic Imaging and Nuclear Medicine, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: mikiy@kuhp.kyoto-u.ac.jp; Adachi, Souichi [Department of Pediatrics, Graduate School of Medicine, Kyoto University, 54 Kawahara-cho, Shogoin, Sakyo-ku, Kyoto-shi Kyoto 606-8507 (Japan)]. E-mail: sadachi@kuhp.kyoto-u.ac.jp (and others)
2006-03-15
Background and purpose: The purpose of this prospective study was to evaluate the hypothesis that magnetization transfer ratio (MTR) histogram analysis of the whole brain could detect early and subtle brain changes nonapparent on conventional magnetic resonance imaging (MRI) in children with acute lymphoblastic leukemia (ALL) receiving methotrexate (MTX) therapy. Materials and methods: Subjects in this prospective study comprised 10 children with ALL (mean age, 6 years; range, 0-16 years). In addition to conventional MRI, magnetization transfer images were obtained before and after intrathecal and intravenous MTX therapy. MTR values were calculated and plotted as a histogram, and peak height and location were calculated. Differences in peak height and location between pre- and post-MTX therapy scans were statistically analyzed. Conventional MRI was evaluated for abnormal signal area in white matter. Results: MTR peak height was significantly lower on post-MTX therapy scans than on pre-MTX therapy scans (p = 0.002). No significant differences in peak location were identified between pre- and post-chemotherapy imaging. No abnormal signals were noted in white matter on either pre- or post-MTX therapy conventional MRI. Conclusions: This study demonstrates that MTR histogram analysis allows better detection of early and subtle brain changes in ALL patients who receive MTX therapy than conventional MRI.
A Social Network Model Exhibiting Tunable Overlapping Community Structure
Liu, D.; Blenn, N.; Van Mieghem, P.F.A.
2012-01-01
Social networks, as well as many other real-world networks, exhibit overlapping community structure. In this paper, we present formulas which facilitate the computation for characterizing the overlapping community structure of networks. A hypergraph representation of networks with overlapping
The QCD vacuum probed by overlap fermions
Weinberg, V; Koller, K; Koma, Y; Schierholz, G; Streuer, T
2006-01-01
We summarize different uses of the eigenmodes of the Neuberger overlap operator for the analysis of the QCD vacuum, here applied to quenched configurations simulated by means of the Luescher-Weisz action. We describe the localization and chiral properties of the lowest modes. The overlap-based topological charge density (with and without UV-filtering) is compared with the results of UV-filtering for the field strength tensor. The latter allows to identify domains of good (anti-)selfduality. All these techniques together lead to a dual picture of the vacuum, unifying the infrared instanton picture with the presence of singular defects co-existent at different scales.
Schwinger model simulations with dynamical overlap fermions
Bietenholz, W; Volkholz, J
2007-01-01
We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate Sigma vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain -- for the very light fermion masses -- values for $\\Sigma$ that follow closely the analytical predictions in the continuum.
Schwinger model simulations with dynamical overlap fermions
Energy Technology Data Exchange (ETDEWEB)
Bietenholz, W. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Shcheredin, S. [Bielefeld Univ. (Germany). Fakultaet fuer Physik; Volkholz, J. [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik
2007-11-15
We present simulation results for the 2-flavour Schwinger model with dynamical overlap fermions. In particular we apply the overlap hypercube operator at seven light fermion masses. In each case we collect sizable statistics in the topological sectors 0 and 1. Since the chiral condensate {sigma} vanishes in the chiral limit, we observe densities for the microscopic Dirac spectrum, which have not been addressed yet by Random Matrix Theory (RMT). Nevertheless, by confronting the averages of the lowest eigenvalues in different topological sectors with chiral RMT in unitary ensemble we obtain - for the very light fermion masses - values for {sigma} that follow closely the analytical predictions in the continuum. (orig.)
On the SOR method with overlapping subsystems
Maleev, A. A.
2006-06-01
A description is given of the iterative Jacobi method with overlapping subsystems and the corresponding Gauss-Seidel method. Similarly to the classical case, a generalized SOR method with overlapping subsystems is constructed by introducing an relaxation parameter. The concept of a ω-consistent matrix is defined. It is shown that, with the optimal choice of the parameter, the theory developed by Young remains valid for ω-consistent matrices. This implies certain results for ω-consistent H-matrices. The theoretical conclusions obtained in the paper are supported by numerical results.
Overlapping community detection using weighted consensus clustering
Indian Academy of Sciences (India)
LINTAO YANG; ZETAI YU; JING QIAN; SHOUYIN LIU
2016-10-01
Many overlapping community detection algorithms have been proposed. Most of them are unstable and behave non-deterministically. In this paper, we use weighted consensus clustering for combining multiple base covers obtained by classic non-deterministic algorithms to improve the quality of the results. We first evaluate a reliability measure for each community in all base covers and assign a proportional weight to each one. Then we redefine the consensus matrix that takes into account not only the common membership of nodes, but also the reliability of the communities. Experimental results on both artificial and real-world networks show that our algorithm can find overlapping communities accurately.
Using Model-based Overlapping Seed Expansion to detect highly overlapping community structure
McDaid, Aaron F
2010-01-01
As research into community finding in social networks progresses, there is a need for algorithms capable of detecting overlapping community structure. Many algorithms have been proposed in recent years that are capable of assigning each node to more than a single community. The performance of these algorithms tends to degrade when the ground-truth contains a more highly overlapping community structure, with nodes assigned to more than two communities. Such highly overlapping structure is likely to exist in many social networks, such as Facebook friendship networks. In this paper we present a scalable algorithm, MOSES, based on a statistical model of community structure, which is capable of detecting highly overlapping community structure, especially when there is variance in the number of communities each node is in. In evaluation on synthetic data MOSES is found to be superior to existing algorithms, especially at high levels of overlap. We demonstrate MOSES on real social network data by analyzing the netwo...
Zeng, Bangze; Zhu, Youpan; Li, Zemin; Hu, Dechao; Luo, Lin; Zhao, Deli; Huang, Juan
2014-11-01
Duo to infrared image with low contrast, big noise and unclear visual effect, target is very difficult to observed and identified. This paper presents an improved infrared image detail enhancement algorithm based on adaptive histogram statistical stretching and gradient filtering (AHSS-GF). Based on the fact that the human eyes are very sensitive to the edges and lines, the author proposed to extract the details and textures by using the gradient filtering. New histogram could be acquired by calculating the sum of original histogram based on fixed window. With the minimum value for cut-off point, author carried on histogram statistical stretching. After the proper weights given to the details and background, the detail-enhanced results could be acquired finally. The results indicate image contrast could be improved and the details and textures could be enhanced effectively as well.
Improving Inversions of the Overlap Operator
Energy Technology Data Exchange (ETDEWEB)
Krieg, S. [Department of Physics, Universitaet Wuppertal, Gaussstrasse 19, Wuppertal (Germany); Cundy, N. [Department of Physics, Universitaet Wuppertal, Gaussstrasse 19, Wuppertal (Germany); Eshof, J. van den [Department of Mathematics, University of Duesseldorf (Germany); Frommer, A. [Department of Mathematics, Univeritaet Wuppertal (Germany); Lippert, Th. [John von Neumann Institute for Computing, Juelich Research Centre, 52425 Juelich (Germany); Schaefer, K. [Department of Mathematics, Univeritaet Wuppertal (Germany)
2005-03-15
We present relaxation and preconditioning techniques which accelerate the inversion of the overlap operator by a factor of four on small lattices, with larger gains as the lattice size increases. These improvements can be used in both propagator calculations and dynamical simulations.
Parallelizing SLPA for Scalable Overlapping Community Detection
Directory of Open Access Journals (Sweden)
Konstantin Kuzmin
2015-01-01
Full Text Available Communities in networks are groups of nodes whose connections to the nodes in a community are stronger than with the nodes in the rest of the network. Quite often nodes participate in multiple communities; that is, communities can overlap. In this paper, we first analyze what other researchers have done to utilize high performance computing to perform efficient community detection in social, biological, and other networks. We note that detection of overlapping communities is more computationally intensive than disjoint community detection, and the former presents new challenges that algorithm designers have to face. Moreover, the efficiency of many existing algorithms grows superlinearly with the network size making them unsuitable to process large datasets. We use the Speaker-Listener Label Propagation Algorithm (SLPA as the basis for our parallel overlapping community detection implementation. SLPA provides near linear time overlapping community detection and is well suited for parallelization. We explore the benefits of a multithreaded programming paradigm and show that it yields a significant performance gain over sequential execution while preserving the high quality of community detection. The algorithm was tested on four real-world datasets with up to 5.5 million nodes and 170 million edges. In order to assess the quality of community detection, at least 4 different metrics were used for each of the datasets.
Overlapping Community Detection based on Network Decomposition
Ding, Zhuanlian; Zhang, Xingyi; Sun, Dengdi; Luo, Bin
2016-04-01
Community detection in complex network has become a vital step to understand the structure and dynamics of networks in various fields. However, traditional node clustering and relatively new proposed link clustering methods have inherent drawbacks to discover overlapping communities. Node clustering is inadequate to capture the pervasive overlaps, while link clustering is often criticized due to the high computational cost and ambiguous definition of communities. So, overlapping community detection is still a formidable challenge. In this work, we propose a new overlapping community detection algorithm based on network decomposition, called NDOCD. Specifically, NDOCD iteratively splits the network by removing all links in derived link communities, which are identified by utilizing node clustering technique. The network decomposition contributes to reducing the computation time and noise link elimination conduces to improving the quality of obtained communities. Besides, we employ node clustering technique rather than link similarity measure to discover link communities, thus NDOCD avoids an ambiguous definition of community and becomes less time-consuming. We test our approach on both synthetic and real-world networks. Results demonstrate the superior performance of our approach both in computation time and accuracy compared to state-of-the-art algorithms.
Current status of Dynamical Overlap project
Cundy, N
2006-01-01
We discuss the adaptation of the Hybrid Monte Carlo algorithm to overlap fermions. We derive a method which can be used to account for the delta function in the fermionic force caused by the differential of the sign function. We discuss the algoritmic difficulties that have been overcome, and mention those that still need to be solved.
Using character overlap to improve language transformation
Wubben, S.; Krahmer, E.; Bosch, A.P.J. van den
2013-01-01
Language transformation can be deﬁned as translating between diachronically distinct language variants. We investigate the transformation of Middle Dutch into Modern Dutch by means of machine translation. We demonstrate that by using character overlap the performance of the machine translation proce
Chromosome Segregation: Organizing Overlap at the Midzone
Janson, M.E.; Tran, P.T.
2008-01-01
Sets of overlapping microtubules support the segregation of chromosomes by linking the poles of mitotic spindles. Recent work examines the effect of putting these linkages under pressure by the activation of dicentric chromosomes and sheds new light on the structural role of several well-known spind
Autism and ADHD: Overlapping and Discriminating Symptoms
Mayes, Susan Dickerson; Calhoun, Susan L.; Mayes, Rebecca D.; Molitoris, Sarah
2012-01-01
Children with ADHD and autism have some similar features, complicating a differential diagnosis. The purpose of our study was to determine the degree to which core ADHD and autistic symptoms overlap in and discriminate between children 2-16 years of age with autism and ADHD. Our study demonstrated that 847 children with autism were easily…
Mori, Toshifumi; Hamers, Robert J; Pedersen, Joel A; Cui, Qiang
2014-07-17
Motivated by specific applications and the recent work of Gao and co-workers on integrated tempering sampling (ITS), we have developed a novel sampling approach referred to as integrated Hamiltonian sampling (IHS). IHS is straightforward to implement and complementary to existing methods for free energy simulation and enhanced configurational sampling. The method carries out sampling using an effective Hamiltonian constructed by integrating the Boltzmann distributions of a series of Hamiltonians. By judiciously selecting the weights of the different Hamiltonians, one achieves rapid transitions among the energy landscapes that underlie different Hamiltonians and therefore an efficient sampling of important regions of the conformational space. Along this line, IHS shares similar motivations as the enveloping distribution sampling (EDS) approach of van Gunsteren and co-workers, although the ways that distributions of different Hamiltonians are integrated are rather different in IHS and EDS. Specifically, we report efficient ways for determining the weights using a combination of histogram flattening and weighted histogram analysis approaches, which make it straightforward to include many end-state and intermediate Hamiltonians in IHS so as to enhance its flexibility. Using several relatively simple condensed phase examples, we illustrate the implementation and application of IHS as well as potential developments for the near future. The relation of IHS to several related sampling methods such as Hamiltonian replica exchange molecular dynamics and λ-dynamics is also briefly discussed.
On the acoustics of overlapping laughter in conversational speech
Truong, Khiet P.; Trouvain, Jürgen
2012-01-01
The social nature of laughter invites people to laugh together. This joint vocal action often results in overlapping laughter. In this paper, we show that the acoustics of overlapping laughs are different from non-overlapping laughs. We found that overlapping laughs are stronger prosodically marked
Directory of Open Access Journals (Sweden)
H. B. Kekre
2012-03-01
Full Text Available In this paper we have introduced three simple feature vector extraction ideas to retrieve the images from database of 2000 images includes 20 different classes into it. The feature extraction process mainly based on splitting the image into three planes, for each plane an equalized histogram will be calculated which is divided in two, three and four equal parts to form the 8, 27 and 64 bins respectively. Three simple ways are used to extract the information in these three different sizes of bin sets. One is, ‘Count’ of the pixels falling in specific range of the histogram of each plane into its destination bin. Second, ‘Total’ intensities of these pixels in each of these bins is taken into consideration, and in third variation is the ‘Mean’ of these intensities is considered to represent the feature vector. Determination of the destination bin address for each pixel under process depends upon the R,G, B value of that pixel which falls in any one part of the equalized partitioned histogram, because based on it the 3digits flag will be assigned to that pixel with respect to its R, G, and B values. This way, sixfeature vector databases are prepared for 2000 images with three variable sizes and 3 variations in the extraction methods. We have maintained the separate set of bins for each plane and that way we have 3 more variations in databases. Means in all we have 18 feature vector databases that is six databases for each Red, Green and Blueplane. Experimentation uses image database of 20 classes having 100 images of each of the following classes: Flower, Sunset, Mountain, Buliding, Bus, Dinosaur, Elephant, Barbie, Mickey, Horses,Kingfisher, Dove, Crow, Rainbowrose, Pyramids, Plates, Car, Trees, Ship and Waterfall. Performance of our approaches is evaluated using two parameters LIRS and LSRR and results are refined and combined using three criteria Criterion1, 2 and 3.
Reduction of Striping Noise in Overlapping LIDAR Intensity Data by Radiometric Normalization
Yan, Wai Yeung; Shaker, Ahmed
2016-06-01
To serve seamless mapping, airborne LiDAR data are usually collected with multiple parallel strips with one or two cross strip(s). Nevertheless, the overlapping regions of LiDAR data strips are usually found with unbalanced intensity values, resulting in the appearance of stripping noise. Despite that physical intensity correction methods are recently proposed, some of the system and environmental parameters are assumed as constant or not disclosed, leading to such an intensity discrepancy. This paper presents a new normalization technique to adjust the radiometric misalignment found in the overlapping LiDAR data strips. The normalization technique is built upon a second-order polynomial function fitted on the joint histogram plot, which is generated with a set of pairwise closest data points identified within the overlapping region. The method was tested on Teledyne Optech's Gemini dataset (at 1064 nm wavelength), where the LiDAR intensity data were first radiometrically corrected based on the radar (range) equation. Five land cover features were selected to evaluate the coefficient of variation (cv) of the intensity values before and after implementing the proposed method. Reduction of cv was found by 19% to 59% in the Gemini dataset, where the striping noise was significantly reduced in the radiometrically corrected and normalized intensity data. The Gemini dataset was also used to conduct land cover classification, and the overall accuracy yielded a notable improvement of 9% to 18%. As a result, LiDAR intensity data should be pre-processed with radiometric correction and normalization prior to any data manipulation.
DEFF Research Database (Denmark)
Fichtl, Severin; Kraft, Dirk; Krüger, Norbert;
2015-01-01
that capture and represent the spatial relationships in an easily accessible way. We are interested in learning to predict the success of “means end” actions that manipulate two objects at once, from exploratory actions, and the observed sensorimo- tor contingencies. In this paper, we use relational histogram...... features and illustrate their effect on learning to predict a variety of “means end” actions’ outcomes. The results show that our vision features can make the learning problem significantly easier, leading to increased learning rates and higher maximum performance. This work is in particular important...
Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.
2013-06-01
In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less
Frontal Face Detection using Haar Wavelet Coefficients and Local Histogram Correlation
Directory of Open Access Journals (Sweden)
Iwan Setyawan
2011-12-01
Full Text Available Face detection is the main building block on which all automatic systems dealing with human faces is built. For example, a face recognition system must rely on face detection to process an input image and determine which areas contain human faces. These areas then become the input for the face recognition system for further processing. This paper presents a face detection system designed to detect frontal faces. The system uses Haar wavelet coefficients and local histogram correlation as differentiating features. Our proposed system is trained using 100 training images. Our experiments show that the proposed system performed well during testing, achieving a detection rate of 91.5%.
Fitting simulated random events to experimental histograms by means of parametric models
Energy Technology Data Exchange (ETDEWEB)
Kortner, Oliver E-mail: oliver.kortner@cern.chkortner@mppmu.mpg.de; Zupancic, Crtomir
2003-05-11
Classical chi-square quantities are appropriate tools for fitting analytical parameter-dependent models to (multidimensional) measured histograms. In contrast, this article proposes a family of special chi-squares suitable for fits with models which simulate experimental data by Monte Carlo methods, thus introducing additional randomness. We investigate the dependence of such chi-squares on the number of experimental and simulated events in each bin, and on the theoretical parameter-dependent weight linking the two kinds of events. We identify the unknown probability distributions of the weights and their inter-bin correlations as the main obstacle to a general performance analysis of the proposed chi-square quantities.
Content Based Radiographic Images Indexing and Retrieval Using Pattern Orientation Histogram
Directory of Open Access Journals (Sweden)
Abolfazl Lakdashti
2008-06-01
Full Text Available Introduction: Content Based Image Retrieval (CBIR is a method of image searching and retrieval in a database. In medical applications, CBIR is a tool used by physicians to compare the previous and current medical images associated with patients pathological conditions. As the volume of pictorial information stored in medical image databases is in progress, efficient image indexing and retrieval is increasingly becoming a necessity. Materials and Methods: This paper presents a new content based radiographic image retrieval approach based on histogram of pattern orientations, namely pattern orientation histogram (POH. POH represents the spatial distribution of five different pattern orientations: vertical, horizontal, diagonal down/left, diagonal down/right and non-orientation. In this method, a given image is first divided into image-blocks and the frequency of each type of pattern is determined in each image-block. Then, local pattern histograms for each of these image-blocks are computed. Results: The method was compared to two well known texture-based image retrieval methods: Tamura and Edge Histogram Descriptors (EHD in MPEG-7 standard. Experimental results based on 10000 IRMA radiography image dataset, demonstrate that POH provides better precision and recall rates compared to Tamura and EHD. For some images, the recall and precision rates obtained by POH are, respectively, 48% and 18% better than the best of the two above mentioned methods. Discussion and Conclusion: Since we exploit the absolute location of the pattern in the image as well as its global composition, the proposed matching method can retrieve semantically similar medical images.
Yet Another Method for Image Segmentation based on Histograms and Heuristics
Directory of Open Access Journals (Sweden)
Horia-Nicolai L. Teodorescu
2012-07-01
Full Text Available We introduce a method for image segmentation that requires little computations, yet providing comparable results to other methods. While the proposed method resembles to the known ones based on histograms, it is still different in the use of the gray level distribution. When to the basic procedure we add several heuristic rules, the method produces results that, in some cases, may outperform the results produced by the known methods. The paper reports preliminary results. More details on the method, improvements, and results will be presented in a future paper.
Non-Rigid Medical Image Registration with Joint Histogram Estimation Based on Mutual Information
Institute of Scientific and Technical Information of China (English)
LU Xuesong; ZHANG Su; SU He; CHEN Yazhu
2007-01-01
A mutual information-based non-rigid medical image registration algorithm is presented. An approximate function of Harming windowed sinc is used as kernel function of partial volume (PV)interpolation to estimate the joint histogram, which is the key to calculating the mutual information. And a new method is proposed to compute the gradient of mutual information with respect to themodel parameters. The transformation of object is modeled by a free-form deformation (FFD) based on B-splines. The experiments on 3D synthetic and real image data show that the algorithm can con-verge at the global optimum and restrain the emergency of local extreme.
Histogram Monte Carlo position-space renormalization group: Applications to the site percolation
Hu, Chin-Kun; Chen, Chi-Ning; Wu, F. Y.
1996-02-01
We study site percolation on the square lattice and show that, when augmented with histogram Monte Carlo simulations for large lattices, the cell-to-cell renormalization group approach can be used to determine the critical probability accurately. Unlike the cell-to-site method and an alternate renormalization group approach proposed recently by Sahimi and Rassamdana, both of which rely on ab initio numerical inputs, the cell-to-cell scheme is free of prior knowledge and thus can be applied more widely.
MOHCS: Towards Mining Overlapping Highly Connected Subgraphs
Lin, Xiahong; Chen, Kefei; Chiu, David K Y
2008-01-01
Many networks in real-life typically contain parts in which some nodes are more highly connected to each other than the other nodes of the network. The collection of such nodes are usually called clusters, communities, cohesive groups or modules. In graph terminology, it is called highly connected graph. In this paper, we first prove some properties related to highly connected graph. Based on these properties, we then redefine the highly connected subgraph which results in an algorithm that determines whether a given graph is highly connected in linear time. Then we present a computationally efficient algorithm, called MOHCS, for mining overlapping highly connected subgraphs. We have evaluated experimentally the performance of MOHCS using real and synthetic data sets from computer-generated graph and yeast protein network. Our results show that MOHCS is effective and reliable in finding overlapping highly connected subgraphs. Keywords-component; Highly connected subgraph, clustering algorithms, minimum cut, m...
Overlap Removal Methods for Data Projection Algorithms
Spicker, Marc
2011-01-01
Projection algorithms map high dimensional data points to lower dimensions. However, when adding arbitrary shaped objects as representatives for these data points, they may intersect. The positions of these representatives have to be modi ed in order to remove existing overlaps. There are multiple algorithms designed to deal with this layout adjustment problem, which lead to very di erent results. These adjustment strategies are evaluated according to di erent measures for comparison: euclide...
Overlap fermions on a twisted mass sea
Bär, O; Schäefer, S; Scorzato, L; Shindler, A
2006-01-01
We present first results of a mixed action project. We analyze gauge configurations generated with two flavors of dynamical twisted mass fermions. Neuberger's overlap Dirac operator is used for the valence sector. The various choices in the setup of the simulation are discussed. We employ chiral perturbation theory to describe the effects of using different actions in the sea and valence sector at non-zero lattice spacing.
The overlap between cyberbullying and traditional bullying.
Waasdorp, Tracy E; Bradshaw, Catherine P
2015-05-01
Cyberbullying appears to be on the rise among adolescents due in part to increased access to electronic devices and less online supervision. Less is known about how cyberbullying differs from traditional bullying which occurs in person and the extent to which these two forms overlap. Our first aim was to examine the overlap of traditional bullying (relational, verbal, and physical) with cyberbullying. The second aim examined student- and school-level correlates of cyber victimization as compared to traditional victims. The final aim explored details of the cyberbullying experience (e.g., who sent the message, how was the message sent, and what was the message about). Data came from 28,104 adolescents (grades, 9-12) attending 58 high schools. Approximately 23% of the youth reported being victims of any form of bullying (cyber, relational, physical, and verbal) within the last month, with 25.6% of those victims reporting being cyberbullied. The largest proportion (50.3%) of victims reported they were victimized by all four forms, whereas only 4.6% reported being only cyberbullied. Multilevel analyses indicated that as compared to those who were only traditionally bullied, those who were cyberbullied were more likely to have externalizing (odds ratio = 1.44) and internalizing symptoms (odds ratio = 1.25). Additional analyses examined detailed characteristics of the cyberbullying experiences, indicating a relatively high level of overlap between cyber and traditional bullying. Implications for preventive interventions targeting youth involved with cyberbullying and its overlap with other forms of bullying are discussed. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
Separating cyclostationary signals from spectrally overlapping interference
Institute of Scientific and Technical Information of China (English)
GUO Jie; LIU Yun; YE Zhi-hui; SONG Tie-cheng; SHEN Lian-feng
2006-01-01
This paper studies an algorithm about separating spectmlly overlapping signals using the cyclostationary properties of signals.On the basis of direct sequence spread system (DSSS),frequency shift filter is added into the receiver of the communication system.Although the structure of frequency shift filter is more complicated than the time-domain filter,it uses both time correlations and frequency spectrum correlations so it can achieve better performances on separating the overlapping signals.After the analysis of cyclostationary characteristic and frequency spectrum correlation,the structure of the frequency shift filter can be gained.Then,a self-adaptive algorithm is utilized for the purpose of achieving optimum multidimensional tap weights of frequency shift components.The simulation results indicate that this method can efficiently separate overlapping signals,and its error rate is lower than the time-domain filter or DSSS system by two orders of magnitude on the condition that high-power interference is added into the system.
Function approximation using adaptive and overlapping intervals
Energy Technology Data Exchange (ETDEWEB)
Patil, R.B.
1995-05-01
A problem common to many disciplines is to approximate a function given only the values of the function at various points in input variable space. A method is proposed for approximating a function of several to one variable. The model takes the form of weighted averaging of overlapping basis functions defined over intervals. The number of such basis functions and their parameters (widths and centers) are automatically determined using given training data and a learning algorithm. The proposed algorithm can be seen as placing a nonuniform multidimensional grid in the input domain with overlapping cells. The non-uniformity and overlap of the cells is achieved by a learning algorithm to optimize a given objective function. This approach is motivated by the fuzzy modeling approach and a learning algorithms used for clustering and classification in pattern recognition. The basics of why and how the approach works are given. Few examples of nonlinear regression and classification are modeled. The relationship between the proposed technique, radial basis neural networks, kernel regression, probabilistic neural networks, and fuzzy modeling is explained. Finally advantages and disadvantages are discussed.
Burnout-depression overlap: a review.
Bianchi, Renzo; Schonfeld, Irvin Sam; Laurent, Eric
2015-03-01
Whether burnout is a form of depression or a distinct phenomenon is an object of controversy. The aim of the present article was to provide an up-to-date review of the literature dedicated to the question of burnout-depression overlap. A systematic literature search was carried out in PubMed, PsycINFO, and IngentaConnect. A total of 92 studies were identified as informing the issue of burnout-depression overlap. The current state of the art suggests that the distinction between burnout and depression is conceptually fragile. It is notably unclear how the state of burnout (i.e., the end stage of the burnout process) is conceived to differ from clinical depression. Empirically, evidence for the distinctiveness of the burnout phenomenon has been inconsistent, with the most recent studies casting doubt on that distinctiveness. The absence of consensual diagnostic criteria for burnout and burnout research's insufficient consideration of the heterogeneity of depressive disorders constitute major obstacles to the resolution of the raised issue. In conclusion, the epistemic status of the seminal, field-dominating definition of burnout is questioned. It is suggested that systematic clinical observation should be given a central place in future research on burnout-depression overlap.
Energy Technology Data Exchange (ETDEWEB)
Ovchinnikov, Mikhail [Pacific Northwest National Laboratory, Richland Washington USA; Lim, Kyo-Sun Sunny [Pacific Northwest National Laboratory, Richland Washington USA; Korea Atomic Energy Research Institute, Daejeon Republic of Korea; Larson, Vincent E. [Department of Mathematical Sciences, University of Wisconsin-Milwaukee, Milwaukee Wisconsin USA; Wong, May [Pacific Northwest National Laboratory, Richland Washington USA; National Center for Atmospheric Research, Boulder Colorado USA; Thayer-Calder, Katherine [National Center for Atmospheric Research, Boulder Colorado USA; Ghan, Steven J. [Pacific Northwest National Laboratory, Richland Washington USA
2016-11-05
Coarse-resolution climate models increasingly rely on probability density functions (PDFs) to represent subgrid-scale variability of prognostic variables. While PDFs characterize the horizontal variability, a separate treatment is needed to account for the vertical structure of clouds and precipitation. When sub-columns are drawn from these PDFs for microphysics or radiation parameterizations, appropriate vertical correlations must be enforced via PDF overlap specifications. This study evaluates the representation of PDF overlap in the Subgrid Importance Latin Hypercube Sampler (SILHS) employed in the assumed PDF turbulence and cloud scheme called the Cloud Layers Unified By Binormals (CLUBB). PDF overlap in CLUBB-SILHS simulations of continental and tropical oceanic deep convection is compared with overlap of PDF of various microphysics variables in cloud-resolving model (CRM) simulations of the same cases that explicitly predict the 3D structure of cloud and precipitation fields. CRM results show that PDF overlap varies significantly between different hydrometeor types, as well as between PDFs of mass and number mixing ratios for each species, - a distinction that the current SILHS implementation does not make. In CRM simulations that explicitly resolve cloud and precipitation structures, faster falling species, such as rain and graupel, exhibit significantly higher coherence in their vertical distributions than slow falling cloud liquid and ice. These results suggest that to improve the overlap treatment in the sub-column generator, the PDF correlations need to depend on hydrometeor properties, such as fall speeds, in addition to the currently implemented dependency on the turbulent convective length scale.
Directory of Open Access Journals (Sweden)
Abdelbaset Buhmeida
2011-05-01
Full Text Available The role of DNA content as a prognostic factor in colorectal cancer (CRC is highly controversial. Some of these controversies are due to purely technical reasons, e.g. variable practices in interpreting the DNA histograms, which is problematic particularly in advanced cases. In this report, we give a detailed account on various options how these histograms could be optimally interpreted, with the idea of establishing the potential value of DNA image cytometry in prognosis and in selection of proper treatment. Material consists of nuclei isolated from 50 ƒĘm paraffin sections from 160 patients with stage II, III or IV CRC diagnosed, treated and followed-up in our clinic. The nuclei were stained with the Feulgen stain. Nuclear DNA was measured using computer-assisted image cytometry. We applied 4 different approaches to analyse the DNA histograms: 1 appearance of the histogram (ABCDE approach, 2 range of DNA values, 3 peak evaluation, and 4 events present at high DNA values. Intra-observer reproducibility of these four histogram interpretation was 89%, 95%, 96%, and 100%, respectively. We depicted selected histograms to illustrate the four analytical approaches in cases with different stages of CRC, with variable disease outcome. In our analysis, the range of DNA values was the best prognosticator, i.e., the tumours with the widest histograms had the most ominous prognosis. These data implicate that DNA cytometry based on isolated nuclei is valuable in predicting the prognosis of CRC. Different interpretation techniques differed in their reproducibility, but the method showing the best prognostic value also had high reproducibility in our analysis.
The application of age distribution theory in the analysis of cytofluorimetric DNA histogram data.
Watson, J V
1977-03-01
Age distribution theory has been employed in a model to analyse a variety of histograms of the DNA content of single cells in samples from experimental tumours growing in tissue culture. The method has produced satisfactory correspondence with the experimental data in which there was a wide variation in the proportions of cells in the intermitotic phases, and generally good agreement between the 3H-thymidine labelling index and the computed proportion in S phase. The model has the capacity to analyse data from populations which contain a proportion of non-cycling cells. However, it is concluded that reliable results for the growth fraction and also for the relative durations of the intermitotic phase times cannot be obtained for the data reported here from the DNA histograms alone. To obtain reliable estimates of the growth fraction the relative durations of the phase time must be known, and conversely, reliable estimates of the relative phase durations can only be obtained if the growth fraction is known.
Predicting low-temperature free energy landscapes with flat-histogram Monte Carlo methods
Mahynski, Nathan A.; Blanco, Marco A.; Errington, Jeffrey R.; Shen, Vincent K.
2017-02-01
We present a method for predicting the free energy landscape of fluids at low temperatures from flat-histogram grand canonical Monte Carlo simulations performed at higher ones. We illustrate our approach for both pure and multicomponent systems using two different sampling methods as a demonstration. This allows us to predict the thermodynamic behavior of systems which undergo both first order and continuous phase transitions upon cooling using simulations performed only at higher temperatures. After surveying a variety of different systems, we identify a range of temperature differences over which the extrapolation of high temperature simulations tends to quantitatively predict the thermodynamic properties of fluids at lower ones. Beyond this range, extrapolation still provides a reasonably well-informed estimate of the free energy landscape; this prediction then requires less computational effort to refine with an additional simulation at the desired temperature than reconstruction of the surface without any initial estimate. In either case, this method significantly increases the computational efficiency of these flat-histogram methods when investigating thermodynamic properties of fluids over a wide range of temperatures. For example, we demonstrate how a binary fluid phase diagram may be quantitatively predicted for many temperatures using only information obtained from a single supercritical state.
Color Histogram and DBC Co-Occurrence Matrix for Content Based Image Retrieval
Directory of Open Access Journals (Sweden)
K. Prasanthi Jasmine
2014-12-01
Full Text Available This paper presents the integration of color histogram and DBC co-occurrence matrix for content based image retrieval. The exit DBC collect the directional edges which are calculated by applying the first-order derivatives in 0º, 45º, 90º and 135º directions. The feature vector length of DBC for a particular direction is 512 which are more for image retrieval. To avoid this problem, we collect the directional edges by excluding the center pixel and further applied the rotation invariant property. Further, we calculated the co-occurrence matrix to form the feature vector. Finally, the HSV color histogram and the DBC co-occurrence matrix are integrated to form the feature database. The retrieval results of the proposed method have been tested by conducting three experiments on Brodatz, MIT VisTex texture databases and Corel-1000 natural database. The results after being investigated show a significant improvement in terms of their evaluation measures as compared to LBP, DBC and other transform domain features.
Contrast enhancement based on layered difference representation of 2D histograms.
Lee, Chulwoo; Lee, Chul; Kim, Chang-Su
2013-12-01
A novel contrast enhancement algorithm based on the layered difference representation of 2D histograms is proposed in this paper. We attempt to enhance image contrast by amplifying the gray-level differences between adjacent pixels. To this end, we obtain the 2D histogram h(k, k + l ) from an input image, which counts the pairs of adjacent pixels with gray-levels k and k + l , and represent the gray-level differences in a tree-like layered structure. Then, we formulate a constrained optimization problem based on the observation that the gray-level differences, occurring more frequently in the input image, should be more emphasized in the output image. We first solve the optimization problem to derive the transformation function at each layer. We then combine the transformation functions at all layers into the unified transformation function, which is used to map input gray-levels to output gray-levels. Experimental results demonstrate that the proposed algorithm enhances images efficiently in terms of both objective quality and subjective quality.
Wang, Po-Jen; Keyawa, Nicholas R.; Euler, Craig
2012-01-01
In order to achieve highly accurate motion control and path planning for a mobile robot, an obstacle avoidance algorithm that provided a desired instantaneous turning radius and velocity was generated. This type of obstacle avoidance algorithm, which has been implemented in California State University Northridge's Intelligent Ground Vehicle (IGV), is known as Radial Polar Histogram (RPH). The RPH algorithm utilizes raw data in the form of a polar histogram that is read from a Laser Range Finder (LRF) and a camera. A desired open block is determined from the raw data utilizing a navigational heading and an elliptical approximation. The left and right most radii are determined from the calculated edges of the open block and provide the range of possible radial paths the IGV can travel through. In addition, the calculated obstacle edge positions allow the IGV to recognize complex obstacle arrangements and to slow down accordingly. A radial path optimization function calculates the best radial path between the left and right most radii and is sent to motion control for speed determination. Overall, the RPH algorithm allows the IGV to autonomously travel at average speeds of 3mph while avoiding all obstacles, with a processing time of approximately 10ms.
User Aligned Histogram Stacks for Visualization of Abdominal Organs via MRI
Özdemir, M.; Akay, O.; Güzeliş, C.; Dicle, O.; Selver, M. A.
2016-08-01
Multi-dimensional transfer functions (MDTF) are occasionally designed as two-step approaches. At the first step, the constructed domain is modelled coarsely using global volume statistics and an initial transfer function (TF) is designed. Then, a finer classification is performed using local information to refine the TF design. In this study, both a new TF domain and a novel two-step MDTF strategy are proposed for visualization of abdominal organs. The proposed domain is generated by aligning the histograms of the slices, which are reconstructed based on user aligned majority axis/regions through an interactive Multi-Planar Reconstruction graphical user interface. It is shown that these user aligned histogram stacks (UAHS) exploit more a priori information by providing tissue specific inter-slice spatial domain knowledge. For initial TF design, UAHS are approximated using a multi-scale hierarchical Gaussian mixture model, which is designed to work in quasi real time. Then, a finer classification step is carried out for refinement of the initial result. Applications to several MRI data sets acquired with various sequences demonstrate improved visualization of abdomen.
Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI
Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.
2015-03-01
Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.
Beyond histograms: Efficiently estimating radial distribution functions via spectral Monte Carlo
Patrone, Paul N.; Rosch, Thomas W.
2017-03-01
Despite more than 40 years of research in condensed-matter physics, state-of-the-art approaches for simulating the radial distribution function (RDF) g(r) still rely on binning pair-separations into a histogram. Such methods suffer from undesirable properties, including subjectivity, high uncertainty, and slow rates of convergence. Moreover, such problems go undetected by the metrics often used to assess RDFs. To address these issues, we propose (I) a spectral Monte Carlo (SMC) quadrature method that yields g(r) as an analytical series expansion and (II) a Sobolev norm that assesses the quality of RDFs by quantifying their fluctuations. Using the latter, we show that, relative to histogram-based approaches, SMC reduces by orders of magnitude both the noise in g(r) and the number of pair separations needed for acceptable convergence. Moreover, SMC reduces subjectivity and yields simple, differentiable formulas for the RDF, which are useful for tasks such as coarse-grained force-field calibration via iterative Boltzmann inversion.
Takemoto, Seiji; Yamaoka, Kiyoshi; Nishikawa, Makiya; Takakura, Yoshinobu
2006-12-01
A bootstrap method is proposed for assessing statistical histograms of pharmacokinetic parameters (AUC, MRT, CL and V(ss)) from one-point sampling data in animal experiments. A computer program, MOMENT(BS), written in Visual Basic on Microsoft Excel, was developed for the bootstrap calculation and the construction of histograms. MOMENT(BS) was applied to one-point sampling data of the blood concentration of three physiologically active proteins ((111)In labeled Hsp70, Suc(20)-BSA and Suc(40)-BSA) administered in different doses to mice. The histograms of AUC, MRT, CL and V(ss) were close to a normal (Gaussian) distribution with the bootstrap resampling number (200), or more, considering the skewness and kurtosis of the histograms. A good agreement of means and SD was obtained between the bootstrap and Bailer's approaches. The hypothesis test based on the normal distribution clearly demonstrated that the disposition of (111)In-Hsp70 and Suc(20)-BSA was almost independent of dose, whereas that of (111)In-Suc(40)-BSA was definitely dose-dependent. In conclusion, the bootstrap method was found to be an efficient method for assessing the histogram of pharmacokinetic parameters of blood or tissue disposition data by one-point sampling.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Yu-Dong; Wang, Qing; Wu, Chen-Jiang; Wang, Xiao-Ning; Zhang, Jing; Liu, Xi-Sheng; Shi, Hai-Bin [First Affiliated Hospital with Nanjing Medical University, Department of Radiology, Nanjing, Jiangsu Province (China); Liu, Hui [Siemens Healthcare, MR Collaborations NE Asia, Shanghai (China)
2015-04-01
To evaluate histogram analysis of intravoxel incoherent motion (IVIM) for discriminating the Gleason grade of prostate cancer (PCa). A total of 48 patients pathologically confirmed as having clinically significant PCa (size > 0.5 cm) underwent preoperative DW-MRI (b of 0-900 s/mm{sup 2}). Data was post-processed by monoexponential and IVIM model for quantitation of apparent diffusion coefficients (ADCs), perfusion fraction f, diffusivity D and pseudo-diffusivity D*. Histogram analysis was performed by outlining entire-tumour regions of interest (ROIs) from histological-radiological correlation. The ability of imaging indices to differentiate low-grade (LG, Gleason score (GS) ≤6) from intermediate/high-grade (HG, GS > 6) PCa was analysed by ROC regression. Eleven patients had LG tumours (18 foci) and 37 patients had HG tumours (42 foci) on pathology examination. HG tumours had significantly lower ADCs and D in terms of mean, median, 10th and 75th percentiles, combined with higher histogram kurtosis and skewness for ADCs, D and f, than LG PCa (p < 0.05). Histogram D showed relatively higher correlations (n = 0.641-0.668 vs. ADCs: 0.544-0.574) with ordinal GS of PCa; and its mean, median and 10th percentile performed better than ADCs did in distinguishing LG from HG PCa. It is feasible to stratify the pathological grade of PCa by IVIM with histogram metrics. D performed better in distinguishing LG from HG tumour than conventional ADCs. (orig.)
Overlapping constraint for variational surface reconstruction
DEFF Research Database (Denmark)
Aanæs, Henrik; Solem, J.E.
2005-01-01
In this paper a counter example, illustrating a shortcoming in most variational formulations for 3D surface estimation, is presented. The nature of this shortcoming is a lack of an overlapping constraint. A remedy for this shortcoming is presented in the form of a penalty function with an analysis...... of the effects of this function on surface motion. For practical purposes, this will only have minor influence on current methods. However, the insight provided in the analysis is likely to influence future developments in the field of variational surface reconstruction....
Overlap Quark Propagator in Coulomb Gauge QCD
Mercado, Ydalia Delgado; Schröck, Mario
2014-01-01
The chirally symmetric Overlap quark propagator is explored in Coulomb gauge. This gauge is well suited for studying the relation between confinement and chiral symmetry breaking, since confinement can be attributed to the infrared divergent Lorentz-vector dressing function. Using quenched gauge field configurations on a $20^4$ lattice, the quark propagator dressing functions are evaluated, the dynamical quark mass is extracted and the chiral limit of these quantities is discussed. By removing the low-lying modes of the Dirac operator, chiral symmetry is artificially restored. Its effect on the dressing functions is discussed.
THE PHASE-OFFSET OVERLAPPED WAVE TECHNIQUE
Institute of Scientific and Technical Information of China (English)
Liang Dequn; Liang Weihua; Sun Changnian
2003-01-01
A new digital communication technology based on the Phase-Offset Overlapped Waves (POOW) has been introduced in this letter. The waves can be considered as a special multicarrier different from traditional ones. The sub-waves in a coded word's period of the POOW are sine waves and have the same frequencies, but different starting phases. The most important characteristic is that these sub-waves are the piecewise functions and not orthogonal in a code word period. The decoding can be implemented by solving a linear equation group.This code has very high efficiency and thus the data transmission rate is increased greatly.
Technology initiatives with government/business overlap
Knapp, Robert H., Jr.
2015-03-01
Three important present-day technology development settings involve significant overlap between government and private sectors. The Advanced Research Project Agency for Energy (ARPA-E) supports a wide range of "high risk, high return" projects carried out in academic, non-profit or private business settings. The Materials Genome Initiative (MGI), based in the White House, aims at radical acceleration of the development process for advanced materials. California public utilities such as Pacific Gas & Electric operate under a structure of financial returns and political program mandates that make them arms of public policy as much as independent businesses.
$B_{K}$ from quenched overlap QCD
Garron, N; Hölbling, C; Lellouch, L P; Rebbi, C
2003-01-01
We present an exploratory calculation of the standard model Delta S=2 matrix element relevant for indirect CP violation in K -> pi pi decays. The computation is performed with overlap fermions in the quenched approximation at beta=6.0 on a 16^3x32 lattice. The resulting bare matrix element is renormalized non-perturbatively. Our preliminary result is B_K^{NDR}(2 GeV)=0.61(7), where the error does not yet include an estimate of systematic uncertainties.
Li, Huibin
2011-09-01
This paper presents a mesh-based approach for 3D face recognition using a novel local shape descriptor and a SIFT-like matching process. Both maximum and minimum curvatures estimated in the 3D Gaussian scale space are employed to detect salient points. To comprehensively characterize 3D facial surfaces and their variations, we calculate weighted statistical distributions of multiple order surface differential quantities, including histogram of mesh gradient (HoG), histogram of shape index (HoS) and histogram of gradient of shape index (HoGS) within a local neighborhood of each salient point. The subsequent matching step then robustly associates corresponding points of two facial surfaces, leading to much more matched points between different scans of a same person than the ones of different persons. Experimental results on the Bosphorus dataset highlight the effectiveness of the proposed method and its robustness to facial expression variations. © 2011 IEEE.
Directory of Open Access Journals (Sweden)
Abdelbaset Buhmeida
1999-01-01
Full Text Available Twenty‐one fine needle aspiration biopsies (FNAB of the prostate, diagnostically classified as definitely malignant, were studied. The Papanicolaou or H&E stained samples were destained and then stained for DNA with the Feulgen reaction. DNA cytometry was applied after different sampling rules. The histograms varied according to the sampling rule applied. Because free cells between cell groups were easier to measure than cells in the cell groups, two sampling rules were tested in all samples: (i cells in the cell groups were measured, and (ii free cells between cell groups were measured. Abnormal histograms were more common after the sampling rule based on free cells, suggesting that abnormal patterns are best revealed through the free cells in these samples. The conclusions were independent of the applied histogram interpretation method.
Directory of Open Access Journals (Sweden)
C.K. Madhusudana
2016-09-01
Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.
Moriya, Tomohisa; Saito, Kazuhiro; Tajima, Yu; Harada, Taiyo L; Araki, Yoichi; Sugimoto, Katsutoshi; Tokuuye, Koichi
2017-01-05
To evaluate the usefulness of differentiation of histological grade in hepatocellular carcinoma (HCC) using three-dimensional (3D) analysis of apparent diffusion coefficient (ADC) histograms retrospectively. The subjects consisted of 53 patients with 56 HCCs. The subjects included 12 well-differentiated, 35 moderately differentiated, and nine poorly differentiated HCCs. Diffusion-weighted imaging (b-values of 100 and 800 s/mm(2)) were obtained within 3 months before surgery. Regions of interest (ROIs) covered the entire tumor. The data acquired from each slice were summated to derive voxel-by-voxel ADCs for the entire tumor. The following parameters were derived from the ADC histogram: mean, standard deviation, minimum, maximum, mode, percentiles (5th, 10th, 25th, 50th, 75th, and 90th), skew, and kurtosis. These parameters were analyzed according to histological grade. After eliminating steatosis lesions, these parameters were re-analyzed. A weak correlation was observed in minimum ADC and 5th percentile for each histological grade (r = -0.340 and r = -0.268, respectively). The minimum ADCs of well, moderately, and poorly differentiated HCC were 585 ± 388, 411 ± 278, and 235 ± 102 × 10(-6) mm(2)/s, respectively. Minimum ADC showed significant differences among tumor histological grades (P = 0.009). The minimum ADC of poorly differentiated HCC and that of combined well and moderately differentiated HCC were 236 ± 102 and 437 ± 299 × 10(-6) mm(2)/s. The minimum ADC of poorly differentiated HCC was significantly lower than that of combined well and moderately differentiated HCC (P = 0.001). The sensitivity and specificity, when a minimum ADC of 400 × 10(-6) mm(2)/s or lower was considered to be poorly differentiated HCC, were 100 and 54%, respectively. After exclusion of the effect of steatosis, the sensitivity and specificity did not change, although the statistical differences became strong (P < 0
Heterogeneity of asthma–COPD overlap syndrome
Joo, Hyonsoo; Han, Deokjae; Lee, Jae Ha; Rhee, Chin Kook
2017-01-01
Many patients suffering from asthma or COPD have overlapping features of both diseases. However, a phenotypical approach for evaluating asthma–COPD overlap syndrome (ACOS) has not been established. In this report, we examined the phenotypes in patients with ACOS. Patients diagnosed with ACOS between 2011 and 2015 were identified and classified into four phenotype groups. Group A was composed of patients who smoked <10 pack years and had blood eosinophil counts ≥300. Group B was composed of patients who smoked <10 pack years and had blood eosinophil counts <300. Group C was composed of patients who smoked ≥10 pack years and had blood eosinophil counts ≥300. Group D was composed of patients who smoked <10 pack years and had blood eosinophil counts <300. Clinical characteristics were analyzed and compared among groups. Comparisons were made among 103 ACOS patients. Patients in group D were oldest, while patients in group A were youngest. There were relatively more female patients in groups A and B; the majority of patients in groups C and D were male. The degree of airflow obstruction was most severe in group C. The rate of being free of severe exacerbation was significantly lower in group C than in the other groups. In this study, each ACOS phenotype showed different characteristics. The proportion of patients free of severe exacerbation differed significantly among groups. At this time, further studies on the phenotypes of ACOS are required.
Adaptive overlapped sub-blocks contrast enhancement
Chen, Anqiu; Yuan, Fei; Liu, Jing; Liu, Siqi; Li, An; Zheng, Zhenrong
2016-09-01
In this paper, an overlapped sub-block gray-level average method for contrast enhancement is presented. The digital image correction of uneven illumination under microscope transmittance is a problem in image processing, also sometimes the image in the dark place need to correct the uneven problem. A new correction method was proposed based on the mask method and sub-blocks gray-level average method because Traditional mask method and background fitting method are restricted due to application scenarios, and the corrected image brightness is low by using background fitting method, so it has some limitations of the application. In this paper, we introduce a new method called AOSCE for image contrast enhancement. The image is divided into many sub-blocks which are overlapped, calculate the average gray-level of the whole image as M and the calculate the average gray-level of each one as mi, next for each block it can get d = mi - m, each block minus d to get a new image, and then get the minimum gray-level of each block into a matrix DD to get the background, and use bilinearity to get the same scale of the image. over fitting the image in matlab in order to get smoother image, then minus the background to get the contrast enhancement image.
Symptom overlap in anxiety and multiple sclerosis.
LENUS (Irish Health Repository)
O Donnchadha, Seán
2013-02-14
BACKGROUND: The validity of self-rated anxiety inventories in people with multiple sclerosis (pwMS) is unclear. However, the appropriateness of self-reported depression scales has been widely examined. Given somatic symptom overlap between depression and MS, research emphasises caution when using such scales. OBJECTIVE: This study evaluates symptom overlap between anxiety and MS in a group of 33 individuals with MS, using the Beck Anxiety Inventory (BAI). METHODS: Participants underwent a neurological examination and completed the BAI. RESULTS: A novel procedure using hierarchical cluster analysis revealed three distinct symptom clusters. Cluster one (\\'wobbliness\\' and \\'unsteady\\') grouped separately from all other BAI items. These symptoms are well-recognised MS-related symptoms and we question whether their endorsement in pwMS can be considered to reflect anxiety. A modified 19-item BAI (mBAI) was created which excludes cluster one items. This removal reduced the number of MS participants considered \\'anxious\\' by 21.21% (low threshold) and altered the level of anxiety severity for a further 27.27%. CONCLUSION: Based on these data, it is suggested that, as with depression measures, researchers and clinicians should exercise caution when using brief screening measures for anxiety in pwMS.
Diffuse interstitial lung disease: overlaps and uncertainties
Energy Technology Data Exchange (ETDEWEB)
Walsh, Simon L.F.; Hansell, David M. [Royal Brompton Hospital, Department of Radiology, London (United Kingdom)
2010-08-15
Histopathological analysis of lung biopsy material allows the diagnosis of idiopathic interstitial pneumonias; however, the strength of this diagnosis is sometimes subverted by interobserver variation and sampling. The American Thoracic Society and European Respiratory Society recommendations of 2002 provide a framework for the diagnosis of interstitial lung disease (ILD) and proposed an integrated clinical, radiological and histopathological approach. These recommendations represent a break with tradition by replacing the 'gold standard' of histopathology with the combined 'silver standards' of clinical, imaging and histopathological information. One of the pitfalls of a rigid classification system for the diagnosis of interstitial lung disease is its failure to accommodate the phenomenon of overlapping disease patterns. This article reviews the various ways that interstitial lung disease may be classified and discusses their applicability. In addition the issue of overlap disease patterns is considered in the context of histopathological interobserver variation and sampling error and how a pigeonhole approach to disease classification may overlook these hybrid entities. (orig.)
Activation of words with phonological overlap
Directory of Open Access Journals (Sweden)
Claudia K. Friedrich
2013-08-01
Full Text Available Multiple lexical representations overlapping with the input (cohort neighbors are temporarily activated in the listener’s mental lexicon when speech unfolds in time. Activation for cohort neighbors appears to rapidly decline as soon as there is mismatch with the input. However, it is a matter of debate whether or not they are completely excluded from further processing. We recorded behavioral data and event-related brain potentials (ERPs in auditory-visual word onset priming during a lexical decision task. As primes we used the first two syllables of spoken German words. In a carrier word condition, the primes were extracted from spoken versions of the target words (ano-ANORAK 'anorak'. In a cohort neighbor condition, the primes were taken from words that overlap with the target word up to the second nucleus (ana- taken from ANANAS 'pineapple'. Relative to a control condition, where primes and targets were unrelated, lexical decision responses for cohort neighbors were delayed. This reveals that cohort neighbors are disfavored by the decision processes at the behavioral front end. In contrast, left-anterior ERPs reflected long-lasting facilitated processing of cohort neighbors. We interpret these results as evidence for extended parallel processing of cohort neighbors. That is, in parallel to the preparation and elicitation of delayed lexical decision responses to cohort neighbors, aspects of the processing system appear to keep track of those less efficient candidates.
Directory of Open Access Journals (Sweden)
Shapovalov S. N.
2009-10-01
Full Text Available We have found that the shape of the histograms, constructed on the basis of the results of radioactivity measurements, changes in correlation with the distortions of the lunar Keplerian orbit (due to the gravitational influence of the Sun. Taking into account that the phenomenon of "macroscopic fluctuations" (regular changes in the fine structure of histograms constructed from the results of measurements of natural processes does not depend on the nature of the process under study, one can consider the correlation of the histogram shape with the Moon's deviations from the Keplerian orbit to be independent from the nature of the process the histograms were obtained on.
Directory of Open Access Journals (Sweden)
Shapovalov S. N.
2009-10-01
Full Text Available We have found that the shape of the histograms, constructed on the basis of the results of radioactivity measurements, changes in correlation with the distortions of the lunar Keplerian orbit (due to the gravitational influence of the Sun. Taking into account that the phenomenon of “macroscopic fluctuations” (regular changes in the fine structure of histograms constructed from the results of measurements of natural processes does not depend on the nature of the process under study, one can consider the correlation of the histogram shape with the Moon’s deviations from the Keplerian orbit to be independent from the nature of the process the histograms were obtained on.
Wavelength-Adaptive Dehazing Using Histogram Merging-Based Classification for UAV Images
Directory of Open Access Journals (Sweden)
Inhye Yoon
2015-03-01
Full Text Available Since incoming light to an unmanned aerial vehicle (UAV platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i image segmentation based on geometric classes; (ii generation of the context-adaptive transmission map; and (iii intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.
Wavelength-adaptive dehazing using histogram merging-based classification for UAV images.
Yoon, Inhye; Jeong, Seokhwa; Jeong, Jaeheon; Seo, Doochun; Paik, Joonki
2015-03-19
Since incoming light to an unmanned aerial vehicle (UAV) platform can be scattered by haze and dust in the atmosphere, the acquired image loses the original color and brightness of the subject. Enhancement of hazy images is an important task in improving the visibility of various UAV images. This paper presents a spatially-adaptive dehazing algorithm that merges color histograms with consideration of the wavelength-dependent atmospheric turbidity. Based on the wavelength-adaptive hazy image acquisition model, the proposed dehazing algorithm consists of three steps: (i) image segmentation based on geometric classes; (ii) generation of the context-adaptive transmission map; and (iii) intensity transformation for enhancing a hazy UAV image. The major contribution of the research is a novel hazy UAV image degradation model by considering the wavelength of light sources. In addition, the proposed transmission map provides a theoretical basis to differentiate visually important regions from others based on the turbidity and merged classification results.
A 124 Mpixels/s VLSI design for histogram-based joint bilateral filtering.
Tseng, Yu-Cheng; Hsu, Po-Hsiung; Chang, Tian-Sheuan
2011-11-01
This paper presents an efficient and scalable design for histogram-based bilateral filtering (BF) and joint BF (JBF) by memory reduction methods and architecture design techniques to solve the problems of high memory cost, high computational complexity, high bandwidth, and large range table. The presented memory reduction methods exploit the progressive computing characteristics to reduce the memory cost to 0.003%-0.020%, as compared with the original approach. Furthermore, the architecture design techniques adopt range domain parallelism and take advantage of the computing order and the numerical properties to solve the complexity, bandwidth, and range-table problems. The example design with a 90-nm complementary metal-oxide-semiconductor process can deliver the throughput to 124 Mpixels/s with 356-K gate counts and 23-KB on-chip memory.
Ship detection and extraction using visual saliency and histogram of oriented gradient
Xu, Fang; Liu, Jing-hong
2016-11-01
A novel unsupervised ship detection and extraction method is proposed. A combination model based on visual saliency is constructed for searching the ship target regions and suppressing the false alarms. The salient target regions are extracted and marked through segmentation. Radon transform is applied to confirm the suspected ship targets with symmetry profiles. Then, a new descriptor, improved histogram of oriented gradient (HOG), is introduced to discriminate the real ships. The experimental results on real optical remote sensing images demonstrate that plenty of ships can be extracted and located successfully, and the number of ships can be accurately acquired. Furthermore, the proposed method is superior to the contrastive methods in terms of both accuracy rate and false alarm rate.
Scale and Orientation-Based Background Weighted Histogram for Human Tracking
Laaroussi, Khadija; Saaidi, Abderrahim; Masrar, Mohamed; Satori, Khalid
2016-09-01
The Mean Shift procedure is a popular object tracking algorithm since it is fast, easy to implement and performs well in a range of conditions. However, classic Mean Shift tracking algorithm fixes the size and orientation of the tracking window, which limits the performance when the target's orientation and scale change. In this paper, we present a new human tracking algorithm based on Mean Shift technique in order to estimate the position, scale and orientation changes of the target. This work combines moment features of the weight image with background information to design a robust tracking algorithm entitled Scale and Orientation-based Background Weighted Histogram (SOBWH). The experimental results show that the proposed approach SOBWH presents a good compromise between tracking precision and calculation time, also they validate its robustness, especially to large background variation, scale and orientation changes and similar background scenes.
Ho, K. C.; Gader, P. D.; Frigui, H.; Wilson, J. N.
2007-04-01
This paper examines the confidence level fusion of several promising algorithms for the vehiclemounted ground penetrating radar landmine detection system. The detection algorithms considered here include Edge Histogram Descriptor (EHD), Hidden Markov Model (HMM), Spectral Correlation Feature (SCF) and NUKEv6. We first form a confidence vector by collecting the confidence values from the four individual detectors. The fused confidence is assigned to be the difference in the square of the Mahalanobis distance to the non-mine class and the square of the Mahalanobis distance to the mine class. Experimental results on a data collection that contains over 1500 mine encounters indicate that the proposed fusion technique can reduce the false alarm rate by a factor of two at 90% probability of detection when compared to the best individual detector.
Directory of Open Access Journals (Sweden)
Manoj Kumar Sharma,
2014-05-01
Full Text Available Steganography is the art of hiding the fact that communication is taking place, by hiding information in other information. The motivation for this work includes provision of protection of information during transmission without any detection of information. We also propose a new technique namely „A New Steganography Technique Based on Difference Scheme of RGB Channels and Text Using Histogram Analysis‟. The proposed technique can encode any colored image files in order to protect confidential text data from unauthorized access. It can be applied to a very small images (24 × 24 as well as large images (1024 × 1024. We use Image quality parameters Peak Signal to Noise Ratio and Mean Square Error. The proposed method has higher PSNR value and lower MSE values. The PSNR value is greater than 50 and MSE value is in fractions.
Context-sensitive patch histograms for detecting rare events in histopathological data
Diaz, Kristians; Baust, Maximilian; Navab, Nassir
2017-03-01
Assessment of histopathological data is not only difficult due to its varying appearance, e.g. caused by staining artifacts, but also due to its sheer size: Common whole slice images feature a resolution of 6000x4000 pixels. Therefore, finding rare events in such data sets is a challenging and tedious task and developing sophisticated computerized tools is not easy, especially when no or little training data is available. In this work, we propose learning-free yet effective approach based on context sensitive patch-histograms in order to find extramedullary hematopoiesis events in Hematoxylin-Eosin-stained images. When combined with a simple nucleus detector, one can achieve performance levels in terms of sensitivity 0.7146, specificity 0.8476 and accuracy 0.8353 which are very well comparable to a recently published approach based on random forests.
Divakaran, Sajilal
2012-01-01
The success rates of Optical Character Recognition (OCR) systems for printed Malayalam documents is quite impressive with the state of the art accuracy levels in the range of 85-95% for various. However for real applications, further enhancement of this accuracy levels are required. One of the bottle necks in further enhancement of the accuracy is identified as close-matching characters. In this paper, we delineate the close matching characters in Malayalam and report the development of a specialised classifier for these close-matching characters. The output of a state of the art of OCR is taken and characters falling into the close-matching character set is further fed into this specialised classifier for enhancing the accuracy. The classifier is based on support vector machine algorithm and uses feature vectors derived out of spectral coefficients of projection histogram signals of close-matching characters.
A hardware-oriented histogram of oriented gradients algorithm and its VLSI implementation
Zhang, Xiangyu; An, Fengwei; Nakashima, Ikki; Luo, Aiwen; Chen, Lei; Ishii, Idaku; Jürgen Mattausch, Hans
2017-04-01
A challenging and important issue for object recognition is feature extraction on embedded systems. We report a hardware implementation of the histogram of oriented gradients (HOG) algorithm for real-time object recognition, which is known to provide high efficiency and accuracy. The developed hardware-oriented algorithm exploits the cell-based scan strategy which enables image-sensor synchronization and extraction-speed acceleration. Furthermore, buffers for image frames or integral images are avoided. An image-size scalable hardware architecture with an effective bin-decoder and a parallelized voting element (PVE) is developed and used to verify the hardware-oriented HOG implementation with the application of human detection. The fabricated test chip in 180 nm CMOS technology achieves fast processing speed and large flexibility for different image resolutions with substantially reduced hardware cost and energy consumption.
REAL-TIME FACE RECOGNITION BASED ON OPTICAL FLOW AND HISTOGRAM EQUALIZATION
Directory of Open Access Journals (Sweden)
D. Sathish Kumar
2013-05-01
Full Text Available Face recognition is one of the intensive areas of research in computer vision and pattern recognition but many of which are focused on recognition of faces under varying facial expressions and pose variation. A constrained optical flow algorithm discussed in this paper, recognizes facial images involving various expressions based on motion vector computation. In this paper, an optical flow computation algorithm which computes the frames of varying facial gestures, and integrating with synthesized image in a probabilistic environment has been proposed. Also Histogram Equalization technique has been used to overcome the effect of illuminations while capturing the input data using camera devices. It also enhances the contrast of the image for better processing. The experimental results confirm that the proposed face recognition system is more robust and recognizes the facial images under varying expressions and pose variations more accurately.
Gender Perception From Faces Using Boosted LBPH (Local Binary Patten Histograms
Directory of Open Access Journals (Sweden)
U. U. Tariq
2013-06-01
Full Text Available Automatic Gender classification from faces has several applications such as surveillance, human computer interaction, targeted advertisement etc. Humans can recognize gender from faces quite accurately but for computer vision it is a difficult task. Many studies have targeted this problem but most of these studies used images of faces taken under constrained conditions. Real-world applications however require to process images from real-world, that have significant variation in lighting and pose, which makes the gender classification task very difficult. We have examined the problem of automatic gender classification from faces on real-world images. Using a face detector faces from images are extracted aligned and represented using Local binary pattern histogram. Discriminative features are selected using Adaboost and the boosted LBP features are used to train a support vector machine that provides a recognition rate of 93.29%.
Monitoring pig movement at the slaughterhouse using optical flow and modified angular histograms
DEFF Research Database (Denmark)
Gronskyte, Ruta; Clemmensen, Line Katrine Harder; Hviid, Marchen Sonja
2016-01-01
use the OF vectors to describe points of movement on all pigs and thereby analyse the herd movement. Subsequently, the OF vectors are used to identify abnormal movements of individual pigs. The OF vectors, obtained from the pigs, point in multiple directions rather than in one movement direction....... To accommodate the multiple directions of the OF vectors, we propose to quantify OF using a summation of the vectors into bins according to their angles, which we call modified angular histograms. Sequential feature selection is used to select angle ranges, which identify pigs that are moving abnormally...... in the herd. The vector lengths from the selected angle ranges are compared to the corresponding median, 25th and 75th percentiles from a training set, which contains only normally moving pigs. We show that the method is capable of locating stationary pigs in the recordings regardless of the number of pigs...
Zhu, Fangqiang; Hummer, Gerhard
2012-02-05
The weighted histogram analysis method (WHAM) has become the standard technique for the analysis of umbrella sampling simulations. In this article, we address the challenges (1) of obtaining fast and accurate solutions of the coupled nonlinear WHAM equations, (2) of quantifying the statistical errors of the resulting free energies, (3) of diagnosing possible systematic errors, and (4) of optimally allocating of the computational resources. Traditionally, the WHAM equations are solved by a fixed-point direct iteration method, despite poor convergence and possible numerical inaccuracies in the solutions. Here, we instead solve the mathematically equivalent problem of maximizing a target likelihood function, by using superlinear numerical optimization algorithms with a significantly faster convergence rate. To estimate the statistical errors in one-dimensional free energy profiles obtained from WHAM, we note that for densely spaced umbrella windows with harmonic biasing potentials, the WHAM free energy profile can be approximated by a coarse-grained free energy obtained by integrating the mean restraining forces. The statistical errors of the coarse-grained free energies can be estimated straightforwardly and then used for the WHAM results. A generalization to multidimensional WHAM is described. We also propose two simple statistical criteria to test the consistency between the histograms of adjacent umbrella windows, which help identify inadequate sampling and hysteresis in the degrees of freedom orthogonal to the reaction coordinate. Together, the estimates of the statistical errors and the diagnostics of inconsistencies in the potentials of mean force provide a basis for the efficient allocation of computational resources in free energy simulations.
Energy Technology Data Exchange (ETDEWEB)
Koyama, Hisanobu [Department of Radiology, Hyogo Kaibara Hospital, 5208-1 Kaibara, Kaibara-cho, Tanba 669-3395 (Japan)], E-mail: hisanobu19760104@yahoo.co.jp; Ohno, Yoshiharu [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan)], E-mail: yosirad@kobe-u.ac.jp; Yamazaki, Youichi [Department of Medical Physics and Engineering, Faculty of Health Sciences, Graduate School of Medicine, Osaka University, 1-7 Yamadaoka, Suita 565-0871 (Japan)], E-mail: y.yamazk@sahs.med.osaka-u.ac.jp; Nogami, Munenobu [Division of PET, Institute of Biomedical Research and Innovation, 2-2 MInamimachi, Minatojima, Chu0-ku, Kobe 650-0047 (Japan)], E-mail: aznogami@fbri.org; Kusaka, Akiko [Division of Radiology, Kobe University Hospital, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan)], E-mail: a.kusaka@hosp.kobe-u.ac.jp; Murase, Kenya [Department of Medical Physics and Engineering, Faculty of Health Sciences, Graduate School of Medicine, Osaka University, 1-7 Yamadaoka, Suita 565-0871 (Japan)], E-mail: murase@sahs.med.osaka-u.ac.jp; Sugimura, Kazuro [Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan)], E-mail: sugimura@med.kobe-u.ac.jp
2010-04-15
This study aimed the influences of reconstruction algorithm for quantitative assessments in interstitial pneumonia patients. A total of 25 collagen vascular disease patients (nine male patients and 16 female patients; mean age, 57.2 years; age range 32-77 years) underwent thin-section MDCT examinations, and MDCT data were reconstructed with three kinds of reconstruction algorithm (two high-frequencies [A and B] and one standard [C]). In reconstruction algorithm B, the effect of low- and middle-frequency space was suppressed compared with reconstruction algorithm A. As quantitative CT parameters, kurtosis, skewness, and mean lung density (MLD) were acquired from a frequency histogram of the whole lung parenchyma in each reconstruction algorithm. To determine the difference of quantitative CT parameters affected by reconstruction algorithms, these parameters were compared statistically. To determine the relationships with the disease severity, these parameters were correlated with PFTs. In the results, all the histogram parameters values had significant differences each other (p < 0.0001) and those of reconstruction algorithm C were the highest. All MLDs had fair or moderate correlation with all parameters of PFT (-0.64 < r < -0.45, p < 0.05). Though kurtosis and skewness in high-frequency reconstruction algorithm A had significant correlations with all parameters of PFT (-0.61 < r < -0.45, p < 0.05), there were significant correlations only with diffusing capacity of carbon monoxide (DLco) and total lung capacity (TLC) in reconstruction algorithm C and with forced expiratory volume in 1 s (FEV1), DLco and TLC in reconstruction algorithm B. In conclusion, reconstruction algorithm has influence to quantitative assessments on chest thin-section MDCT examination in interstitial pneumonia patients.
Seismic remote sensing image segmentation based on spectral histogram and dynamic region merging
Wang, Peng; Sun, Genyun; Wang, Zhenjie
2015-12-01
Image segmentation is the foundation of seismic information extraction from high-resolution remote sensing images. While the complexity of the seismic image brings great challenges to its segmentation. Compared with the traditional pixel-level approaches, the region-level approaches are found prevailing in dealing with the complexity. This paper addresses the seismic image segmentation problem in a region-merging style. Starting from many over-segmented regions, the image segmentation is performed by iteratively merging the neighboring regions. In the proposed algorithm, the merging criterion and merging order are two essential issues to be emphatically considered. An effective merging criterion is largely depends on the region feature and neighbor homogeneity measure. The region's spectral histogram represents the global feature of each region and enhances the discriminability of neighboring regions. Therefore, we utilize it to solve the merging criterion. Under a certain the merging criterion, a better performance could be obtained if the most similar regions are always ensured to be merged first, which can be transformed into a least-cost problem. Rather than predefine an order queue, we solve the order problem with a dynamic scheme. The proposed approach mainly contains three parts. Firstly, starting from the over-segmented regions, the spectral histograms are constructed to represent each region. Then, we use the homogeneity that combines the distance and shape measure to conduct the merge criterion. Finally, neighbor regions are dynamically merged following the dynamic program (DP) theory and breadth-first strategy. Experiments are conducted using the earthquake images, including collapsed buildings and seismic secondary geological disaster. The experimental results show that, the proposed method segments the seismic image more correctly.
2D matrix based indexing with color spectral histogram for efficient image retrieval
Institute of Scientific and Technical Information of China (English)
Maruthamuthu Ramasamy; John Sanjeev Kumar Athisayam
2016-01-01
A novel content based image retrieval (CBIR) algorithm using relevant feedback is presented. The proposed framework has three major contributions: a novel feature descriptor cal ed color spectral histogram (CSH) to measure the similarity between images; two-dimensional matrix based indexing approach pro-posed for short-term learning (STL);and long-term learning (LTL). In general, image similarities are measured from feature repre-sentation which includes color quantization, texture, color, shape and edges. However, CSH can describe the image feature only with the histogram. Typical y the image retrieval process starts by finding the similarity between the query image and the images in the database; the major computation involved here is that the selection of top ranking images requires a sorting algorithm to be employed at least with the lower bound of O(n log n). A 2D ma-trix based indexing of images can enormously reduce the search time in STL. The same structure is used for LTL with an aim to reduce the amount of log to be maintained. The performance of the proposed framework is analyzed and compared with the exist-ing approaches, the quantified results indicates that the proposed feature descriptor is more effectual than the existing feature de-scriptors that were original y developed for CBIR. In terms of STL, the proposed 2D matrix based indexing minimizes the computation effort for retrieving similar images and for LTL, the proposed algo-rithm takes minimum log information than the existing approaches.
Development of a Quality Assurance Procedure for Dose Volume Histogram Analysis
Davenport, David A.
The role of the dose-volume histogram (DVH) is rapidly expanding in radiation oncology treatment planning. DVHs are already relied upon to differentiate between two similar plans and evaluate organ-at-risk dosage. Their role will become even more important as progress continues towards implementing biologically based treatment planning systems. Therefore it is imperative that the accuracy of DVHs is evaluated and reappraised after any major software or hardware upgrades, affecting a treatment planning system (TPS). The purpose of this work is to create and implement a comprehensive quality assurance procedure evaluating dose volume histograms to insure their accuracy while satisfying American College of Radiology guidelines. Virtual phantoms of known volumes were created in Pinnacle TPS and exposed to different beam arrangements. Variables including grid size and slice thickness were varied and their effects were analyzed. The resulting DVHs were evaluated by comparison to the commissioned percent depth dose values using a custom Excel spreadsheet. After determining the uncertainty of the DVH based on these variables, multiple second check calculations were performed using MIM Maestro and Matlab software packages. The uncertainties of the DVHs were shown to be less than +/- 3%. The average uncertainty was shown to be less than +/- 1%. The second check procedures resulted in mean percent differences less than 1% which confirms the accuracy of DVH calculation in Pinnacle and the effectiveness of the quality assurance template. The importance of knowing the limits of accuracy of the DVHs, which are routinely used to assess the quality of clinical treatment plans, cannot be overestimated. The developed comprehensive QA procedure evaluating the accuracy of the DVH statistical analysis will become a part of our clinical arsenal for periodic tests of the treatment planning system. It will also be performed at the time of commissioning and after any major software
Phonological and Orthographic Overlap Effects in Fast and Masked Priming
Frisson, Steven; Bélanger, Nathalie N.; Rayner, Keith
2014-01-01
We investigated how orthographic and phonological information is activated during reading, using a fast priming task, and during single word recognition, using masked priming. Specifically, different types of overlap between prime and target were contrasted: high orthographic and high phonological overlap (track-crack), high orthographic and low phonological overlap (bear-gear), or low orthographic and high phonological overlap (fruit-chute). In addition, we examined whether (orthographic) beginning overlap (swoop-swoon) yielded the same priming pattern as end (rhyme) overlap (track-crack). Prime durations were 32 and 50ms in the fast priming version, and 50ms in the masked priming version, and mode of presentation (prime and target in lower case) was identical. The fast priming experiment showed facilitatory priming effects when both orthography and phonology overlapped, with no apparent differences between beginning and end overlap pairs. Facilitation was also found when prime and target only overlapped orthographically. In contrast, the masked priming experiment showed inhibition for both types of end overlap pairs (with and without phonological overlap), and no difference for begin overlap items. When prime and target only shared principally phonological information, facilitation was only found with a long prime duration in the fast priming experiment, while no differences were found in the masked priming version. These contrasting results suggest that fast priming and masked priming do not necessarily tap into the same type of processing. PMID:24365065
Phonological and orthographic overlap effects in fast and masked priming.
Frisson, Steven; Bélanger, Nathalie N; Rayner, Keith
2014-01-01
We investigated how orthographic and phonological information is activated during reading, using a fast priming task, and during single-word recognition, using masked priming. Specifically, different types of overlap between prime and target were contrasted: high orthographic and high phonological overlap (track-crack), high orthographic and low phonological overlap (bear-gear), or low orthographic and high phonological overlap (fruit-chute). In addition, we examined whether (orthographic) beginning overlap (swoop-swoon) yielded the same priming pattern as end (rhyme) overlap (track-crack). Prime durations were 32 and 50 ms in the fast priming version and 50 ms in the masked priming version, and mode of presentation (prime and target in lower case) was identical. The fast priming experiment showed facilitatory priming effects when both orthography and phonology overlapped, with no apparent differences between beginning and end overlap pairs. Facilitation was also found when prime and target only overlapped orthographically. In contrast, the masked priming experiment showed inhibition for both types of end overlap pairs (with and without phonological overlap) and no difference for begin overlap items. When prime and target only shared principally phonological information, facilitation was only found with a long prime duration in the fast priming experiment, while no differences were found in the masked priming version. These contrasting results suggest that fast priming and masked priming do not necessarily tap into the same type of processing.
[Syndrome overlap: autoimmune hepatitis and autoimmune cholangitis].
Guerra Montero, Luis; Ortega Alvarez, Félix; Marquez Teves, Maguin; Asato Higa, Carmen; Sumire Umeres, Julia
2016-01-01
Autoimmune hepatitis, primary biliary cirrhosis, primary sclerosing cholangitis and autoimmune cholangitis are chronic autoimmune liver disease, usually present separate, the cases where characteristics of two of the above is observed liver disease is commonly referred to as Overlap Syndromes (OS). Although there is no consensus on specific criteria for the diagnosis of OS identification of this association is important for initiating appropriate treatment and prevent its progression to cirrhosis or at least the complications of cirrhosis and death. We report the case of awoman aged 22 cirrhotic which debuted are edematous ascites, severe asthenia and jaundice compliant diagnostics SS criteria and initially present any response to treatment with ursodeoxycholic acid and oral corticosteroids, but ultimately finished performing a transplant orthotopic liver.
Recombining overlapping BACs into single large BACs.
Kotzamanis, George; Kotsinas, Athanassios
2015-01-01
BAC clones containing the entire genomic region of a gene including the long-range regulatory elements are very useful for gene functional analysis. However, large genes often span more than the insert of a BAC clone, and single BACs covering the entire region of interest are not available. Here, we describe a general system for linking two or more overlapping BACs into a single clone. Two rounds of homologous recombination are used. In the first, the BAC inserts are subcloned into the pBACLink vectors. In the second, the two BACs are combined together. Multiple BACs in a contig can be combined by alternating use of the pBACLInk vectors, resulting in several BAC clones containing as much of the genomic region of a gene as required. Such BACs can then be used in gene expression studies and/or gene therapy applications.
Social externalities, overlap and the poverty trap.
Kim, Young-Chul; Loury, Glenn C
2014-12-01
Previous studies find that some social groups are stuck in poverty traps because of network effects. However, these studies do not carefully analyze how these groups overcome low human capital investment activities. Unlike previous studies, the model in this paper includes network externalities in both the human capital investment stage and the subsequent career stages. This implies that not only the current network quality, but also the expectations about future network quality affect the current investment decision. Consequently, the coordinated expectation among the group members can play a crucial role in the determination of the final state. We define "overlap" for some initial skill ranges, whereby the economic performance of a group can be improved simply by increasing expectations of a brighter future. We also define "poverty trap" for some ranges, wherein a disadvantaged group is constrained by its history, and we explore the egalitarian policies to mobilize the group out of the trap.
Geometrical constraint experimental determination of Raman lidar overlap profile.
Li, Jian; Li, Chengcai; Zhao, Yiming; Li, Jing; Chu, Yiqi
2016-06-20
A simple experimental method to determine the overlap profile of Raman lidar is presented in this paper. Based on Mie and Raman backscattering signals and a geometrically constrained condition, the overlap profile of a Raman lidar system can be determined. Our approach simultaneously retrieves the lidar ratio of aerosols, which is one of the most important sources of uncertainty in the overlap profile determination. The results indicate that the overlap factor is significantly influenced by the lidar ratio in experimental methods. A representative case study indicates that the correction of the overlap profile obtained by this method is practical and feasible.
Cooper, Linda L.; Shore, Felice S.
2008-01-01
This paper identifies and discusses misconceptions that students have in making judgments of center and variability when data are presented graphically. An assessment addressing interpreting center and variability in histograms and stem-and-leaf plots was administered to, and follow-up interviews were conducted with, undergraduates enrolled in…
Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru
2016-05-10
The TaBoo SeArch (TBSA) algorithm [ Harada et al. J. Comput. Chem. 2015 , 36 , 763 - 772 and Harada et al. Chem. Phys. Lett. 2015 , 630 , 68 - 75 ] was recently proposed as an enhanced conformational sampling method for reproducing biologically relevant rare events of a given protein. In TBSA, an inverse histogram of the original distribution, mapped onto a set of reaction coordinates, is constructed from trajectories obtained by multiple short-time molecular dynamics (MD) simulations. Rarely occurring states of a given protein are statistically selected as new initial states based on the inverse histogram, and resampling is performed by restarting the MD simulations from the new initial states to promote the conformational transition. In this process, the definition of the inverse histogram, which characterizes the rarely occurring states, is crucial for the efficiency of TBSA. In this study, we propose a simple modification of the inverse histogram to further accelerate the convergence of TBSA. As demonstrations of the modified TBSA, we applied it to (a) hydrogen bonding rearrangements of Met-enkephalin, (b) large-amplitude domain motions of Glutamine-Binding Protein, and (c) folding processes of the B domain of Staphylococcus aureus Protein A. All demonstrations numerically proved that the modified TBSA reproduced these biologically relevant rare events with nanosecond-order simulation times, although a set of microsecond-order, canonical MD simulations failed to reproduce the rare events, indicating the high efficiency of the modified TBSA.
Sarria, D.; Blelly, P.-L.; Briggs, M. S.; Forme, F.
2016-05-01
Terrestrial gamma-ray flashes are bursts of X/gamma photons, correlated to thunderstorms. By interacting with the atmosphere, the photons produce a substantial number of electrons and positrons. Some of these reach a sufficiently high altitude that their interactions with the atmosphere become negligible, and they are then guided by geomagnetic field lines, forming a Terrestrial Electron Beam. On 9 December 2009, the Gamma-Ray Burst Monitor (GBM) instrument on board the Fermi Space Telescope made a particularly interesting measurement of such an event. To study this type of event in detail, we perform Monte-Carlo simulations and focus on the resulting time histograms. In agreement with previous work, we show that the histogram measured by Fermi GBM is reproducible from a simulation. We then show that the time histogram resulting from this simulation is only weakly dependent on the production altitude, duration, beaming angle, and spectral shape of the associated terrestrial gamma-ray flash. Finally, we show that the time histogram can be decomposed into three populations of leptons, coming from the opposite hemisphere, and mirroring back to the satellite with or without interacting with the atmosphere, and that these populations can be clearly distinguished by their pitch angles.
Burzotta, Francesco; Siviglia, Massimo; Altamura, Luca; Trani, Carlo; Leone, Antonio Maria; Romagnoli, Enrico; Mazzari, Mario Attilio; Mongiardo, Rocco; Niccoli, Giampaolo; Brancati, Marta; Biondi-Zoccai, Giuseppe; Rebuzzi, Antonio Giuseppe; Schiavoni, Giovanni; Crea, Filippo
2007-02-01
Overlapping homogenous drug-eluting stents (DESs) may be used instead of overlapping bare metal stents (BMSs) to treat coronary lesions longer than available stents. Yet, no data are available on patients treated with overlapping heterogenous DESs or DESs and BMSs. We prospectively assessed 9-month clinical outcome and 6-month angiographic late loss (evaluated at 5 different lesion segments) in a consecutive series of 40 patients who received overlapping homogenous DESs (sirolimus-eluting stent [SES] or paclitaxel-eluting stent [PES]), heterogenous DESs (SES + PES), or overlapping DESs and BMSs. In 8 patients (7 with angiographic follow-up) with overlapping heterogenous DESs, no angiographic or clinical adverse event was observed. Moreover, in-segment late loss was similar to that of patients who received homogenous DESs. In 8 patients (7 with angiographic follow-up) with overlapping DESs and BMSs, there was a higher incidence of major adverse events (3 repeat percutaneous coronary interventions and 1 death, 50% adverse event rate) and worse in-segment binary restenosis rate compared with patients treated with homogenous or heterogenous DESs (p = 0.02 and 0.012, respectively). Late lumen loss at the site of stent overlap showed significant differences according to type of overlapped stent (1.00 +/- 0.76 mm in DES-BMS overlap, 0.32 +/- 0.55 mm in PES-PES overlap, 0.13 +/- 0.11 in SES-PES overlap, and 0.08 +/- 0.10 mm in SES-SES overlap, p = 0.005). In conclusion, the present study suggests that overlap of DESs and BMSs should be avoided because the antirestenotic effect of DESs is skewed by contiguous BMS implantation. Overlap between SESs and PESs in this very preliminary report was associated with no specific adverse event.
Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit
2017-02-01
To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD2) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD2 verification with pair t-test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D90%, 0.56% in the bladder, 1.74% in the rectum when determined by D2cc, and less than 1% in Pinnacle. The difference in the EQD2 between the software calculation and the manual calculation was not significantly different with 0.00% at p-values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.
Depression-Burnout Overlap in Physicians
Wurm, Walter; Vogel, Katrin; Holl, Anna; Ebner, Christoph; Bayer, Dietmar; Mörkl, Sabrina; Szilagyi, Istvan-Szilard; Hotter, Erich; Kapfhammer, Hans-Peter; Hofmann, Peter
2016-01-01
Background Whether burnout is a distinct phenomenon rather than a type of depression and whether it is a syndrome, limited to three “core” components (emotional exhaustion, depersonalization and low personal accomplishment) are subjects of current debate. We investigated the depression-burnout overlap, and the pertinence of these three components in a large, representative sample of physicians. Methods In a cross-sectional study, all Austrian physicians were invited to answer a questionnaire that included the Major Depression Inventory (MDI), the Hamburg Burnout Inventory (HBI), as well as demographic and job-related parameters. Of the 40093 physicians who received an invitation, a total of 6351 (15.8%) participated. The data of 5897 participants were suitable for analysis. Results Of the participants, 10.3% were affected by major depression. Our study results suggest that potentially 50.7% of the participants were affected by symptoms of burnout. Compared to physicians unaffected by burnout, the odds ratio of suffering from major depression was 2.99 (95% CI 2.21–4.06) for physicians with mild, 10.14 (95% CI 7.58–13.59) for physicians with moderate, 46.84 (95% CI 35.25–62.24) for physicians with severe burnout and 92.78 (95% CI 62.96–136.74) for the 3% of participants with the highest HBI_sum (sum score of all ten HBI components). The HBI components Emotional Exhaustion, Personal Accomplishment and Detachment (representing depersonalization) tend to correlate more highly with the main symptoms of major depression (sadness, lack of interest and lack of energy) than with each other. A combination of the HBI components Emotional Exhaustion, Helplessness, Inner Void and Tedium (adj.R2 = 0.92) explained more HBI_sum variance than the three “core” components (adj.R2 = 0.85) of burnout combined. Cronbach’s alpha for Emotional Exhaustion, Helplessness, Inner Void and Tedium combined was 0.90 compared to α = 0.54 for the combination of the three
Depression-Burnout Overlap in Physicians.
Directory of Open Access Journals (Sweden)
Walter Wurm
Full Text Available Whether burnout is a distinct phenomenon rather than a type of depression and whether it is a syndrome, limited to three "core" components (emotional exhaustion, depersonalization and low personal accomplishment are subjects of current debate. We investigated the depression-burnout overlap, and the pertinence of these three components in a large, representative sample of physicians.In a cross-sectional study, all Austrian physicians were invited to answer a questionnaire that included the Major Depression Inventory (MDI, the Hamburg Burnout Inventory (HBI, as well as demographic and job-related parameters. Of the 40093 physicians who received an invitation, a total of 6351 (15.8% participated. The data of 5897 participants were suitable for analysis.Of the participants, 10.3% were affected by major depression. Our study results suggest that potentially 50.7% of the participants were affected by symptoms of burnout. Compared to physicians unaffected by burnout, the odds ratio of suffering from major depression was 2.99 (95% CI 2.21-4.06 for physicians with mild, 10.14 (95% CI 7.58-13.59 for physicians with moderate, 46.84 (95% CI 35.25-62.24 for physicians with severe burnout and 92.78 (95% CI 62.96-136.74 for the 3% of participants with the highest HBI_sum (sum score of all ten HBI components. The HBI components Emotional Exhaustion, Personal Accomplishment and Detachment (representing depersonalization tend to correlate more highly with the main symptoms of major depression (sadness, lack of interest and lack of energy than with each other. A combination of the HBI components Emotional Exhaustion, Helplessness, Inner Void and Tedium (adj.R2 = 0.92 explained more HBI_sum variance than the three "core" components (adj.R2 = 0.85 of burnout combined. Cronbach's alpha for Emotional Exhaustion, Helplessness, Inner Void and Tedium combined was 0.90 compared to α = 0.54 for the combination of the three "core" components.This study demonstrates the
Hybrid lattice Boltzmann method on overlapping grids.
Di Ilio, G; Chiappini, D; Ubertini, S; Bella, G; Succi, S
2017-01-01
In this work, a hybrid lattice Boltzmann method (HLBM) is proposed, where the standard lattice Boltzmann implementation based on the Bhatnagar-Gross-Krook (LBGK) approximation is combined together with an unstructured finite-volume lattice Boltzmann model. The method is constructed on an overlapping grid system, which allows the coexistence of a uniform lattice nodes spacing and a coordinate-free lattice structure. The natural adaptivity of the hybrid grid system makes the method particularly suitable to handle problems involving complex geometries. Moreover, the provided scheme ensures a high-accuracy solution near walls, given the capability of the unstructured submodel of achieving the desired level of refinement in a very flexible way. For these reasons, the HLBM represents a prospective tool for solving multiscale problems. The proposed method is here applied to the benchmark problem of a two-dimensional flow past a circular cylinder for a wide range of Reynolds numbers and its numerical performances are measured and compared with the standard LBGK ones.
Overlapping Neural Endophenotypes in Addiction and Obesity
Directory of Open Access Journals (Sweden)
Andréanne Michaud
2017-06-01
Full Text Available Impulsivity refers to a tendency to act rapidly without full consideration of consequences. The trait is thought to result from the interaction between high arousal responses to potential rewards and poor self-control. Studies have suggested that impulsivity confers vulnerability to both addiction and obesity. However, results in this area are unclear, perhaps due to the high phenotypic complexity of addictions and obesity. Focusing on impulsivity, the aim of this review is to tackle the putative overlaps between addiction and obesity in four domains: (1 personality research, (2 neurocognitive tasks, (3 brain imaging, and (4 clinical evidence. We suggest that three impulsivity-related domains are particularly relevant for our understanding of similarities between addiction and obesity: lower self-control (high Disinhibition/low Conscientiousness, reward sensitivity (high Extraversion/Positive Emotionality, and negative affect (high Neuroticism/Negative Emotionality. Neurocognitive studies have shown that obesity and addiction are both associated with increased impulsive decision-making and attention bias in response to drug or food cues, respectively. Mirroring this, obesity and different forms of addiction seem to exhibit similar alterations in functional MRI brain activity in response to reward processing and during self-control tasks. Overall, our review provides an integrative approach to understand those facets of obesity that present similarities to addictive behaviors. In addition, we suggest that therapeutic interventions targeting inhibitory control may represent a promising approach for the prevention and/or treatment of obesity.
Overlapping Neural Endophenotypes in Addiction and Obesity
Michaud, Andréanne; Vainik, Uku; Garcia-Garcia, Isabel; Dagher, Alain
2017-01-01
Impulsivity refers to a tendency to act rapidly without full consideration of consequences. The trait is thought to result from the interaction between high arousal responses to potential rewards and poor self-control. Studies have suggested that impulsivity confers vulnerability to both addiction and obesity. However, results in this area are unclear, perhaps due to the high phenotypic complexity of addictions and obesity. Focusing on impulsivity, the aim of this review is to tackle the putative overlaps between addiction and obesity in four domains: (1) personality research, (2) neurocognitive tasks, (3) brain imaging, and (4) clinical evidence. We suggest that three impulsivity-related domains are particularly relevant for our understanding of similarities between addiction and obesity: lower self-control (high Disinhibition/low Conscientiousness), reward sensitivity (high Extraversion/Positive Emotionality), and negative affect (high Neuroticism/Negative Emotionality). Neurocognitive studies have shown that obesity and addiction are both associated with increased impulsive decision-making and attention bias in response to drug or food cues, respectively. Mirroring this, obesity and different forms of addiction seem to exhibit similar alterations in functional MRI brain activity in response to reward processing and during self-control tasks. Overall, our review provides an integrative approach to understand those facets of obesity that present similarities to addictive behaviors. In addition, we suggest that therapeutic interventions targeting inhibitory control may represent a promising approach for the prevention and/or treatment of obesity. PMID:28659866
Strange quark momentum fraction from overlap fermion
Sun, Mingyang; Liu, Keh-Fei; Gong, Ming
2015-01-01
We present a calculation of $\\langle x \\rangle_s$ for the strange quark in the nucleon. We also report the ratio of the strange $\\langle x \\rangle$ to that of $u/d$ in the disconnected insertion which will be useful in constraining the global fit of parton distribution functions at small $x$. We adopt overlap fermion action on $2 + 1$ flavor domain-wall fermion configurations on the $24^3 \\times 64$ lattice with a light sea quark mass which corresponds to $m_{\\pi}=330$ MeV. Smeared grid $Z_3$ sources are deployed to calculate the nucleon propagator with low-mode substitution. Even-odd grid sources and time-dilution technique with stochastic noises are used to calculate the high mode contribution to the quark loop. Low mode averaging (LMA) for the quark loop is applied to reduce the statistical error of the disconnected insertion calculation. We find the ratio $\\langle x \\rangle_s/\\langle x \\rangle_{u/d}^{\\mathrm{DI}}= 0.78(3)$ in this study.
BI-LEVEL CLASSIFICATION OF COLOR INDEXED IMAGE HISTOGRAMS FOR CONTENT BASED IMAGE RETRIEVAL
Directory of Open Access Journals (Sweden)
Karpagam Vilvanathan
2013-01-01
Full Text Available This dissertation proposes content based image classification and retrieval with Classification and Regression Tree (CART. A simple CBIR system (WH is designed and proved to be efficient even in the presence of distorted and noisy images. WH exhibits good performance in terms of precision, without using any intensive image processing feature extraction techniques. Unique indexed color histogram and wavelet decomposition based horizontal, vertical and diagonal image attributes have been chosen as the primary attributes in the design of the retrieval system. The output feature vectors of the WH method serve as input to the proposed decision tree based image classification and retrieval system. The performance of the proposed content based image classification and retrieval system is evaluated with the standard SIMPLIcity dataset which has been used in several previous works. The performance of the system is measured with precision as the metric. Holdout validation and k-fold cross validation are used to validate the results. The proposed system performs obviously better than SIMPLIcity and all the other compared methods.
Directory of Open Access Journals (Sweden)
Yan Koon-Kiu
2007-11-01
Full Text Available Abstract Background The evolution of the full repertoire of proteins encoded in a given genome is mostly driven by gene duplications, deletions, and sequence modifications of existing proteins. Indirect information about relative rates and other intrinsic parameters of these three basic processes is contained in the proteome-wide distribution of sequence identities of pairs of paralogous proteins. Results We introduce a simple mathematical framework based on a stochastic birth-and-death model that allows one to extract some of this information and apply it to the set of all pairs of paralogous proteins in H. pylori, E. coli, S. cerevisiae, C. elegans, D. melanogaster, and H. sapiens. It was found that the histogram of sequence identities p generated by an all-to-all alignment of all protein sequences encoded in a genome is well fitted with a power-law form ~ p-γ with the value of the exponent γ around 4 for the majority of organisms used in this study. This implies that the intra-protein variability of substitution rates is best described by the Gamma-distribution with the exponent α ≈ 0.33. Different features of the shape of such histograms allow us to quantify the ratio between the genome-wide average deletion/duplication rates and the amino-acid substitution rate. Conclusion We separately measure the short-term ("raw" duplication and deletion rates rdup∗ MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemOCai3aa0baaSqaaiabbsgaKjabbwha1jabbchaWbqaaiabgEHiQaaaaaa@3283@, rdel∗ MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemOCai3aa0baaSqaaiabbsga
Bernas, Tytus; Starosolski, Roman; Robinson, J Paul; Rajwa, Bartlomiej
2012-11-01
Modern applications of biological microscopy such as high-content screening (HCS), 4D imaging, and multispectral imaging may involve collection of thousands of images in every experiment making efficient image-compression techniques necessary. Reversible compression algorithms, when used with biological micrographs, provide only a moderate compression ratio, while irreversible techniques obtain better ratios at the cost of removing some information from images and introducing artifacts. We construct a model of noise, which is a function of signal in the imaging system. In the next step insignificant intensity levels are discarded using intensity binning. The resultant images, characterized by sparse intensity histograms, are coded reversibly. We evaluate compression efficiency of combined reversible coding and intensity depth-reduction using single-channel 12-bit light micrographs of several subcellular structures. We apply local and global measures of intensity distribution to estimate maximum distortions introduced by the proposed algorithm. We demonstrate that the algorithm provides efficient compression and does not introduce significant changes to biological micrographs. The algorithm preserves information content of these images and therefore offers better fidelity than standard irreversible compression method JPEG2000. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
A Histogram-Based Static-Error Correction Technique for Flash ADCs
Institute of Scientific and Technical Information of China (English)
Armin Jalili; J Jacob Wikner; Sayed Masoud Sayedi; Rasoul Dehghani
2011-01-01
High-speed, high-accuracy data converters are attractive for use in most RF applications. Such converters allow direct conversion to occur between the digital baseband and the antenna. However, high speed and high accuracy make the analog components in a converter more complex, and this complexity causes more power to be dissipated than if a traditional approach were taken. A static calibration technique for flash analog-to-digital converters （ADCs） is discussed in this paper. The calibration is based onhistogram test methods, and equivalent errors in the flash ADC comparators are estimated in the digital domain without any significant changes being made to the ADC comparators. In the trimming process, reference voltages are adjusted to compensate for static errors. Behavioral-level simulations of a moderate-resolution 8-bit flash ADC show that, for typical errors, ADC performance is considerably improved by the proposed technique. As a result of calibration, the differential no.nlinearities （DNLs） are reduced on average from 4 LSB to 0.5 LSB, and the integral nonlinearities （INLs） are reduced on average from 4.2 LSB to 0.35 LSB. Implementation issues for this proposed technique are discussed in our subsequent paper, “A Histogram-Based Static-Error Correction Technique for Flash ADCs： Implementation Aspects. ”
Facial expression recognition and histograms of oriented gradients: a comprehensive study.
Carcagnì, Pierluigi; Del Coco, Marco; Leo, Marco; Distante, Cosimo
2015-01-01
Automatic facial expression recognition (FER) is a topic of growing interest mainly due to the rapid spread of assistive technology applications, as human-robot interaction, where a robust emotional awareness is a key point to best accomplish the assistive task. This paper proposes a comprehensive study on the application of histogram of oriented gradients (HOG) descriptor in the FER problem, highlighting as this powerful technique could be effectively exploited for this purpose. In particular, this paper highlights that a proper set of the HOG parameters can make this descriptor one of the most suitable to characterize facial expression peculiarities. A large experimental session, that can be divided into three different phases, was carried out exploiting a consolidated algorithmic pipeline. The first experimental phase was aimed at proving the suitability of the HOG descriptor to characterize facial expression traits and, to do this, a successful comparison with most commonly used FER frameworks was carried out. In the second experimental phase, different publicly available facial datasets were used to test the system on images acquired in different conditions (e.g. image resolution, lighting conditions, etc.). As a final phase, a test on continuous data streams was carried out on-line in order to validate the system in real-world operating conditions that simulated a real-time human-machine interaction.
Zaunders, John; Jing, Junmei; Leipold, Michael; Maecker, Holden; Kelleher, Anthony D; Koch, Inge
2016-01-01
Many methods have been described for automated clustering analysis of complex flow cytometry data, but so far the goal to efficiently estimate multivariate densities and their modes for a moderate number of dimensions and potentially millions of data points has not been attained. We have devised a novel approach to describing modes using second order polynomial histogram estimators (SOPHE). The method divides the data into multivariate bins and determines the shape of the data in each bin based on second order polynomials, which is an efficient computation. These calculations yield local maxima and allow joining of adjacent bins to identify clusters. The use of second order polynomials also optimally uses wide bins, such that in most cases each parameter (dimension) need only be divided into 4-8 bins, again reducing computational load. We have validated this method using defined mixtures of up to 17 fluorescent beads in 16 dimensions, correctly identifying all populations in data files of 100,000 beads in analysis, and up to 65 subpopulations of PBMC in 33-dimensional CyTOF data, showing its usefulness in discovery research. SOPHE has the potential to greatly increase efficiency of analysing complex mixtures of cells in higher dimensions.
Assessing the hydrologic alteration of the Yangtze River using the histogram matching approach
Huang, F.; Zhang, N.; Guo, L. D.; Xia, Z. Q.
2016-08-01
Hydrologic changes of the Yangtze River, an important river with abundant water resources in China, were investigated using the Histogram Matching Approach. Daily streamflow data spanning the time interval from 1955 to 2013 was collected from Yichang and Datong stations, which monitor the hydrologic processes of the upper and lower reach of the Yangtze River, respectively. The Gezhouba Dam, the first dam constructed at the main stream of the Yangtze River, started operations in 1981. 1981 was used to differentiate the pre-dam (1955-1980) and post-dam (1981-2013) hydrologic regimes. The hydrologic regime was quantified by the Indicators of Hydrologic Alteration. The overall alteration degree of the upper Yangtze River was 31% and the alteration degree of every hydrologic indicator ranged from 10% to 81%. Only 1, 5 and 26 hydrologic indicators were altered at high, moderate and low degrees, respectively. The overall alteration degree of the lower Yangtze River was 30%, and the alteration degree of every hydrologic indicator ranged from 8% to 49%. No high alteration degree was detected at the Datong station. Ten hydrologic indicators were altered at moderate degrees and 22 hydrologic indicators were altered at low degrees. Significant increases could be observed for the low-flow relevant indicators, including the monthly flow from January-March, the annual minimum 1, 3, 7, 30 and 90-day flows, and the base flow index.
Multicomponent adsorption in mesoporous flexible materials with flat-histogram Monte Carlo methods
Mahynski, Nathan A.; Shen, Vincent K.
2016-11-01
We demonstrate an extensible flat-histogram Monte Carlo simulation methodology for studying the adsorption of multicomponent fluids in flexible porous solids. This methodology allows us to easily obtain the complete free energy landscape for the confined fluid-solid system in equilibrium with a bulk fluid of any arbitrary composition. We use this approach to study the adsorption of a prototypical coarse-grained binary fluid in "Hookean" solids, where the free energy of the solid may be described as a simple spring. However, our approach is fully extensible to solids with arbitrarily complex free energy profiles. We demonstrate that by tuning the fluid-solid interaction ranges, the inhomogeneous fluid structure inside the pore can give rise to enhanced selective capture of a larger species through cooperative adsorption with a smaller one. The maximum enhancement in selectivity is observed at low to intermediate pressures and is especially pronounced when the larger species is very dilute in the bulk. This suggest a mechanism by which the selective capture of a minor component from a bulk fluid may be enhanced.
Classification of amyloid status using machine learning with histograms of oriented 3D gradients
Directory of Open Access Journals (Sweden)
Liam Cattell
2016-01-01
Full Text Available Brain amyloid burden may be quantitatively assessed from positron emission tomography imaging using standardised uptake value ratios. Using these ratios as an adjunct to visual image assessment has been shown to improve inter-reader reliability, however, the amyloid positivity threshold is dependent on the tracer and specific image regions used to calculate the uptake ratio. To address this problem, we propose a machine learning approach to amyloid status classification, which is independent of tracer and does not require a specific set of regions of interest. Our method extracts feature vectors from amyloid images, which are based on histograms of oriented three-dimensional gradients. We optimised our method on 133 18F-florbetapir brain volumes, and applied it to a separate test set of 131 volumes. Using the same parameter settings, we then applied our method to 209 11C-PiB images and 128 18F-florbetaben images. We compared our method to classification results achieved using two other methods: standardised uptake value ratios and a machine learning method based on voxel intensities. Our method resulted in the largest mean distances between the subjects and the classification boundary, suggesting that it is less likely to make low-confidence classification decisions. Moreover, our method obtained the highest classification accuracy for all three tracers, and consistently achieved above 96% accuracy.
Cho, Gene Y; Gennaro, Lucas; Sutton, Elizabeth J; Zabor, Emily C; Zhang, Zhigang; Giri, Dilip; Moy, Linda; Sodickson, Daniel K; Morris, Elizabeth A; Sigmund, Eric E; Thakur, Sunitha B
2017-01-01
To examine the prognostic capabilities of intravoxel incoherent motion (IVIM) metrics and their ability to predict response to neoadjuvant treatment (NAT). Additionally, to observe changes in IVIM metrics between pre- and post-treatment MRI. This IRB-approved, HIPAA-compliant retrospective study observed 31 breast cancer patients (32 lesions). Patients underwent standard bilateral breast MRI along with diffusion-weighted imaging before and after NAT. Six patients underwent an additional IVIM-MRI scan 12-14 weeks after initial scan and 2 cycles of treatment. In addition to apparent diffusion coefficients (ADC) from monoexponential decay, IVIM mean values (tissue diffusivity Dt, perfusion fraction fp, and pseudodiffusivity Dp) and histogram metrics were derived using a biexponential model. An additional filter identified voxels of highly vascular tumor tissue (VTT), excluding necrotic or normal tissue. Clinical data include histology of biopsy and clinical response to treatment through RECIST assessment. Comparisons of treatment response were made using Wilcoxon rank-sum tests. Average, kurtosis, and skewness of pseudodiffusion Dp significantly differentiated RECIST responders from nonresponders. ADC and Dt values generally increased (∼70%) and VTT% values generally decreased (∼20%) post-treatment. Dp metrics showed prognostic capabilities; slow and heterogeneous pseudodiffusion offer poor prognosis. Baseline ADC/Dt parameters were not significant predictors of response. This work suggests that IVIM mean values and heterogeneity metrics may have prognostic value in the setting of breast cancer NAT.
Directory of Open Access Journals (Sweden)
SRIKOTE, G.
2016-08-01
Full Text Available This paper proposes an improved performance algorithm of face recognition to identify two face mismatch pairs in cases of incorrect decisions. The primary feature of this method is to deploy the similarity score with respect to Gaussian components between two previously unseen faces. Unlike the conventional classical vector distance measurement, our algorithms also consider the plot of summation of the similarity index versus face feature vector distance. A mixture of Gaussian models of labeled faces is also widely applicable to different biometric system parameters. By comparative evaluations, it has been shown that the efficiency of the proposed algorithm is superior to that of the conventional algorithm by an average accuracy of up to 1.15% and 16.87% when compared with 3x3 Multi-Region Histogram (MRH direct-bag-of-features and Principal Component Analysis (PCA-based face recognition systems, respectively. The experimental results show that similarity score consideration is more discriminative for face recognition compared to feature distance. Experimental results of Labeled Face in the Wild (LFW data set demonstrate that our algorithms are suitable for real applications probe-to-gallery identification of face recognition systems. Moreover, this proposed method can also be applied to other recognition systems and therefore additionally improves recognition scores.
Energy Technology Data Exchange (ETDEWEB)
Reiner, Caecilia S., E-mail: caecilia.reiner@usz.ch; Gordic, Sonja; Puippe, Gilbert; Morsbach, Fabian; Wurnig, Moritz [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology (Switzerland); Schaefer, Niklaus; Veit-Haibach, Patrick [University Hospital Zurich, Division of Nuclear Medicine (Switzerland); Pfammatter, Thomas; Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic and Interventional Radiology (Switzerland)
2016-03-15
PurposeTo evaluate in patients with hepatocellular carcinoma (HCC), whether assessment of tumor heterogeneity by histogram analysis of computed tomography (CT) perfusion helps predicting response to transarterial radioembolization (TARE).Materials and MethodsSixteen patients (15 male; mean age 65 years; age range 47–80 years) with HCC underwent CT liver perfusion for treatment planning prior to TARE with Yttrium-90 microspheres. Arterial perfusion (AP) derived from CT perfusion was measured in the entire tumor volume, and heterogeneity was analyzed voxel-wise by histogram analysis. Response to TARE was evaluated on follow-up imaging (median follow-up, 129 days) based on modified Response Evaluation Criteria in Solid Tumors (mRECIST). Results of histogram analysis and mean AP values of the tumor were compared between responders and non-responders. Receiver operating characteristics were calculated to determine the parameters’ ability to discriminate responders from non-responders.ResultsAccording to mRECIST, 8 patients (50 %) were responders and 8 (50 %) non-responders. Comparing responders and non-responders, the 50th and 75th percentile of AP derived from histogram analysis was significantly different [AP 43.8/54.3 vs. 27.6/34.3 mL min{sup −1} 100 mL{sup −1}); p < 0.05], while the mean AP of HCCs (43.5 vs. 27.9 mL min{sup −1} 100 mL{sup −1}; p > 0.05) was not. Further heterogeneity parameters from histogram analysis (skewness, coefficient of variation, and 25th percentile) did not differ between responders and non-responders (p > 0.05). If the cut-off for the 75th percentile was set to an AP of 37.5 mL min{sup −1} 100 mL{sup −1}, therapy response could be predicted with a sensitivity of 88 % (7/8) and specificity of 75 % (6/8).ConclusionVoxel-wise histogram analysis of pretreatment CT perfusion indicating tumor heterogeneity of HCC improves the pretreatment prediction of response to TARE.
On the interpretation of wave function overlaps in quantum dots
DEFF Research Database (Denmark)
Stobbe, Søren; Hvam, Jørn Märcher; Lodahl, Peter
2011-01-01
that the electron and the hole are located at the same point or region in space, i.e., they must coincide spatially to recombine. Here, we show that this interpretation is not correct even loosely speaking. By general mathematical considerations we compare the envelope wave function overlap, the exchange overlap......The spontaneous emission rate of excitons strongly confined in quantum dots (QDs) is proportional to the overlap integral of electron and hole envelope wave functions. A common and intuitive interpretation of this result is that the spontaneous emission rate is proportional to the probability...... compare our qualitative predictions with recent measurements of the wave function overlap and find good agreement....
Do Neutrino Wave Functions Overlap and Does it Matter?
Li, Cheng-Hsien
2016-01-01
Studies of neutrinos commonly ignore anti-symmetrization of their wave functions. This implicitly assumes that either spatial wave functions for neutrinos with approximately the same momentum do not overlap or their overlapping has no measurable consequences. We examine these assumptions by considering the evolution of three-dimensional neutrino wave packets (WPs). We find that it is perfectly adequate to treat accelerator and reactor neutrinos as separate WPs for typical experimental setup. While solar and supernova neutrinos correspond to overlapping WPs, they can be treated effectively as non-overlapping for analyses of their detection.
DEFF Research Database (Denmark)
Tatu, Aditya Jayant
defined subspace, the N-links bicycle chain space, i.e. the space of curves with equidistant neighboring landmark points. This in itself is a useful shape space for medical image analysis applications. The Histogram of Gradient orientation based features are many in number and are widely used......This thesis deals with two unrelated issues, restricting curve evolution to subspaces and computing image patches in the equivalence class of Histogram of Gradient orientation based features using nonlinear projection methods. Curve evolution is a well known method used in various applications like...... specific requirements like shape priors or a given data model, and due to limitations of the computer, the computed curve evolution forms a path in some finite dimensional subspace of the space of curves. We give methods to restrict the curve evolution to a finite dimensional linear or implicitly defined...
Histogramming in the LATOME-firmware for the Phase-1 upgrade of the ATLAS LAr calorimeter readout
Energy Technology Data Exchange (ETDEWEB)
Horn, Philipp; Hentges, Rainer; Straessner, Arno [Institut fuer Kern- und Teilchenphysik, Dresden (Germany)
2016-07-01
Due to the increased luminosity and the higher effective event rate after the phase 1 upgrade the ATLAS LAr detector needs new trigger electronics. The so-called LATOME-Board was designed as a LAr Digital Processing Blade (LPDB) to reconstruct the energy deposited by the particles and is an important part of the read out system. A prototype has already been build and the firmware for the on-board FPGA is under development. The insertion of a histogram-builder in this device gives the unique opportunity to look at untriggered data. This talk provides an insight in the LATOME-firmware and shows the different possibilities to implement the histogram-builder.
基于直方图均衡化的手背静脉图像对比度增强%Contrast enhancement of hand vein images based on histogram equalization
Institute of Scientific and Technical Information of China (English)
蔡超峰; 任景英
2013-01-01
The hand vein image tends to be of low contrast, which affects the recognition accuracy of the whole hand vein recognition system. The effective area of the hand vein image was extracted firstly and then the Histogram Equalization (HE) algorithm and its improved versions were employed to enhance the contrast of the extracted hand vein image. The results-indicate that the Partially Overlapped Sub-block Histogram Equalization (POSHE) can enhance not only the overall contrast but also the detailed one. Meanwhile, its high efficiency is suitable to the hand vein image contrast enhancement.%手背静脉图像对比度往往较低,这将影响整个手背静脉识别系统的识别准确率.首先提取手背静脉图像中的有效区域,然后利用直方图均衡化(HE)及其各种改进算法对提取的手背静脉图像进行对比度增强处理.实验结果表明,子块部分重叠局部直方图均衡化算法(POSHE)不但能够增强图像的整体对比度,而且图像中细节与背景之间的对比度也得到了增强,同时该算法效率较高,适合于手背静脉图像的对比度增强处理.
Energy Technology Data Exchange (ETDEWEB)
Nowosielski, Martha; Tinkhauser, Gerd; Stockhammer, Guenther [Innsbruck Medical University, Department of Neurology, Innsbruck (Austria); Recheis, Wolfgang; Schocke, Michael; Gotwald, Thaddaeus [Innsbruck Medical University, Department of Radiology, Innsbruck (Austria); Goebel, Georg [Innsbruck Medical University, Department of Medical Statistics, Informatics and Health Economics, Innsbruck (Austria); Gueler, Oezguer [Innsbruck Medical University, 4D Visualization Laboratory, University Clinic of Oto-, Rhino- and Laryngology, Innsbruck (Austria); Kostron, Herwig [Innsbruck Medical University, Department of Neurosurgery, Innsbruck (Austria); Hutterer, Markus [Innsbruck Medical University, Department of Neurology, Innsbruck (Austria); Paracelsus Medical University Salzburg-Christian Doppler Hospital, Department of Neurology, Salzburg (Austria)
2011-04-15
The purpose of this study is to evaluate apparent diffusion coefficient (ADC) maps to distinguish anti-vascular and anti-tumor effects in the course of anti-angiogenic treatment of recurrent high-grade gliomas (rHGG) as compared to standard magnetic resonance imaging (MRI). This retrospective study analyzed ADC maps from diffusion-weighted MRI in 14 rHGG patients during bevacizumab/irinotecan (B/I) therapy. Applying image segmentation, volumes of contrast-enhanced lesions in T1 sequences and of hyperintense T2 lesions (hT2) were calculated. hT2 were defined as regions of interest (ROI) and registered to corresponding ADC maps (hT2-ADC). Histograms were calculated from hT2-ADC ROIs. Thereafter, histogram asymmetry termed ''skewness'' was calculated and compared to progression-free survival (PFS) as defined by the Response Assessment Neuro-Oncology (RANO) Working Group criteria. At 8-12 weeks follow-up, seven (50%) patients showed a partial response, three (21.4%) patients were stable, and four (28.6%) patients progressed according to RANO criteria. hT2-ADC histograms demonstrated statistically significant changes in skewness in relation to PFS at 6 months. Patients with increasing skewness (n = 11) following B/I therapy had significantly shorter PFS than did patients with decreasing or stable skewness values (n = 3, median percentage change in skewness 54% versus -3%, p = 0.04). In rHGG patients, the change in ADC histogram skewness may be predictive for treatment response early in the course of anti-angiogenic therapy and more sensitive than treatment assessment based solely on RANO criteria. (orig.)
Institute of Scientific and Technical Information of China (English)
郭迎春; 吴鹏; 袁浩杰
2012-01-01
针对智能监控系统,提出了一种基于运动目标灰度直方图和自身投影直方图的检索匹配方法,能够快速实现视频序列中行人的运动方向异常检测.该方法结合目标的灰度直方图和自身投影直方图在人群中快速检索匹配目标,采用目标质心运动历史记录表连续记录目标质心和运动方向,通过比较各个目标的运动方向找出运动人群中的异常目标.实验结果表明,引入目标的自身投影直方图,比只利用灰度图的灰度信息有更高的检测准确性,同时历史移动记录表可完全胜任运动目标信息记录的任务.该方法计算量小,同时利用记录质心的移动速度能实时对目标的运动情况进行预测,对运动目标的相互遮藏有一定的鲁棒性.%For the intelligent video surveillance system, a motion object retrieval match approach is proposed, combining with the gray histogram and the self-casting histogram. It can rapidly detect an object with abnormal direction of motion. The method uses the feature combined with the gray and self-casting histograms to detect and match the object among crowds. And it uses the motion history record list of object centroid to continuously record the centroid of object and its motion direction. Besides, it compares the motion direction to find the abnormal object among moving crowds. The experiment result shows that compared with the method only employing the information of gray histogram, the accuracy of detection is improved after introducing object self-casting histogram, and the motion history record list is fully qualified to record the motion information of moving objects. The method has small amount of calculation and good robustness against objects covered by each other during their movement by recording the speeds of centroid motion.
Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method
Basire, M.; Soudan, J.-M.; Angelié, C.
2014-09-01
The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, gp(Ep) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called "corrected EAM" (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients Sij are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature Tm is plotted in terms of the cluster atom number Nat. The standard N_{at}^{-1/3} linear dependence (Pawlow law) is observed for Nat >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For Nat potential clusters studied in Paper I.
Naud, Richard; Gerstner, Wulfram
2012-01-01
The response of a neuron to a time-dependent stimulus, as measured in a Peri-Stimulus-Time-Histogram (PSTH), exhibits an intricate temporal structure that reflects potential temporal coding principles. Here we analyze the encoding and decoding of PSTHs for spiking neurons with arbitrary refractoriness and adaptation. As a modeling framework, we use the spike response model, also known as the generalized linear neuron model. Because of refractoriness, the effect of the most recent spike on the spiking probability a few milliseconds later is very strong. The influence of the last spike needs therefore to be described with high precision, while the rest of the neuronal spiking history merely introduces an average self-inhibition or adaptation that depends on the expected number of past spikes but not on the exact spike timings. Based on these insights, we derive a 'quasi-renewal equation' which is shown to yield an excellent description of the firing rate of adapting neurons. We explore the domain of validity of the quasi-renewal equation and compare it with other rate equations for populations of spiking neurons. The problem of decoding the stimulus from the population response (or PSTH) is addressed analogously. We find that for small levels of activity and weak adaptation, a simple accumulator of the past activity is sufficient to decode the original input, but when refractory effects become large decoding becomes a non-linear function of the past activity. The results presented here can be applied to the mean-field analysis of coupled neuron networks, but also to arbitrary point processes with negative self-interaction.
Arif Wibowo, R.; Haris, Bambang; Inganatul Islamiyah, dan
2017-05-01
Brachytherapy is one way to cure cervical cancer. It works by placing a radioactive source near the tumor. However, there are some healthy tissues or organs at risk (OAR) such as bladder and rectum which received radiation also. This study aims to evaluate the radiation dose of the bladder and rectum. There were 12 total radiation dose data of the bladder and rectum obtained from patients’ brachytherapy. The dose of cervix for all patients was 6 Gy. Two-dimensional calculation of the radiation dose was based on the International Commission on Radiation Units and Measurements (ICRU) points or called DICRU while the 3-dimensional calculation derived from Dose Volume Histogram (DVH) on a volume of 2 cc (D2cc). The radiation dose of bladder and rectum from both methods were analysed using independent t test. The mean DICRU of bladder was 4.33730 Gy and its D2cc was4.78090 Gy. DICRU and D2cc bladder did not differ significantly (p = 0.144). The mean DICRU of rectum was 3.57980 Gy and 4.58670 Gy for D2cc. The mean DICRU of rectum differed significantly from D2cc of rectum (p = 0.000). The three-dimensional method radiation dose of the bladder and rectum was higher than the two-dimensional method with ratios 1.10227 for bladder and 1.28127 for rectum. The radiation dose of the bladder and rectum was still below the tolerance dose. Two-dimensional calculation of the bladder and rectum dose was lower than three-dimension which was more accurate due to its calculation at the whole volume of the organs.
Directory of Open Access Journals (Sweden)
Olivassé Nasari Junior
2015-06-01
Full Text Available Background: In chronic Chagas disease (ChD, impairment of cardiac autonomic function bears prognostic implications. Phase‑rectification of RR-interval series isolates the sympathetic, acceleration phase (AC and parasympathetic, deceleration phase (DC influences on cardiac autonomic modulation. Objective: This study investigated heart rate variability (HRV as a function of RR-interval to assess autonomic function in healthy and ChD subjects. Methods: Control (n = 20 and ChD (n = 20 groups were studied. All underwent 60-min head-up tilt table test under ECG recording. Histogram of RR-interval series was calculated, with 100 ms class, ranging from 600–1100 ms. In each class, mean RR-intervals (MNN and root-mean-squared difference (RMSNN of consecutive normal RR-intervals that suited a particular class were calculated. Average of all RMSNN values in each class was analyzed as function of MNN, in the whole series (RMSNNT, and in AC (RMSNNAC and DC (RMSNNDC phases. Slopes of linear regression lines were compared between groups using Student t-test. Correlation coefficients were tested before comparisons. RMSNN was log-transformed. (α < 0.05. Results: Correlation coefficient was significant in all regressions (p < 0.05. In the control group, RMSNNT, RMSNNAC, and RMSNNDC significantly increased linearly with MNN (p < 0.05. In ChD, only RMSNNAC showed significant increase as a function of MNN, whereas RMSNNT and RMSNNDC did not. Conclusion: HRV increases in proportion with the RR-interval in healthy subjects. This behavior is lost in ChD, particularly in the DC phase, indicating cardiac vagal incompetence.
Nepal, Kumud; Fine, Adam; Imam, Nabil; Pietrocola, David; Robertson, Neil; Ahlgren, David J.
2009-01-01
Q is an unmanned ground vehicle designed to compete in the Autonomous and Navigation Challenges of the AUVSI Intelligent Ground Vehicle Competition (IGVC). Built on a base platform of a modified PerMobil Trax off-road wheel chair frame, and running off a Dell Inspiron D820 laptop with an Intel t7400 Core 2 Duo Processor, Q gathers information from a SICK laser range finder (LRF), video cameras, differential GPS, and digital compass to localize its behavior and map out its navigational path. This behavior is handled by intelligent closed loop speed control and robust sensor data processing algorithms. In the Autonomous challenge, data taken from two IEEE 1394 cameras and the LRF are integrated and plotted on a custom-defined occupancy grid and converted into a histogram which is analyzed for openings between obstacles. The image processing algorithm consists of a series of steps involving plane extraction, normalizing of the image histogram for an effective dynamic thresholding, texture and morphological analysis and particle filtering to allow optimum operation at varying ambient conditions. In the Navigation Challenge, a modified Vector Field Histogram (VFH) algorithm is combined with an auto-regressive path planning model for obstacle avoidance and better localization. Also, Q features the Joint Architecture for Unmanned Systems (JAUS) Level 3 compliance. All algorithms are developed and implemented using National Instruments (NI) hardware and LabVIEW software. The paper will focus on explaining the various algorithms that make up Q's intelligence and the different ways and modes of their implementation.
Piparo, Danilo
2012-01-01
The estimation of the compatibility of large amounts of histogram pairs is a recurrent problem in high energy physics. The issue is common to several different areas, from software quality monitoring to data certification, preservation and analysis. Given two sets of histograms, it is very important to be able to scrutinize the outcome of several goodness of fit tests, obtain a clear answer about the overall compatibility, easily spot the single anomalies and directly access the concerned histogram pairs. This procedure must be automated in order to reduce the human workload, therefore improving the process of identification of differences which is usually carried out by a trained human mind. Some solutions to this problem have been proposed, but they are experiment specific. RelMon depends only on ROOT and offers several goodness of fit tests (e.g. chi-squared or Kolmogorov-Smirnov). It produces highly readable web reports, in which aggregations of the comparisons rankings are available as well as all the pl...
Tateno, K; Glass, L
2001-11-01
The paper describes a method for the automatic detection of atrial fibrillation, an abnormal heart rhythm, based on the sequence of intervals between heartbeats. The RR interval is the interbeat interval, and deltaRR is the difference between two successive RR intervals. Standard density histograms of the RR and deltaRR intervals were prepared as templates for atrial fibrillation detection. As the coefficients of variation of the RR and deltaRR intervals were approximately constant during atrial fibrillation, the coefficients of variation in the test data could be compared with the standard coefficients of variation (CV test). Further, the similarities between the density histograms of the test data and the standard density histograms were estimated using the Kolmogorov-Smirnov test. The CV test based on the RR intervals showed a sensitivity of 86.6% and a specificity of 84.3%. The CV test based on the deltaRR intervals showed that the sensitivity and the specificity are both approximately 84%. The Kolmogorov-Smirnov test based on the RR intervals did not improve on the result of the CV test. In contrast, the Kolmogorov-Smirnov test based on the ARR intervals showed a sensitivity of 94.4% and a specificity of 97.2%.
Overlapping and non-overlapping functions of condensins I and II in neural stem cell divisions.
Directory of Open Access Journals (Sweden)
Kenji Nishide
2014-12-01
Full Text Available During development of the cerebral cortex, neural stem cells (NSCs divide symmetrically to proliferate and asymmetrically to generate neurons. Although faithful segregation of mitotic chromosomes is critical for NSC divisions, its fundamental mechanism remains unclear. A class of evolutionarily conserved protein complexes, known as condensins, is thought to be central to chromosome assembly and segregation among eukaryotes. Here we report the first comprehensive genetic study of mammalian condensins, demonstrating that two different types of condensin complexes (condensins I and II are both essential for NSC divisions and survival in mice. Simultaneous depletion of both condensins leads to severe defects in chromosome assembly and segregation, which in turn cause DNA damage and trigger p53-induced apoptosis. Individual depletions of condensins I and II lead to slower loss of NSCs compared to simultaneous depletion, but they display distinct mitotic defects: chromosome missegregation was observed more prominently in NSCs depleted of condensin II, whereas mitotic delays were detectable only in condensin I-depleted NSCs. Remarkably, NSCs depleted of condensin II display hyperclustering of pericentric heterochromatin and nucleoli, indicating that condensin II, but not condensin I, plays a critical role in establishing interphase nuclear architecture. Intriguingly, these defects are taken over to postmitotic neurons. Our results demonstrate that condensins I and II have overlapping and non-overlapping functions in NSCs, and also provide evolutionary insight into intricate balancing acts of the two condensin complexes.
Simple Evaluation of Chiral Jacobian with Overlap Dirac Operator
Suzuki, H
1999-01-01
The chiral Jacobian, which is defined with Neuberger's overlap Dirac operator of lattice fermion, is explicitly evaluated in the continuum limit without expanding it in the gauge coupling constant. Our calculational scheme is simple and straightforward. We determine a coefficient of the chiral anomaly for general value of the bare mass parameter and the Wilson parameter of the overlap Dirac operator.
Influence of line isolation overlappings on formation of lightning overvoltages
Directory of Open Access Journals (Sweden)
Antropov I. M.
2015-12-01
Full Text Available The model of substation protection against lightning waves with considered multiple overlappings of line isolation has been presented. Influence of multiple overlapping of isolation on line support on formation of lightning overvoltages has been shown. Ambiguity of determination of lightning current dangerous parameters at the fixed length of its front has been revealed
An Exposition of Fischer's Model of Overlapping Contracts.
Fields, T. Windsor; Hart, William R.
1992-01-01
Suggests how the classic model of overlapping contracts can be incorporated into the contract wage model of aggregate supply. Illustrates dynamics of macroeconomic adjustment following a shock to aggregate demand. Concludes that overlapping contracts do not prolong the adjustment process; rather, the longest remaining contract determines the time…
Nested Genetic Algorithm for Resolving Overlapped Spectral Bands
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
A nested genetic algorithm, including genetic parameter level and genetic implemented level for peak parameters, was proposed and applied for resolving overlapped spectral bands. By the genetic parameter level, parameters of genetic algorithm were optimized; moreover, the number of overlapped peaks was determined simultaneously. Then parameters of individual peaks were computed with the genetic implemented level.
Overlaps of Partial Neel States and Bethe States
Foda, O
2015-01-01
Partial Neel states are generalizations of the ordinary Neel (classical anti-ferromagnet) state that can have arbitrary integer spin. We study overlaps of these states with Bethe states. We first identify this overlap with a partial version of reflecting-boundary domain-wall partition function, and then derive various determinant representations for off-shell and on-shell Bethe states.
Shake for Sigma, Pray for Pi: Classroom Orbital Overlap Analogies
Dicks, Andrew P.
2011-01-01
An introductory organic classroom demonstration is discussed where analogies are made between common societal hand contact and covalent bond formation. A handshake signifies creation of a [sigma] bond ("head-on" orbital overlap), whereas the action of praying illustrates "sideways" overlap and generation of a [pi] bond. The nature of orbital and…
Direct and indirect effects in the regulation of overlapping promoters
DEFF Research Database (Denmark)
Bendtsen, Kristian Moss; Erdossy, Janos; Csiszovski, Zsolt;
2011-01-01
promoter database we found that ~14% of the identified 'forward' promoters overlap with a promoter oriented in the opposite direction. In this article we combine a mathematical model with experimental analysis of synthetic regulatory regions to investigate interference of overlapping promoters. We find...
Distribution and content of dust in overlapping galaxy systems
White, R E; Conselice, C J; White, Raymond E; Keel, William C; Conselice, Christopher J
1996-01-01
Partially overlapping galaxies are used to directly determine the effective absorption in spiral galaxy disks. The non-overlapping parts of the galaxies and symmetry considerations are used to reconstruct, via differential photometry, how much background galaxy light is lost in passing through the foreground disks.
Compressed Sensing Inspired Image Reconstruction from Overlapped Projections
Directory of Open Access Journals (Sweden)
Lin Yang
2010-01-01
Full Text Available The key idea discussed in this paper is to reconstruct an image from overlapped projections so that the data acquisition process can be shortened while the image quality remains essentially uncompromised. To perform image reconstruction from overlapped projections, the conventional reconstruction approach (e.g., filtered backprojection (FBP algorithms cannot be directly used because of two problems. First, overlapped projections represent an imaging system in terms of summed exponentials, which cannot be transformed into a linear form. Second, the overlapped measurement carries less information than the traditional line integrals. To meet these challenges, we propose a compressive sensing-(CS- based iterative algorithm for reconstruction from overlapped data. This algorithm starts with a good initial guess, relies on adaptive linearization, and minimizes the total variation (TV. Then, we demonstrated the feasibility of this algorithm in numerical tests.
Segmentation of Overlapping Shapes using Test Ray Intersections
DEFF Research Database (Denmark)
Rasmusson, Allan
be a major task, but in bioimaging and tissue quantification it is often complicated further by the need for segmenting images of overlapping particles, for instance neurons. One approach to segmenting overlapping particles is to oversegment the image into many small regions which are then combined...... into the correct shapes in a postprocessing step. The postprocessing step is unfortunately often both difficult and computationally expensive. Another approach is to incorporate descriptions of the overlapping shapes into a segmentation algorithm which normally only segments the union of all particle profiles....... This may, however, quickly lead to the implementation of complex descriptions of any possible configuration the overlapping shapes may appear in. Presented here is a new approach to segment overlapping shapes which utilizes information gained from probing the image with test rays. Test rays intersections...
Compression of flow can reveal overlapping modular organization in networks
Esquivel, Alcides Viamontes
2011-01-01
To better understand the overlapping modular organization of large networks with respect to flow, here we introduce the map equation for overlapping modules. In this information-theoretic framework, we use the correspondence between compression and regularity detection. The generalized map equation measures how well we can compress a description of flow in the network when we partition it into modules with possible overlaps. When we minimize the generalized map equation over overlapping network partitions, we detect modules that capture flow and determine which nodes at the boundaries between modules should be classified in multiple modules and to what degree. With a novel greedy search algorithm, we find that some networks, for example, the neural network of C. Elegans, are best described by modules dominated by hard boundaries, but that others, for example, the sparse road network of California, have a highly overlapping modular organization. To compare our approach with other clustering algorithms, we sugg...
Directory of Open Access Journals (Sweden)
Davide Colombi
Full Text Available To describe changes over time in extent of idiopathic pulmonary fibrosis (IPF at multidetector computed tomography (MDCT assessed by semi-quantitative visual scores (VSs and fully automatic histogram-based quantitative evaluation and to test the relationship between these two methods of quantification.Forty IPF patients (median age: 70 y, interquartile: 62-75 years; M:F, 33:7 that underwent 2 MDCT at different time points with a median interval of 13 months (interquartile: 10-17 months were retrospectively evaluated. In-house software YACTA quantified automatically lung density histogram (10th-90th percentile in 5th percentile steps. Longitudinal changes in VSs and in the percentiles of attenuation histogram were obtained in 20 untreated patients and 20 patients treated with pirfenidone. Pearson correlation analysis was used to test the relationship between VSs and selected percentiles.In follow-up MDCT, visual overall extent of parenchymal abnormalities (OE increased in median by 5%/year (interquartile: 0%/y; +11%/y. Substantial difference was found between treated and untreated patients in HU changes of the 40th and of the 80th percentiles of density histogram. Correlation analysis between VSs and selected percentiles showed higher correlation between the changes (Δ in OE and Δ 40th percentile (r=0.69; p<0.001 as compared to Δ 80th percentile (r=0.58; p<0.001; closer correlation was found between Δ ground-glass extent and Δ 40th percentile (r=0.66, p<0.001 as compared to Δ 80th percentile (r=0.47, p=0.002, while the Δ reticulations correlated better with the Δ 80th percentile (r=0.56, p<0.001 in comparison to Δ 40th percentile (r=0.43, p=0.003.There is a relevant and fully automatically measurable difference at MDCT in VSs and in histogram analysis at one year follow-up of IPF patients, whether treated or untreated: Δ 40th percentile might reflect the change in overall extent of lung abnormalities, notably of ground-glass pattern
An extension to artifact-free projection overlaps
Energy Technology Data Exchange (ETDEWEB)
Lin, Jianyu, E-mail: jianyulin@hotmail.com [Department of Electrical and Computer Engineering, Curtin University, GPO Box U1987, Perth, Western Australia 6845 (Australia)
2015-05-15
Purpose: In multipinhole single photon emission computed tomography, the overlapping of projections has been used to increase sensitivity. Avoiding artifacts in the reconstructed image associated with projection overlaps (multiplexing) is a critical issue. In our previous report, two types of artifact-free projection overlaps, i.e., projection overlaps that do not lead to artifacts in the reconstructed image, were formally defined and proved, and were validated via simulations. In this work, a new proposition is introduced to extend the previously defined type-II artifact-free projection overlaps so that a broader range of artifact-free overlaps is accommodated. One practical purpose of the new extension is to design a baffle window multipinhole system with artifact-free projection overlaps. Methods: First, the extended type-II artifact-free overlap was theoretically defined and proved. The new proposition accommodates the situation where the extended type-II artifact-free projection overlaps can be produced with incorrectly reconstructed portions in the reconstructed image. Next, to validate the theory, the extended-type-II artifact-free overlaps were employed in designing the multiplexing multipinhole spiral orbit imaging systems with a baffle window. Numerical validations were performed via simulations, where the corresponding 1-pinhole nonmultiplexing reconstruction results were used as the benchmark for artifact-free reconstructions. The mean square error (MSE) was the metric used for comparisons of noise-free reconstructed images. Noisy reconstructions were also performed as part of the validations. Results: Simulation results show that for noise-free reconstructions, the MSEs of the reconstructed images of the artifact-free multiplexing systems are very similar to those of the corresponding 1-pinhole systems. No artifacts were observed in the reconstructed images. Therefore, the testing results for artifact-free multiplexing systems designed using the
Magner, Abram; Grama, Ananth
2016-01-01
Algorithms for detecting clusters (including overlapping clusters) in graphs have received significant attention in the research community. A closely related important aspect of the problem -- quantification of statistical significance of overlap of clusters, remains relatively unexplored. This paper presents the first theoretical and practical results on quantifying statistically significant interactions between clusters in networks. Such problems commonly arise in diverse applications, ranging from social network analysis to systems biology. The paper addresses the problem of quantifying the statistical significance of the observed overlap of the two clusters in an Erd\\H{o}s-R\\'enyi graph model. The analytical framework presented in the paper assigns a $p$-value to overlapping subgraphs by combining information about both the sizes of the subgraphs and their edge densities in comparison to the corresponding values for their overlapping component. This $p$-value is demonstrated to have excellent discriminati...
National Research Council Canada - National Science Library
Kang, Yusuhn; Choi, Seung Hong; Kim, Young-Jae; Kim, Kwang Gi; Sohn, Chul-Ho; Kim, Ji-Hoon; Yun, Tae Jin; Chang, Kee-Hyun
2011-01-01
To explore the role of histogram analysis of apparent diffusion coefficient (ADC) maps based on entire tumor volume data in determining glioma grade and to evaluate the diagnostic performance of ADC maps at standard...
Zenchenko, K I; Zenchenko, T A; Kuzhevskiĭ, B M; Vilken, B; Axford, Y; Shnol', S E
2001-01-01
In joint experiments performed at Max Plank Institute of Aeronomy (Germany) and the Institute of Theoretical and Experimental Biophysics in Pushchino, the main manifestations of the phenomenon of macroscopic fluctuations were confirmed. An increased probability of the similarity in synchronous histograms in independent measurements performed by two installations in one laboratory and in two laboratories separated by a distance of 2000 km was shown. In the latter case, the similarity of histograms is most probable at the same local time.
Warming effect of dust aerosols modulated by overlapping clouds below
Xu, Hui; Guo, Jianping; Wang, Yuan; Zhao, Chuanfeng; Zhang, Zhibo; Min, Min; Miao, Yucong; Liu, Huan; He, Jing; Zhou, Shunwu; Zhai, Panmao
2017-10-01
Due to the substantial warming effect of dust aerosols overlying clouds and its poor representation in climate models, it is imperative to accurately quantify the direct radiative forcing (DRF) of above-cloud dust aerosols. When absorbing aerosol layers are located above clouds, the warming effect of aerosols strongly depends on the cloud macro- and micro-physical properties underneath, such as cloud optical depth and cloud fraction at visible wavelength. A larger aerosol-cloud overlap is believed to cause a larger warming effect of absorbing aerosols, but the influence of overlapping cloud fraction and cloud optical depth remains to be explored. In this study, the impact of overlapping cloud properties on the shortwave all-sky DRF due to springtime above-cloud dust aerosols is quantified over northern Pacific Ocean based on 10-year satellite measurements. On average, the DRF is roughly 0.62 Wm-2. Furthermore, the warming effect of dust aerosols linearly increases with both overlapping cloud fraction and cloud optical depth. An increase of 1% in overlapping cloud fraction will amplify this warming effect by 1.11 Wm-2τ-1. For the springtime northern Pacific Ocean, top-of-atmosphere cooling by dust aerosols turns into warming when overlapping cloud fraction is beyond 0.20. The variation of critical cloud optical depth beyond which dust aerosols switch from exerting a net cooling to a net warming effect depends on the concurrent overlapping cloud fraction. When the overlapping cloud coverage range increases from 0.2 to -0.4 to 0.6-0.8, the corresponding critical cloud optical depth reduces from 6.92 to 1.16. Our results demonstrate the importance of overlapping cloud properties for determining the springtime warming effect of dust aerosols.
Nanothermodynamics of large iron clusters by means of a flat histogram Monte Carlo method
Energy Technology Data Exchange (ETDEWEB)
Basire, M.; Soudan, J.-M.; Angelié, C., E-mail: christian.angelie@cea.fr [Laboratoire Francis Perrin, CNRS-URA 2453, CEA/DSM/IRAMIS/LIDyL, F-91191 Gif-sur-Yvette Cedex (France)
2014-09-14
The thermodynamics of iron clusters of various sizes, from 76 to 2452 atoms, typical of the catalyst particles used for carbon nanotubes growth, has been explored by a flat histogram Monte Carlo (MC) algorithm (called the σ-mapping), developed by Soudan et al. [J. Chem. Phys. 135, 144109 (2011), Paper I]. This method provides the classical density of states, g{sub p}(E{sub p}) in the configurational space, in terms of the potential energy of the system, with good and well controlled convergence properties, particularly in the melting phase transition zone which is of interest in this work. To describe the system, an iron potential has been implemented, called “corrected EAM” (cEAM), which approximates the MEAM potential of Lee et al. [Phys. Rev. B 64, 184102 (2001)] with an accuracy better than 3 meV/at, and a five times larger computational speed. The main simplification concerns the angular dependence of the potential, with a small impact on accuracy, while the screening coefficients S{sub ij} are exactly computed with a fast algorithm. With this potential, ergodic explorations of the clusters can be performed efficiently in a reasonable computing time, at least in the upper half of the solid zone and above. Problems of ergodicity exist in the lower half of the solid zone but routes to overcome them are discussed. The solid-liquid (melting) phase transition temperature T{sub m} is plotted in terms of the cluster atom number N{sub at}. The standard N{sub at}{sup −1/3} linear dependence (Pawlow law) is observed for N{sub at} >300, allowing an extrapolation up to the bulk metal at 1940 ±50 K. For N{sub at} <150, a strong divergence is observed compared to the Pawlow law. The melting transition, which begins at the surface, is stated by a Lindemann-Berry index and an atomic density analysis. Several new features are obtained for the thermodynamics of cEAM clusters, compared to the Rydberg pair potential clusters studied in Paper I.
Apparent diffusion coefficient histogram analysis of neonatal hypoxic-ischemic encephalopathy
Energy Technology Data Exchange (ETDEWEB)
Cauley, Keith A. [University of Massachusetts Medical School, Department of Radiology, Worcester, MA (United States); New York Presbyterian Hospital, Columbia University Medical Center, Department of Radiology, New York, NY (United States); Filippi, Christopher G. [New York Presbyterian Hospital, Columbia University Medical Center, Department of Radiology, New York, NY (United States)
2014-06-15
Diffusion-weighted imaging is a valuable tool in the assessment of the neonatal brain, and changes in diffusion are seen in normal development as well as in pathological states such as hypoxic-ischemic encephalopathy (HIE). Various methods of quantitative assessment of diffusion values have been reported. Global ischemic injury occurring during the time of rapid developmental changes in brain myelination can complicate the imaging diagnosis of neonatal HIE. To compare a quantitative method of histographic analysis of brain apparent coefficient (ADC) maps to the qualitative interpretation of routine brain MR imaging studies. We correlate changes in diffusion values with gestational age in radiographically normal neonates, and we investigate the sensitivity of the method as a quantitative measure of hypoxic-ischemic encephalopathy. We reviewed all brain MRI studies from the neonatal intensive care unit (NICU) at our university medical center over a 4-year period to identify cases that were radiographically normal (23 cases) and those with diffuse, global hypoxic-ischemic encephalopathy (12 cases). We histographically displayed ADC values of a single brain slice at the level of the basal ganglia and correlated peak (s-sD{sub av}) and lowest histogram values (s-sD{sub lowest}) with gestational age. Normative s-sD{sub av} values correlated significantly with gestational age and declined linearly through the neonatal period (r {sup 2} = 0.477, P < 0.01). Six of 12 cases of known HIE demonstrated significantly lower s-sD{sub av} and s-sD{sub lowest} ADC values than were reflected in the normative distribution; several cases of HIE fell within a 95% confidence interval for normative studies, and one case demonstrated higher-than-normal s-sD{sub av}. Single-slice histographic display of ADC values is a rapid and clinically feasible method of quantitative analysis of diffusion. In this study normative values derived from consecutive neonates without radiographic evidence of
A Guide to Using STITCHER for Overlapping Assembly PCR Applications.
O'Halloran, Damien M
2017-01-01
Overlapping PCR is commonly used in many molecular applications that include stitching PCR fragments together, generating fluorescent transcriptional and translational fusions, inserting mutations, making deletions, and PCR cloning. Overlapping PCR is also used for genotyping and in detection experiments using techniques such as loop-mediated isothermal amplification (LAMP). STITCHER is a web tool providing a central resource for researchers conducting all types of overlapping assembly PCR experiments with an intuitive interface for automated primer design that's fast, easy to use, and freely available online.
Overlap distribution of the three-dimensional Ising model.
Berg, Bernd A; Billoire, Alain; Janke, Wolfhard
2002-10-01
We study the Parisi overlap probability density P(L)(q) for the three-dimensional Ising ferromagnet by means of Monte Carlo (MC) simulations. At the critical point, P(L)(q) is peaked around q=0 in contrast with the double peaked magnetic probability density. We give particular attention to the tails of the overlap distribution at the critical point, which we control over up to 500 orders of magnitude by using the multioverlap MC algorithm. Below the critical temperature, interface tension estimates from the overlap probability density are given and their approach to the infinite volume limit appears to be smoother than for estimates from the magnetization.
Computation of overlap integrals over STOs with mathematica
Yükçü, S. A.; Yükçü, N.
2017-02-01
Overlap integrals which encountered in molecular structure calculations are the most basic of molecular integrals. Also, other molecular integrals can be expressed in terms of these integrals. Overlap integrals can be calculated by using Slater Type Orbitals (STOs). In this work, we develop algorithms for two-center overlap integrals which are calculated over the STOs in ellipsoidal coordinates and some auxiliary functions by S. M. Mekelleche's group. During the computation of this paper, Mathematica programming language has been used to produce algorithms. Numerical results for some quantum numbers are presented in the tables. Finally, our numerical results and others are compared, then some details of evaluation method are discussed.
Overlapping Communities Detection Based on Link Partition in Directed Networks
Directory of Open Access Journals (Sweden)
Qingyu Zou
2013-09-01
Full Text Available Many complex systems can be described as networks to comprehend both the structure and the function. Community structure is one of the most important properties of complex networks. Detecting overlapping communities in networks have been more attention in recent years, but the most of approaches to this problem have been applied to the undirected networks. This paper presents a novel approach based on link partition to detect overlapping communities structure in directed networks. In contrast to previous researches focused on grouping nodes, our algorithm defines communities as groups of directed links rather than nodes with the purpose of nodes naturally belong to more than one community. This approach can identify a suitable number of overlapping communities without any prior knowledge about the community in directed networks. We evaluate our algorithm on a simple artificial network and several real-networks. Experimental results demonstrate that the algorithm proposed is efficient for detecting overlapping communities in directed networks.
ORIGINAL ARTICLE Farber disease overlapping with stiff skin ...
African Journals Online (AJOL)
salah
Background: Farber Disease (MIM 228000)1 is a rare AR disorder first de- scribed by Sidney ... Aim of the Study: Diagnosis and clarification of overlapping in the clinical .... pochromic anemia with anisocytosis, .... a new approach to treatment.
Genetic overlap among intelligence and other candidate endophenotypes for schizophrenia
Aukes, Maartje F; Alizadeh, Behrooz Z; Sitskoorn, Margriet M; Kemner, Chantal; Ophoff, Roel A; Kahn, René S
2009-01-01
BACKGROUND: A strategy to improve genetic studies of schizophrenia involves the use of endophenotypes. Information on overlapping genetic contributions among endophenotypes may provide additional power, reveal biological pathways, and have practical implications for genetic research. Several cogniti
Overlap Dirac Operator, Eigenvalues and Random Matrix Theory
Edwards, Robert G.; Heller, Urs M.; Kiskis, Joe; Narayanan, Rajamani
1999-01-01
The properties of the spectrum of the overlap Dirac operator and their relation to random matrix theory are studied. In particular, the predictions from chiral random matrix theory in topologically non-trivial gauge field sectors are tested.
Some new results on the central overlap problem in astrometry
Rapaport, M.
1998-07-01
The central overlap problem in astrometry has been revisited in the recent last years by Eichhorn (1988) who explicitly inverted the matrix of a constrained least squares problem. In this paper, the general explicit solution of the unconstrained central overlap problem is given. We also give the explicit solution for an other set of constraints; this result is a confirmation of a conjecture expressed by Eichhorn (1988). We also consider the use of iterative methods to solve the central overlap problem. A surprising result is obtained when the classical Gauss Seidel method is used; the iterations converge immediately to the general solution of the equations; we explain this property writing the central overlap problem in a new set of variables.
Overlapping features of frontotemporal dementia and amyotrophic lateral sclerosis
National Research Council Canada - National Science Library
Lillo, Patricia; Matamala, José Manuel; Valenzuela, Daniel; Verdugo, Renato; Castillo, José Luis; Ibáñez, Agustín; Slachevsky, Andrea
2014-01-01
...) and amyotrophic lateral sclerosis (ALS) are overlapping multisystem disorders. While 10-15% of ALS patients fulfil criteria for FTD, features of motor neuron disease appear in approximately 15...
Continuum-limit scaling of overlap fermions as valence quarks
Energy Technology Data Exchange (ETDEWEB)
Cichy, Krzysztof [Adam Mickiewicz Univ., Poznan (Poland). Faculty of Physics; Herdoiza, Gregorio; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2009-10-15
We present the results of a mixed action approach, employing dynamical twisted mass fermions in the sea sector and overlap valence fermions, with the aim of testing the continuum limit scaling behaviour of physical quantities, taking the pion decay constant as an example. To render the computations practical, we impose for this purpose a fixed finite volume with lattice size L{approx}1.3 fm. We also briefly review the techniques we have used to deal with overlap fermions. (orig.)
A novel symbol overlapping FFH-OCDMA system
Institute of Scientific and Technical Information of China (English)
Chengbin Shen(沈成彬); Chen Wu(吴琛); Jinhui Yu(于金辉); Ge Fan(范戈)
2004-01-01
@@ A novel symbol overlapping optical fast frequency-hop code-division multiple access(FFH-OCDMA)sys-tem is proposed,and its bit error rate(BER)performance is investigated under consideration of avalanchephotonic diode(APD)noise auid thermal noise.An experimental symbol overlapping(SO)FFH-OCDMAtestbed is developed and some experimental results axe given.The theoretical and experimental resultsshow that the system is apt to implement and has larger throughput.
Topological summation of observables measured with dynamical overlap fermions
Energy Technology Data Exchange (ETDEWEB)
Bietenholz, W. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Hip, I. [Zagreb Univ. (Croatia). Faculty of Geothechnical Engineering
2008-10-15
HMC histories for light dynamical overlap fermions tend to stay in a fixed topological sector for many trajectories, so that the different sectors are not sampled properly. Therefore the suitable summation of observables, which have been measured in separate sectors, is a major challenge. We explore several techniques for this issue, based on data for the chiral condensate and the (analogue of the) pion mass in the 2-flavour Schwinger model with dynamical overlap-hypercube fermions. (orig.)
Topological Summation of Observables Measured with Dynamical Overlap Fermions
2008-01-01
HMC histories for light dynamical overlap fermions tend to stay in a fixed topological sector for many trajectories, so that the different sectors are not sampled properly. Therefore the suitable summation of observables, which have been measured in separate sectors, is a major challenge. We explore several techniques for this issue, based on data for the chiral condensate and the (analogue of the) pion mass in the 2-flavour Schwinger model with dynamical overlap-hypercube fermions.
Overlap Areas of a Square Box on a Square Mesh
2017-04-01
ARL-TN-0818 ● APR 2017 US Army Research Laboratory Overlap Areas of a Square Box on a Square Mesh by James U Cazamias...originator. ARL-TN-0818 ● APR 2017 US Army Research Laboratory Overlap Areas of a Square Box on a Square Mesh by James U Cazamias...no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control
Gene network interconnectedness and the generalized topological overlap measure
Directory of Open Access Journals (Sweden)
Horvath Steve
2007-01-01
Full Text Available Abstract Background Network methods are increasingly used to represent the interactions of genes and/or proteins. Genes or proteins that are directly linked may have a similar biological function or may be part of the same biological pathway. Since the information on the connection (adjacency between 2 nodes may be noisy or incomplete, it can be desirable to consider alternative measures of pairwise interconnectedness. Here we study a class of measures that are proportional to the number of neighbors that a pair of nodes share in common. For example, the topological overlap measure by Ravasz et al. 1 can be interpreted as a measure of agreement between the m = 1 step neighborhoods of 2 nodes. Several studies have shown that two proteins having a higher topological overlap are more likely to belong to the same functional class than proteins having a lower topological overlap. Here we address the question whether a measure of topological overlap based on higher-order neighborhoods could give rise to a more robust and sensitive measure of interconnectedness. Results We generalize the topological overlap measure from m = 1 step neighborhoods to m ≥ 2 step neighborhoods. This allows us to define the m-th order generalized topological overlap measure (GTOM by (i counting the number of m-step neighbors that a pair of nodes share and (ii normalizing it to take a value between 0 and 1. Using theoretical arguments, a yeast co-expression network application, and a fly protein network application, we illustrate the usefulness of the proposed measure for module detection and gene neighborhood analysis. Conclusion Topological overlap can serve as an important filter to counter the effects of spurious or missing connections between network nodes. The m-th order topological overlap measure allows one to trade-off sensitivity versus specificity when it comes to defining pairwise interconnectedness and network modules.
Arai, Kazuhiro; Kadoya, Noriyuki; Kato, Takahiro; Endo, Hiromitsu; Komori, Shinya; Abe, Yoshitomo; Nakamura, Tatsuya; Wada, Hitoshi; Kikuchi, Yasuhiro; Takai, Yoshihiro; Jingu, Keiichi
2017-01-01
The aim of this study was to confirm On-Board Imager cone-beam computed tomography (CBCT) using the histogram-matching algorithm as a useful method for proton dose calculation. We studied one head and neck phantom, one pelvic phantom, and ten patients with head and neck cancer treated using intensity-modulated radiation therapy (IMRT) and proton beam therapy. We modified Hounsfield unit (HU) values of CBCT and generated two modified CBCTs (mCBCT-RR, mCBCT-DIR) using the histogram-matching algorithm: modified CBCT with rigid registration (mCBCT-RR) and that with deformable image registration (mCBCT-DIR). Rigid and deformable image registration were applied to match the CBCT to planning CT. To evaluate the accuracy of the proton dose calculation, we compared dose differences in the dosimetric parameters (D2% and D98%) for clinical target volume (CTV) and planning target volume (PTV). We also evaluated the accuracy of the dosimetric parameters (Dmean and D2%) for some organs at risk, and compared the proton ranges (PR) between planning CT (reference) and CBCT or mCBCTs, and the gamma passing rates of CBCT and mCBCTs. For patients, the average dose and PR differences of mCBCTs were smaller than those of CBCT. Additionally, the average gamma passing rates of mCBCTs were larger than those of CBCT (e.g., 94.1±3.5% in mCBCT-DIR vs. 87.8±7.4% in CBCT). We evaluated the accuracy of the proton dose calculation in CBCT and mCBCTs for two phantoms and ten patients. Our results showed that HU modification using the histogram-matching algorithm could improve the accuracy of the proton dose calculation.
Energy Technology Data Exchange (ETDEWEB)
Kim, Joo Young; Lee, Ik Jae; Keum, Ki Chang; Kim, Yong Bae; Shim, Su Jung; Jeong, Kyoung Keun; Kim, Jong Dae; Suh, Chang Ok [Yonsei University College of Medicine, Seoul (Korea, Republic of)
2007-12-15
Purpose: To evaluate the association between radiation pneumonitis and dose-volume histogram parameters and to provide practical guidelines to prevent radiation pneumonitis following radiotherapy administered for breast cancer including internal mammary lymph nodes. Materials and Methods: Twenty patients with early breast cancer who underwent a partial mastectomy were involved in this study. The entire breast, supraclavicular lymph nodes, and internal mammary lymph nodes were irradiated with a dose of 50.4 Gy in 28 fractions. Radiation pneumonitis was assessed by both radiological pulmonary change (RPC) and by evaluation of symptomatic radiation pneumonitis. Dose-volume histogram parameters were compared between patients with grade <2 RPC and those with grade {>=}2 RPC. The parameters were the mean lung dose, V10 (percent lung volume receiving equal to and more than 10 Gy), V20, V30, V40, and normal tissue complication probability (NTCP). Results: Of the 20 patients, 9 (45%) developed grade 2 RPC and 11 (55%) did not develop RPC (grade 0). Only one patient developed grade 1 symptomatic radiation pneumonitis. Univariate analysis showed that among the dose-volume histogram parameters, NTCP was significantly different between the two RPC grade groups (p <0.05). Fisher's exact test indicated that an NTCP value of 45% was appropriate as an RPC threshold level. Conclusion: This study shows that NTCP can be used as a predictor of RPC after radiotherapy of the internal mammary lymph nodes in breast cancer. Clinically, it indicates that an RPC is likely to develop when the NTCP is greater than 45%.
Energy Technology Data Exchange (ETDEWEB)
Castellano, Antonella; Iadanza, Antonella; Falini, Andrea [San Raffaele Scientific Institute and Vita-Salute San Raffaele University, Neuroradiology Unit and CERMAC, Milano (Italy); Donativi, Marina [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); Ruda, Roberta; Bertero, Luca; Soffietti, Riccardo [University of Torino, Department of Neuro-oncology, Turin (Italy); De Nunzio, Giorgio [University of Salento, Department of Mathematics and Physics ' ' Ennio De Giorgi' ' and A.D.A.M. (Advanced Data Analysis in Medicine), Lecce (Italy); INFN (National Institute of Nuclear Physics), Lecce (Italy); Riva, Marco; Bello, Lorenzo [Universita degli Studi di Milano, Milan, and Humanitas Research Hospital, Department of Medical Biotechnology and Translational Medicine, Rozzano, MI (Italy); Rucco, Matteo [University of Camerino, School of Science and Technology, Computer Science Division, Camerino, MC (Italy)
2016-05-15
To explore the role of diffusion tensor imaging (DTI)-based histogram analysis and functional diffusion maps (fDMs) in evaluating structural changes of low-grade gliomas (LGGs) receiving temozolomide (TMZ) chemotherapy. Twenty-one LGG patients underwent 3T-MR examinations before and after three and six cycles of dose-dense TMZ, including 3D-fluid-attenuated inversion recovery (FLAIR) sequences and DTI (b = 1000 s/mm{sup 2}, 32 directions). Mean diffusivity (MD), fractional anisotropy (FA), and tensor-decomposition DTI maps (p and q) were obtained. Histogram and fDM analyses were performed on co-registered baseline and post-chemotherapy maps. DTI changes were compared with modifications of tumour area and volume [according to Response Assessment in Neuro-Oncology (RANO) criteria], and seizure response. After three cycles of TMZ, 20/21 patients were stable according to RANO criteria, but DTI changes were observed in all patients (Wilcoxon test, P ≤ 0.03). After six cycles, DTI changes were more pronounced (P ≤ 0.005). Seventy-five percent of patients had early seizure response with significant improvement of DTI values, maintaining stability on FLAIR. Early changes of the 25th percentiles of p and MD predicted final volume change (R{sup 2} = 0.614 and 0.561, P < 0.0005, respectively). TMZ-related changes were located mainly at tumour borders on p and MD fDMs. DTI-based histogram and fDM analyses are useful techniques to evaluate the early effects of TMZ chemotherapy in LGG patients. (orig.)
No association between plant mating system and geographic range overlap.
Grossenbacher, Dena; Briscoe Runquist, Ryan D; Goldberg, Emma E; Brandvain, Yaniv
2016-01-01
Automatic self-fertilization may influence the geography of speciation, promote reproductive isolation between incipient species, and lead to ecological differentiation. As such, selfing taxa are predicted to co-occur more often with their closest relatives than are outcrossing taxa. Despite suggestions that this pattern may be general, the extent to which mating system influences range overlap in close relatives has not been tested formally across a diverse group of plant species pairs. We tested for a difference in range overlap between species pairs for which zero, one, or both species are selfers, using data from 98 sister species pairs in 20 genera across 15 flowering plant families. We also used divergence time estimates from time-calibrated phylogenies to ask how range overlap changes with divergence time and whether this effect depends on mating system. We found no evidence that automatic self-fertilization influenced range overlap of closely related plant species. Sister pairs with more recent divergence times had modestly greater range overlap, but this effect did not depend on mating system. The absence of a strong influence of mating system on range overlap suggests that mating system plays a minor or inconsistent role compared with many other mechanisms potentially influencing the co-occurrence of close relatives. © 2016 Botanical Society of America.
A Bayesian variable selection procedure to rank overlapping gene sets
Directory of Open Access Journals (Sweden)
Skarman Axel
2012-05-01
Full Text Available Abstract Background Genome-wide expression profiling using microarrays or sequence-based technologies allows us to identify genes and genetic pathways whose expression patterns influence complex traits. Different methods to prioritize gene sets, such as the genes in a given molecular pathway, have been described. In many cases, these methods test one gene set at a time, and therefore do not consider overlaps among the pathways. Here, we present a Bayesian variable selection method to prioritize gene sets that overcomes this limitation by considering all gene sets simultaneously. We applied Bayesian variable selection to differential expression to prioritize the molecular and genetic pathways involved in the responses to Escherichia coli infection in Danish Holstein cows. Results We used a Bayesian variable selection method to prioritize Kyoto Encyclopedia of Genes and Genomes pathways. We used our data to study how the variable selection method was affected by overlaps among the pathways. In addition, we compared our approach to another that ignores the overlaps, and studied the differences in the prioritization. The variable selection method was robust to a change in prior probability and stable given a limited number of observations. Conclusions Bayesian variable selection is a useful way to prioritize gene sets while considering their overlaps. Ignoring the overlaps gives different and possibly misleading results. Additional procedures may be needed in cases of highly overlapping pathways that are hard to prioritize.
Management Model of Resources Equilibrium Distribution among Overlapping-Generations
Institute of Scientific and Technical Information of China (English)
Jiang Xuemin; Li Ling
2004-01-01
The overlapping generation models the western scholars have designed from various perspectives to address different kinds of issues do not reflect Chinese emerging political and economic problems, and cannot be entirely and blindly applied to Chinese practical situation. In this paper the authors endeavor to incorporate some western scholars' research results into their own research findings to present overlapping generations model theory in a new perspective through establishing an overlapping generations theory on population including articulation of concepts and theorems of biological generation, economic generation and social generation and the overlapping periods in biological generation and two overlapping periods in economic generation among three generations. This management model with equilibrium distribution of resource wealth includes overlapping generations length model (δ),equilibrium transfer model (θ) and a complete model on equilibrium distribution among generations (δ-θ).The model provides quantitative basis for the creation of resource management system, and fills in a theoretical gap in this discipline in China. Besides,it furnishes a new methodology and manipulable tool for Chinese government to establish a comprehensive management information bank for many sectors such as economic trade, population, science and technology, education, human resource, natural resource and environment, agriculture, forestry,industry, mining and energy.
Characterizing Computation-Communication Overlap in Message-Passing Systems
Energy Technology Data Exchange (ETDEWEB)
David E. Bernholdt; Jarek Nieplocha; P. Sadayappan; Aniruddha G. Shet; Vinod Tipparaju
2008-01-31
Effective overlap of computation and communication is a well understood technique for latency hiding and can yield significant performance gains for applications on high-end computers. In this report, we describe an instrumentation framework developed for messagepassing systems to characterize the degree of overlap of communication with computation in the execution of parallel applications. The inability to obtain precise time-stamps for pertinent communication events is a significant problem, and is addressed by generation of minimum and maximum bounds on achieved overlap. The overlap measures can aid application developers and system designers in investigating scalability issues. The approach has been used to instrument two MPI implementations as well as the ARMCI system. The implementation resides entirely within the communication library and thus integrates well with existing approaches that operate outside the library. The utility of the framework is demonstrated by analyzing communication-computation overlap for micro-benchmarks and the NAS benchmarks, and the insights obtained are used to modify the NAS SP benchmark, resulting in improved overlap.
Characterizing Computation-Communication Overlap in Message-Passing Systems
Energy Technology Data Exchange (ETDEWEB)
David E. Bernholdt; Jarek Nieplocha; P. Sadayappan; Aniruddha G. Shet; Vinod Tipparaju
2008-01-31
Effective overlap of computation and communication is a well understood technique for latency hiding and can yield significant performance gains for applications on high-end computers. In this report, we describe an instrumentation framework developed for message-passing systems to characterize the degree of overlap of communication with computation in the execution of parallel applications. The inability to obtain precise time-stamps for pertinent communication events is a significant problem, and is addressed by generation of minimum and maximum bounds on achieved overlap. The overlap measures can aid application developers and system designers in investigating scalability issues. The approach has been used to instrument two MPI implementations as well as the ARMCI system. The implementation resides entirely within the communication library and thus integrates well with existing approaches that operate outside the library. The utility of the framework is demonstrated by analyzing communication-computation overlap for micro-benchmarks and the NAS benchmarks, and the insights obtained are used to modify the NAS SP benchmark, resulting in improved overlap.
Guo, Yuan; Kong, Qing-Cong; Zhu, Ye-Qing; Liu, Zhen-Zhen; Peng, Ling-Rong; Tang, Wen-Jie; Yang, Rui-Meng; Xie, Jia-Jun; Liu, Chun-Ling
2017-06-22
To evaluate the utility of the whole-lesion histogram apparent diffusion coefficient (ADC) for characterizing the heterogeneity of mucinous breast carcinoma (MBC) and to determine which ADC metrics may help to best differentiate subtypes of MBC. This retrospective study involved 52 MBC patients, including 37 pure MBC (PMBC) and 15 mixed MBC (MMBC). The PMBC patients were subtyped into PMBC-A (20 cases) and PMBC-B (17 cases) groups. All patients underwent preoperative diffusion-weighted imaging (DWI) at 1.5T and the whole-lesion ADC assessments were generated. Histogram-derived ADC parameters were compared between PMBC vs. MMBC and PMBC-A vs. PMBC-B, and receiver operating characteristic (ROC) curve analysis was used to determine optimal histogram parameters for differentiating these groups. The PMBC group exhibited significantly higher ADC values for the mean (P = 0.004), 25(th) (P = 0.004), 50(th) (P = 0.004), 75(th) (P = 0.006), and 90(th) percentiles (P = 0.013) and skewness (P = 0.021) than did the MMBC group. The 25(th) percentile of ADC values achieved the highest area under the curve (AUC) (0.792), with a cutoff value of 1.345 × 10(-3) mm(2) /s, in distinguishing PMBC and MMBC. The PMBC-A group showed significantly higher ADC values for the mean (P = 0.049), 25(th) (P = 0.015), and 50(th) (P = 0.026) percentiles and skewness (P = 0.004) than did the PMBC-B group. The 25(th) percentile of the ADC cutoff value (1.476 × 10(-3) mm(2) /s) demonstrated the best AUC (0.837) among the ADC values for distinguishing PMBC-A and PMBC-B. Whole-lesion ADC histogram analysis enables comprehensive evaluation of an MBC in its entirety and differentiating subtypes of MBC. Thus, it may be a helpful and supportive tool for conventional MRI. 4 TECHNICAL EFFICACY: Stage 2 J. Magn. Reson. Imaging 2017. © 2017 International Society for Magnetic Resonance in Medicine.
Presentation of dynamically overlapping auditory messages in user interfaces
Energy Technology Data Exchange (ETDEWEB)
Papp, III, Albert Louis [Univ. of California, Davis, CA (United States)
1997-09-01
This dissertation describes a methodology and example implementation for the dynamic regulation of temporally overlapping auditory messages in computer-user interfaces. The regulation mechanism exists to schedule numerous overlapping auditory messages in such a way that each individual message remains perceptually distinct from all others. The method is based on the research conducted in the area of auditory scene analysis. While numerous applications have been engineered to present the user with temporally overlapped auditory output, they have generally been designed without any structured method of controlling the perceptual aspects of the sound. The method of scheduling temporally overlapping sounds has been extended to function in an environment where numerous applications can present sound independently of each other. The Centralized Audio Presentation System is a global regulation mechanism that controls all audio output requests made from all currently running applications. The notion of multimodal objects is explored in this system as well. Each audio request that represents a particular message can include numerous auditory representations, such as musical motives and voice. The Presentation System scheduling algorithm selects the best representation according to the current global auditory system state, and presents it to the user within the request constraints of priority and maximum acceptable latency. The perceptual conflicts between temporally overlapping audio messages are examined in depth through the Computational Auditory Scene Synthesizer. At the heart of this system is a heuristic-based auditory scene synthesis scheduling method. Different schedules of overlapped sounds are evaluated and assigned penalty scores. High scores represent presentations that include perceptual conflicts between over-lapping sounds. Low scores indicate fewer and less serious conflicts. A user study was conducted to validate that the perceptual difficulties predicted by
Benke, Stephan; Nettels, Daniel; Hofmann, Hagen; Schuler, Benjamin
2017-03-17
Single-molecule fluorescence spectroscopy is a powerful approach for probing biomolecular structure and dynamics, including protein folding. For the investigation of nonequilibrium kinetics, Förster resonance energy transfer combined with confocal multiparameter detection has proven particularly versatile, owing to the large number of observables and the broad range of accessible timescales, especially in combination with rapid microfluidic mixing. However, a comprehensive kinetic analysis of the resulting time series of transfer efficiency histograms and complementary observables can be challenging owing to the complexity of the data. Here we present and compare three different methods for the analysis of such kinetic data: singular value decomposition, multivariate curve resolution with alternating least square fitting, and model-based peak fitting, where an explicit model of both the transfer efficiency histogram of each species and the kinetic mechanism of the process is employed. While each of these methods has its merits for specific applications, we conclude that model-based peak fitting is most suitable for a quantitative analysis and comparison of kinetic mechanisms.
Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.
2015-01-01
Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.
Pg Haji Mohd Ariffin, Ak Muhamad Amirul Irfan
2015-01-01
This paper presents the project that I have been tasked while attending a three-month Summer Programme at CERN. The Project specification is to analyse the result of a weekly data produced by Compact Muon Solenoid (CMS) in the form of histograms. CMS is a detector which is a multi-purpose apparatus use to operate at the Large Hadron Collider (LHC) at CERN. It will yield head-on collisions of two proton (ion) beams of 7 TeV (2.75 TeV per nucleon) each, with a design luminosity of 10 34 cm -2s-1. A comparison of the results is then made using two methods namely Kolmogorov Smirnov Statistic Test and Chi-Squared Test. These tests will be further elaborated in the subsequent paragraphs. To execute this project, I have to firstly study the entire basic computer programming in particular C++ and the ROOT Basic Programmes. This is important to ensure the tasks given can be resolved within the given time. A program is subsequently written to provide output of histogram and calculation of Kolmogorov-Smirnov Test and Ch...
Benke, Stephan; Nettels, Daniel; Hofmann, Hagen; Schuler, Benjamin
2017-03-01
Single-molecule fluorescence spectroscopy is a powerful approach for probing biomolecular structure and dynamics, including protein folding. For the investigation of nonequilibrium kinetics, Förster resonance energy transfer combined with confocal multiparameter detection has proven particularly versatile, owing to the large number of observables and the broad range of accessible timescales, especially in combination with rapid microfluidic mixing. However, a comprehensive kinetic analysis of the resulting time series of transfer efficiency histograms and complementary observables can be challenging owing to the complexity of the data. Here we present and compare three different methods for the analysis of such kinetic data: singular value decomposition, multivariate curve resolution with alternating least square fitting, and model-based peak fitting, where an explicit model of both the transfer efficiency histogram of each species and the kinetic mechanism of the process is employed. While each of these methods has its merits for specific applications, we conclude that model-based peak fitting is most suitable for a quantitative analysis and comparison of kinetic mechanisms.
Directory of Open Access Journals (Sweden)
Andreas Behr
2006-06-01
Full Text Available The histogram location approach has been proposed by Kahn (1997 to estimate the fraction of wage cuts prevented by downward nominal wage rigidity. In this paper, we analyze the validity of the approach by means of a simulation study which yielded evidence of unbiasedness but also of potential underestimation of rigidity parameter uncertainty and therefore of potential anticonservative inference. We apply the histogram location approach to estimate the extent of downward nominal wage rigidity across the EU for 1995-2001. Our data base is the User Data Base (UDB of the European Community Household Panel (ECHP. The results show wide variation in the fraction of wage cuts prevented by nominal wage rigidity across the EU. The lowest rigidity parameters are found for the UK, Spain and Ireland, the largest for Portugal and Italy. Analyzing the change of rigidity between sub periods 1995-1997 and 1999-2001 even shows an widening of the differences in nominal wage rigidity. Due to the finding of large differences across the EU, the results imply that the costs of low inflation policies across the EU differ substantially.
Bartels, P C; Schoorl, M; Lombarts, A J
1997-11-01
Screening for pseudothrombocytopenia caused by in vitro platelet clumping has been performed in 45,000 subjects attending a general hospital. In our region, the observed prevalence of EDTA-induced pseudothrombocytopenia in blood samples with an initial platelet count below 150 x 10(9)/l was estimated to amount to 0.1%. EDTA-induced pseudothrombocytopenia was confirmed by detection of platelet aggregates by means of microscopic evaluation from the blood smear. In routine investigations, pseudothrombocytopenia could be highly suspected when the Sysmex NE 8000 showed characteristic peculiarities in the white blood cell (WBC) scattergram and histogram. Platelet aggregation is avoided in such cases by the use of citrate as an anticoagulant instead of EDTA. Pseudothrombocytopenia was detected in 46 subjects. As a screening test for pseudothrombocytopenia, increased cut-off values derived from the WBC histogram demonstrated 90% sensitivity and 100% specificity. Automated flagging for platelet clumps, deviations reflecting MPV, or PDW abnormalities revealed lower scores with respect to sensitivity.
Domengie, F.; Morin, P.; Bauza, D.
2015-07-01
We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.
Hierarchical Overlapping Clustering of Network Data Using Cut Metrics
Gama, Fernando; Ribeiro, Alejandro
2016-01-01
A novel method to obtain hierarchical and overlapping clusters from network data -i.e., a set of nodes endowed with pairwise dissimilarities- is presented. The introduced method is hierarchical in the sense that it outputs a nested collection of groupings of the node set depending on the resolution or degree of similarity desired, and it is overlapping since it allows nodes to belong to more than one group. Our construction is rooted on the facts that a hierarchical (non-overlapping) clustering of a network can be equivalently represented by a finite ultrametric space and that a convex combination of ultrametrics results in a cut metric. By applying a hierarchical (non-overlapping) clustering method to multiple dithered versions of a given network and then convexly combining the resulting ultrametrics, we obtain a cut metric associated to the network of interest. We then show how to extract a hierarchical overlapping clustering structure from the aforementioned cut metric. Furthermore, the so-called overlappi...
Energy Technology Data Exchange (ETDEWEB)
Song, Yong Sub; Choi, Seung Hong; Park, Chul Kee [Seoul National University College of Medicine, Seoul (Korea, Republic of); and others
2013-08-15
The purpose of this study was to differentiate true progression from pseudoprogression of glioblastomas treated with concurrent chemoradiotherapy (CCRT) with temozolomide (TMZ) by using histogram analysis of apparent diffusion coefficient (ADC) and normalized cerebral blood volume (nCBV) maps. Twenty patients with histopathologically proven glioblastoma who had received CCRT with TMZ underwent perfusion-weighted imaging and diffusion-weighted imaging (b = 0, 1000 sec/mm{sup 2}). The corresponding nCBV and ADC maps for the newly visible, entirely enhancing lesions were calculated after the completion of CCRT with TMZ. Two observers independently measured the histogram parameters of the nCBV and ADC maps. The histogram parameters between the true progression group (n = 10) and the pseudoprogression group (n = 10) were compared by use of an unpaired Student's t test and subsequent multivariable stepwise logistic regression analysis to determine the best predictors for the differential diagnosis between the two groups. Receiver operating characteristic analysis was employed to determine the best cutoff values for the histogram parameters that proved to be significant predictors for differentiating true progression from pseudoprogression. Intraclass correlation coefficient was used to determine the level of inter-observer reliability for the histogram parameters. The 5th percentile value (C5) of the cumulative ADC histograms was a significant predictor for the differential diagnosis between true progression and pseudoprogression (p 0.044 for observer 1; p = 0.011 for observer 2). Optimal cutoff values of 892 x 10{sup -6} mm{sup 2}/sec for observer 1 and 907 x 10{sup -6} mm{sup 2}/sec for observer 2 could help differentiate between the two groups with a sensitivity of 90% and 80%, respectively, a specificity of 90% and 80%, respectively, and an area under the curve of 0.880 and 0.840, respectively. There was no other significant differentiating parameter on the n
A Bayesian variable selection procedure for ranking overlapping gene sets
DEFF Research Database (Denmark)
Skarman, Axel; Mahdi Shariati, Mohammad; Janss, Luc
2012-01-01
described. In many cases, these methods test one gene set at a time, and therefore do not consider overlaps among the pathways. Here, we present a Bayesian variable selection method to prioritize gene sets that overcomes this limitation by considering all gene sets simultaneously. We applied Bayesian...... variable selection to differential expression to prioritize the molecular and genetic pathways involved in the responses to Escherichia coli infection in Danish Holstein cows. Results We used a Bayesian variable selection method to prioritize Kyoto Encyclopedia of Genes and Genomes pathways. We used our...... data to study how the variable selection method was affected by overlaps among the pathways. In addition, we compared our approach to another that ignores the overlaps, and studied the differences in the prioritization. The variable selection method was robust to a change in prior probability...
Overlapping community detection in weighted networks via a Bayesian approach
Chen, Yi; Wang, Xiaolong; Xiang, Xin; Tang, Buzhou; Chen, Qingcai; Fan, Shixi; Bu, Junzhao
2017-02-01
Complex networks as a powerful way to represent complex systems have been widely studied during the past several years. One of the most important tasks of complex network analysis is to detect communities embedded in networks. In the real world, weighted networks are very common and may contain overlapping communities where a node is allowed to belong to multiple communities. In this paper, we propose a novel Bayesian approach, called the Bayesian mixture network (BMN) model, to detect overlapping communities in weighted networks. The advantages of our method are (i) providing soft-partition solutions in weighted networks; (ii) providing soft memberships, which quantify 'how strongly' a node belongs to a community. Experiments on a large number of real and synthetic networks show that our model has the ability in detecting overlapping communities in weighted networks and is competitive with other state-of-the-art models at shedding light on community partition.
Iterative methods for overlap and twisted mass fermions
Energy Technology Data Exchange (ETDEWEB)
Chiarappa, T. [Univ. di Milano Bicocca (Italy); Jansen, K.; Shindler, A.; Wetzorke, I. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Nagai, K.I. [Wuppertal Univ. (Gesamthochschule) (Germany). Fachbereich Physik; Papinutto, M. [INFN Sezione di Roma Tre, Rome (Italy); Scorzato, L. [European Centre for Theoretical Studies in Nuclear Physics and Related Areas (ECT), Villazzano (Italy); Urbach, C. [Liverpool Univ. (United Kingdom). Dept. of Mathematical Sciences; Wenger, U. [ETH Zuerich (Switzerland). Inst. fuer Theoretische Physik
2006-09-15
We present a comparison of a number of iterative solvers of linear systems of equations for obtaining the fermion propagator in lattice QCD. In particular, we consider chirally invariant overlap and chirally improved Wilson (maximally) twisted mass fermions. The comparison of both formulations of lattice QCD is performed at four fixed values of the pion mass between 230 MeV and 720 MeV. For overlap fermions we address adaptive precision and low mode preconditioning while for twisted mass fermions we discuss even/odd preconditioning. Taking the best available algorithms in each case we find that calculations with the overlap operator are by a factor of 30-120 more expensive than with the twisted mass operator. (orig.)
Piles, Tabs and Overlaps in Navigation among Documents
DEFF Research Database (Denmark)
Jakobsen, Mikkel Rønne; Hornbæk, Kasper Anders Søren
2010-01-01
documents worked well for tasks that involved visual features of the documents, but the utility of recency or stable ordering of documents was task dependent. Based on the results, we discuss the effects of spatial arrangement, visibility, and task-dependency, and suggest areas for future research......Navigation among documents is a frequent, but ill supported activity. Overlapping or tabbed documents are widespread, but they offer limited visibility of their content. We explore variations on navigation support: arranging documents with tabs, as overlapping windows, and in piles....... In an experiment we compared 11 participants’ navigation with these variations and found strong task effects. Overall, overlapping windows were preferred and their structured layout worked well with some tasks. Surprisingly, tabbed documents were efficient in tasks requiring simply finding a document. Piled...
Climate-induced range overlap among closely related species
Krosby, Meade; Wilsey, Chad B.; McGuire, Jenny L.; Duggan, Jennifer M.; Nogeire, Theresa M.; Heinrichs, Julie A.; Tewksbury, Joshua J.; Lawler, Joshua J.
2015-09-01
Contemporary climate change is causing large shifts in biotic distributions, which has the potential to bring previously isolated, closely related species into contact. This has led to concern that hybridization and competition could threaten species persistence. Here, we use bioclimatic models to show that future range overlap by the end of the century is predicted for only 6.4% of isolated, congeneric species pairs of New World birds, mammals and amphibians. Projected rates of climate-induced overlap are higher for birds (11.6%) than for mammals (4.4%) or amphibians (3.6%). As many species will have difficulty tracking shifting climates, actual rates of future overlap are likely to be far lower, suggesting that hybridization and competition impacts may be relatively modest.
Stitching interferometry of a full cylinder without using overlap areas
Peng, Junzheng; Chen, Dingfu; Yu, Yingjie
2017-08-01
Traditional stitching interferometry requires finding out the overlap correspondence and computing the discrepancies in the overlap regions, which makes it complex and time-consuming to obtain the 360° form map of a cylinder. In this paper, we develop a cylinder stitching model based on a new set of orthogonal polynomials, termed Legendre Fourier (LF) polynomials. With these polynomials, individual subaperture data can be expanded as a composition of the inherent form of a partial cylinder surface and additional misalignment parameters. Then the 360° form map can be acquired by simultaneously fitting all subaperture data with the LF polynomials. A metal shaft was measured to experimentally verify the proposed method. In contrast to traditional stitching interferometry, our technique does not require overlapping of adjacent subapertures, thus significantly reducing the measurement time and making the stitching algorithm simple.
Overlap syndromes of autoimmune hepatitis: an open question.
Durazzo, Marilena; Premoli, Alberto; Paschetta, Elena; Belci, Paola; Spandre, Maurizio; Bo, Simona
2013-02-01
The headword "overlap syndromes" of liver diseases includes the coexistence of autoimmune hepatitis, primary biliary cirrhosis, and primary sclerosing cholangitis. These syndromes often represent a diagnostic and therapeutic challenge for hepatologists; it remains unclear whether these overlap syndromes form distinct entities or they are only variants of the major autoimmune liver diseases. The most frequent reported association occurs between autoimmune hepatitis and primary biliary cirrhosis, whereas the overlap between autoimmune hepatitis and primary sclerosing cholangitis is less frequent, typically at young age and often attendant with an inflammatory bowel disease. The choice therapy is based on ursodeoxycholic acid and immunosuppressive drugs, used at the same time or consecutively, according to the course of disease. The diagnostic scores for autoimmune hepatitis can help for diagnosis, even though their definitive soundness is lacking.
Overlapped flowers yield detection using computer-based interface
Directory of Open Access Journals (Sweden)
Anuradha Sharma
2016-09-01
Full Text Available Precision agriculture has always dealt with the accuracy and timely information about agricultural products. With the help of computer hardware and software technology designing a decision support system that could generate flower yield information and serve as base for management and planning of flower marketing is made so easy. Despite such technologies, some problem still arise, for example, a colour homogeneity of a specimen which cannot be obtained similar to actual colour of image and overlapping of image. In this paper implementing a new ‘counting algorithm’ for overlapped flower is being discussed. For implementing this algorithm, some techniques and operations such as colour image segmentation technique, image segmentation, using HSV colour space and morphological operations have been used. In this paper used two most popular colour space; those are RGB and HSV. HSV colour space decouples brightness from a chromatic component in the image, by which it provides better result in case for occlusion and overlapping.
Iterative methods for overlap and twisted mass fermions
Energy Technology Data Exchange (ETDEWEB)
Chiarappa, T. [Univ. di Milano Bicocca (Italy); Jansen, K.; Shindler, A.; Wetzorke, I. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Nagai, K.I. [Wuppertal Univ. (Gesamthochschule) (Germany). Fachbereich Physik; Papinutto, M. [INFN Sezione di Roma Tre, Rome (Italy); Scorzato, L. [European Centre for Theoretical Studies in Nuclear Physics and Related Areas (ECT), Villazzano (Italy); Urbach, C. [Liverpool Univ. (United Kingdom). Dept. of Mathematical Sciences; Wenger, U. [ETH Zuerich (Switzerland). Inst. fuer Theoretische Physik
2006-09-15
We present a comparison of a number of iterative solvers of linear systems of equations for obtaining the fermion propagator in lattice QCD. In particular, we consider chirally invariant overlap and chirally improved Wilson (maximally) twisted mass fermions. The comparison of both formulations of lattice QCD is performed at four fixed values of the pion mass between 230 MeV and 720 MeV. For overlap fermions we address adaptive precision and low mode preconditioning while for twisted mass fermions we discuss even/odd preconditioning. Taking the best available algorithms in each case we find that calculations with the overlap operator are by a factor of 30-120 more expensive than with the twisted mass operator. (orig.)
Detecting highly overlapping community structure by greedy clique expansion
Lee, Conrad; McDaid, Aaron; Hurley, Neil
2010-01-01
In complex networks it is common for each node to belong to several communities, implying a highly overlapping community structure. Recent advances in benchmarking indicate that existing community assignment algorithms, capable of detecting overlapping communities, perform well only when the extent of community overlap is kept to modest levels. To overcome this limitation, we introduce a new community assignment algorithm called Greedy Clique Expansion (GCE). The algorithm identifies distinct cliques as seeds and expands these seeds by greedily optimizing a local fitness function. We perform extensive benchmarks on synthetic data to demonstrate that GCE's good performance is robust across diverse graph topologies. Significantly, GCE is the only algorithm to perform well on these synthetic graphs, in which every node belongs to multiple communities. Furthermore, when put to the task of identifying functional modules in protein interaction data, and college dorm assignments in Facebook friendship data, we find ...
The eigSUMR inverter for overlap fermion
Cundy, Nigel
2015-01-01
We discuss the usage and applicability of deflation methods for the overlap lattice Dirac operator, focussing on calculating the eigenvalues using a method similar to the eigCG algorithm used for other Dirac operators. The overlap operator, which contains several theoretical advantages over other formulations of lattice Quantum Chromodynamics, is more computationally expensive because it requires the computation of the matrix sign function. The principle change made compared to deflation methods for other formulations of lattice QCD is that it is necessary for best performance to tune or relax the accuracy of the matrix sign function as the computation proceeds. We adapt the eigCG algorithm for two inversion algorithms for overlap fermions, GMRESR(relCG) and GMRESR(relSUMR). Before deflation, the rate of convergence of these routines in terms of iterations is similar, but, since the Shifted Unitary Minimal Residual (SUMR) algorithm only requires one call to the matrix sign function compared to the two calls r...
Toxic epidermal necrolysis, DRESS, AGEP: Do overlap cases exist?
Directory of Open Access Journals (Sweden)
Bouvresse Sophie
2012-09-01
Full Text Available Abstract Background Severe cutaneous adverse reactions to drugs (SCARs include acute generalized exanthematous pustulosis (AGEP, drug reaction with eosinophilia and systemic symptoms (DRESS and epidermal necrolysis (Stevens-Johnson syndrome–toxic epidermal necrolysis [SJS-TEN]. Because of the varied initial presentation of such adverse drug reactions, diagnosis may be difficult and suggests overlap among SCARs. Overlapping SCARs are defined as cases fulfilling the criteria for definite or probable diagnosis of at least 2 ADRs according to scoring systems for AGEP, DRESS and SJS-TEN. We aimed to evaluate the prevalence of overlap among SCARs among cases in the referral hospital in France. Methods We retrospectively analyzed data for 216 patients hospitalized in the referral centre over 7 years with a discharge diagnosis of AGEP (n = 45, DRESS (n = 47, SJS-TEN (n = 80 or “drug rash” (n = 44. Each case with detailed clinical data and a skin biopsy specimen was scored for AGEP, DRESS and SJS-TEN by use of diagnostic scores elaborated by the RegiSCAR group. Results In total, 45 of 216 cases (21% had at least 2 possible diagnoses: 35 had a single predominant diagnosis (definite or probable, 7 had several possible diagnoses and 3 (2.1% of 145 confirmed SCARs were overlap SCARs. Conclusions Despite ambiguities among SCARs, confirmed overlap cases are rare. This study did not avoid pitfalls linked to its retrospective nature and selection bias. In the acute stage of disease, early identification of severe ADRs can be difficult because of clinical or biologic overlapping features and missing data on histology, biology and evolution. Retrospectively analyzing cases by use of diagnostic algorithms can lead to reliable discrimination among AGEP, DRESS and SJS-TEN.
Overlapping illusions by transformation optics without any negative refraction material
Sun, Fei; He, Sailing
2016-01-01
A novel method to achieve an overlapping illusion without any negative refraction index material is introduced with the help of the optic-null medium (ONM) designed by an extremely stretching spatial transformation. Unlike the previous methods to achieve such an optical illusion by transformation optics (TO), our method can achieve a power combination and reshape the radiation pattern at the same time. Unlike the overlapping illusion with some negative refraction index material, our method is not sensitive to the loss of the materials. Other advantages over existing methods are discussed. Numerical simulations are given to verify the performance of the proposed devices.
Overlaps after quantum quenches in the sine-Gordon model
Horváth, D. X.; Takács, G.
2017-08-01
We present a numerical computation of overlaps in mass quenches in sine-Gordon quantum field theory using truncated conformal space approach (TCSA). To improve the cut-off dependence of the method, we use a novel running coupling definition which has a general applicability in free boson TCSA. The numerical results for the first breather overlaps are compared with the analytic continuation of a previously proposed analytical Ansatz for the initial state in a related sinh-Gordon quench, and good agreement is found between the numerical data and the analytical prediction in a large energy range.
QCD thermodynamics with continuum extrapolated dynamical overlap fermions
Borsanyi, Sz; Lippert, T; Nogradi, D; Pittler, F; Szabo, K K; Toth, B C
2015-01-01
We study the finite temperature transition in QCD with two flavors of dynamical fermions at a pseudoscalar pion mass of about 350 MeV. We use lattices with temporal extent of $N_t$=8, 10 and 12. For the first time in the literature a continuum limit is carried out for several observables with dynamical overlap fermions. These findings are compared with results obtained within the staggered fermion formalism at the same pion masses and extrapolated to the continuum limit. The presented results correspond to fixed topology and its effect is studied in the staggered case. Nice agreement is found between the overlap and staggered results.
Neurofibromatosis type 1 with overlap Turner syndrome and Klinefelter syndrome.
Hatipoglu, Nihal; Kurtoglu, Selim; Kendirci, Mustafa; Keskin, Mehmet; Per, Hüseyin
2010-02-01
Turner's syndrome is a sex chromosome disorder. Klinefelter's syndrome is one of the most severe genetic diseases. Neurofibromatosis is an autosomal dominant disorder characterized by cafe-au-lait spots and fibromatous tumors of the skin. In this article, we report the overlap of neurofibromatosis-1 with Turner and Klinefelter syndromes. Thus, these disorders might overlap within the same patient. Due to these cases, we suggest that each patient with Turner-like symptoms or Klinefelter's-like syndrome, be carefully examined for café au lait macules before the initiation of hormone replacement treatment.
Characteristics and self-rated health of overlap syndrome
Directory of Open Access Journals (Sweden)
Chung JW
2014-07-01
Full Text Available Jung Wha Chung,1 Kyoung Ae Kong,2 Jin Hwa Lee,1 Seok Jeong Lee,1 Yon Ju Ryu,1 Jung Hyun Chang11Division of Pulmonary and Critical Care Medicine, Department of Internal Medicine, School of Medicine, Ewha Womans University, 2Clinical Trial Center, Ewha Womans University Mokdong Hospital, Seoul, Korea Background and objective: Overlap syndrome shares features of both asthma and chronic obstructive pulmonary disease (COPD. The aim of this study was to investigate characteristics of overlap syndrome and their effect on self-rated health (SRH. Methods: We analyzed data from the Fourth Korea National Health and Nutrition Examination Survey of 2007–2009. Subjects with acceptable spirometry and available wheezing history were included. Subjects were classified into four groups based on forced expiratory volume in one second (FEV1/forced vital capacity (FVC results and the presence or absence of self-reported wheezing for the previous 12 months: 1 COPD group, defined as having FEV1/FVC <0.7 without self-reported wheezing history; 2 asthma group, defined as having self-reported wheezing history without FEV1/FVC <0.7; 3 overlap syndrome group, having both FEV1/FVC <0.7 and wheezing history; and 4 non-obstructive disease (NOD group, having neither FEV1/FVC <0.7 nor self-reported wheezing. SRH was categorized as better or lower based on responses to a questionnaire. Results: From a total 9,104 subjects, 700 were assigned to the COPD group, 560 to the asthma group, 210 to the overlap syndrome group, and 7,634 to the NOD group. Compared to the other groups, subjects in the overlap syndrome group were more likely to have low lung function, a high proportion of smokers, low socioeconomic status, short education duration, lower SRH, and past diagnosis of pulmonary tuberculosis or bronchiectasis. Multiple logistic regression analysis revealed that both overlap syndrome and asthma groups were independently associated with lower SRH after adjustment for age, sex
Experimental characterization of Raman overlaps between mode-groups
DEFF Research Database (Denmark)
Christensen, Erik Nicolai; Koefoed, Jacob Gade; Friis, Søren Michael Mørk;
2016-01-01
-equalized gain. In this paper, we present an experimental characterization of the intermodal Raman intensity overlaps of a few-mode fiber using backward-pumped Raman amplification. By varying the input pump power and the degree of higher order mode-excitation for the pump and the signal in a 10km long two......-mode fiber, we are able to characterize all intermodal Raman intensity overlaps. Using these results, we perform a Raman amplification measurement and demonstrate a mode-differential gain of only 0.25dB per 10dB overall gain. This is, to the best of our knowledge, the lowest mode differential gain achieved...
Detect overlapping and hierarchical community structure in networks
Shen, Huawei; Cai, Kai; Hu, Mao-Bin
2008-01-01
Clustering and community structure is crucial for many network systems and the related dynamic processes. It has been shown that communities are usually overlapping and hierarchical. However, previous methods investigate these two properties of community structure separately. This paper propose an algorithm (EAGLE) to detect both the overlapping and hierarchical properties of complex community structure together. This algorithm deals with the set of maximal cliques and adopts an agglomerative framework. The quality function of modularity is extended to evaluate the goodness of a cover. The examples of application to real world networks give excellent results.
Institute of Scientific and Technical Information of China (English)
吉书鹏; 丁小青
2003-01-01
Image enhancement methods are typically aimed at improvement of the overall visibility of features. Though histogram equalization can enhance the contrast by redistributing the gray levels, it has the drawback that it reduces the information in the processed image. In this paper, we present a new image enhancement algorithm. After histogram equalization is carried out, morphological filters and wavelet-based enhancement algorithm is used to clean out the unwanted details and further enhance the image and compensate for the information loss during histogram equalization. Experimental results show that the morphological filters and wavelet-based histogram equalization algorithm can significantly enhance the contrast and increase the information entropy of the image.
Shnoll, S E; Berulis, I I; Udaltsova, N V; Rubinstein, I A; Shnoll, Simon E.; Zenchenko, Konstantin I.; Berulis, Iosas I.; Udaltsova, Natalia V.; Rubinstein, Ilia A.
2004-01-01
The fine structure of histograms of measurements of 239Pu alpha-activity varies periodically, and the period of these variations is equal to sidereal day (1436 minutes). The periodicity is not observed in the experiments with collimator that restricts the alpha particles flow to be oriented to the Polar Star. Based on this study and other independent data, such as measurements conducted by the Arctic expedition, and similarity of the histograms in processes observed at different locations at the same local time, the conclusion was made, that the fine structure of statistical distributions of the observed processes depends on the celestial sphere.
Energy Technology Data Exchange (ETDEWEB)
Souza, Rafael T.F.; Lemke, Ney; Hormaza, Joel Mesa; Alvarez, Matheus, E-mail: rafael@ibb.unesp.b [Universidade Estadual Paulista Julio de Mesquisa Filho (DFB/IB/UNESP), Botucatu, SP (Brazil). Inst. de Biociencias. Dept. de Fisica e Biofisica; Pina, Diana R.; Teixeira, Altamir S. [Universidade Estadual Paulista Julio de Mesquisa Filho (HC/FM/UNESP), Botucatu, SP (Brazil). Hospital de Clinicas. Dept. de Doencas Tropicais e Diagnostico por Imagem
2010-06-15
An algorithm for determining the equivalent thickness of biological tissue by the removal of Gaussian from the histograms was proposed. This algorithm classifies the different biological tissues using histograms, constructed from CT scans in DICOM format and calculates the average thickness of these tissues. The founded results show to be coherent with literature, with discrepancies of up to 21.6% on the bone, and analyzed for the anthropomorphic phantom (RANDO). These results allow the use of this method in living tissues for the construction of chest homogeneous phantoms of newborn and suckling patients, which are subsequently used in the optimization process of pediatric radiographic images. (author)
Locality and topology with fat link overlap actions
Kovács, T G
2003-01-01
We study the locality and topological properties of fat link clover overlap (FCO) actions. We find that a small amount of fattening (2-4 steps of APE or 1 step of HYP) already results in greatly improved properties compared to the Wilson overlap (WO). We present a detailed study of the localisation of the FCO and its connection to the density of low modes of $A^\\dagger A$. In contrast to the Wilson overlap, on quenched gauge backgrounds we do not find any dependence of the localization of the FCO on the gauge coupling. This suggests that the FCO remains local in the continuum limit. The FCO also faithfully reproduces the zero mode wave functions of typical lattice instantons, not like the Wilson overlap. After a general discussion of different lattice definitions of the topological charge we also show that the FCO together with the Boulder charge are likely to satisfy the index theorem in the continuum limit. Finally, we present a high statistics computation of the quenched topological susceptibility with the...
Anxiety, fainting and gagging in dentistry: Separate or overlapping constructs?
Houtem, C.M.H.H. van
2016-01-01
This thesis aimed to increase the knowledge about severe forms of anxiety, gagging and fainting in dentistry and to investigate whether these phenomena are overlapping or separate constructs. In Chapter 2 a literature review of twin studies showed that the estimated heritability of specific phobias
Correlation functions at small quark masses with overlap fermions
Energy Technology Data Exchange (ETDEWEB)
Giusti, L. [CNRS Luminy, Marseille (France). Centre de Physique Theorique; Hernandez, P. [Edificio Institutos Investigacion, Valencia (Spain). Dpto. Fisica Teorica and IFIC; Laine, M. [Bielefeld Univ. (Germany). Fakultaet fuer Physik; Pena, C.; Wennekers, J.; Wittig, H.; Weisz, P. [Max-Planck-Institut fuer Physik, Muenchen (Germany)
2004-09-01
We report on recent work on the determination of low-energy constants describing {delta}S = 1 weak transitions, in order to investigate the origins of the {delta}I = 1/2 rule. We focus on numerical techniques designed to enhance the statistical signal in three-point correlation functions computed with overlap fermions near the chiral limit. (orig.)
Correlation functions at small quark masses with overlap fermions
Energy Technology Data Exchange (ETDEWEB)
Giusti, L. [Centre de Physique Theorique, CNRS Luminy, F-13288 Marseille Cedex 9 (France); Hernandez, P. [Dpto. Fisica Teorica and IFIC, Edificio Institutos Investigacion, E-46071 Valencia (Spain); Laine, M. [Faculty of Physics, University of Bielefeld, D-33501 Bielefeld (Germany); Pena, C. [Deutsches Elektronen-Synchrotron, DESY, Notkestr. 85, D-22603 Hamburg (Germany); Weisz, P. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, D-80805 Munich (Germany); Wennekers, J. [Deutsches Elektronen-Synchrotron, DESY, Notkestr. 85, D-22603 Hamburg (Germany); Wittig, H. [Deutsches Elektronen-Synchrotron, DESY, Notkestr. 85, D-22603 Hamburg (Germany)
2005-03-15
We report on recent work on the determination of low-energy constants describing {delta}S=1 weak transitions, in order to investigate the origins of the {delta}I=1/2 rule. We focus on numerical techniques designed to enhance the statistical signal in three-point correlation functions computed with overlap fermions near the chiral limit.
Correlation functions at small quark masses with overlap fermions
Giusti, Leonardo; Laine, Mikko; Peña, C; Weisz, P; Wennekers, J; Wittig, H
2005-01-01
We report on recent work on the determination of low-energy constants describing Delta{S}=1 weak transitions, in order to investigate the origins of the Delta{I}=1/2 rule. We focus on numerical techniques designed to enhance the statistical signal in three-point correlation functions computed with overlap fermions near the chiral limit.
Quality Assurance in the Determination of Overlapping Peak Areas
DEFF Research Database (Denmark)
Christensen, L.H.; Heydorn, K.
1987-01-01
The ability of different computer programs to yield accurate peak areas in statistical control in the case of partially overlapping photopeaks has been tested by the Analysis of Precision. A modified Covell method, two commercially available peak-fitting programs from Nuclear Data and Ortec, and ...
On the geometry of coating layers formed by overlap
Ocelik, V.; Nenadl, O.; Palavra, A.; De Hosson, J. Th. M.
2014-01-01
A recursive model is presented for the prediction of the profile of a coating layer formed by single track overlap. A known shape of single track is assumed and on the base of simple physical assumptions the recursive sequence is deduced to construct an entire profile of such coatings. Calculations
Track with overlapping links for dry coal extrusion pumps
Saunders, Timothy; Brady, John D
2014-01-21
A chain for a particulate material extrusion pump includes a plurality of links, each of the plurality of links having a link body and a link ledge, wherein each link ledge of the plurality of links at least partially overlaps the link body of an adjacent one of the plurality of links.
Coeliac disease and autoimmune disease-genetic overlap and screening
Lundin, Knut E. A.; Wijmenga, Cisca
Coeliac disease is a treatable, gluten-induced disease that often occurs concurrently with other autoimmune diseases. In genetic studies since 2007, a partial genetic overlap between these diseases has been revealed and further insights into the pathophysiology of coeliac disease and autoimmunity
Two Efficient Techniques to Find Approximate Overlaps between Sequences
2017-01-01
The next-generation sequencing (NGS) technology outputs a huge number of sequences (reads) that require further processing. After applying prefiltering techniques in order to eliminate redundancy and to correct erroneous reads, an overlap-based assembler typically finds the longest exact suffix-prefix match between each ordered pair of the input reads. However, another trend has been evolving for the purpose of solving an approximate version of the overlap problem. The main benefit of this direction is the ability to skip time-consuming error-detecting techniques which are applied in the prefiltering stage. In this work, we present and compare two techniques to solve the approximate overlap problem. The first adapts a compact prefix tree to efficiently solve the approximate all-pairs suffix-prefix problem, while the other utilizes a well-known principle, namely, the pigeonhole principle, to identify a potential overlap match in order to ultimately solve the same problem. Our results show that our solution using the pigeonhole principle has better space and time consumption over an FM-based solution, while our solution based on prefix tree has the best space consumption between all three solutions. The number of mismatches (hamming distance) is used to define the approximate matching between strings in our work.
Australia's National Research Collection: Overlap, Uniqueness, and Distribution
Genoni, Paul; Wright, Janette
2011-01-01
This paper reports on the results of an overlap study of Australian research library collections. The study used OCLC's WorldCat Collection Analysis software to mine data recording Australian holdings on the WorldCat database. The data is analysed according to the results obtained for six "groups" which represent various coalitions of…
9 CFR 121.4 - Overlap select agents and toxins.
2010-01-01
... OF AGRICULTURE VIRUSES, SERUMS, TOXINS, AND ANALOGOUS PRODUCTS; ORGANISMS AND VECTORS POSSESSION, USE...; Hendra virus; Nipah virus; Rift Valley fever virus; Venezuelan equine encephalitis virus. (c) Genetic... infectious forms of any of the overlap select agent viruses listed in paragraph (b) of this section. 4 4...
42 CFR 73.4 - Overlap select agents and toxins.
2010-10-01
... pseudomallei (formerly Pseudomonas pseudomallei) Hendra virus Nipah virus Rift Valley fever virus Venezuelan Equine Encephalitis virus (c) Genetic Elements, Recombinant Nucleic Acids, and Recombinant Organisms: (1) Nucleic acids that can produce infectious forms of any of the overlap select agent viruses listed...
A fuzzy approach to the Weighted Overlap Dominance model
DEFF Research Database (Denmark)
Franco de los Rios, Camilo Andres; Hougaard, Jens Leth; Nielsen, Kurt
2013-01-01
in an interactive way, where input data can take the form of uniquely-graded or interval-valued information. Here we explore the Weighted Overlap Dominance (WOD) model from a fuzzy perspective and its outranking approach to decision support and multidimensional interval analysis. Firstly, imprecision measures...
Two-fractal overlap time series: Earthquakes and market crashes
Indian Academy of Sciences (India)
Bikas K Chakrabarti; Arnab Chatterjee; Pratip Bhattacharyya
2008-08-01
We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.
Market positioning: the shifting effects of niche overlap
Bruggeman, J.; Grunow, D.; Leenders, M.A.A.M.; Vermeulen, I.; Kuilman, J.G.
2012-01-01
Organizational ecology models of market dynamics emphasize the competition-inducing role of inter-organizational niche overlap—targeting similar market niches increases competitive pressure and thus reduces organizations’ fitness. Recent studies, however, have suggested that moderate niche overlap m
THE ROLE OF PREPARATION IN OVERLAPPING-TASK PERFORMANCE
de Jong, Robert (Rob)
1995-01-01
We examined how performance of two overlapping discrete tasks is organized and controlled. Experiment 1 showed that when stimuli were presented in an unexpected order, expectations rather than actual presentation order determined the order in which the two stimuli were processed. In Experiment 2, wh
Interference management with partial uplink/downlink spectrum overlap
Randrianantenaina, Itsikiantsoa
2016-07-26
Simultaneous reuse of spectral resources by uplink and downlink, denoted as in-band full duplex (FD) communication, is promoted to double the spectral efficiency when compared to its half-duplex (HD) counterpart. Interference management, however, remains challenging in FD cellular networks, especially when high disparity between uplink and downlink transmission powers exists. The uplink performance can be particularly deteriorated when operating on channels that are simultaneously occupied with downlink transmission. This paper considers a cellular wireless system with partial spectrum overlap between the downlink and uplink. The performance of the system becomes, therefore, a function of the overlap fraction, as well as the power levels of both the uplink and downlink transmissions. The paper considers the problem of maximizing an overall network utility to find the uplink/downlink transmission powers and the spectrum overlap fraction between the uplink and downlink spectrum in each cell, and proposes solving the problem using interior point method. Simulations results confirm the vulnerability of the uplink performance to the FD operation, and show the superiority of the proposed scheme over the FD and HD schemes. The results further show that explicit uplink and downlink performance should be considered for efficient design of cellular networks with overlapping uplink/downlink resources. © 2016 IEEE.
Room acoustic transition time based on reflection overlap
DEFF Research Database (Denmark)
Jeong, Cheol-Ho; Brunskog, Jonas; Jacobsen, Finn
2010-01-01
A transition time is defined based on the temporal overlap of reflected pulses in room impulse responses. Assuming specular reflections only, the temporal distance between adjacent reflections, which is proportional to the volume of a room, is compared with the characteristic width of a pulse...
Room acoustic transition time based on reflection overlap
DEFF Research Database (Denmark)
Jeong, Cheol-Ho; Brunskog, Jonas; Jacobsen, Finn
2013-01-01
A transition time is defined based on the temporal overlap of reflected pulses in room impulse responses. Assuming specular reflections only, the temporal distance between adjacent reflections, which is proportional to the volume of a room, is compared with the characteristic width of a pulse...
[Impact of the Overlap Region Between Acoustic and Electric Stimulation].
Baumann, Uwe; Mocka, Moritz
2017-06-01
Patients with residual hearing in the low frequencies and ski-slope hearing loss with partial deafness at medium and high frequencies receive a cochlear implant treatment with electric-acoustic stimulation (EAS, "hybrid" stimulation). In the border region between electric and acoustic stimulation a superposition of the 2 types of stimulation is expected. The area of overlap is determined by the insertion depth of the stimulating electrode and the lower starting point of signal transmission provided by the CI speech processor. The study examined the influence of the variation of the electric-acoustic overlap area on speech perception in noise, whereby the width of the "transmission gap" between the 2 different stimulus modalities was varied by 2 different methods. The results derived from 9 experienced users of the MED-EL Duet 2 speech processor show that the electric-acoustic overlapping area and with it the crossover frequency between the acoustic part and the CI should be adjusted individually. Overall, speech reception thresholds (SRT) showed a wide variation of results in between subjects. Further studies shall investigate whether generalized procedures about the setting of the overlap between electric and acoustic stimulation are reasonable, whereby an increased number of subjects and a longer period of acclimatization prior to the conduction of hearing tests deemed necessary. © Georg Thieme Verlag KG Stuttgart · New York.
Peer Network Overlap in Twin, Sibling, and Friend Dyads
McGuire, Shirley; Segal, Nancy L.
2013-01-01
Research suggests that sibling–peer connections are important for understanding adolescent problem behaviors. Using a novel behavioral genetic design, the current study investigated peer network overlap in 300 child–child pairs (aged 7-13 years) in 5 dyad types: monozygotic (MZ), dizygotic twins, full siblings (FSs), friend pairs, and virtual…
On the geometry of coating layers formed by overlap
Ocelik, V.; Nenadl, O.; Palavra, A.; De Hosson, J. Th. M.
2014-01-01
A recursive model is presented for the prediction of the profile of a coating layer formed by single track overlap. A known shape of single track is assumed and on the base of simple physical assumptions the recursive sequence is deduced to construct an entire profile of such coatings. Calculations
Directory of Open Access Journals (Sweden)
Lei Zhao
2017-01-01
Full Text Available This paper proposes novel framework for facial expressions analysis using dynamic and static information in video sequences. First, based on incremental formulation, discriminative deformable face alignment method is adapted to locate facial points to correct in-plane head rotation and break up facial region from background. Then, spatial-temporal motion local binary pattern (LBP feature is extracted and integrated with Gabor multiorientation fusion histogram to give descriptors, which reflect static and dynamic texture information of facial expressions. Finally, a one-versus-one strategy based multiclass support vector machine (SVM classifier is applied to classify facial expressions. Experiments on Cohn-Kanade (CK + facial expression dataset illustrate that integrated framework outperforms methods using single descriptors. Compared with other state-of-the-art methods on CK+, MMI, and Oulu-CASIA VIS datasets, our proposed framework performs better.
Holloway, Lois Charlotte; Miller, Julie-Anne; Kumar, Shivani; Whelan, Brendan M; Vinod, Shalini K
2012-01-01
Treatment planning studies often require the calculation of a large number of dose and radiobiological metrics. To streamline these calculations, a computer program called Comp Plan was developed using MATLAB. Comp Plan calculates common metrics, including equivalent uniform dose, tumor control probability, and normal tissue complication probability from dose-volume histogram data. The dose and radiobiological metrics can be calculated for the original data or for an adjusted fraction size using the linear quadratic model. A homogeneous boost dose can be added to a given structure if desired. The final output is written to an Excel file in a format convenient for further statistical analysis. Comp Plan was verified by independent calculations. A lung treatment planning study comparing 45 plans for 7 structures using up to 6 metrics for each structure was successfully analyzed within approximately 5 minutes with Comp Plan. The code is freely available from the authors on request.
Purwanti, Endah; Calista, Evelyn
2017-05-01
Leukemia is a type of cancer which is caused by malignant neoplasms in leukocyte cells. Leukemia disease which can cause death quickly enough for the sufferer is a type of acute lymphocyte leukemia (ALL). In this study, we propose automatic detection of lymphocyte leukemia through classification of lymphocyte cell images obtained from peripheral blood smear single cell. There are two main objectives in this study. The first is to extract featuring cells. The second objective is to classify the lymphocyte cells into two classes, namely normal and abnormal lymphocytes. In conducting this study, we use combination of shape feature and histogram feature, and the classification algorithm is k-nearest Neighbour with k variation is 1, 3, 5, 7, 9, 11, 13, and 15. The best level of accuracy, sensitivity, and specificity in this study are 90%, 90%, and 90%, and they were obtained from combined features of area-perimeter-mean-standard deviation with k=7.
Directory of Open Access Journals (Sweden)
Lifeng Zhang
2014-01-01
Full Text Available In recent years, the smart devices equipped with imaging functions are widely spreading for consumer application. It is very convenient for people to record information using these devices. For example, people can photo one page of a book in a library or they can capture an interesting piece of news on the bulletin board when walking on the street. But sometimes, one shot full area image cannot give a sufficient resolution for OCR soft or for human visual recognition. Therefore, people would prefer to take several partial character images of a readable size and then stitch them together in an efficient way. In this study, we propose a print document acquisition method using a device with a video camera. A one-dimensional histogram based self-calibration algorithm is developed for calibration. Because the calculation cost is low, it can be installed on a smartphone. The simulation result shows that the calibration and stitching are well performed.
Energy Technology Data Exchange (ETDEWEB)
Heo, Min Suk; Kavitha, Muthu Subash [Dept. of Oral and Maxillofacial Radiology and Dental Research Institute, School of Dentistry, Seoul National University, Seoul (Korea, Republic of); Asano, Akira [Graduate School of Engineering, Hiroshima University, Hiroshima (Japan); Taguchi, Akira [Dept. of Oral and Maxillofacial Radiology, Matsumoto Dental University, Nagano (Japan)
2013-09-15
To prevent low bone mineral density (BMD), that is, osteoporosis, in postmenopausal women, it is essential to diagnose osteoporosis more precisely. This study presented an automatic approach utilizing a histogram-based automatic clustering (HAC) algorithm with a support vector machine (SVM) to analyse dental panoramic radiographs (DPRs) and thus improve diagnostic accuracy by identifying postmenopausal women with low BMD or osteoporosis. We integrated our newly-proposed histogram-based automatic clustering (HAC) algorithm with our previously-designed computer-aided diagnosis system. The extracted moment-based features (mean, variance, skewness, and kurtosis) of the mandibular cortical width for the radial basis function (RBF) SVM classifier were employed. We also compared the diagnostic efficacy of the SVM model with the back propagation (BP) neural network model. In this study, DPRs and BMD measurements of 100 postmenopausal women patients (aged >50 years), with no previous record of osteoporosis, were randomly selected for inclusion. The accuracy, sensitivity, and specificity of the BMD measurements using our HAC-SVM model to identify women with low BMD were 93.0% (88.0%-98.0%), 95.8% (91.9%-99.7%) and 86.6% (79.9%-93.3%), respectively, at the lumbar spine; and 89.0% (82.9%-95.1%), 96.0% (92.2%-99.8%) and 84.0% (76.8%-91.2%), respectively, at the femoral neck. Our experimental results predict that the proposed HAC-SVM model combination applied on DPRs could be useful to assist dentists in early diagnosis and help to reduce the morbidity and mortality associated with low BMD and osteoporosis.
Directory of Open Access Journals (Sweden)
Aniza Othman
2016-12-01
Full Text Available In the world of today, most images are digitized and kept in digital libraries for better organization and management. With the growth of information and communication technology, collection holders such as museums or cultural institutions have been increasingly interested in making their collections available anytime and anywhere for any Image Retrieval (IR activities such as browsing and searching. In a colour image retrieval application, images retrieved by users are accomplished according to their specifications on what they want or acquire, which could be based upon so many concepts. We suggest an approach to categorize the colour appearances of whole scene landscape painting images based on human colour perception. The colour features in the image are represented using a colour histogram. We then find the suitable quantization bins that can be used to generate optimum colour histograms for all categories of colour appearances, which is selected based on theHarmonic Mean of the precision and recall, also known as F-Score percentage higher saturated value. Colour appearance attributes in the CIELab colour model (L-Lightness, a and b are colour-opponent dimension are used to generate colour appearance feature vectors namely the saturation metric, lightness metric and multicoloured metric. For the categorizations, we use the Nearest Neighbour (NN method to detect the classes by using the predefined colour appearance descriptor measures and the pre-set thresholds. The experimental results show that the quantization of CIELab colour model into 11 uniformly bins for each component had achieved the optimum result for all colour appearances categories.
Energy Technology Data Exchange (ETDEWEB)
Domengie, F., E-mail: florian.domengie@st.com; Morin, P. [STMicroelectronics Crolles 2 (SAS), 850 Rue Jean Monnet, 38926 Crolles Cedex (France); Bauza, D. [CNRS, IMEP-LAHC - Grenoble INP, Minatec: 3, rue Parvis Louis Néel, CS 50257, 38016 Grenoble Cedex 1 (France)
2015-07-14
We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.
Individual foraging strategies reveal niche overlap between endangered galapagos pinnipeds.
Directory of Open Access Journals (Sweden)
Stella Villegas-Amtmann
Full Text Available Most competition studies between species are conducted from a population-level approach. Few studies have examined inter-specific competition in conjunction with intra-specific competition, with an individual-based approach. To our knowledge, none has been conducted on marine top predators. Sympatric Galapagos fur seals (Arctocephalus galapagoensis and sea lions (Zalophus wollebaeki share similar geographic habitats and potentially compete. We studied their foraging niche overlap at Cabo Douglas, Fernandina Island from simultaneously collected dive and movement data to examine spatial and temporal inter- and intra-specific competition. Sea lions exhibited 3 foraging strategies (shallow, intermediate and deep indicating intra-specific competition. Fur seals exhibited one foraging strategy, diving predominantly at night, between 0-80 m depth and mostly at 19-22 h. Most sea lion dives also occurred at night (63%, between 0-40 m, within fur seals' diving depth range. 34% of sea lions night dives occurred at 19-22 h, when fur seals dived the most, but most of them occurred at dawn and dusk, when fur seals exhibited the least amount of dives. Fur seals and sea lions foraging behavior overlapped at 19 and 21 h between 0-30 m depths. Sea lions from the deep diving strategy exhibited the greatest foraging overlap with fur seals, in time (19 h, depth during overlapping time (21-24 m, and foraging range (37.7%. Fur seals foraging range was larger. Cabo Douglas northwest coastal area, region of highest diving density, is a foraging "hot spot" for both species. Fur seals and sea lions foraging niche overlap occurred, but segregation also occurred; fur seals primarily dived at night, while sea lions exhibited night and day diving. Both species exploited depths and areas exclusive to their species. Niche breadth generally increases with environmental uncertainty and decreased productivity. Potential competition between these species could be greater during
LENUS (Irish Health Repository)
Coleman, Linda
2013-11-01
This study investigates the impact of systematic multileaf collimator (MLC) positional errors on gamma analysis results used for quality assurance (QA) of Rapidarc treatments. In addition, this study evaluates the relationship of these gamma analysis results and clinical dose volume histogram metrics (DVH) for Rapidarc treatment plans.
On Evaluation of Overlap Integrals with Noninteger Principal Quantum Numbers
Institute of Scientific and Technical Information of China (English)
I.I.Guseinov; B.A.Mamedov
2004-01-01
By use of complete orthonormal sets of ψα exponential-type orbitals (ψα-ETOs,α=1,0,-1,-2,...) the series expansion formulas for the noninteger n Slater-type orbitals (NISTOs) in terms of integer n Slater-type orbitals (ISTOs) are derived. These formulas enable us to express the overlap integrals with NISTOs through the overlap integrals over ISTOs with the same and different screening constants. By calculating concrete cases the convergence of the series for arbitrary values of noninteger principal quantum numbers and screening constants of NISTOs and internuclear distances is tested. The accuracy of the results is quite high for quantum numbers, screening constants and location of STOs.
On Evaluation of Overlap Integrals with Noninteger Principal Quantum Numbers
Institute of Scientific and Technical Information of China (English)
I.I. Guseinov; B.A. Mamedov
2004-01-01
By use of complete orthonormal sets of ψα exponential-type orbitals (ψα-ETOs, α = 1, 0,-1,-2, ...) the series expansion formulas for the noninteger n* Slater-type orbitals (NISTOs) in terms of integer n Slater-type orbitals(ISTOs) are derived. These formulas enable us to express the overlap integrals with NISTOs through the overlap integrals over ISTOs with the same and different screening constants. By calculating concrete cases the convergence of the series for arbitrary values of noninteger principal quantum numbers and screening constants of NISTOs and internuclear distances is tested. The accuracy of the results is quite high for quantum numbers, screening constants and location of STOs.
Hausdorff dimension of self-similar sets with overlaps
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
We provide a simple formula to compute the Hausdorff dimension of the attractor of an overlapping iterated function system of contractive similarities satisfying a certain collection of assumptions. This formula is obtained by associating a non-overlapping infinite iterated function system to an iterated function system satisfying our assumptions and using the results of Moran to compute the Hausdorff dimension of the attractor of this infinite iterated function system, thus showing that the Hausdorff dimension of the attractor of this infinite iterated function system agrees with that of the attractor of the original iterated function system. Our methods are applicable to some iterated function systems that do not satisfy the finite type condition recently introduced by Ngai and Wang.
Functional overlap of the Arabidopsis leaf and root microbiota.
Bai, Yang; Müller, Daniel B; Srinivas, Girish; Garrido-Oter, Ruben; Potthoff, Eva; Rott, Matthias; Dombrowski, Nina; Münch, Philipp C; Spaepen, Stijn; Remus-Emsermann, Mitja; Hüttel, Bruno; McHardy, Alice C; Vorholt, Julia A; Schulze-Lefert, Paul
2015-12-17
Roots and leaves of healthy plants host taxonomically structured bacterial assemblies, and members of these communities contribute to plant growth and health. We established Arabidopsis leaf- and root-derived microbiota culture collections representing the majority of bacterial species that are reproducibly detectable by culture-independent community sequencing. We found an extensive taxonomic overlap between the leaf and root microbiota. Genome drafts of 400 isolates revealed a large overlap of genome-encoded functional capabilities between leaf- and root-derived bacteria with few significant differences at the level of individual functional categories. Using defined bacterial communities and a gnotobiotic Arabidopsis plant system we show that the isolates form assemblies resembling natural microbiota on their cognate host organs, but are also capable of ectopic leaf or root colonization. While this raises the possibility of reciprocal relocation between root and leaf microbiota members, genome information and recolonization experiments also provide evidence for microbiota specialization to their respective niche.
Hausdorff dimension of self-similar sets with overlaps
Institute of Scientific and Technical Information of China (English)
DENG QiRong; John HARDING; HU TianYou
2009-01-01
We provide a simple formula to compute the Hausdorff dimension of the attractor of an overlapping iterated function system of contractive similarities satisfying a certain collection of assumptions. This formula is obtained by associating a non-overlapping infinite iterated function system to an iterated function system satisfying our assumptions and using the results of Moran to compute the Hausdorff dimension of the attractor of this infinite iterated function system,thus showing that the Hausdorff dimension of the attractor of this infinite iterated function system agrees with that of the attractor of the original iterated function system.Our methods are applicable to some iterated function systems that do not satisfy the finite type condition recently introduced by Ngai and Wang.
Aceclofenac induced Stevens-Johnson/toxic epidermal necrolysis overlap syndrome
Directory of Open Access Journals (Sweden)
Kaderthambi Hajamohideen Nooru Ameen
2013-01-01
Full Text Available The purpose of this paper is to report a rare occurrence of Stevens-Johnson/Toxic epidermal necrolysis (SJS/TEN overlap syndrome after the use of aceclofenac. A 38 year old healthy adult male presented with rapidly evolving rash over face and upper body with ulceration of buccal mucosa and breathlessness after taking aceclofenac tablet. Naranjo score for this adverse drug event was six, thereby making it a probable adverse drug reaction. Despite aggressive fluid resuscitation and use of antihistamines and systemic steroids, patient′s health rapidly worsened and died within six hours of presentation. Aceclofenac induced SJS/TEN overlap is an extremely rare clinical association previously reported only once in medical literature. To the best of our knowledge, this is the first case report of such an association in the Indian population. We are presenting this case to highlight the serious adverse reactions possible from a routinely prescribed drug.
Overlapping coalition formation games in wireless communication networks
Wang, Tianyu; Saad, Walid; Han, Zhu
2017-01-01
This brief introduces overlapping coalition formation games (OCF games), a novel mathematical framework from cooperative game theory that can be used to model, design and analyze cooperative scenarios in future wireless communication networks. The concepts of OCF games are explained, and several algorithmic aspects are studied. In addition, several major application scenarios are discussed. These applications are drawn from a variety of fields that include radio resource allocation in dense wireless networks, cooperative spectrum sensing for cognitive radio networks, and resource management for crowd sourcing. For each application, the use of OCF games is discussed in detail in order to show how this framework can be used to solve relevant wireless networking problems. Overlapping Coalition Formation Games in Wireless Communication Networks provides researchers, students and practitioners with a concise overview of existing works in this emerging area, exploring the relevant fundamental theories, key techniqu...
On-the-fly Overlapping of Sparse Generations
DEFF Research Database (Denmark)
Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani; Fitzek, Frank
2014-01-01
Traditionally, the idea of overlapping generations in network coding research has focused on reducing the complexity of decoding large data files while maintaining the delay performance expected of a system that combines all data packets. However, the effort for encoding and decoding individual...... generations can still be quite high compared to other sparse coding approaches. This paper focuses on an inherently different approach that combines (i) sparsely coded generations configured on-the- fly based on (ii) controllable and infrequent feedback that allows the system to remove some original packets...... from the pool of packets to be mixed in the linear combinations. The latter is key to maintain a high impact of the coded packets received during the entire process while maintaining very sparsely coded generations. Interestingly, our proposed approach naturally bridges the idea of overlapping...
Results from overlap valence quarks on a twisted mass sea
Garron, N
2007-01-01
We present results of lattice computations using overlap fermions on a twisted mass background. $N_f=2$ full QCD gauge configurations have been produced by the ETM Collaboration with very light pions (down to less than 300 MeV), with small lattice spacing ($a \\approx 0.09$ fm) and large volumes ($V/a^4=24^3\\times 48$). By profiting of the good chiral properties of the overlap operator for the valence quarks, it is also possible to have a precise (and unquenched) determination of those physical quantities where the chiral properties are crucial. In order to have unquenched results, we match the valence quark mass with the sea quark mass. We also perform computations with different quark masses in order to simulate (partially quenched) Strange and Charm quarks. A typical application is the computation of $B_K$, for which we present first results.
DEMON: a Local-First Discovery Method for Overlapping Communities
Coscia, Michele; Giannotti, Fosca; Pedreschi, Dino
2012-01-01
Community discovery in complex networks is an interesting problem with a number of applications, especially in the knowledge extraction task in social and information networks. However, many large networks often lack a particular community organization at a global level. In these cases, traditional graph partitioning algorithms fail to let the latent knowledge embedded in modular structure emerge, because they impose a top-down global view of a network. We propose here a simple local-first approach to community discovery, able to unveil the modular organization of real complex networks. This is achieved by democratically letting each node vote for the communities it sees surrounding it in its limited view of the global system, i.e. its ego neighborhood, using a label propagation algorithm; finally, the local communities are merged into a global collection. We tested this intuition against the state-of-the-art overlapping and non-overlapping community discovery methods, and found that our new method clearly ou...
Efficient Method for Histogram Generation on GPU%一种高效直方图生成算法在GPU上的实现
Institute of Scientific and Technical Information of China (English)
狄鹏; 胡长军; 李建江
2012-01-01
Histogram generation is an inherently sequential loop computation with irregular data-dependence, which has a full range of applications in diverse fields. However, the presence of irregular memory access in histogram loop nest poses an obstacle to its paralleled execution using a massive number of fine-grained threads due to access latency leaded by bank conflicts. It is non-trivial to accelerate histogram generation algorithm on parallel platform, particularly on the state-of-the-art parallel platform, graphics processing unit (GPU). For reducing bank conflicts, utilization of padding technique can evenly distribute shared memory access of multiple threads to different banks and largely exploit GPU's potential on accelerating histogram generation. Moreover, efficient near-optimal configuration search model can guide programmers choosing appropriate GPU execution parameters for higher performance. Experimental result demonstrates the improved histogram generation algorithm has approximate 42% to 88% speedups than traditional histogram generation algorithm on GPU.%直方图生成算法(Histogram Generation)是一种顺序的非规则数据依赖的循环运算,已在许多领域被广泛应用.但是,由于非规则的内存访问,使得多线程对共享内存访问会产生很多存储体冲突(Bank Conflict),从而阻碍并行效率.如何在并行处理器平台,特别是当前最先进的图像处理单元(Graphic Processing Unit,GPU)实现高效的直方图生成算法是很有研究价值的.为了减少直方图生成过程中的存储体冲突,通过内存填充技术,将多线程的共享内存访问均匀地分散到各个存储体,可以大幅减少直方图生成算法在GPU上的内存访问延时.同时,通过提出有效可靠的近似最优配置搜索模型,可以指导用户配置GPU执行参数,以获得更高的性能.经实验验证,在实际应用中,改良后的算法比原有算法性能提高了42％～88％.
Node-Centric Detection of Overlapping Communities in Social Networks
Cohen, Yehonatan; Rubin, Amir
2016-01-01
We present NECTAR, a community detection algorithm that generalizes Louvain method's local search heuristic for overlapping community structures. NECTAR chooses dynamically which objective function to optimize based on the network on which it is invoked. Our experimental evaluation on both synthetic benchmark graphs and real-world networks, based on ground-truth communities, shows that NECTAR provides excellent results as compared with state of the art community detection algorithms.
Bilateral coxitis in scleroderma-polymyositis overlap syndrome
Berrada, Khadija; Abourazzak, Fatima Ezzahra; Houssaini, Ghita Sqalli; Kadi, Nadira; Tahiri, Latifa; Amrani, Kawthar; Khammar, Zineb; Lahlou, Meriam; Berrady, Rhizlane; Rabhi, Samira; Tizniti, Siham; Bono, Wafaa; Harzy, Taoufik
2014-01-01
Joint manifestations in scleroderma (Scl) and polymyositis (PM) are dominated by inflammatory arthralgia. Arthritis is less common and preferentially affects the hands, wrists, knees, and ankles. Involvement of the hip has been rarely reported in the literature. We report a case of coxitis diagnosed in a patient suffering from scleroderma-polymyositis overlap syndrome successfully treated by ultrasound-guided infiltration of triamcinolone hexacetonide PMID:27708891
Unquenching the topological susceptibility with an overlap action
Kovács, T G
2002-01-01
We estimate the quark-mass dependence of the topological susceptibility with dynamical overlap and clover fermions. Unquenching effects on the susceptibility turn out to be well approximated by a reweighting of a quenched ensemble with a low-eigenmode truncation of the fermionic determinant. We find that it is most likely due to the explicit chiral symmetry breaking of the fermion action that present day dynamical simulations do not show the expected suppression of the topological susceptibility.