WorldWideScience

Sample records for source count map

  1. A matrix-inversion method for gamma-source mapping from gamma-count data - 59082

    International Nuclear Information System (INIS)

    Bull, Richard K.; Adsley, Ian; Burgess, Claire

    2012-01-01

    Gamma ray counting is often used to survey the distribution of active waste material in various locations. Ideally the output from such surveys would be a map of the activity of the waste. In this paper a simple matrix-inversion method is presented. This allows an array of gamma-count data to be converted to an array of source activities. For each survey area the response matrix is computed using the gamma-shielding code Microshield [1]. This matrix links the activity array to the count array. The activity array is then obtained via matrix inversion. The method was tested on artificially-created arrays of count-data onto which statistical noise had been added. The method was able to reproduce, quite faithfully, the original activity distribution used to generate the dataset. The method has been applied to a number of practical cases, including the distribution of activated objects in a hot cell and to activated Nimonic springs amongst fuel-element debris in vaults at a nuclear plant. (authors)

  2. Comparing MapReduce and Pipeline Implementations for Counting Triangles

    Directory of Open Access Journals (Sweden)

    Edelmira Pasarella

    2017-01-01

    Full Text Available A common method to define a parallel solution for a computational problem consists in finding a way to use the Divide and Conquer paradigm in order to have processors acting on its own data and scheduled in a parallel fashion. MapReduce is a programming model that follows this paradigm, and allows for the definition of efficient solutions by both decomposing a problem into steps on subsets of the input data and combining the results of each step to produce final results. Albeit used for the implementation of a wide variety of computational problems, MapReduce performance can be negatively affected whenever the replication factor grows or the size of the input is larger than the resources available at each processor. In this paper we show an alternative approach to implement the Divide and Conquer paradigm, named dynamic pipeline. The main features of dynamic pipelines are illustrated on a parallel implementation of the well-known problem of counting triangles in a graph. This problem is especially interesting either when the input graph does not fit in memory or is dynamically generated. To evaluate the properties of pipeline, a dynamic pipeline of processes and an ad-hoc version of MapReduce are implemented in the language Go, exploiting its ability to deal with channels and spawned processes. An empirical evaluation is conducted on graphs of different topologies, sizes, and densities. Observed results suggest that dynamic pipelines allows for an efficient implementation of the problem of counting triangles in a graph, particularly, in dense and large graphs, drastically reducing the execution time with respect to the MapReduce implementation.

  3. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    Science.gov (United States)

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  4. A simulator for airborne laser swath mapping via photon counting

    Science.gov (United States)

    Slatton, K. C.; Carter, W. E.; Shrestha, R.

    2005-06-01

    Commercially marketed airborne laser swath mapping (ALSM) instruments currently use laser rangers with sufficient energy per pulse to work with return signals of thousands of photons per shot. The resulting high signal to noise level virtually eliminates spurious range values caused by noise, such as background solar radiation and sensor thermal noise. However, the high signal level approach requires laser repetition rates of hundreds of thousands of pulses per second to obtain contiguous coverage of the terrain at sub-meter spatial resolution, and with currently available technology, affords little scalability for significantly downsizing the hardware, or reducing the costs. A photon-counting ALSM sensor has been designed by the University of Florida and Sigma Space, Inc. for improved topographic mapping with lower power requirements and weight than traditional ALSM sensors. Major elements of the sensor design are presented along with preliminary simulation results. The simulator is being developed so that data phenomenology and target detection potential can be investigated before the system is completed. Early simulations suggest that precise estimates of terrain elevation and target detection will be possible with the sensor design.

  5. DAWN GRAND MAP CERES TPE NEUTRON COUNTS V1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — A global map thermal+epithermal neutron counting rates binned on twenty-degree quasi-equal-area pixels is provided. The map was determined from a time series of the...

  6. Preparation of source mounts for 4π counting

    International Nuclear Information System (INIS)

    Johnson, E.P.

    1991-01-01

    The 4πβ/γ counter in the ANSTO radioisotope standards laboratory at Lucas Heights constitutes part of the Australian national standard for radioactivity. Sources to be measured in the counter must be mounted on a substrate which is strong enough to withstand careful handling and transport. The substrate must also be electrically conducting to minimise counting errors caused by charging of the source, and it must have very low superficial density so that little or none of the radiation is absorbed. The entire process of fabrication of VYNS films, coating them with gold/palladium and transferring them to source mount rings, as carried out in the radioisotope standards laboratory, is documented. 3 refs., 2 tabs., 6 figs

  7. Calculation of the counting efficiency for extended sources

    International Nuclear Information System (INIS)

    Korun, M.; Vidmar, T.

    2002-01-01

    A computer program for calculation of efficiency calibration curves for extended samples counted on gamma- and X ray spectrometers is described. The program calculates efficiency calibration curves for homogeneous cylindrical samples placed coaxially with the symmetry axis of the detector. The method of calculation is based on integration over the sample volume of the efficiencies for point sources measured in free space on an equidistant grid of points. The attenuation of photons within the sample is taken into account using the self-attenuation function calculated with a two-dimensional detector model. (author)

  8. Observations of the Hubble Deep Field with the Infrared Space Observatory .3. Source counts and P(D) analysis

    DEFF Research Database (Denmark)

    Oliver, S.J.; Goldschmidt, P.; Franceschini, A.

    1997-01-01

    We present source counts at 6.7 and 15 mu m from our maps of the Hubble Deep Field (HDF) region, reaching 38.6 mu Jy at 6.7 mu m and 255 mu Jy at 15 mu m. These are the first ever extragalactic number counts to be presented at 6.7 mu m, and are three decades fainter than IRAS at 12 mu m. Both...

  9. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  10. Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions

    Science.gov (United States)

    Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong

    2018-01-01

    Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within  ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.

  11. Radiation Source Mapping with Bayesian Inverse Methods

    Science.gov (United States)

    Hykes, Joshua Michael

    We present a method to map the spectral and spatial distributions of radioactive sources using a small number of detectors. Locating and identifying radioactive materials is important for border monitoring, accounting for special nuclear material in processing facilities, and in clean-up operations. Most methods to analyze these problems make restrictive assumptions about the distribution of the source. In contrast, the source-mapping method presented here allows an arbitrary three-dimensional distribution in space and a flexible group and gamma peak distribution in energy. To apply the method, the system's geometry and materials must be known. A probabilistic Bayesian approach is used to solve the resulting inverse problem (IP) since the system of equations is ill-posed. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint flux, discrete ordinates solutions, obtained in this work by the Denovo code, are required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes are then used to form the linear model to map the state space to the response space. The test for the method is simultaneously locating a set of 137Cs and 60Co gamma sources in an empty room. This test problem is solved using synthetic measurements generated by a Monte Carlo (MCNP) model and using experimental measurements that we collected for this purpose. With the synthetic data, the predicted source distributions identified the locations of the sources to within tens of centimeters, in a room with an approximately four-by-four meter floor plan. Most of the predicted source intensities were within a factor of ten of their true value. The chi-square value of the predicted source was within a factor of five from the expected value based on the number of measurements employed. With a favorable uniform initial guess, the predicted source map was nearly identical to the true distribution

  12. The Herschel Virgo Cluster Survey. XVII. SPIRE point-source catalogs and number counts

    Science.gov (United States)

    Pappalardo, Ciro; Bendo, George J.; Bianchi, Simone; Hunt, Leslie; Zibetti, Stefano; Corbelli, Edvige; di Serego Alighieri, Sperello; Grossi, Marco; Davies, Jonathan; Baes, Maarten; De Looze, Ilse; Fritz, Jacopo; Pohlen, Michael; Smith, Matthew W. L.; Verstappen, Joris; Boquien, Médéric; Boselli, Alessandro; Cortese, Luca; Hughes, Thomas; Viaene, Sebastien; Bizzocchi, Luca; Clemens, Marcel

    2015-01-01

    Aims: We present three independent catalogs of point-sources extracted from SPIRE images at 250, 350, and 500 μm, acquired with the Herschel Space Observatory as a part of the Herschel Virgo Cluster Survey (HeViCS). The catalogs have been cross-correlated to consistently extract the photometry at SPIRE wavelengths for each object. Methods: Sources have been detected using an iterative loop. The source positions are determined by estimating the likelihood to be a real source for each peak on the maps, according to the criterion defined in the sourceExtractorSussextractor task. The flux densities are estimated using the sourceExtractorTimeline, a timeline-based point source fitter that also determines the fitting procedure with the width of the Gaussian that best reproduces the source considered. Afterwards, each source is subtracted from the maps, removing a Gaussian function in every position with the full width half maximum equal to that estimated in sourceExtractorTimeline. This procedure improves the robustness of our algorithm in terms of source identification. We calculate the completeness and the flux accuracy by injecting artificial sources in the timeline and estimate the reliability of the catalog using a permutation method. Results: The HeViCS catalogs contain about 52 000, 42 200, and 18 700 sources selected at 250, 350, and 500 μm above 3σ and are ~75%, 62%, and 50% complete at flux densities of 20 mJy at 250, 350, 500 μm, respectively. We then measured source number counts at 250, 350, and 500 μm and compare them with previous data and semi-analytical models. We also cross-correlated the catalogs with the Sloan Digital Sky Survey to investigate the redshift distribution of the nearby sources. From this cross-correlation, we select ~2000 sources with reliable fluxes and a high signal-to-noise ratio, finding an average redshift z ~ 0.3 ± 0.22 and 0.25 (16-84 percentile). Conclusions: The number counts at 250, 350, and 500 μm show an increase in

  13. Degree of polarization and source counts of faint radio sources from Stacking Polarized intensity

    International Nuclear Information System (INIS)

    Stil, J. M.; George, S. J.; Keller, B. W.; Taylor, A. R.

    2014-01-01

    We present stacking polarized intensity as a means to study the polarization of sources that are too faint to be detected individually in surveys of polarized radio sources. Stacking offers not only high sensitivity to the median signal of a class of radio sources, but also avoids a detection threshold in polarized intensity, and therefore an arbitrary exclusion of sources with a low percentage of polarization. Correction for polarization bias is done through a Monte Carlo analysis and tested on a simulated survey. We show that the nonlinear relation between the real polarized signal and the detected signal requires knowledge of the shape of the distribution of fractional polarization, which we constrain using the ratio of the upper quartile to the lower quartile of the distribution of stacked polarized intensities. Stacking polarized intensity for NRAO VLA Sky Survey (NVSS) sources down to the detection limit in Stokes I, we find a gradual increase in median fractional polarization that is consistent with a trend that was noticed before for bright NVSS sources, but is much more gradual than found by previous deep surveys of radio polarization. Consequently, the polarized radio source counts derived from our stacking experiment predict fewer polarized radio sources for future surveys with the Square Kilometre Array and its pathfinders.

  14. How Fred Hoyle Reconciled Radio Source Counts and the Steady State Cosmology

    Science.gov (United States)

    Ekers, Ron

    2012-09-01

    In 1969 Fred Hoyle invited me to his Institute of Theoretical Astronomy (IOTA) in Cambridge to work with him on the interpretation of the radio source counts. This was a period of extreme tension with Ryle just across the road using the steep slope of the radio source counts to argue that the radio source population was evolving and Hoyle maintaining that the counts were consistent with the steady state cosmology. Both of these great men had made some correct deductions but they had also both made mistakes. The universe was evolving, but the source counts alone could tell us very little about cosmology. I will try to give some indication of the atmosphere and the issues at the time and look at what we can learn from this saga. I will conclude by briefly summarising the exponential growth of the size of the radio source counts since the early days and ask whether our understanding has grown at the same rate.

  15. Reconstruction of multiple line source attenuation maps

    International Nuclear Information System (INIS)

    Celler, A.; Sitek, A.; Harrop, R.

    1996-01-01

    A simple configuration for a transmission source for the single photon emission computed tomography (SPECT) was proposed, which utilizes a series of collimated line sources parallel to the axis of rotation of a camera. The detector is equipped with a standard parallel hole collimator. We have demonstrated that this type of source configuration can be used to generate sufficient data for the reconstruction of the attenuation map when using 8-10 line sources spaced by 3.5-4.5 cm for a 30 x 40cm detector at 65cm distance from the sources. Transmission data for a nonuniform thorax phantom was simulated, then binned and reconstructed using filtered backprojection (FBP) and iterative methods. The optimum maps are obtained with data binned into 2-3 bins and FBP reconstruction. The activity in the source was investigated for uniform and exponential activity distributions, as well as the effect of gaps and overlaps of the neighboring fan beams. A prototype of the line source has been built and the experimental verification of the technique has started

  16. 2π proportional counting chamber for large-area-coated β sources

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics; Volume 86; Issue 6. 2 π proportional counting chamber for large-area-coated β sources ... A provision is made for change ofthe source and immediate measurement of source activity. These sources are used to calibrate the efficiency of contamination monitors at radiological ...

  17. Radiation measurement practice for understanding statistical fluctuation of radiation count using natural radiation sources

    International Nuclear Information System (INIS)

    Kawano, Takao

    2014-01-01

    It is known that radiation is detected at random and the radiation counts fluctuate statistically. In the present study, a radiation measurement experiment was performed to understand the randomness and statistical fluctuation of radiation counts. In the measurement, three natural radiation sources were used. The sources were fabricated from potassium chloride chemicals, chemical fertilizers and kelps. These materials contain naturally occurring potassium-40 that is a radionuclide. From high schools, junior high schools and elementary schools, nine teachers participated to the radiation measurement experiment. Each participant measured the 1-min integration counts of radiation five times using GM survey meters, and 45 sets of data were obtained for the respective natural radiation sources. It was found that the frequency of occurrence of radiation counts was distributed according to a Gaussian distribution curve, although the obtained 45 data sets of radiation counts superficially looked to be fluctuating meaninglessly. (author)

  18. Cosmology from angular size counts of extragalactic radio sources

    International Nuclear Information System (INIS)

    Kapahi, V.K.

    1975-01-01

    The cosmological implications of the observed angular sizes of extragalactic radio sources are investigated using (i) the log N-log theta relation, where N is the number of sources with an angular size greater than a value theta, for the complete sample of 3CR sources, and (ii) the thetasub(median) vs flux density (S) relation derived from the 3CR, the All-sky, and the Ooty occulation surveys, spanning a flux density range of about 300:1. The method of estimating the expected N(theta) and thetasub(m)(S) relations for a uniform distribution of sources in space is outlined. Since values of theta>approximately 100second arc in the 3C sample arise from sources of small z, the slope of the N(theta) relation in this range is practically independent of the world model and the distribution of source sizes, but depends strongly on the radio luminosity function (RLF). From the observed slope the RLF is derived in the luminosity range of about 10 23 178 26 W Hz -1 sr -1 to be of the form rho(P)dP proportional to Psup(-2.1)dP. It is shown that the angular size data provide independent evidence of evolution in source properties with epoch. It is difficult to explain the data with the simple steady-state theory even if identified QSOs are excluded from ths source samples and a local deficiency of strong source is postulated. The simplest evolutionary scheme that fits the data in the Einstein-de Sitter cosmology indicates that (a) the local RLF steepens considerably at high luminosities, (b) the comoving density of high luminosity sources increases with z in a manner similar to that implied by the log N-log S data and by the V/Vsub(m) test for QSOs, and (c) the mean physical sizes of radio sources evolve with z approximately as (1+z) -1 . Similar evolutionary effects appear to be present for QSOs as well as radio galaxies. (author)

  19. Mapping of the extinction in Giant Molecular Clouds using optical star counts

    OpenAIRE

    Cambresy, L.

    1999-01-01

    This paper presents large scale extinction maps of most nearby Giant Molecular Clouds of the Galaxy (Lupus, rho-Ophiuchus, Scorpius, Coalsack, Taurus, Chamaeleon, Musca, Corona Australis, Serpens, IC 5146, Vela, Orion, Monoceros R1 and R2, Rosette, Carina) derived from a star count method using an adaptive grid and a wavelet decomposition applied to the optical data provided by the USNO-Precision Measuring Machine. The distribution of the extinction in the clouds leads to estimate their total...

  20. Testing the count rate performance of the scintillation camera by exponential attenuation: Decaying source; Multiple filters

    International Nuclear Information System (INIS)

    Adams, R.; Mena, I.

    1988-01-01

    An algorithm and two fortrAN programs have been developed to evaluate the count rate performance of scintillation cameras from count rates reduced exponentially, either by a decaying source or by filtration. The first method is used with short-lived radionuclides such as 191 /sup m/Ir or 191 /sup m/Au. The second implements a National Electrical Manufacturers' Association (NEMA) protocol in which the count rate from a source of 191 /sup m/Tc is attenuated by a varying number of copper filters stacked over it. The count rate at each data point is corrected for deadtime loss after assigning an arbitrary deadtime (tau). A second-order polynomial equation is fitted to the logarithms of net count rate values: ln(R) = A+BT+CT 2 where R is the net corrected count rate (cps), and T is the elapsed time (or the filter thickness in the NEMA method). Depending on C, tau is incremented or decremented iteratively, and the count rate corrections and curve fittings are repeated until C approaches zero, indicating a correct value of the deadtime (tau). The program then plots the measured count rate versus the corrected count rate values

  1. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  2. The European large area ISO survey - III. 90-mu m extragalactic source counts

    DEFF Research Database (Denmark)

    Efstathiou, A.; Oliver, S.; Rowan-Robinson, M.

    2000-01-01

    We present results and source counts at 90 mum extracted from the preliminary analysis of the European Large Area ISO Survey (ELAIS). The survey covered about 12 deg(2) of the sky in four main areas and was carried out with the ISOPHOT instrument onboard the Infrared Space Observatory (ISO...... or small groups of galaxies, suggesting that the sample may include a significant fraction of luminous infrared galaxies. The source counts extracted from a reliable subset of the detected sources are in agreement with strongly evolving models of the starburst galaxy population....

  3. Performance of 3DOSEM and MAP algorithms for reconstructing low count SPECT acquisitions

    Energy Technology Data Exchange (ETDEWEB)

    Grootjans, Willem [Radboud Univ. Medical Center, Nijmegen (Netherlands). Dept. of Radiology and Nuclear Medicine; Leiden Univ. Medical Center (Netherlands). Dept. of Radiology; Meeuwis, Antoi P.W.; Gotthardt, Martin; Visser, Eric P. [Radboud Univ. Medical Center, Nijmegen (Netherlands). Dept. of Radiology and Nuclear Medicine; Slump, Cornelis H. [Univ. Twente, Enschede (Netherlands). MIRA Inst. for Biomedical Technology and Technical Medicine; Geus-Oei, Lioe-Fee de [Radboud Univ. Medical Center, Nijmegen (Netherlands). Dept. of Radiology and Nuclear Medicine; Univ. Twente, Enschede (Netherlands). MIRA Inst. for Biomedical Technology and Technical Medicine; Leiden Univ. Medical Center (Netherlands). Dept. of Radiology

    2016-07-01

    Low count single photon emission computed tomography (SPECT) is becoming more important in view of whole body SPECT and reduction of radiation dose. In this study, we investigated the performance of several 3D ordered subset expectation maximization (3DOSEM) and maximum a posteriori (MAP) algorithms for reconstructing low count SPECT images. Phantom experiments were conducted using the National Electrical Manufacturers Association (NEMA) NU2 image quality (IQ) phantom. The background compartment of the phantom was filled with varying concentrations of pertechnetate and indiumchloride, simulating various clinical imaging conditions. Images were acquired using a hybrid SPECT/CT scanner and reconstructed with 3DOSEM and MAP reconstruction algorithms implemented in Siemens Syngo MI.SPECT (Flash3D) and Hermes Hybrid Recon Oncology (Hyrid Recon 3DOSEM and MAP). Image analysis was performed by calculating the contrast recovery coefficient (CRC),percentage background variability (N%), and contrast-to-noise ratio (CNR), defined as the ratio between CRC and N%. Furthermore, image distortion is characterized by calculating the aspect ratio (AR) of ellipses fitted to the hot spheres. Additionally, the performance of these algorithms to reconstruct clinical images was investigated. Images reconstructed with 3DOSEM algorithms demonstrated superior image quality in terms of contrast and resolution recovery when compared to images reconstructed with filtered-back-projection (FBP), OSEM and 2DOSEM. However, occurrence of correlated noise patterns and image distortions significantly deteriorated the quality of 3DOSEM reconstructed images. The mean AR for the 37, 28, 22, and 17 mm spheres was 1.3, 1.3, 1.6, and 1.7 respectively. The mean N% increase in high and low count Flash3D and Hybrid Recon 3DOSEM from 5.9% and 4.0% to 11.1% and 9.0%, respectively. Similarly, the mean CNR decreased in high and low count Flash3D and Hybrid Recon 3DOSEM from 8.7 and 8.8 to 3.6 and 4

  4. Extragalactic sources in Cosmic Microwave Background maps

    Energy Technology Data Exchange (ETDEWEB)

    Zotti, G. De; Castex, G. [SISSA, via Bonomea 265, 34136 Trieste (Italy); González-Nuevo, J. [Departamento de Física, Universidad de Oviedo, C. Calvo Sotelo s/n, 33007 Oviedo (Spain); Lopez-Caniego, M. [European Space Agency, ESAC, Planck Science Office, Camino bajo del Castillo, s/n, Urbanización Villafranca del Castillo, Villanueva de la Cañada, Madrid (Spain); Negrello, M.; Clemens, M. [INAF-Osservatorio Astronomico di Padova, vicolo dell' Osservatorio 5, I-35122 Padova (Italy); Cai, Z.-Y. [CAS Key Laboratory for Research in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Hefei, Anhui 230026 (China); Delabrouille, J. [APC, 10, rue Alice Domon et Léonie Duquet, 75205 Paris Cedex 13 (France); Herranz, D.; Bonavera, L. [Instituto de Física de Cantabria (CSIC-UC), avda. los Castros s/n, 39005 Santander (Spain); Melin, J.-B. [DSM/Irfu/SPP, CEA-Saclay, F-91191 Gif-sur-Yvette Cedex (France); Tucci, M. [Département de Physique Théorique and Center for Astroparticle Physics, Université de Genève, 24 quai Ansermet, CH-1211 Genève 4 (Switzerland); Serjeant, S. [Department of Physical Sciences, The Open University, Walton Hall, Milton Keynes MK7 6AA (United Kingdom); Bilicki, M. [Astrophysics, Cosmology and Gravity Centre, Department of Astronomy, University of Cape Town, Private Bag X3, Rondebosch (South Africa); Andreani, P., E-mail: gianfranco.dezotti@oapd.inaf.it, E-mail: gcastex@sissa.it, E-mail: gnuevo@uniovi.es, E-mail: marcos.lopez.caniego@sciops.esa.int [European Southern Observatory, Karl-Schwarzschild-Straße 2, D-85748, Garching (Germany); and others

    2015-06-01

    We discuss the potential of a next generation space-borne CMB experiment for studies of extragalactic sources with reference to COrE+, a project submitted to ESA in response to the call for a Medium-size mission (M4). We consider three possible options for the telescope size: 1 m, 1.5 m and 2 m (although the last option is probably impractical, given the M4 boundary conditions). The proposed instrument will be far more sensitive than Planck and will have a diffraction-limited angular resolution. These properties imply that even the 1 m telescope option will perform substantially better than Planck for studies of extragalactic sources. The source detection limits as a function of frequency have been estimated by means of realistic simulations taking into account all the relevant foregrounds. Predictions for the various classes of extragalactic sources are based on up-to-date models. The most significant improvements over Planck results are presented for each option. COrE+ will provide much larger samples of truly local star-forming galaxies (by about a factor of 8 for the 1 m telescope, of 17 for 1.5 m, of 30 for 2 m), making possible analyses of the properties of galaxies (luminosity functions, dust mass functions, star formation rate functions, dust temperature distributions, etc.) across the Hubble sequence. Even more interestingly, COrE+ will detect, at |b| > 30°, thousands of strongly gravitationally lensed galaxies (about 2,000, 6,000 and 13,000 for the 1 m, 1.5 m and 2 m options, respectively). Such large samples are of extraordinary astrophysical and cosmological value in many fields. Moreover, COrE+ high frequency maps will be optimally suited to pick up proto-clusters of dusty galaxies, i.e. to investigate the evolution of large scale structure at larger redshifts than can be reached by other means. Thanks to its high sensitivity COrE+ will also yield a spectacular advance in the blind detection of extragalactic sources in polarization: we expect that

  5. Extragalactic sources in Cosmic Microwave Background maps

    Science.gov (United States)

    De Zotti, G.; Castex, G.; González-Nuevo, J.; Lopez-Caniego, M.; Negrello, M.; Cai, Z.-Y.; Clemens, M.; Delabrouille, J.; Herranz, D.; Bonavera, L.; Melin, J.-B.; Tucci, M.; Serjeant, S.; Bilicki, M.; Andreani, P.; Clements, D. L.; Toffolatti, L.; Roukema, B. F.

    2015-06-01

    We discuss the potential of a next generation space-borne CMB experiment for studies of extragalactic sources with reference to COrE+, a project submitted to ESA in response to the call for a Medium-size mission (M4). We consider three possible options for the telescope size: 1 m, 1.5 m and 2 m (although the last option is probably impractical, given the M4 boundary conditions). The proposed instrument will be far more sensitive than Planck and will have a diffraction-limited angular resolution. These properties imply that even the 1 m telescope option will perform substantially better than Planck for studies of extragalactic sources. The source detection limits as a function of frequency have been estimated by means of realistic simulations taking into account all the relevant foregrounds. Predictions for the various classes of extragalactic sources are based on up-to-date models. The most significant improvements over Planck results are presented for each option. COrE+ will provide much larger samples of truly local star-forming galaxies (by about a factor of 8 for the 1 m telescope, of 17 for 1.5 m, of 30 for 2 m), making possible analyses of the properties of galaxies (luminosity functions, dust mass functions, star formation rate functions, dust temperature distributions, etc.) across the Hubble sequence. Even more interestingly, COrE+ will detect, at |b| > 30°, thousands of strongly gravitationally lensed galaxies (about 2,000, 6,000 and 13,000 for the 1 m, 1.5 m and 2 m options, respectively). Such large samples are of extraordinary astrophysical and cosmological value in many fields. Moreover, COrE+ high frequency maps will be optimally suited to pick up proto-clusters of dusty galaxies, i.e. to investigate the evolution of large scale structure at larger redshifts than can be reached by other means. Thanks to its high sensitivity COrE+ will also yield a spectacular advance in the blind detection of extragalactic sources in polarization: we expect that it

  6. Presurgical mapping with magnetic source imaging. Comparisons with intraoperative findings

    International Nuclear Information System (INIS)

    Roberts, T.P.L.; Ferrari, P.; Perry, D.; Rowley, H.A.; Berger, M.S.

    2000-01-01

    We compare noninvasive preoperative mapping with magnetic source imaging to intraoperative cortical stimulation mapping. These techniques were directly compared in 17 patients who underwent preoperative and postoperative somatosensory mapping of a total of 22 comparable anatomic sites (digits, face). Our findings are presented in the context of previous studies that used magnetic source imaging and functional magnetic resonance imaging as noninvasive surrogates of intraoperative mapping for the identification of sensorimotor and language-specific brain functional centers in patients with brain tumors. We found that magnetic source imaging results were reasonably concordant with intraoperative mapping findings in over 90% of cases, and that concordance could be defined as 'good' in 77% of cases. Magnetic source imaging therefore provides a viable, if coarse, identification of somatosensory areas and, consequently, can guide and reduce the time taken for intraoperative mapping procedures. (author)

  7. Two-part zero-inflated negative binomial regression model for quantitative trait loci mapping with count trait.

    Science.gov (United States)

    Moghimbeigi, Abbas

    2015-05-07

    Poisson regression models provide a standard framework for quantitative trait locus (QTL) mapping of count traits. In practice, however, count traits are often over-dispersed relative to the Poisson distribution. In these situations, the zero-inflated Poisson (ZIP), zero-inflated generalized Poisson (ZIGP) and zero-inflated negative binomial (ZINB) regression may be useful for QTL mapping of count traits. Added genetic variables to the negative binomial part equation, may also affect extra zero data. In this study, to overcome these challenges, I apply two-part ZINB model. The EM algorithm with Newton-Raphson method in the M-step uses for estimating parameters. An application of the two-part ZINB model for QTL mapping is considered to detect associations between the formation of gallstone and the genotype of markers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Optimizing the calculation of point source count-centroid in pixel size measurement

    International Nuclear Information System (INIS)

    Zhou Luyi; Kuang Anren; Su Xianyu

    2004-01-01

    Pixel size is an important parameter of gamma camera and SPECT. A number of methods are used for its accurate measurement. In the original count-centroid method, where the image of a point source (PS) is acquired and its count-centroid calculated to represent PS position in the image, background counts are inevitable. Thus the measured count-centroid (X m ) is an approximation of the true count-centroid (X p ) of the PS, i.e. X m =X p + (X b -X p )/(1+R p /R b ), where Rp is the net counting rate of the PS, X b the background count-centroid and Rb the background counting. To get accurate measurement, R p must be very big, which is unpractical, resulting in the variation of measured pixel size. R p -independent calculation of PS count-centroid is desired. Methods: The proposed method attempted to eliminate the effect of the term (X b -X p )/(1 + R p /R b ) by bringing X b closer to X p and by reducing R b . In the acquired PS image, a circular ROI was generated to enclose the PS, the pixel with the maximum count being the center of the ROI. To choose the diameter (D) of the ROI, a Gaussian count distribution was assumed for the PS, accordingly, K=1-(0.5) D/R percent of the total PS counts was in the ROI, R being the full width at half maximum of the PS count distribution. D was set to be 6*R to enclose most (K=98.4%) of the PS counts. The count-centroid of the ROI was calculated to represent X p . The proposed method was tested in measuring the pixel size of a well-tuned SPECT, whose pixel size was estimated to be 3.02 mm according to its mechanical and electronic setting (128 x 128 matrix, 387 mm UFOV, ZOOM=1). For comparison, the original method, which was use in the former versions of some commercial SPECT software, was also tested. 12 PSs were prepared and their image acquired and stored. The net counting rate of the PSs increased from 10 cps to 1183 cps. Results: Using the proposed method, the measured pixel size (in mm) varied only between 3.00 and 3.01 (mean

  9. Optimizing the calculation of point source count-centroid in pixel size measurement

    International Nuclear Information System (INIS)

    Zhou Luyi; Kuang Anren; Su Xianyu

    2004-01-01

    Purpose: Pixel size is an important parameter of gamma camera and SPECT. A number of Methods are used for its accurate measurement. In the original count-centroid method, where the image of a point source(PS) is acquired and its count-centroid calculated to represent PS position in the image, background counts are inevitable. Thus the measured count-centroid (Xm) is an approximation of the true count-centroid (Xp) of the PS, i.e. Xm=Xp+(Xb-Xp)/(1+Rp/Rb), where Rp is the net counting rate of the PS, Xb the background count-centroid and Rb the background counting rate. To get accurate measurement, Rp must be very big, which is unpractical, resulting in the variation of measured pixel size. Rp-independent calculation of PS count-centroid is desired. Methods: The proposed method attempted to eliminate the effect of the term (Xb-Xp)/(1+Rp/Rb) by bringing Xb closer to Xp and by reducing Rb. In the acquired PS image, a circular ROI was generated to enclose the PS, the pixel with the maximum count being the center of the ROI. To choose the diameter (D) of the ROI, a Gaussian count distribution was assumed for the PS, accordingly, K=I-(0.5)D/R percent of the total PS counts was in the ROI, R being the full width at half maximum of the PS count distribution. D was set to be 6*R to enclose most (K=98.4%) of the PS counts. The count-centroid of the ROI was calculated to represent Xp. The proposed method was tested in measuring the pixel size of a well-tuned SPECT, whose pixel size was estimated to be 3.02 mm according to its mechanical and electronic setting (128*128 matrix, 387 mm UFOV, ZOOM=1). For comparison, the original method, which was use in the former versions of some commercial SPECT software, was also tested. 12 PSs were prepared and their image acquired and stored. The net counting rate of the PSs increased from 10cps to 1183cps. Results: Using the proposed method, the measured pixel size (in mm) varied only between 3.00 and 3.01( mean= 3.01±0.00) as Rp increased

  10. Frequency Count Attribute Oriented Induction of Corporate Network Data for Mapping Business Activity

    Directory of Open Access Journals (Sweden)

    Tanutama Lukas

    2014-03-01

    Full Text Available Companies increasingly rely on Internet for effective and efficient business communication. As Information Technology infrastructure backbone for business activities, corporate network connects the company to Internet and enables its activities globally. It carries data packets generated by the activities of the users performing their business tasks. Traditionally, infrastructure operations mainly maintain data carrying capacity and network devices performance. It would be advantageous if a company knows what activities are running in its network. The research provides a simple method of mapping the business activity reflected by the network data. To map corporate users’ activities, a slightly modified Attribute Oriented Induction (AOI approach to mine the network data was applied. The frequency of each protocol invoked were counted to show what the user intended to do. The collected data was samples taken within a certain sampling period. Samples were taken due to the enormous data packets generated. Protocols of interest are only Internet related while intranet protocols are ignored. It can be concluded that the method could provide the management a general overview of the usage of its infrastructure and lead to efficient, effective and secure ICT infrastructure.

  11. Frequency Count Attribute Oriented Induction of Corporate Network Data for Mapping Business Activity

    Science.gov (United States)

    Tanutama, Lukas

    2014-03-01

    Companies increasingly rely on Internet for effective and efficient business communication. As Information Technology infrastructure backbone for business activities, corporate network connects the company to Internet and enables its activities globally. It carries data packets generated by the activities of the users performing their business tasks. Traditionally, infrastructure operations mainly maintain data carrying capacity and network devices performance. It would be advantageous if a company knows what activities are running in its network. The research provides a simple method of mapping the business activity reflected by the network data. To map corporate users' activities, a slightly modified Attribute Oriented Induction (AOI) approach to mine the network data was applied. The frequency of each protocol invoked were counted to show what the user intended to do. The collected data was samples taken within a certain sampling period. Samples were taken due to the enormous data packets generated. Protocols of interest are only Internet related while intranet protocols are ignored. It can be concluded that the method could provide the management a general overview of the usage of its infrastructure and lead to efficient, effective and secure ICT infrastructure.

  12. Air Emissions Sources, Charts and Maps

    Data.gov (United States)

    U.S. Environmental Protection Agency — Air Emissions provides (1) interactive charts supporting national, state, or county charts, (2) county maps of criteria air pollutant emissions for a state, and (3)...

  13. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    Science.gov (United States)

    Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.

    2017-01-01

    Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.

  14. Systematic management of sealed source and nucleonic counting system in field service

    International Nuclear Information System (INIS)

    Mahadi Mustapha; Mohd Fitri Abdul Rahman; Jaafar Abdullah

    2005-01-01

    PAT group have received a lot of service from the oil and gas plant. All the services use sealed source and nucleonic counting system. This paper described the detail of management before going to the field service. This management is important to make sure the job is smoothly done and safe to the radiation worker and public. Furthermore this management in line with the regulation from LPTA. (Author)

  15. HIGH-RESOLUTION IMAGING OF THE ATLBS REGIONS: THE RADIO SOURCE COUNTS

    Energy Technology Data Exchange (ETDEWEB)

    Thorat, K.; Subrahmanyan, R.; Saripalli, L.; Ekers, R. D., E-mail: kshitij@rri.res.in [Raman Research Institute, C. V. Raman Avenue, Sadashivanagar, Bangalore 560080 (India)

    2013-01-01

    The Australia Telescope Low-brightness Survey (ATLBS) regions have been mosaic imaged at a radio frequency of 1.4 GHz with 6'' angular resolution and 72 {mu}Jy beam{sup -1} rms noise. The images (centered at R.A. 00{sup h}35{sup m}00{sup s}, decl. -67 Degree-Sign 00'00'' and R.A. 00{sup h}59{sup m}17{sup s}, decl. -67 Degree-Sign 00'00'', J2000 epoch) cover 8.42 deg{sup 2} sky area and have no artifacts or imaging errors above the image thermal noise. Multi-resolution radio and optical r-band images (made using the 4 m CTIO Blanco telescope) were used to recognize multi-component sources and prepare a source list; the detection threshold was 0.38 mJy in a low-resolution radio image made with beam FWHM of 50''. Radio source counts in the flux density range 0.4-8.7 mJy are estimated, with corrections applied for noise bias, effective area correction, and resolution bias. The resolution bias is mitigated using low-resolution radio images, while effects of source confusion are removed by using high-resolution images for identifying blended sources. Below 1 mJy the ATLBS counts are systematically lower than the previous estimates. Showing no evidence for an upturn down to 0.4 mJy, they do not require any changes in the radio source population down to the limit of the survey. The work suggests that automated image analysis for counts may be dependent on the ability of the imaging to reproduce connecting emission with low surface brightness and on the ability of the algorithm to recognize sources, which may require that source finding algorithms effectively work with multi-resolution and multi-wavelength data. The work underscores the importance of using source lists-as opposed to component lists-and correcting for the noise bias in order to precisely estimate counts close to the image noise and determine the upturn at sub-mJy flux density.

  16. DEEP GALEX OBSERVATIONS OF THE COMA CLUSTER: SOURCE CATALOG AND GALAXY COUNTS

    International Nuclear Information System (INIS)

    Hammer, D.; Hornschemeier, A. E.; Miller, N.; Jenkins, L.; Mobasher, B.; Smith, R.; Arnouts, S.; Milliard, B.

    2010-01-01

    We present a source catalog from a deep 26 ks Galaxy Evolution Explorer (GALEX) observation of the Coma cluster in the far-UV (FUV; 1530 A) and near-UV (NUV; 2310 A) wavebands. The observed field is centered ∼0. 0 9 (1.6 Mpc) southwest of the Coma core in a well-studied region of the cluster known as 'Coma-3'. The entire field is located within the apparent virial radius of the Coma cluster, and has optical photometric coverage with Sloan Digital Sky Survey (SDSS) and deep spectroscopic coverage to r ∼ 21. We detect GALEX sources to NUV = 24.5 and FUV = 25.0, which corresponds to a star formation rate of ∼10 -3 M sun yr -1 for galaxies at the distance of Coma. We have assembled a catalog of 9700 galaxies with GALEX and SDSS photometry, including 242 spectroscopically confirmed Coma member galaxies that span a large range of galaxy types from giant spirals and elliptical galaxies to dwarf irregular and early-type galaxies. The full multi-wavelength catalog (cluster plus background galaxies) is ∼80% complete to NUV = 23 and FUV = 23.5. The GALEX images presented here are very deep and include detections of many resolved cluster members superposed on a dense field of unresolved background galaxies. This required a two-fold approach to generating a source catalog: we used a Bayesian deblending algorithm to measure faint and compact sources (using SDSS coordinates as position prior), and used the GALEX pipeline catalog for bright and/or extended objects. We performed simulations to assess the importance of systematic effects (e.g., object blends, source confusion, Eddington Bias) that influence the source detection and photometry when using both methods. The Bayesian deblending method roughly doubles the number of source detections and provides reliable photometry to a few magnitudes deeper than the GALEX pipeline catalog. This method is free from source confusion over the UV magnitude range studied here; we estimate that the GALEX pipeline catalogs are

  17. Deep Galex Observations of the Coma Cluster: Source Catalog and Galaxy Counts

    Science.gov (United States)

    Hammer, D.; Hornschemeier, A. E.; Mobasher, B.; Miller, N.; Smith, R.; Arnouts, S.; Milliard, B.; Jenkins, L.

    2010-01-01

    We present a source catalog from deep 26 ks GALEX observations of the Coma cluster in the far-UV (FUV; 1530 Angstroms) and near-UV (NUV; 2310 Angstroms) wavebands. The observed field is centered 0.9 deg. (1.6 Mpc) south-west of the Coma core, and has full optical photometric coverage by SDSS and spectroscopic coverage to r-21. The catalog consists of 9700 galaxies with GALEX and SDSS photometry, including 242 spectroscopically-confirmed Coma member galaxies that range from giant spirals and elliptical galaxies to dwarf irregular and early-type galaxies. The full multi-wavelength catalog (cluster plus background galaxies) is 80% complete to NUV=23 and FUV=23.5, and has a limiting depth at NUV=24.5 and FUV=25.0 which corresponds to a star formation rate of 10(exp -3) solar mass yr(sup -1) at the distance of Coma. The GALEX images presented here are very deep and include detections of many resolved cluster members superposed on a dense field of unresolved background galaxies. This required a two-fold approach to generating a source catalog: we used a Bayesian deblending algorithm to measure faint and compact sources (using SDSS coordinates as a position prior), and used the GALEX pipeline catalog for bright and/or extended objects. We performed simulations to assess the importance of systematic effects (e.g. object blends, source confusion, Eddington Bias) that influence source detection and photometry when using both methods. The Bayesian deblending method roughly doubles the number of source detections and provides reliable photometry to a few magnitudes deeper than the GALEX pipeline catalog. This method is also free from source confusion over the UV magnitude range studied here: conversely, we estimate that the GALEX pipeline catalogs are confusion limited at NUV approximately 23 and FUV approximately 24. We have measured the total UV galaxy counts using our catalog and report a 50% excess of counts across FUV=22-23.5 and NUV=21.5-23 relative to previous GALEX

  18. Determining {sup 252}Cf source strength by absolute passive neutron correlation counting

    Energy Technology Data Exchange (ETDEWEB)

    Croft, S. [Oak Ridge National Laboratory, Oak Ridge, TN 37831-6166 (United States); Henzlova, D., E-mail: henzlova@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2013-06-21

    Physically small, lightly encapsulated, radionuclide sources containing {sup 252}Cf are widely used for a vast variety of industrial, medical, educational and research applications requiring a convenient source of neutrons. For many quantitative applications, such as detector efficiency calibrations, the absolute strength of the neutron emission is needed. In this work we show how, by using a neutron multiplicity counter the neutron emission rate can be obtained with high accuracy. This provides an independent and alternative way to create reference sources in-house for laboratories such as ours engaged in international safeguards metrology. The method makes use of the unique and well known properties of the {sup 252}Cf spontaneous fission system and applies advanced neutron correlation counting methods. We lay out the foundation of the method and demonstrate it experimentally. We show that accuracy comparable to the best methods currently used by national bodies to certify neutron source strengths is possible.

  19. Limits to source counts and cosmic microwave background fluctuations at 10.6 GHz

    International Nuclear Information System (INIS)

    Seielstad, G.A.; Masson, C.R.; Berge, G.L.

    1981-01-01

    We have determined the distribution of deflections due to sky temperature fluctuations at 10.6 GHz. If all the deflections are due to fine structure in the cosmic microwave background, we limit these fluctuations to ΔT/T -4 on an angular scale of 11 arcmin. If, on the other hand, all the deflections are due to confusion among discrete radio sources, the areal density of these sources is calculated for various slopes of the differential source count relationship and for various cutoff flux densities. If, for example, the slope is 2.1 and the cutoff is 10 mJy, we find (0.25--3.3) 10 6 sources sr -1 Jy -1

  20. Nature or Nurture in finger counting: a review on the determinants of the direction of number-finger mapping

    Directory of Open Access Journals (Sweden)

    Paola ePrevitali

    2011-12-01

    Full Text Available The spontaneous use of finger counting has been for long recognised as critical to the acquisition of number skills. Recently, the great interest on space-number associations shifted attention to the practice of finger counting itself, and specifically, to its spatial components. Besides general cross-cultural differences in mapping numbers onto fingers, contrasting results have been reported with regard to the directional features of this mapping. The key issue we address is to what extent directionality is culturally-mediated, i.e., linked to the conventional reading-writing system direction, and/or biologically determined, i.e. linked to hand dominance. Although the preferred starting hand for counting seems to depend on the surveyed population, even within the same population high inter-individual variability minimises the role of cultural factors. Even if so far largely overlooked, handedness represents a sound candidate for shaping finger counting direction. Here we discuss adults and developmental evidence in support of this view and we reconsider the plausibility of multiple and coexistent number-space mapping in physical and representational space.

  1. One, Two, Three, Four, Nothing More: An Investigation of the Conceptual Sources of the Verbal Counting Principles

    Science.gov (United States)

    Le Corre, Mathieu; Carey, Susan

    2007-01-01

    Since the publication of [Gelman, R., & Gallistel, C. R. (1978). "The child's understanding of number." Cambridge, MA: Harvard University Press.] seminal work on the development of verbal counting as a representation of number, the nature of the ontogenetic sources of the verbal counting principles has been intensely debated. The present…

  2. An avalanche counter and encoder system for counting and mapping radioactive specimens

    International Nuclear Information System (INIS)

    Britten, R.J.

    1988-01-01

    A parallel plate counter utilizes avalanche event counting over a large area with the ability to locate radioactive sources in two dimensions. One novel embodiment comprises a gas-filled chamber formed by a stretched stainless steel window cathode spaced from a flat semiconductive anode surface between which a high voltage is applied. When a beta ray, for example, enters the chamber, an ionization event occurs and the avalanche effect multiplies the event and results in charge collection on the anode surface for a limited period of time before the charge leaks away. An encoder system, comprising a symmetrical array of planar conductive surfaces separated from the anode by a dielectric material, couples charge currents the amplitude of which define the relative position of the ionization event. A number of preferred encoder system embodiments are disclosed including a novel matrix or grid pattern of electrical paths connected to voltage dividers and charge sensitive integrating amplifiers. The amplitude of coupled current delivered to the amplifiers defines the location of the event, and spatial resolution for a given signal-to-noise ratio can be controlled by changing the number of such amplifiers. (author) 11 figs

  3. Gating circuit for single photon-counting fluorescence lifetime instruments using high repetition pulsed light sources

    International Nuclear Information System (INIS)

    Laws, W.R.; Potter, D.W.; Sutherland, J.C.

    1984-01-01

    We have constructed a circuit that permits conventional timing electronics to be used in single photon-counting fluorimeters with high repetition rate excitation sources (synchrotrons and mode-locked lasers). Most commercial time-to-amplitude and time-to-digital converters introduce errors when processing very short time intervals and when subjected to high-frequency signals. This circuit reduces the frequency of signals representing the pulsed light source (stops) to the rate of detected fluorescence events (starts). Precise timing between the start/stop pair is accomplished by using the second stop pulse after a start pulse. Important features of our design are that the circuit is insensitive to the simultaneous occurrence of start and stop signals and that the reduction in the stop frequency allows the start/stop time interval to be placed in linear regions of the response functions of commercial timing electronics

  4. The optimal on-source region size for detections with counting-type telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Klepser, Stefan

    2017-01-15

    The on-source region is typically a circular area with radius θ in which the signal is expected to appear with the shape of the instrument point spread function (PSF). This paper addresses the question of what is the θ that maximises the probability of detection for a given PSF width and background event density. In the high count number limit and assuming a Gaussian PSF profile, the optimum is found to be at ζ{sup 2}{sub ∞}∼2.51 times the squared PSF width σ{sup 2}{sub PSF39}. While this number is shown to be a good choice in many cases, a dynamic formula for cases of lower count numbers, which favour larger on-source regions, is given. The recipe to get to this parametrisation can also be applied to cases with a non-Gaussian PSF. This result can standardise and simplify analysis procedures, reduce trials and eliminate the need for experience-based ad hoc cut definitions or expensive case-by-case Monte Carlo simulations.

  5. The optimal on-source region size for detections with counting-type telescopes

    International Nuclear Information System (INIS)

    Klepser, Stefan

    2017-01-01

    The on-source region is typically a circular area with radius θ in which the signal is expected to appear with the shape of the instrument point spread function (PSF). This paper addresses the question of what is the θ that maximises the probability of detection for a given PSF width and background event density. In the high count number limit and assuming a Gaussian PSF profile, the optimum is found to be at ζ"2_∞∼2.51 times the squared PSF width σ"2_P_S_F_3_9. While this number is shown to be a good choice in many cases, a dynamic formula for cases of lower count numbers, which favour larger on-source regions, is given. The recipe to get to this parametrisation can also be applied to cases with a non-Gaussian PSF. This result can standardise and simplify analysis procedures, reduce trials and eliminate the need for experience-based ad hoc cut definitions or expensive case-by-case Monte Carlo simulations.

  6. Mapping of auroral kilometric radiation sources to the aurora

    International Nuclear Information System (INIS)

    Huff, R.L.; Calvert, W.; Craven, J.D.; Frank, L.A.; Gurnett, D.A.

    1988-01-01

    Auroral kilometric radiation (AKR) and optical auroral emissions are observed simultaneously using plasma wave instrumentation and auroral imaging photometers acrried on the DE 1 spacecraft. The DE 1 plasma wave instrument measures the relative phase of signals from orthogonal electric dipole antennas, and from these measurements, apparent source directions can be determined with a high degree of precision. Wave data are analyzed for several strong AKR events, and source directions are determined for several emission frequencies. By assuming that the AKR originates at cyclotron resonant altitudes, a condidate source field line is identified. When the selected source field line is traced down to auroral altitudes on the concurrent DE 1 auroral image, a striking correspondence between the AKR source field line and localized auroral features is produced. The magnetic mapping study provides strong evidence that AKR sources occur on field lines associated with discrete auroral arcs, and it provides confirmation that AKR is generated near the electron cyclotron frequency

  7. Sources and magnitude of sampling error in redd counts for bull trout

    Science.gov (United States)

    Jason B. Dunham; Bruce Rieman

    2001-01-01

    Monitoring of salmonid populations often involves annual redd counts, but the validity of this method has seldom been evaluated. We conducted redd counts of bull trout Salvelinus confluentus in two streams in northern Idaho to address four issues: (1) relationships between adult escapements and redd counts; (2) interobserver variability in redd...

  8. Measurement of uranium and plutonium in solid waste by passive photon or neutron counting and isotopic neutron source interrogation

    Energy Technology Data Exchange (ETDEWEB)

    Crane, T.W.

    1980-03-01

    A summary of the status and applicability of nondestructive assay (NDA) techniques for the measurement of uranium and plutonium in 55-gal barrels of solid waste is reported. The NDA techniques reviewed include passive gamma-ray and x-ray counting with scintillator, solid state, and proportional gas photon detectors, passive neutron counting, and active neutron interrogation with neutron and gamma-ray counting. The active neutron interrogation methods are limited to those employing isotopic neutron sources. Three generic neutron sources (alpha-n, photoneutron, and /sup 252/Cf) are considered. The neutron detectors reviewed for both prompt and delayed fission neutron detection with the above sources include thermal (/sup 3/He, /sup 10/BF/sub 3/) and recoil (/sup 4/He, CH/sub 4/) proportional gas detectors and liquid and plastic scintillator detectors. The instrument found to be best suited for low-level measurements (< 10 nCi/g) is the /sup 252/Cf Shuffler. The measurement technique consists of passive neutron counting followed by cyclic activation using a /sup 252/Cf source and delayed neutron counting with the source withdrawn. It is recommended that a waste assay station composed of a /sup 252/Cf Shuffler, a gamma-ray scanner, and a screening station be tested and evaluated at a nuclear waste site. 34 figures, 15 tables.

  9. PRECISE ORTHO IMAGERY AS THE SOURCE FOR AUTHORITATIVE AIRPORT MAPPING

    Directory of Open Access Journals (Sweden)

    H. Howard

    2016-06-01

    Full Text Available As the aviation industry moves from paper maps and charts to the digital cockpit and electronic flight bag, producers of these products need current and accurate data to ensure flight safety. FAA (Federal Aviation Administration and ICAO (International Civil Aviation Organization require certified suppliers to follow a defined protocol to produce authoritative map data for the aerodrome. Typical airport maps have been produced to meet 5 m accuracy requirements. The new digital aviation world is moving to 1 m accuracy maps to provide better situational awareness on the aerodrome. The commercial availability of 0.5 m satellite imagery combined with accurate ground control is enabling the production of avionics certified .85 m orthophotos of airports around the globe. CompassData maintains an archive of over 400+ airports as source data to support producers of 1 m certified Aerodrome Mapping Database (AMDB critical to flight safety and automated situational awareness. CompassData is a DO200A certified supplier of authoritative orthoimagery and attendees will learn how to utilize current airport imagery to build digital aviation mapping products.

  10. Precise Ortho Imagery as the Source for Authoritative Airport Mapping

    Science.gov (United States)

    Howard, H.; Hummel, P.

    2016-06-01

    As the aviation industry moves from paper maps and charts to the digital cockpit and electronic flight bag, producers of these products need current and accurate data to ensure flight safety. FAA (Federal Aviation Administration) and ICAO (International Civil Aviation Organization) require certified suppliers to follow a defined protocol to produce authoritative map data for the aerodrome. Typical airport maps have been produced to meet 5 m accuracy requirements. The new digital aviation world is moving to 1 m accuracy maps to provide better situational awareness on the aerodrome. The commercial availability of 0.5 m satellite imagery combined with accurate ground control is enabling the production of avionics certified .85 m orthophotos of airports around the globe. CompassData maintains an archive of over 400+ airports as source data to support producers of 1 m certified Aerodrome Mapping Database (AMDB) critical to flight safety and automated situational awareness. CompassData is a DO200A certified supplier of authoritative orthoimagery and attendees will learn how to utilize current airport imagery to build digital aviation mapping products.

  11. Preschool children use space, rather than counting, to infer the numerical magnitude of digits: Evidence for a spatial mapping principle.

    Science.gov (United States)

    Sella, Francesco; Berteletti, Ilaria; Lucangeli, Daniela; Zorzi, Marco

    2017-01-01

    A milestone in numerical development is the acquisition of counting principles which allow children to exactly determine the numerosity of a given set. Moreover, a canonical left-to-right spatial layout for representing numbers also emerges during preschool. These foundational aspects of numerical competence have been extensively studied, but there is sparse knowledge about the interplay between the acquisition of the cardinality principle and spatial mapping of numbers in early numerical development. The present study investigated how these skills concurrently develop before formal schooling. Preschool children were classified according to their performance in Give-a-Number and Number-to-position tasks. Experiment 1 revealed three qualitatively different groups: (i) children who did not master the cardinality principle and lacked any consistent spatial mapping for digits, (ii) children who mastered the cardinality principle and yet failed in spatial mapping, and (iii) children who mastered the cardinality principle and displayed consistent spatial mapping. This suggests that mastery of the cardinality principle does not entail the emergence of spatial mapping. Experiment 2 confirmed the presence of these three developmental stages and investigated their relation with a digit comparison task. Crucially, only children who displayed a consistent spatial mapping of numbers showed the ability to compare digits by numerical magnitude. A congruent (i.e., numerically ordered) positioning of numbers onto a visual line as well as the concept that moving rightwards (in Western cultures) conveys an increase in numerical magnitude mark the mastery of a spatial mapping principle. Children seem to rely on this spatial organization to achieve a full understanding of the magnitude relations between digits. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Effects of the thickness of gold deposited on a source backing film in the 4πβ-counting

    International Nuclear Information System (INIS)

    Miyahara, Hiroshi; Yoshida, Makoto; Watanabe, Tamaki

    1976-01-01

    A gold deposited VYNS film as a source backing in the 4πβ-counting has generally been used for reducing the absorption of β-rays. The thickness of the film with the gold is usually a few times thicker than the VYNS film itself. However, Because the appropriate thickness of gold has not yet been determined, the effects of gold thickness on electrical resistivity, plateau characteristics and β-ray counting efficiency were studied. 198 Au (960 keV), 60 Co(315 keV), 59 Fe(273 keV) and 95 Nb(160 keV), which were prepared as sources by the aluminium chloride treatment method, were used. Gold was evaporated under a deposition rate of 1 - 5 μg/cm 2 /min at a pressure less than 1 x 10 -5 Torr. Results show that the gold deposition on the side opposite the source after source preparation is essential. In this case, a maximum counting efficiency is obtained at the mean thickness of 2 μg/cm 2 . When gold is deposited only on the same side as the source, a maximum counting efficiency, which is less than that in the former case, is obtained at the mean thickness of 20 μg/cm 2 . (Evans, J.)

  13. SEDS: THE SPITZER EXTENDED DEEP SURVEY. SURVEY DESIGN, PHOTOMETRY, AND DEEP IRAC SOURCE COUNTS

    International Nuclear Information System (INIS)

    Ashby, M. L. N.; Willner, S. P.; Fazio, G. G.; Huang, J.-S.; Hernquist, L.; Hora, J. L.; Arendt, R.; Barmby, P.; Barro, G.; Faber, S.; Guhathakurta, P.; Bell, E. F.; Bouwens, R.; Cattaneo, A.; Croton, D.; Davé, R.; Dunlop, J. S.; Egami, E.; Finlator, K.; Grogin, N. A.

    2013-01-01

    The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg 2 to a depth of 26 AB mag (3σ) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 μm. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six-month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 ± 1.0 and 4.4 ± 0.8 nW m –2 sr –1 at 3.6 and 4.5 μm to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

  14. Mapping thunder sources by inverting acoustic and electromagnetic observations

    Science.gov (United States)

    Anderson, J. F.; Johnson, J. B.; Arechiga, R. O.; Thomas, R. J.

    2014-12-01

    We present a new method of locating current flow in lightning strikes by inversion of thunder recordings constrained by Lightning Mapping Array observations. First, radio frequency (RF) pulses are connected to reconstruct conductive channels created by leaders. Then, acoustic signals that would be produced by current flow through each channel are forward modeled. The recorded thunder is considered to consist of a weighted superposition of these acoustic signals. We calculate the posterior distribution of acoustic source energy for each channel with a Markov Chain Monte Carlo inversion that fits power envelopes of modeled and recorded thunder; these results show which parts of the flash carry current and produce thunder. We examine the effects of RF pulse location imprecision and atmospheric winds on quality of results and apply this method to several lightning flashes over the Magdalena Mountains in New Mexico, USA. This method will enable more detailed study of lightning phenomena by allowing researchers to map current flow in addition to leader propagation.

  15. AKARI/IRC source catalogues and source counts for the IRAC Dark Field, ELAIS North and the AKARI Deep Field South

    Science.gov (United States)

    Davidge, H.; Serjeant, S.; Pearson, C.; Matsuhara, H.; Wada, T.; Dryer, B.; Barrufet, L.

    2017-12-01

    We present the first detailed analysis of three extragalactic fields (IRAC Dark Field, ELAIS-N1, ADF-S) observed by the infrared satellite, AKARI, using an optimized data analysis toolkit specifically for the processing of extragalactic point sources. The InfaRed Camera (IRC) on AKARI complements the Spitzer Space Telescope via its comprehensive coverage between 8-24 μm filling the gap between the Spitzer/IRAC and MIPS instruments. Source counts in the AKARI bands at 3.2, 4.1, 7, 11, 15 and 18 μm are presented. At near-infrared wavelengths, our source counts are consistent with counts made in other AKARI fields and in general with Spitzer/IRAC (except at 3.2 μm where our counts lie above). In the mid-infrared (11 - 18 μm), we find our counts are consistent with both previous surveys by AKARI and the Spitzer peak-up imaging survey with the InfraRed Spectrograph (IRS). Using our counts to constrain contemporary evolutionary models, we find that although the models and counts are in agreement at mid-infrared wavelengths there are inconsistencies at wavelengths shortward of 7 μm, suggesting either a problem with stellar subtraction or indicating the need for refinement of the stellar population models. We have also investigated the AKARI/IRC filters, and find an active galactic nucleus selection criteria out to z < 2 on the basis of AKARI 4.1, 11, 15 and 18 μm colours.

  16. Airborne system for mapping and tracking extended gamma ray sources

    International Nuclear Information System (INIS)

    Stuart, T.P.; Hendricks, T.J.; Wallace, G.G.; Cleland, J.R.

    1976-01-01

    An airborne system was developed for mapping and tracking extended sources of airborne or terrestrially distributed γ-ray emitters. The system records 300 channel γ-ray spectral data every three seconds on magnetic tape. Computer programs have been written to isolate the contribution from the particular radionuclide of interest. Aircraft position as sensed by a microwave ranging system is recorded every second on magnetic tape. Measurements of airborne stack releases of 41 A concentrations versus time or aircraft position agree well with computer code predictions

  17. An open-source java platform for automated reaction mapping.

    Science.gov (United States)

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  18. The SCUBA-2 Cosmology Legacy Survey: 850 μm maps, catalogues and number counts

    NARCIS (Netherlands)

    Geach, J. E.; Dunlop, J. S.; Halpern, M.; Smail, Ian; van der Werf, P.; Alexander, D. M.; Almaini, O.; Aretxaga, I.; Arumugam, V.; Asboth, V.; Banerji, M.; Beanlands, J.; Best, P. N.; Blain, A. W.; Birkinshaw, M.; Chapin, E. L.; Chapman, S. C.; Chen, C.-C.; Chrysostomou, A.; Clarke, C.; Clements, D. L.; Conselice, C.; Coppin, K. E. K.; Cowley, W. I.; Danielson, A. L. R.; Eales, S.; Edge, A. C.; Farrah, D.; Gibb, A.; Harrison, C. M.; Hine, N. K.; Hughes, D.; Ivison, R. J.; Jarvis, M.; Jenness, T.; Jones, S. F.; Karim, A.; Koprowski, M.; Knudsen, K. K.; Lacey, C. G.; Mackenzie, T.; Marsden, G.; McAlpine, K.; McMahon, R.; Meijerink, R.; Michałowski, M. J.; Oliver, S. J.; Page, M. J.; Peacock, J. A.; Rigopoulou, D.; Robson, E. I.; Roseboom, I.; Rotermund, K.; Scott, Douglas; Serjeant, S.; Simpson, C.; Simpson, J. M.; Smith, D. J. B.; Spaans, M.; Stanley, F.; Stevens, J. A.; Swinbank, A. M.; Targett, T.; Thomson, A. P.; Valiante, E.; Wake, D. A.; Webb, T. M. A.; Willott, C.; Zavala, J. A.; Zemcov, M.

    2017-01-01

    We present a catalogue of ˜3000 submillimetre sources detected (≥3.5σ) at 850 μm over ˜5 deg2 surveyed as part of the James Clerk Maxwell Telescope (JCMT) SCUBA-2 Cosmology Legacy Survey (S2CLS). This is the largest survey of its kind at 850 μm, increasing the sample size of 850 μm selected

  19. 52 Million Points and Counting: A New Stratification Approach for Mapping Global Marine Ecosystems

    Science.gov (United States)

    Wright, D. J.; Sayre, R.; Breyer, S.; Butler, K. A.; VanGraafeiland, K.; Goodin, K.; Kavanaugh, M.; Costello, M. J.; Cressie, N.; Basher, Z.; Harris, P. T.; Guinotte, J. M.

    2016-12-01

    We report progress on the Ecological Marine Units (EMU) project, a new undertaking commissioned by the Group on Earth Observations (GEO) as a means of developing a standardized and practical global ecosystems classification and map for the oceans, and thus a key outcome of the GEO Biodiversity Observation Network (GEO BON). The project is one of four components of the new GI-14 GEO Ecosystems Initiative within the GEO 2016 Transitional Work plan, and for eventual use by the Global Earth Observation System of Systems (GEOSS). The project is also the follow-on to a comprehensive Ecological Land Units project (ELU), also commissioned by GEO. The EMU is comprised of a global point mesh framework, created from 52,487,233 points from the NOAA World Ocean Atlas; spatial resolution is ¼° by ¼° by varying depth; temporal resolution is currently decadal; each point has x, y, z, as well as six attributes of chemical and physical oceanographic structure (temperature, salinity, dissolved oxygen, nitrate, silicate, phosphate) that are likely drivers of many ecosystem responses. We implemented a k-means statistical clustering of the point mesh (using the pseudo-F statistic to help determine the numbers of clusters), allowing us to identify and map 37 environmentally distinct 3D regions (candidate `ecosystems') within the water column. These units can be attributed according to their productivity, direction and velocity of currents, species abundance, global seafloor geomorphology (from Harris et al.), and much more. A series of data products for open access will share the 3D point mesh and EMU clusters at the surface, bottom, and within the water column, as well as 2D and 3D web apps for exploration of the EMUs and the original World Ocean Atlas data. Future plans include a global delineation of Ecological Coastal Units (ECU) at a much finer spatial resolution (not yet commenced), as well as global ecological freshwater ecosystems (EFUs; in earliest planning stages). We will

  20. Radio source counts: comments on their convergence and assessment of the contribution to fluctuations of the microwave background

    International Nuclear Information System (INIS)

    Danese, L.; De Zotti, G.; Mandolesi, N.

    1982-01-01

    We point out that statistically estimated high frequency counts at milli-Jansky levels exhibit a slower convergence than expected on the basis of extrapolations of counts at higher flux densities and at longer wavelengths. This seems to demand a substantial cosmological evolution for at least a sub-population of flat-spectrum sources different from QSO's, a fact that might have important implications also in connection with the problem of the origin of the X-ray background. We also compute the discrete source contributions to small scale fluctuations in the Rayleigh-Jeans region of the cosmic microwave background and we show that they set a serious limit to the searches for truly primordial anisotropies using conventional radio-astronomical techniques

  1. Open-Source Automated Mapping Four-Point Probe

    Directory of Open Access Journals (Sweden)

    Handy Chandra

    2017-01-01

    Full Text Available Scientists have begun using self-replicating rapid prototyper (RepRap 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  2. Open-Source Automated Mapping Four-Point Probe.

    Science.gov (United States)

    Chandra, Handy; Allen, Spencer W; Oberloier, Shane W; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M

    2017-01-26

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  3. OpenCFU, a new free and open-source software to count cell colonies and other circular objects.

    Science.gov (United States)

    Geissmann, Quentin

    2013-01-01

    Counting circular objects such as cell colonies is an important source of information for biologists. Although this task is often time-consuming and subjective, it is still predominantly performed manually. The aim of the present work is to provide a new tool to enumerate circular objects from digital pictures and video streams. Here, I demonstrate that the created program, OpenCFU, is very robust, accurate and fast. In addition, it provides control over the processing parameters and is implemented in an intuitive and modern interface. OpenCFU is a cross-platform and open-source software freely available at http://opencfu.sourceforge.net.

  4. Quantifying the sources of variability in equine faecal egg counts: implications for improving the utility of the method.

    Science.gov (United States)

    Denwood, M J; Love, S; Innocent, G T; Matthews, L; McKendrick, I J; Hillary, N; Smith, A; Reid, S W J

    2012-08-13

    The faecal egg count (FEC) is the most widely used means of quantifying the nematode burden of horses, and is frequently used in clinical practice to inform treatment and prevention. The statistical process underlying the FEC is complex, comprising a Poisson counting error process for each sample, compounded with an underlying continuous distribution of means between samples. Being able to quantify the sources of variability contributing to this distribution of means is a necessary step towards providing estimates of statistical power for future FEC and FECRT studies, and may help to improve the usefulness of the FEC technique by identifying and minimising unwanted sources of variability. Obtaining such estimates require a hierarchical statistical model coupled with repeated FEC observations from a single animal over a short period of time. Here, we use this approach to provide the first comparative estimate of multiple sources of within-horse FEC variability. The results demonstrate that a substantial proportion of the observed variation in FEC between horses occurs as a result of variation in FEC within an animal, with the major sources being aggregation of eggs within faeces and variation in egg concentration between faecal piles. The McMaster procedure itself is associated with a comparatively small coefficient of variation, and is therefore highly repeatable when a sufficiently large number of eggs are observed to reduce the error associated with the counting process. We conclude that the variation between samples taken from the same animal is substantial, but can be reduced through the use of larger homogenised faecal samples. Estimates are provided for the coefficient of variation (cv) associated with each within animal source of variability in observed FEC, allowing the usefulness of individual FEC to be quantified, and providing a basis for future FEC and FECRT studies. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Deep far infrared ISOPHOT survey in "Selected Area 57" - I. Observations and source counts

    DEFF Research Database (Denmark)

    Linden-Vornle, M.J.D.; Nørgaard-Nielsen, Hans Ulrik; Jørgensen, H.E.

    2000-01-01

    We present here the results of a deep survey in a 0.4 deg(2) blank field in Selected Area 57 conducted with the ISOPHOT instrument aboard ESAs Infrared Space Observatory (ISO1) at both 60 mu m and 90 mu m. The resulting sky maps have a spatial resolution of 15 x 23 arcsrc(2) per pixel which is much...

  6. Micro-electrodeposition techniques for the preparation of small actinide counting sources for ultra-high resolution alpha spectrometry by microcalorimetry

    International Nuclear Information System (INIS)

    Plionis, A.A.; Hastings, E.P.; LaMont, S.P.; Dry, D.E.; Bacrania, M.K.; Rabin, M.W.; Rim, J.H.

    2009-01-01

    Special considerations and techniques are desired for the preparation of small actinide counting sources. Counting sources have been prepared on metal disk substrates (planchets) with an active area of only 0.079 mm 2 . This represents a 93.75% reduction in deposition area from standard electrodeposition methods. The actinide distribution upon the smaller planchet must remain thin and uniform to allow alpha particle emissions to escape the counting source with a minimal amount of self-attenuation. This work describes the development of micro-electrodeposition methods and optimization of the technique with respect to deposition time and current density for various planchet sizes. (author)

  7. Combining disparate data sources for improved poverty prediction and mapping.

    Science.gov (United States)

    Pokhriyal, Neeti; Jacques, Damien Christophe

    2017-11-14

    More than 330 million people are still living in extreme poverty in Africa. Timely, accurate, and spatially fine-grained baseline data are essential to determining policy in favor of reducing poverty. The potential of "Big Data" to estimate socioeconomic factors in Africa has been proven. However, most current studies are limited to using a single data source. We propose a computational framework to accurately predict the Global Multidimensional Poverty Index (MPI) at a finest spatial granularity and coverage of 552 communes in Senegal using environmental data (related to food security, economic activity, and accessibility to facilities) and call data records (capturing individualistic, spatial, and temporal aspects of people). Our framework is based on Gaussian Process regression, a Bayesian learning technique, providing uncertainty associated with predictions. We perform model selection using elastic net regularization to prevent overfitting. Our results empirically prove the superior accuracy when using disparate data (Pearson correlation of 0.91). Our approach is used to accurately predict important dimensions of poverty: health, education, and standard of living (Pearson correlation of 0.84-0.86). All predictions are validated using deprivations calculated from census. Our approach can be used to generate poverty maps frequently, and its diagnostic nature is, likely, to assist policy makers in designing better interventions for poverty eradication. Copyright © 2017 the Author(s). Published by PNAS.

  8. Modification history of the Harmakhis Vallis outflow channel, Mars, based on CTX-scale photogeologic mapping and crater count dating

    Science.gov (United States)

    Kukkonen, S.; Kostama, V.-P.

    2018-01-01

    Harmakhis Vallis is one of the four major outflow channel systems (Dao, Niger, Harmakhis, and Reull Valles) that cut the eastern rim region of the Hellas basin, the largest well-preserved impact structure on Mars. The structure of Harmakhis Vallis and the volume of its head depression, as well as earlier dating studies of the region, suggest that the outflow channel formed in the Hesperian period by collapsing when a large amount of subsurface fluid was released. Thus Harmakhis Vallis, as well as the other nearby outflow channels, represents a significant stage of the fluvial activity in the regional history. On the other hand, the outflow channel lies in the Martian mid-latitude zone, where there are several geomorphologic indicators of past and possibly also contemporary ground ice. The floor of Harmakhis also displays evidence of a later-stage ice-related activity, as the outflow channel has been covered by lineated valley fill deposits and debris apron material. The eastern rim region of the Hellas impact basin has been the subject of numerous geologic mapping studies at various scales and based on different imaging data sets. However, Harmakhis Vallis itself has received less attention and the studies on the outflow channel have focused only on limited parts of the outflow channel or on separated different geologic events. In this work, the Harmakhis Vallis floor is mapped and dated from the head depression to the beginning of the terminus based on the Mars Reconnaissance Orbiter's ConTeXt camera images (CTX; ∼ 6 m/pixel). Our results show that Harmakhis Vallis has been modified by several processes after its formation. Age determinations on the small uncovered parts of the outflow channel, which possibly represent the original floor of Harmakhis, imply that Harmakhis may have experienced fluvial activity only 780-850 ( ± 400-600) Ma ago. The discovered terrace structure instead shows that the on-surface activity of the outflow channel has been periodic

  9. Analysis of the emission characteristics of ion sources for high-value optical counting processes

    International Nuclear Information System (INIS)

    Beermann, Nils

    2009-01-01

    The production of complex high-quality thin film systems requires a detailed understanding of all partial processes. One of the most relevant partial processes is the condensation of the coating material on the substrate surface. The optical and mechanical material properties can be adjusted by the well-defined impingement of energetic ions during deposition. Thus, in the past, a variety of different ion sources were developed. With respect to the present and future challenges in the production of precisely fabricated high performance optical coatings, the ion emission of the sources has commonly not been characterized sufficiently so far. This question is addressed in the frame of this work which itself is thematically integrated in the field of process-development and -control of ion assisted deposition processes. In a first step, a Faraday cup measurement system was developed which allows the spatially resolved determination of the ion energy distribution as well as the ion current distribution. Subsequently, the ion emission profiles of six ion sources were determined depending on the relevant operating parameters. Consequently, a data pool for process planning and supplementary process analysis is made available. On the basis of the acquired results, the basic correlations between the operating parameters and the ion emission are demonstrated. The specific properties of the individual sources as well as the respective control strategies are pointed out with regard to the thin film properties and production yield. Finally, a synthesis of the results and perspectives for future activities are given. (orig.)

  10. Alpha-particle autoradiography by solid state track detectors to spatial distribution of radioactivity in alpha-counting source

    International Nuclear Information System (INIS)

    Ishigure, Nobuhito; Nakano, Takashi; Enomoto, Hiroko; Koizumi, Akira; Miyamoto, Katsuhiro

    1989-01-01

    A technique of autoradiography using solid state track detectors is described by which spatial distribution of radioactivity in an alpha-counting source can easily be visualized. As solid state track detectors, polymer of allyl diglycol carbonate was used. The advantage of the present technique was proved that alpha-emitters can be handled in the light place alone through the whole course of autoradiography, otherwise in the conventional autoradiography the alpha-emitters, which requires special carefulness from the point of radiation protection, must be handled in the dark place with difficulty. This technique was applied to rough examination of self-absorption of the plutonium source prepared by the following different methods; the source (A) was prepared by drying at room temperature, (B) by drying under an infrared lamp, (C) by drying in ammonia atmosphere after redissolving by the addition of a drop of distilled water which followed complete evaporation under an infrared lamp and (D) by drying under an infrared lamp after adding a drop of diluted neutral detergent. The difference in the spatial distributions of radioactivity could clearly be observed on the autoradiographs. For example, the source (C) showed the most diffuse distribution, which suggested that the self-absorption of this source was the smallest. The present autoradiographic observation was in accordance with the result of the alpha-spectrometry with a silicon surface-barrier detector. (author)

  11. What Is the Role of Manual Preference in Hand-Digit Mapping During Finger Counting? A Study in a Large Sample of Right- and Left-Handers.

    Science.gov (United States)

    Zago, Laure; Badets, Arnaud

    2016-01-01

    The goal of the present study was to test whether there is a relationship between manual preference and hand-digit mapping in 369 French adults with similar numbers of right- and left-handers. Manual laterality was evaluated with the finger tapping test to evaluate hand motor asymmetry, and the Edinburgh handedness inventory was used to assess manual preference strength (MPS) and direction. Participants were asked to spontaneously "count on their fingers from 1 to 10" without indications concerning the hand(s) to be used. The results indicated that both MPS and hand motor asymmetry affect the hand-starting preference for counting. Left-handers with a strong left-hand preference (sLH) or left-hand motor asymmetry largely started to count with their left hand (left-starter), while right-handers with a strong right-hand preference (sRH) or right-hand motor asymmetry largely started to count with their right hand (right-starter). Notably, individuals with weak MPS did not show a hand-starting preference. These findings demonstrated that manual laterality contributes to finger counting directionality. Lastly, the results showed a higher proportion of sLH left-starter individuals compared with sRH right-starters, indicating an asymmetric bias of MPS on hand-starting preference. We hypothesize that the higher proportion of sLH left-starters could be explained by the congruence between left-to-right hand-digit mapping and left-to-right mental number line representation that has been largely reported in the literature. Taken together, these results indicate that finger-counting habits integrate biological and cultural information. © The Author(s) 2015.

  12. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    Science.gov (United States)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  13. Optimal Matched Filter in the Low-number Count Poisson Noise Regime and Implications for X-Ray Source Detection

    Science.gov (United States)

    Ofek, Eran O.; Zackay, Barak

    2018-04-01

    Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.

  14. Three-Level AC-DC-AC Z-Source Converter Using Reduced Passive Component Count

    DEFF Research Database (Denmark)

    Loh, Poh Chiang; Gao, Feng; Tan, Pee-Chin

    2009-01-01

    This paper presents a three-level ac-dc-ac Z-source converter with output voltage buck-boost capability. The converter is implemented by connecting a low-cost front-end diode rectifier to a neutral-point-clamped inverter through a single X-shaped LC impedance network. The inverter is controlled...... to switch with a three-level output voltage, where the middle neutral potential is uniquely tapped from the star-point of a wye-connected capacitive filter placed before the front-end diode rectifier for input current filtering. Through careful control, the resulting converter can produce the correct volt...

  15. Reverberation Mapping of the Continuum Source in Active Galactic Nuclei

    Science.gov (United States)

    Fausnaugh, Michael Martin

    I present results from a monitoring campaign of 11 active galactic nuclei (AGN) conducted in Spring of 2014. I use the reverberation mapping method to probe the interior structures of the AGN, specifically the broad line regions (BLRs) and accretion disks. One of these AGN, NGC 5548, was also subject to multi-wavelength (X-ray, UV, optical, and near-IR) monitoring using 25 ground-based telescopes and four space-based facilities. For NGC 5548, I detect lags between the continuum emission at different wavelengths that follow a trend consistent with the prediction for continuum reprocessing by an accretion disk with temperature profile T ∝ R -3/4. However, the lags imply a disk radius that is 3 times larger than the prediction from standard thin-disk models. The lags at wavelengths longer than the Vband are also equal to or greater than the lags of high-ionization-state emission lines (such as HeII lambda1640 and lambda4686), suggesting that the continuum-emitting source is of a physical size comparable to the inner broad-line region. Using optical spectra from the Large Binocular Telescope, I estimate the bias of the interband continuum lags due to BLR emission observed in the filters, and I find that the bias for filters with high levels of BLR contamination (˜20%) can be important for the shortest continuum lags. This likely has a significant impact on the u and U bands owing to Balmer continuum emission. I then develop a new procedure for the internal (night-to-night) calibration of time series spectra that can reach precisions of ˜1 millimagnitude and improves traditional techniques by up to a factor of 5. At this level, other systematic issues (e.g., the nightly sensitivity functions and Fe II contamination) limit the final precision of the observed light curves. Using the new calibration method, I next present the data and first results from the optical spectroscopic monitoring component of the reverberation mapping campaign. Five AGN were sufficiently

  16. Standards-Based Open-Source Planetary Map Server: Lunaserv

    Science.gov (United States)

    Estes, N. M.; Silva, V. H.; Bowley, K. S.; Lanjewar, K. K.; Robinson, M. S.

    2018-04-01

    Lunaserv is a planetary capable Web Map Service developed by the LROC SOC. It enables researchers to serve their own planetary data to a wide variety of GIS clients without any additional processing or download steps.

  17. A Predictive Model for Microbial Counts on Beaches where Intertidal Sand is the Primary Source

    Science.gov (United States)

    Feng, Zhixuan; Reniers, Ad; Haus, Brian K.; Solo-Gabriele, Helena M.; Wang, John D.; Fleming, Lora E.

    2015-01-01

    Human health protection at recreational beaches requires accurate and timely information on microbiological conditions to issue advisories. The objective of this study was to develop a new numerical mass balance model for enterococci levels on nonpoint source beaches. The significant advantage of this model is its easy implementation, and it provides a detailed description of the cross-shore distribution of enterococci that is useful for beach management purposes. The performance of the balance model was evaluated by comparing predicted exceedances of a beach advisory threshold value to field data, and to a traditional regression model. Both the balance model and regression equation predicted approximately 70% the advisories correctly at the knee depth and over 90% at the waist depth. The balance model has the advantage over the regression equation in its ability to simulate spatiotemporal variations of microbial levels, and it is recommended for making more informed management decisions. PMID:25840869

  18. Practical biosafety in the tuberculosis laboratory: containment at the source is what truly counts.

    Science.gov (United States)

    van Soolingen, D; Wisselink, H J; Lumb, R; Anthony, R; van der Zanden, A; Gilpin, C

    2014-08-01

    In industrialised countries, sufficient resources for establishing and maintaining fully equipped biosafety level 3 (BSL-3) laboratories according to international standards are generally available. BSL-3 laboratories are designed to provide several layers of containment to protect the laboratory worker as well as the outside environment and community from risk of exposure in case of local contamination. However, such facilities are scarce in high-burden settings, primarily due to the high financial burden and complexity of the initial construction and/or regular maintenance. Measures to prevent unintended exposure to Mycobacterium tuberculosis during laboratory manipulation of specimens and cultures is the first, and by far the most important, aspect of containment. This paper focuses on the need for risk containment at source. Assuming that in many settings the establishment of BSL-3 laboratories with all the required features is not achievable, this paper also discusses the minimum requirements necessary to mitigate risks associated with particular laboratory procedures. The term 'TB containment laboratory' is used throughout this paper to describe the minimum requirements for a laboratory suitable for high-risk procedures. The TB containment laboratory has many, but not all, of the features of a BSL-3 laboratory.

  19. Preverbal and verbal counting and computation.

    Science.gov (United States)

    Gallistel, C R; Gelman, R

    1992-08-01

    We describe the preverbal system of counting and arithmetic reasoning revealed by experiments on numerical representations in animals. In this system, numerosities are represented by magnitudes, which are rapidly but inaccurately generated by the Meck and Church (1983) preverbal counting mechanism. We suggest the following. (1) The preverbal counting mechanism is the source of the implicit principles that guide the acquisition of verbal counting. (2) The preverbal system of arithmetic computation provides the framework for the assimilation of the verbal system. (3) Learning to count involves, in part, learning a mapping from the preverbal numerical magnitudes to the verbal and written number symbols and the inverse mappings from these symbols to the preverbal magnitudes. (4) Subitizing is the use of the preverbal counting process and the mapping from the resulting magnitudes to number words in order to generate rapidly the number words for small numerosities. (5) The retrieval of the number facts, which plays a central role in verbal computation, is mediated via the inverse mappings from verbal and written numbers to the preverbal magnitudes and the use of these magnitudes to find the appropriate cells in tabular arrangements of the answers. (6) This model of the fact retrieval process accounts for the salient features of the reaction time differences and error patterns revealed by experiments on mental arithmetic. (7) The application of verbal and written computational algorithms goes on in parallel with, and is to some extent guided by, preverbal computations, both in the child and in the adult.

  20. Mapping world cultures: Cluster formation, sources and implications

    OpenAIRE

    Simcha Ronen; Oded Shenkar

    2013-01-01

    This paper extends and builds on Ronen and Shenkar’s synthesized cultural clustering of countries based on similarity and dissimilarity in work-related attitudes. The new map uses an updated dataset, and expands coverage to world areas that were non-accessible at the time. Cluster boundaries are drawn empirically rather than intuitively, and the plot obtained is triple nested, indicating three levels of similarity across given country pairs. Also delineated are cluster adjacency and cluster c...

  1. Influence of MAP and Multi-layer Flexible Pouches on Clostridium Count of Smoked Kutum Fish (Rutilus frisii kutum

    Directory of Open Access Journals (Sweden)

    Nazanin Zand

    2016-11-01

    Full Text Available In this study the effect of different concentrations of three gas mixture (carbon dioxide, nitrogen, oxygen, and also vacuum conditions and flexible multi-layer films were evaluated on Clostridium count of smoked kutum fish (Rutilus frisii kutum at ambient condition (T= 25 0C. Ordinary condition as control packaging were compared with four types of modified atmosphere packaging: (N270%+ CO230%, (N230% + CO270%, (45%CO2+ 45%N2+10%O2 and vacuum conditions, in this project. Smoked kutum fish were packaged into 3 kinds of flexible pouches {3- layers(PET(12/AL(12/LLD(100, 4-layers (PET(12/AL(7/ PET(12/LLD(100, and 3-layer (PET(12/AL(7/LLD(100}. Packed samples were performed microbial tests (Clostridium count, in different times during 60 days, with 15 treatment ,3 run, statistical analysis and comparison of data, were done by software SAS (Ver:9/1 and Duncan’s new multiple range test, with confidence level of 95% (P <0.05 . The shelf life of Samples (according to Clostridium count were reported in 4-layers , under conditions 1,2,3 and vacuum conditions, 60,58,45,40 days, in 3-layers (AL:12, under conditions 1,2,3 were 55,50,40 days and in vacuum conditions were about 35 days, with 3- layers(AL:7, under conditions 1,2,3 and vacuum conditions 45,40,35, 30 days. Clostridium count showed that increasing CO2 concentration prolonged shelf life. During the period of this experiment Clostridium count of samples in various conditions, had significant level. According to these results could be concluded the best condition belonged to treatment under modified atmosphere CO2 70% and also 4- layer container due to the thickness (131 μ, low permeability of water vapor in this 4-layer container and anti-microbial effect of more percentage of CO2.

  2. Precise Mapping Of A Spatially Distributed Radioactive Source

    International Nuclear Information System (INIS)

    Beck, A.; Caras, I.; Piestum, S.; Sheli, E.; Melamud, Y.; Berant, S.; Kadmon, Y.; Tirosh, D.

    1999-01-01

    Spatial distribution measurement of radioactive sources is a routine task in the nuclear industry. The precision of each measurement depends upon the specific application. However, the technological edge of this precision is motivated by the production of standards for calibration. Within this definition, the most demanding field is the calibration of standards for medical equipment. In this paper, a semi-empirical method for controlling the measurement precision is demonstrated, using a relatively simple laboratory apparatus. The spatial distribution of the source radioactivity is measured as part of the quality assurance tests, during the production of flood sources. These sources are further used in calibration of medical gamma cameras. A typical flood source is a 40 x 60 cm 2 plate with an activity of 10 mCi (or more) of 57 Co isotope. The measurement set-up is based on a single NaI(Tl) scintillator with a photomultiplier tube, moving on an X Y table which scans the flood source. In this application the source is required to have a uniform activity distribution over its surface

  3. A radiation protection initiative to map old radium sources

    International Nuclear Information System (INIS)

    Risica, S.; Grisanti, G.; Masi, R.; Melfi, A.

    2008-01-01

    Due to a legacy of past events, the Technology and Health Department of the Instituto Superiore di Sanita (ISS) has preserved an old, large archive of the allocation of radium sources in public hospitals. These sources were purchased by the Ministry of Interior first, then by the Ministry of Health, and provided to hospitals for cancer brachytherapy. After a retrieval initiative - organised in the 1980's, but discontinued some years later owing to the saturation of the temporary storage site - a considerable number of these sources remained in the hospitals. As a result of an incomplete transfer of the retrieval data, some events connected with the second world war, and the decision of some hospitals to dispose directly of their sources without informing the ISS, the archive was not completed and a series of initiatives were undertaken by the ISS to update it. On the other hand, following the concerns that arose after September 11th, 2001 about the possible criminal use of radioactive sources, the Carabinieri Environmental Care Command (CCTA) were required by the Minister of Environment to carry out a thorough investigation into all possible nuclear sources and waste in the country. Special attention was devoted to radium sources because of the high risk their loss or theft entails. For this reason, in 2004, the CCTA made an agreement with the ISS to acquire a final, updated picture of the distribution of these radium sources. In March 2007 a comprehensive report on this collaborative action and its conclusions was officially sent to both the Ministry of Health and the Ministry of the Environment. The paper describes the involvement of these two bodies in the issue, their collaborative action and the most relevant results. (author)

  4. MASHUP SCHEME DESIGN OF MAP TILES USING LIGHTWEIGHT OPEN SOURCE WEBGIS PLATFORM

    Directory of Open Access Journals (Sweden)

    T. Hu

    2018-04-01

    Full Text Available To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.

  5. Calculation of total counting efficiency of a NaI(Tl) detector by hybrid Monte-Carlo method for point and disk sources

    Energy Technology Data Exchange (ETDEWEB)

    Yalcin, S. [Education Faculty, Kastamonu University, 37200 Kastamonu (Turkey)], E-mail: yalcin@gazi.edu.tr; Gurler, O.; Kaynak, G. [Department of Physics, Faculty of Arts and Sciences, Uludag University, Gorukle Campus, 16059 Bursa (Turkey); Gundogdu, O. [Department of Physics, School of Engineering and Physical Sciences, University of Surrey, Guildford GU2 7XH (United Kingdom)

    2007-10-15

    This paper presents results on the total gamma counting efficiency of a NaI(Tl) detector from point and disk sources. The directions of photons emitted from the source were determined by Monte-Carlo techniques and the photon path lengths in the detector were determined by analytic equations depending on photon directions. This is called the hybrid Monte-Carlo method where analytical expressions are incorporated into the Monte-Carlo simulations. A major advantage of this technique is the short computation time compared to other techniques on similar computational platforms. Another advantage is the flexibility for inputting detector-related parameters (such as source-detector distance, detector radius, source radius, detector linear attenuation coefficient) into the algorithm developed, thus making it an easy and flexible method to apply to other detector systems and configurations. The results of the total counting efficiency model put forward for point and disc sources were compared with the previous work reported in the literature.

  6. Calculation of total counting efficiency of a NaI(Tl) detector by hybrid Monte-Carlo method for point and disk sources

    International Nuclear Information System (INIS)

    Yalcin, S.; Gurler, O.; Kaynak, G.; Gundogdu, O.

    2007-01-01

    This paper presents results on the total gamma counting efficiency of a NaI(Tl) detector from point and disk sources. The directions of photons emitted from the source were determined by Monte-Carlo techniques and the photon path lengths in the detector were determined by analytic equations depending on photon directions. This is called the hybrid Monte-Carlo method where analytical expressions are incorporated into the Monte-Carlo simulations. A major advantage of this technique is the short computation time compared to other techniques on similar computational platforms. Another advantage is the flexibility for inputting detector-related parameters (such as source-detector distance, detector radius, source radius, detector linear attenuation coefficient) into the algorithm developed, thus making it an easy and flexible method to apply to other detector systems and configurations. The results of the total counting efficiency model put forward for point and disc sources were compared with the previous work reported in the literature

  7. NetMap - Creating a Map of Application Layer QoS Metrics of Mobile Networks Using Crowd Sourcing

    DEFF Research Database (Denmark)

    Mikkelsen, Lars Møller; Thomsen, Steffen Riber; Pedersen, Michael Sølvkjær

    2014-01-01

    Based on the continuous increase in network traffic on mobile networks, the large increase in smart devices, and the ever ongoing development of Internet enabled services, we argue for the need of a network performance map. In this paper NetMap is presented, which is a measurement system based...... on crowd sourcing, that utilizes end user smart devices in automatically measuring and gathering network performance metrics on mobile networks. Metrics measured include throughput, round trip times, connectivity, and signal strength, and are accompanied by a wide range of context information about...

  8. Mapping of low temperature heat sources in Denmark

    DEFF Research Database (Denmark)

    Bühler, Fabian; Holm, Fridolin Müller; Huang, Baijia

    2015-01-01

    heat. The total accessible waste heat potential is found to be approximately 266 PJ per year with 58 % of it below 100 °C. In the natural heat category, temperatures below 20 °C originate from ambient air, sea water and shallow geothermal energy, and temperatures up to 100 °C are found for solar...... and deep geothermal energy. The theoretical solar thermal potential alone would be above 500 PJ per year. For the development of advanced thermodynamic cycles for the integration of heat sources in the Danish energy system, several areas of interest are determined. In the maritime transport sector a high......Low temperature heat sources are available in many applications, ranging from waste heat from industrial processes and buildings to geothermal and solar heat sources. Technical advancements, such as heat pumps with novel cycle design and multi-component working fluids, make the utilisation of many...

  9. Offshore dredger sounds: Source levels, sound maps, and risk assessment

    NARCIS (Netherlands)

    Jong, C.A.F. de; Ainslie, M.A.; Heinis, F.; Janmaat, J.

    2016-01-01

    The underwater sound produced during construction of the Port of Rotterdam harbor extension (Maasvlakte 2) was measured, with emphasis on the contribution of the trailing suction hopper dredgers during their various activities: dredging, transport, and discharge of sediment. Measured source levels

  10. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  11. Land cover change map comparisons using open source web mapping technologies

    Science.gov (United States)

    Erik Lindblom; Ian Housman; Tony Guay; Mark Finco; Kevin. Megown

    2015-01-01

    The USDA Forest Service is evaluating the status of current landscape change maps and assessing gaps in their information content. These activities have been occurring under the auspices of the Landscape Change Monitoring System (LCMS) project, which is a joint effort between USFS Research, USFS Remote Sensing Applications Center (RSAC), USGS Earth Resources...

  12. Preparation of Films and Sources for 4{pi} Counting; Prigotovlenie dlya 4{pi}-scheta plenok i istochnikov

    Energy Technology Data Exchange (ETDEWEB)

    Konstantinov, A. A.; Sazonova, T. E. [Vsesojuznyj Nauchno - Isledovatel' skij Institut Im. D.I. Mendeleeva, Leningrad, SSSR (Russian Federation)

    1967-03-15

    To obtain a high degree of accuracy in determining the specific activity of sources by the absolute counting of particles with a 4{pi} counter, attention must be paid to the preparation of the radioactive sources. At the Mendeleev Institute of Metrology, celluloid films (surface density 8-10 {mu}/cm{sup 2}) coated on both sides with gold (10-15 {mu}g/cm{sup 2} ) or palladium (5-6 {mu}g/cm{sup 2}) are used as the bases of the radioactive sources. In order to reduce the correction for absorption of beta particles in the radioactive deposit, the base is specially treated with insulin. The authors present an extremely sensitive and effective method, employing the electron-capture nuclide {sup 54}Mn ({sup 54}Cr), for determining the uniform distribution of the active layer over the entire insulintreated surface. A solution of {sup 54}Mn ({sup 54}Cr) salt was applied to the insulin-tteated film, and the source of {sup 54}Cr ({sup 54}Mn) Auger K electrons thus obtained was investigated with the help of a proportional 4{pi} counter. The total number of {sup 54}Cr ({sup 54}Mn) Auger K electrons from the source was 8-12% less than the fluorescence coefficient (calculated from the number of {sup 54}Cr ({sup 54}Mn) K X-quanta emitted by the source) and the number of K electrons absorbed in the film (determined by the 'sandwich' method). From the differences, for insulintreated and untreated {sup 54}Mn ({sup 54}Cr) sources, between the calculated and recorded number of Auger electrons it is possible to reach a definite conclusion regarding the quality of the insulin treatment. (author) [Russian] Dlja poluchenija vysokoj tochnosti izmerenij pri opredelenii udel'noj aktivnosti istochnikov metodom absoljutnogo scheta chastic s pomoshh'ju 4{pi}- schetchika, bol'shoe vnimanie dolzhno byt' udeleno prigotovleniju radioaktivnyh istochnikov. Vo VNIIM v kachestve podlozhek radioaktivnyh istochnikov ispol'zujutsja celluloidnye plenki (poverhnostnaja plot- nost' 8-10 mkg/sm{sup 2

  13. Determining random counts in liquid scintillation counting

    International Nuclear Information System (INIS)

    Horrocks, D.L.

    1979-01-01

    During measurements involving coincidence counting techniques, errors can arise due to the detection of chance or random coincidences in the multiple detectors used. A method and the electronic circuits necessary are here described for eliminating this source of error in liquid scintillation detectors used in coincidence counting. (UK)

  14. High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers

    Science.gov (United States)

    Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.

    2017-12-01

    The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and

  15. RBC count

    Science.gov (United States)

    ... by kidney disease) RBC destruction ( hemolysis ) due to transfusion, blood vessel injury, or other cause Leukemia Malnutrition Bone ... slight risk any time the skin is broken) Alternative Names Erythrocyte count; Red blood cell count; Anemia - RBC count Images Blood test ...

  16. Disease Mapping and Regression with Count Data in the Presence of Overdispersion and Spatial Autocorrelation: A Bayesian Model Averaging Approach

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew

    2014-01-01

    This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference. PMID:24413702

  17. Web GIS in practice IV: publishing your health maps and connecting to remote WMS sources using the Open Source UMN MapServer and DM Solutions MapLab

    Directory of Open Access Journals (Sweden)

    Honda Kiyoshi

    2006-01-01

    Full Text Available Abstract Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium standards, including WMS (Web Map Service. WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described.

  18. IMPLEMENTATION OF OPEN-SOURCE WEB MAPPING TECHNOLOGIES TO SUPPORT MONITORING OF GOVERNMENTAL SCHEMES

    Directory of Open Access Journals (Sweden)

    B. R. Pulsani

    2015-10-01

    Full Text Available Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS, Telangana State Housing Corporation Limited (TSHCL and Ground Water Quality Mapping (GWQM has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  19. Parametric normalization for full-energy peak efficiency of HPGe γ-ray spectrometers at different counting positions for bulky sources.

    Science.gov (United States)

    Peng, Nie; Bang-Fa, Ni; Wei-Zhi, Tian

    2013-02-01

    Application of effective interaction depth (EID) principle for parametric normalization of full energy peak efficiencies at different counting positions, originally for quasi-point sources, has been extended to bulky sources (within ∅30 mm×40 mm) with arbitrary matrices. It is also proved that the EID function for quasi-point source can be directly used for cylindrical bulky sources (within ∅30 mm×40 mm) with the geometric center as effective point source for low atomic number (Z) and low density (D) media and high energy γ-rays. It is also found that in general EID for bulky sources is dependent upon Z and D of the medium and the energy of the γ-rays in question. In addition, the EID principle was theoretically verified by MCNP calculations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Science.gov (United States)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  1. Spatial resolution limits for the localization of noise sources using direct sound mapping

    DEFF Research Database (Denmark)

    Comesana, D. Fernandez; Holland, K. R.; Fernandez Grande, Efren

    2016-01-01

    the relationship between spatial resolution, noise level and geometry. The proposed expressions are validated via simulations and experiments. It is shown that particle velocity mapping yields better results for identifying closely spaced sound sources than sound pressure or sound intensity, especially...... extensively been used for many years to locate sound sources. However, it is not yet well defined when two sources should be regarded as resolved by means of direct sound mapping. This paper derives the limits of the direct representation of sound pressure, particle velocity and sound intensity by exploring......One of the main challenges arising from noise and vibration problems is how to identify the areas of a device, machine or structure that produce significant acoustic excitation, i.e. the localization of main noise sources. The direct visualization of sound, in particular sound intensity, has...

  2. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    Directory of Open Access Journals (Sweden)

    Xiang Gao

    2016-07-01

    Full Text Available This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  3. Crowd-Sourced Mobility Mapping for Location Tracking Using Unlabeled Wi-Fi Simultaneous Localization and Mapping

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-01-01

    Full Text Available Due to the increasing requirements of the seamless and round-the-clock Location-based services (LBSs, a growing interest in Wi-Fi network aided location tracking is witnessed in the past decade. One of the significant problems of the conventional Wi-Fi location tracking approaches based on received signal strength (RSS fingerprinting is the time-consuming and labor intensive work involved in location fingerprint calibration. To solve this problem, a novel unlabeled Wi-Fi simultaneous localization and mapping (SLAM approach is developed to avoid the location fingerprinting and additional inertial or vision sensors. In this approach, an unlabeled mobility map of the coverage area is first constructed by using the crowd-sourcing from a batch of sporadically recorded Wi-Fi RSS sequences based on the spectral cluster assembling. Then, the sequence alignment algorithm is applied to conduct location tracking and mobility map updating. Finally, the effectiveness of this approach is verified by the extensive experiments carried out in a campus-wide area.

  4. Prototype of interactive Web Maps: an approach based on open sources

    Directory of Open Access Journals (Sweden)

    Jürgen Philips

    2004-07-01

    Full Text Available To explore the potentialities available in the World Wide Web (WWW, a prototype with interactive Web map was elaborated using standardized codes and open sources, such as eXtensible Markup Language (XML, Scalable Vector Graphics (SVG, Document Object Model (DOM , script languages ECMAScript/JavaScript and “PHP: Hypertext Preprocessor”, and PostgreSQL and its extension, the PostGIS, to disseminate information related to the urban real estate register. Data from the City Hall of São José - Santa Catarina, were used, referring to Campinas district. Using Client/Server model, a prototype of a Web map with standardized codes and open sources was implemented, allowing a user to visualize Web maps using only the Adobe’s plug-in Viewer 3.0 in his/her browser. Aiming a good cartographic project for the Web, it was obeyed rules of graphical translation and was implemented different functionalities of interaction, like interactive legends, symbolization and dynamic scale. From the results, it can be recommended the use of using standardized codes and open sources in interactive Web mapping projects. It is understood that, with the use of Open Source code, in the public and private administration, the possibility of technological development is amplified, and consequently, a reduction with expenses in the acquisition of computer’s program. Besides, it stimulates the development of computer applications targeting specific demands and requirements.

  5. High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers

    Science.gov (United States)

    Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas

    2017-04-01

    Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for

  6. Counting carbohydrates

    Science.gov (United States)

    Carb counting; Carbohydrate-controlled diet; Diabetic diet; Diabetes-counting carbohydrates ... Many foods contain carbohydrates (carbs), including: Fruit and fruit juice Cereal, bread, pasta, and rice Milk and milk products, soy milk Beans, legumes, ...

  7. Large-region acoustic source mapping using a movable array and sparse covariance fitting.

    Science.gov (United States)

    Zhao, Shengkui; Tuna, Cagdas; Nguyen, Thi Ngoc Tho; Jones, Douglas L

    2017-01-01

    Large-region acoustic source mapping is important for city-scale noise monitoring. Approaches using a single-position measurement scheme to scan large regions using small arrays cannot provide clean acoustic source maps, while deploying large arrays spanning the entire region of interest is prohibitively expensive. A multiple-position measurement scheme is applied to scan large regions at multiple spatial positions using a movable array of small size. Based on the multiple-position measurement scheme, a sparse-constrained multiple-position vectorized covariance matrix fitting approach is presented. In the proposed approach, the overall sample covariance matrix of the incoherent virtual array is first estimated using the multiple-position array data and then vectorized using the Khatri-Rao (KR) product. A linear model is then constructed for fitting the vectorized covariance matrix and a sparse-constrained reconstruction algorithm is proposed for recovering source powers from the model. The user parameter settings are discussed. The proposed approach is tested on a 30 m × 40 m region and a 60 m × 40 m region using simulated and measured data. Much cleaner acoustic source maps and lower sound pressure level errors are obtained compared to the beamforming approaches and the previous sparse approach [Zhao, Tuna, Nguyen, and Jones, Proc. IEEE Intl. Conf. on Acoustics, Speech and Signal Processing (ICASSP) (2016)].

  8. Model-based analysis and optimization of the mapping of cortical sources in the spontaneous scalp EEG

    NARCIS (Netherlands)

    Sazonov, A.; Bergmans, J.W.M.; Cluitmans, P.J.M.; Griep, P.A.M.; Arends, J.B.A.M.; Boon, P.A.J.M.

    2007-01-01

    The mapping of brain sources into the scalp electroencephalogram (EEG) depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM) is fully determined by an observation function (OF) matrix. This paper analyses the

  9. Mapping correlation of a simulated dark matter source and a point source in the gamma-ray sky - Oral Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, Alexander [SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2015-08-23

    In my research, I analyzed how two gamma-ray source models interact with one another when optimizing to fit data. This is important because it becomes hard to distinguish between the two point sources when they are close together or looking at low energy photons. The reason for the first is obvious, the reason why they become harder to distinguish at lower photon energies is the resolving power of the Fermi Gamma-Ray Space Telescope gets worse at lower energies. When the two point sources are highly correlated (hard to distinguish between), we need to change our method of statistical analysis. What I did was show that highly correlated sources have larger uncertainties associated with them, caused by an optimizer not knowing which point source’s parameters to optimize. I also mapped out where their is high correlation for 2 different theoretical mass dark matter point sources so that people analyzing them in the future knew where they had to use more sophisticated statistical analysis.

  10. Identifying fecal pollution sources using 3M(™) Petrifilm (™) count plates and antibiotic resistance analysis in the Horse Creek Watershed in Aiken County, SC (USA).

    Science.gov (United States)

    Harmon, S Michele; West, Ryan T; Yates, James R

    2014-12-01

    Sources of fecal coliform pollution in a small South Carolina (USA) watershed were identified using inexpensive methods and commonly available equipment. Samples from the upper reaches of the watershed were analyzed with 3M(™) Petrifilm(™) count plates. We were able to narrow down the study's focus to one particular tributary, Sand River, that was the major contributor of the coliform pollution (both fecal and total) to a downstream reservoir that is heavily used for recreation purposes. Concentrations of total coliforms ranged from 2,400 to 120,333 cfu/100 mL, with sharp increases in coliform counts observed in samples taken after rain events. Positive correlations between turbidity and fecal coliform counts suggested a relationship between fecal pollution and stormwater runoff. Antibiotic resistance analysis (ARA) compared antibiotic resistance profiles of fecal coliform isolates from the stream to those of a watershed-specific fecal source library (equine, waterfowl, canines, and untreated sewage). Known fecal source isolates and unknown isolates from the stream were exposed to six antibiotics at three concentrations each. Discriminant analysis grouped known isolates with an overall average rate of correct classification (ARCC) of 84.3 %. A total of 401 isolates from the first stream location were classified as equine (45.9 %), sewage (39.4 %), waterfowl (6.2 %), and feline (8.5 %). A similar pattern was observed at the second sampling location, with 42.6 % equine, 45.2 % sewage, 2.8 % waterfowl, 0.6 % canine, and 8.8 % feline. While there were slight weather-dependent differences, the vast majority of the coliform pollution in this stream appeared to be from two sources, equine and sewage. This information will contribute to better land use decisions and further justify implementation of low-impact development practices within this urban watershed.

  11. Pixel-based parametric source depth map for Cerenkov luminescence imaging

    International Nuclear Information System (INIS)

    Altabella, L.; Spinelli, A.E.; Boschi, F.

    2016-01-01

    Optical tomography represents a challenging problem in optical imaging because of the intrinsically ill-posed inverse problem due to photon diffusion. Cerenkov luminescence tomography (CLT) for optical photons produced in tissues by several radionuclides (i.e.: 32P, 18F, 90Y), has been investigated using both 3D multispectral approach and multiviews methods. Difficult in convergence of 3D algorithms can discourage to use this technique to have information of depth and intensity of source. For these reasons, we developed a faster 2D corrected approach based on multispectral acquisitions, to obtain source depth and its intensity using a pixel-based fitting of source intensity. Monte Carlo simulations and experimental data were used to develop and validate the method to obtain the parametric map of source depth. With this approach we obtain parametric source depth maps with a precision between 3% and 7% for MC simulation and 5–6% for experimental data. Using this method we are able to obtain reliable information about the source depth of Cerenkov luminescence with a simple and flexible procedure

  12. Review of single particle dynamics for third generation light sources through frequency map analysis

    Directory of Open Access Journals (Sweden)

    L. Nadolski

    2003-11-01

    Full Text Available Frequency map analysis [J. Laskar, Icarus 88, 266 (1990] is used here to analyze the transverse dynamics of four third generation synchrotron light sources: the ALS, the ESRF, the SOLEIL project, and Super-ACO. Time variations of the betatron tunes give additional information for the global dynamics of the beam. The main resonances are revealed; a one-to-one correspondence between the configuration space and the frequency space can be performed. We stress that the frequency maps, and therefore the dynamics optimization, are highly sensitive to sextupolar strengths and vary in a large amount from one machine to another. The frequency maps can thus be used to characterize the different machines.

  13. THE USE OF MULTIPLE DATA SOURCES IN THE PROCESS OF TOPOGRAPHIC MAPS UPDATING

    Directory of Open Access Journals (Sweden)

    A. Cantemir

    2016-06-01

    Full Text Available The methods used in the process of updating maps have evolved and become more complex, especially upon the development of the digital technology. At the same time, the development of technology has led to an abundance of available data that can be used in the updating process. The data sources came in a great variety of forms and formats from different acquisition sensors. Satellite images provided by certain satellite missions are now available on space agencies portals. Images stored in archives of satellite missions such us Sentinel, Landsat and other can be downloaded free of charge.The main advantages are represented by the large coverage area and rather good spatial resolution that enables the use of these images for the map updating at an appropriate scale. In our study we focused our research of these images on 1: 50.000 scale map. DEM that are globally available could represent an appropriate input for watershed delineation and stream network generation, that can be used as support for hydrography thematic layer update. If, in addition to remote sensing aerial photogrametry and LiDAR data are ussed, the accuracy of data sources is enhanced. Ortophotoimages and Digital Terrain Models are the main products that can be used for feature extraction and update. On the other side, the use of georeferenced analogical basemaps represent a significant addition to the process. Concerning the thematic maps, the classic representation of the terrain by contour lines derived from DTM, remains the best method of surfacing the earth on a map, nevertheless the correlation with other layers such as Hidrography are mandatory. In the context of the current national coverage of the Digital Terrain Model, one of the main concerns of the National Center of Cartography, through the Cartography and Photogrammetry Department, is represented by the exploitation of the available data in order to update the layers of the Topographic Reference Map 1:5000, known as

  14. Open Source Software Development Experiences on the Students' Resumes: Do They Count?--Insights from the Employers' Perspectives

    Science.gov (United States)

    Long, Ju

    2009-01-01

    Open Source Software (OSS) is a major force in today's Information Technology (IT) landscape. Companies are increasingly using OSS in mission-critical applications. The transparency of the OSS technology itself with openly available source codes makes it ideal for students to participate in the OSS project development. OSS can provide unique…

  15. Impact source location on composite CNG storage tank using acoustic emission energy based signal mapping method

    Energy Technology Data Exchange (ETDEWEB)

    Han, Byeong Hee; Yoon, Dong Jin; Park, Chun Soo [Korea Research Institute of Standards and Science, Center for Safety Measurement, Daejeon (Korea, Republic of); Lee, Young Shin [Dept. of Mechanical Design Engineering, Chungnam National University, Daejeon (Korea, Republic of)

    2016-10-15

    Acoustic emission (AE) is one of the most powerful techniques for detecting damages and identify damage location during operations. However, in case of the source location technique, there is some limitation in conventional AE technology, because it strongly depends on wave speed in the corresponding structures having heterogeneous composite materials. A compressed natural gas(CNG) pressure vessel is usually made of carbon fiber composite outside of vessel for the purpose of strengthening. In this type of composite material, locating impact damage sources exactly using conventional time arrival method is difficult. To overcome this limitation, this study applied the previously developed Contour D/B map technique to four types of CNG storage tanks to identify the source location of damages caused by external shock. The results of the identification of the source location for different types were compared.

  16. Calorie count - fast food

    Science.gov (United States)

    ... GO About MedlinePlus Site Map FAQs Customer Support Health Topics Drugs & Supplements Videos & Tools Español You Are Here: Home → Medical Encyclopedia → Calorie count - fast food URL of this page: //medlineplus.gov/ency/patientinstructions/ ...

  17. Counting cormorants

    DEFF Research Database (Denmark)

    Bregnballe, Thomas; Carss, David N; Lorentsen, Svein-Håkon

    2013-01-01

    This chapter focuses on Cormorant population counts for both summer (i.e. breeding) and winter (i.e. migration, winter roosts) seasons. It also explains differences in the data collected from undertaking ‘day’ versus ‘roost’ counts, gives some definitions of the term ‘numbers’, and presents two...

  18. High-frequency maximum observable shaking map of Italy from fault sources

    KAUST Repository

    Zonno, Gaetano; Basili, Roberto; Meroni, Fabrizio; Musacchio, Gemma; Mai, Paul Martin; Valensise, Gianluca

    2012-01-01

    We present a strategy for obtaining fault-based maximum observable shaking (MOS) maps, which represent an innovative concept for assessing deterministic seismic ground motion at a regional scale. Our approach uses the fault sources supplied for Italy by the Database of Individual Seismogenic Sources, and particularly by its composite seismogenic sources (CSS), a spatially continuous simplified 3-D representation of a fault system. For each CSS, we consider the associated Typical Fault, i. e., the portion of the corresponding CSS that can generate the maximum credible earthquake. We then compute the high-frequency (1-50 Hz) ground shaking for a rupture model derived from its associated maximum credible earthquake. As the Typical Fault floats within its CSS to occupy all possible positions of the rupture, the high-frequency shaking is updated in the area surrounding the fault, and the maximum from that scenario is extracted and displayed on a map. The final high-frequency MOS map of Italy is then obtained by merging 8,859 individual scenario-simulations, from which the ground shaking parameters have been extracted. To explore the internal consistency of our calculations and validate the results of the procedure we compare our results (1) with predictions based on the Next Generation Attenuation ground-motion equations for an earthquake of M w 7.1, (2) with the predictions of the official Italian seismic hazard map, and (3) with macroseismic intensities included in the DBMI04 Italian database. We then examine the uncertainties and analyse the variability of ground motion for different fault geometries and slip distributions. © 2012 Springer Science+Business Media B.V.

  19. High-frequency maximum observable shaking map of Italy from fault sources

    KAUST Repository

    Zonno, Gaetano

    2012-03-17

    We present a strategy for obtaining fault-based maximum observable shaking (MOS) maps, which represent an innovative concept for assessing deterministic seismic ground motion at a regional scale. Our approach uses the fault sources supplied for Italy by the Database of Individual Seismogenic Sources, and particularly by its composite seismogenic sources (CSS), a spatially continuous simplified 3-D representation of a fault system. For each CSS, we consider the associated Typical Fault, i. e., the portion of the corresponding CSS that can generate the maximum credible earthquake. We then compute the high-frequency (1-50 Hz) ground shaking for a rupture model derived from its associated maximum credible earthquake. As the Typical Fault floats within its CSS to occupy all possible positions of the rupture, the high-frequency shaking is updated in the area surrounding the fault, and the maximum from that scenario is extracted and displayed on a map. The final high-frequency MOS map of Italy is then obtained by merging 8,859 individual scenario-simulations, from which the ground shaking parameters have been extracted. To explore the internal consistency of our calculations and validate the results of the procedure we compare our results (1) with predictions based on the Next Generation Attenuation ground-motion equations for an earthquake of M w 7.1, (2) with the predictions of the official Italian seismic hazard map, and (3) with macroseismic intensities included in the DBMI04 Italian database. We then examine the uncertainties and analyse the variability of ground motion for different fault geometries and slip distributions. © 2012 Springer Science+Business Media B.V.

  20. Mapping of potential heat sources for heat pumps for district heating in Denmark

    International Nuclear Information System (INIS)

    Lund, Rasmus; Persson, Urban

    2016-01-01

    The ambitious policy in Denmark on having a 100% renewable energy supply in 2050 requires radical changes to the energy systems to avoid an extensive and unsustainable use of biomass resources. Currently, wind power is being expanded and the increasing supply of electricity is slowly pushing the CHP (combined heat and power) plants out of operation, reducing the energy efficiency of the DH (district heating) supply. Here, large heat pumps for district heating is a frequently mentioned solution as a flexible demand for electricity and an energy efficient heat producer. The idea is to make heat pump use a low temperature waste or ambient heat source, but it has so far been very unclear which heat sources are actually available for this purpose. In this study eight categories of heat sources are analysed for the case of Denmark and included in a detailed spatial analysis where the identified heat sources are put in relation to the district heating areas and the corresponding demands. The analysis shows that potential heat sources are present near almost all district heating areas and that sea water most likely will have to play a substantial role as a heat source in future energy systems in Denmark. - Highlights: • The availability of heat sources for heat pumps in Denmark are mapped and quantified. • A novel methodology for assessment of low temperature industrial excess heat is presented. • There are heat sources available for 99% of district heating networks in Denmark. • The concentration of heat sources is generally bigger around bigger cities than smaller. • Ambient temperature heat sources will be more needed in district heating of big cities.

  1. An Open Source Web Map Server Implementation For California and the Digital Earth: Lessons Learned

    Science.gov (United States)

    Sullivan, D. V.; Sheffner, E. J.; Skiles, J. W.; Brass, J. A.; Condon, Estelle (Technical Monitor)

    2000-01-01

    This paper describes an Open Source implementation of the Open GIS Consortium's Web Map interface. It is based on the very popular Apache WWW Server, the Sun Microsystems Java ServIet Development Kit, and a C language shared library interface to a spatial datastore. This server was initially written as a proof of concept, to support a National Aeronautics and Space Administration (NASA) Digital Earth test bed demonstration. It will also find use in the California Land Science Information Partnership (CaLSIP), a joint program between NASA and the state of California. At least one WebMap enabled server will be installed in every one of the state's 58 counties. This server will form a basis for a simple, easily maintained installation for those entities that do not yet require one of the larger, more expensive, commercial offerings.

  2. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  3. MAP-Based Underdetermined Blind Source Separation of Convolutive Mixtures by Hierarchical Clustering and -Norm Minimization

    Directory of Open Access Journals (Sweden)

    Kellermann Walter

    2007-01-01

    Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.

  4. Novel Family of Single-Phase Modified Impedance-Source Buck-Boost Multilevel Inverters with Reduced Switch Count

    DEFF Research Database (Denmark)

    Husev, Oleksandr; Strzelecki, Ryszard; Blaabjerg, Frede

    2016-01-01

    This paper describes novel single-phase solutions with increased inverter voltage levels derived by means of a nonstandard inverter configuration and impedance source networks. Operation principles based on special modulation techniques are presented. Detailed component design guidelines along wi...... with simulation and experimental verification are also provided. Possible application fields are discussed, as well as advantages and disadvantages. Finally, future studies are addressed for the new solutions....

  5. Counting the dead to determine the source and transmission of the marine herpesvirus OsHV-1 in Crassostrea gigas.

    Science.gov (United States)

    Whittington, Richard J; Paul-Pont, Ika; Evans, Olivia; Hick, Paul; Dhand, Navneet K

    2018-04-10

    Marine herpesviruses are responsible for epizootics in economically, ecologically and culturally significant taxa. The recent emergence of microvariants of Ostreid herpesvirus 1 (OsHV-1) in Pacific oysters Crassostrea gigas has resulted in socioeconomic losses in Europe, New Zealand and Australia however, there is no information on their origin or mode of transmission. These factors need to be understood because they influence the way the disease may be prevented and controlled. Mortality data obtained from experimental populations of C. gigas during natural epizootics of OsHV-1 disease in Australia were analysed qualitatively. In addition we compared actual mortality data with those from a Reed-Frost model of direct transmission and analysed incubation periods using Sartwell's method to test for the type of epizootic, point source or propagating. We concluded that outbreaks were initiated from an unknown environmental source which is unlikely to be farmed oysters in the same estuary. While direct oyster-to-oyster transmission may occur in larger oysters if they are in close proximity (< 40 cm), it did not explain the observed epizootics, point source exposure and indirect transmission being more common and important. A conceptual model is proposed for OsHV-1 index case source and transmission, leading to endemicity with recurrent seasonal outbreaks. The findings suggest that prevention and control of OsHV-1 in C. gigas will require multiple interventions. OsHV-1 in C. gigas, which is a sedentary animal once beyond the larval stage, is an informative model when considering marine host-herpesvirus relationships.

  6. TRAM (Transcriptome Mapper: database-driven creation and analysis of transcriptome maps from multiple sources

    Directory of Open Access Journals (Sweden)

    Danieli Gian

    2011-02-01

    Full Text Available Abstract Background Several tools have been developed to perform global gene expression profile data analysis, to search for specific chromosomal regions whose features meet defined criteria as well as to study neighbouring gene expression. However, most of these tools are tailored for a specific use in a particular context (e.g. they are species-specific, or limited to a particular data format and they typically accept only gene lists as input. Results TRAM (Transcriptome Mapper is a new general tool that allows the simple generation and analysis of quantitative transcriptome maps, starting from any source listing gene expression values for a given gene set (e.g. expression microarrays, implemented as a relational database. It includes a parser able to assign univocal and updated gene symbols to gene identifiers from different data sources. Moreover, TRAM is able to perform intra-sample and inter-sample data normalization, including an original variant of quantile normalization (scaled quantile, useful to normalize data from platforms with highly different numbers of investigated genes. When in 'Map' mode, the software generates a quantitative representation of the transcriptome of a sample (or of a pool of samples and identifies if segments of defined lengths are over/under-expressed compared to the desired threshold. When in 'Cluster' mode, the software searches for a set of over/under-expressed consecutive genes. Statistical significance for all results is calculated with respect to genes localized on the same chromosome or to all genome genes. Transcriptome maps, showing differential expression between two sample groups, relative to two different biological conditions, may be easily generated. We present the results of a biological model test, based on a meta-analysis comparison between a sample pool of human CD34+ hematopoietic progenitor cells and a sample pool of megakaryocytic cells. Biologically relevant chromosomal segments and gene

  7. Tower counts

    Science.gov (United States)

    Woody, Carol Ann; Johnson, D.H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.

    2007-01-01

    Counting towers provide an accurate, low-cost, low-maintenance, low-technology, and easily mobilized escapement estimation program compared to other methods (e.g., weirs, hydroacoustics, mark-recapture, and aerial surveys) (Thompson 1962; Siebel 1967; Cousens et al. 1982; Symons and Waldichuk 1984; Anderson 2000; Alaska Department of Fish and Game 2003). Counting tower data has been found to be consistent with that of digital video counts (Edwards 2005). Counting towers do not interfere with natural fish migration patterns, nor are fish handled or stressed; however, their use is generally limited to clear rivers that meet specific site selection criteria. The data provided by counting tower sampling allow fishery managers to determine reproductive population size, estimate total return (escapement + catch) and its uncertainty, evaluate population productivity and trends, set harvest rates, determine spawning escapement goals, and forecast future returns (Alaska Department of Fish and Game 1974-2000 and 1975-2004). The number of spawning fish is determined by subtracting subsistence, sport-caught fish, and prespawn mortality from the total estimated escapement. The methods outlined in this protocol for tower counts can be used to provide reasonable estimates ( plus or minus 6%-10%) of reproductive salmon population size and run timing in clear rivers. 

  8. GIS based optimal impervious surface map generation using various spatial data for urban nonpoint source management.

    Science.gov (United States)

    Lee, Cholyoung; Kim, Kyehyun; Lee, Hyuk

    2018-01-15

    Impervious surfaces are mainly artificial structures such as rooftops, roads, and parking lots that are covered by impenetrable materials. These surfaces are becoming the major causes of nonpoint source (NPS) pollution in urban areas. The rapid progress of urban development is increasing the total amount of impervious surfaces and NPS pollution. Therefore, many cities worldwide have adopted a stormwater utility fee (SUF) that generates funds needed to manage NPS pollution. The amount of SUF is estimated based on the impervious ratio, which is calculated by dividing the total impervious surface area by the net area of an individual land parcel. Hence, in order to identify the exact impervious ratio, large-scale impervious surface maps (ISMs) are necessary. This study proposes and assesses various methods for generating large-scale ISMs for urban areas by using existing GIS data. Bupyeong-gu, a district in the city of Incheon, South Korea, was selected as the study area. Spatial data that were freely offered by national/local governments in S. Korea were collected. First, three types of ISMs were generated by using the land-cover map, digital topographic map, and orthophotographs, to validate three methods that had been proposed conceptually by Korea Environment Corporation. Then, to generate an ISM of higher accuracy, an integration method using all data was proposed. Error matrices were made and Kappa statistics were calculated to evaluate the accuracy. Overlay analyses were performed to examine the distribution of misclassified areas. From the results, the integration method delivered the highest accuracy (Kappa statistic of 0.99) compared to the three methods that use a single type of spatial data. However, a longer production time and higher cost were limiting factors. Among the three methods using a single type of data, the land-cover map showed the highest accuracy with a Kappa statistic of 0.91. Thus, it was judged that the mapping method using the land

  9. Lessons learned from a pilot implementation of the UMLS information sources map.

    Science.gov (United States)

    Miller, P L; Frawley, S J; Wright, L; Roderer, N K; Powsner, S M

    1995-01-01

    OBJECTIVE: To explore the software design issues involved in implementing an operational information sources map (ISM) knowledge base (KB) and system of navigational tools that can help medical users access network-based information sources relevant to a biomedical question. DESIGN: A pilot biomedical ISM KB and associated client-server software (ISM/Explorer) have been developed to help students, clinicians, researchers, and staff access network-based information sources, as part of the National Library of Medicine's (NLM) multi-institutional Unified Medical Language System (UMLS) project. The system allows the user to specify and constrain a search for a biomedical question of interest. The system then returns a list of sources matching the search. At this point the user may request 1) further information about a source, 2) that the list of sources be regrouped by different criteria to allow the user to get a better overall appreciation of the set of retrieved sources as a whole, or 3) automatic connection to a source. RESULTS: The pilot system operates in client-server mode and currently contains coded information for 121 sources. It is in routine use from approximately 40 workstations at the Yale School of Medicine. The lessons that have been learned are that: 1) it is important to make access to different versions of a source as seamless as possible, 2) achieving seamless, cross-platform access to heterogeneous sources is difficult, 3) significant differences exist between coding the subject content of an electronic information resource versus that of an article or a book, 4) customizing the ISM to multiple institutions entails significant complexities, and 5) there are many design trade-offs between specifying searches and viewing sets of retrieved sources that must be taken into consideration. CONCLUSION: An ISM KB and navigational tools have been constructed. In the process, much has been learned about the complexities of development and evaluation in this

  10. Lessons learned from a pilot implementation of the UMLS information sources map.

    Science.gov (United States)

    Miller, P L; Frawley, S J; Wright, L; Roderer, N K; Powsner, S M

    1995-01-01

    To explore the software design issues involved in implementing an operational information sources map (ISM) knowledge base (KB) and system of navigational tools that can help medical users access network-based information sources relevant to a biomedical question. A pilot biomedical ISM KB and associated client-server software (ISM/Explorer) have been developed to help students, clinicians, researchers, and staff access network-based information sources, as part of the National Library of Medicine's (NLM) multi-institutional Unified Medical Language System (UMLS) project. The system allows the user to specify and constrain a search for a biomedical question of interest. The system then returns a list of sources matching the search. At this point the user may request 1) further information about a source, 2) that the list of sources be regrouped by different criteria to allow the user to get a better overall appreciation of the set of retrieved sources as a whole, or 3) automatic connection to a source. The pilot system operates in client-server mode and currently contains coded information for 121 sources. It is in routine use from approximately 40 workstations at the Yale School of Medicine. The lessons that have been learned are that: 1) it is important to make access to different versions of a source as seamless as possible, 2) achieving seamless, cross-platform access to heterogeneous sources is difficult, 3) significant differences exist between coding the subject content of an electronic information resource versus that of an article or a book, 4) customizing the ISM to multiple institutions entails significant complexities, and 5) there are many design trade-offs between specifying searches and viewing sets of retrieved sources that must be taken into consideration. An ISM KB and navigational tools have been constructed. In the process, much has been learned about the complexities of development and evaluation in this new environment, which are different

  11. Mapping and defining sources of variability in bioavailable strontium isotope ratios in the Eastern Mediterranean

    Science.gov (United States)

    Hartman, Gideon; Richards, Mike

    2014-02-01

    The relative contributions of bedrock and atmospheric sources to bioavailable strontium (Sr) pools in local soils was studied in Northern Israel and the Golan regions through intensive systematic sampling of modern plants and invertebrates, to produce a map of modern bioavailable strontium isotope ratios (87Sr/86Sr) for regional reconstructions of human and animal mobility patterns. The study investigates sources of variability in bioavailable 87Sr/86Sr ratios, in particular the intra-and inter-site range of variation in plant 87Sr/86Sr ratios, the range of 87Sr/86Sr ratios of plants growing on marine sedimentary versus volcanic geologies, the differences between ligneous and non-ligneous plants with varying growth and water utilization strategies, and the relative contribution of atmospheric Sr sources from different soil and vegetation types and climatic zones. Results indicate predictable variation in 87Sr/86Sr ratios. Inter- and intra-site differences in bioavailable 87Sr/86Sr ratios average of 0.00025, while the range of 87Sr/86Sr ratios measured regionally in plants and invertebrates is 0.7090 in Pleistocene calcareous sandstone and 0.7074 in mid-Pleistocene volcanic pyroclast. The 87Sr/86Sr ratios measured in plants growing on volcanic bedrock show time dependent increases in atmospheric deposition relative to bedrock weathering. The 87Sr/86Sr ratios measured in plants growing on renzina soils depends on precipitation. The spacing between bedrock 87Sr/86Sr ratios and plants is highest in wet conditions and decreases in dry conditions. The 87Sr/86Sr ratios measured in plants growing on terra rossa soils is relatively constant (0.7085) regardless of precipitation. Ligneous plants are typically closer to bedrock 87Sr/86Sr ratios than non-ligneous plants. Since the bioavailable 87Sr/86Sr ratios currently measured in the region reflect a mix of both exogenous and endogenous sources, changes in the relative contribution of exogenous sources can cause variation

  12. INLINING 3D RECONSTRUCTION, MULTI-SOURCE TEXTURE MAPPING AND SEMANTIC ANALYSIS USING OBLIQUE AERIAL IMAGERY

    Directory of Open Access Journals (Sweden)

    D. Frommholz

    2016-06-01

    Full Text Available This paper proposes an in-line method for the simplified reconstruction of city buildings from nadir and oblique aerial images that at the same time are being used for multi-source texture mapping with minimal resampling. Further, the resulting unrectified texture atlases are analyzed for fac¸ade elements like windows to be reintegrated into the original 3D models. Tests on real-world data of Heligoland/ Germany comprising more than 800 buildings exposed a median positional deviation of 0.31 m at the fac¸ades compared to the cadastral map, a correctness of 67% for the detected windows and good visual quality when being rendered with GPU-based perspective correction. As part of the process building reconstruction takes the oriented input images and transforms them into dense point clouds by semi-global matching (SGM. The point sets undergo local RANSAC-based regression and topology analysis to detect adjacent planar surfaces and determine their semantics. Based on this information the roof, wall and ground surfaces found get intersected and limited in their extension to form a closed 3D building hull. For texture mapping the hull polygons are projected into each possible input bitmap to find suitable color sources regarding the coverage and resolution. Occlusions are detected by ray-casting a full-scale digital surface model (DSM of the scene and stored in pixel-precise visibility maps. These maps are used to derive overlap statistics and radiometric adjustment coefficients to be applied when the visible image parts for each building polygon are being copied into a compact texture atlas without resampling whenever possible. The atlas bitmap is passed to a commercial object-based image analysis (OBIA tool running a custom rule set to identify windows on the contained fac¸ade patches. Following multi-resolution segmentation and classification based on brightness and contrast differences potential window objects are evaluated against geometric

  13. Stability of rotors and focal sources for human atrial fibrillation: focal impulse and rotor mapping (FIRM) of AF sources and fibrillatory conduction.

    Science.gov (United States)

    Swarup, Vijay; Baykaner, Tina; Rostamian, Armand; Daubert, James P; Hummel, John; Krummen, David E; Trikha, Rishi; Miller, John M; Tomassoni, Gery F; Narayan, Sanjiv M

    2014-12-01

    Several groups report electrical rotors or focal sources that sustain atrial fibrillation (AF) after it has been triggered. However, it is difficult to separate stable from unstable activity in prior studies that examined only seconds of AF. We applied phase-based focal impulse and rotor mapping (FIRM) to study the dynamics of rotors/sources in human AF over prolonged periods of time. We prospectively mapped AF in 260 patients (169 persistent, 61 ± 12 years) at 6 centers in the FIRM registry, using baskets with 64 contact electrodes per atrium. AF was phase mapped (RhythmView, Topera, Menlo Park, CA, USA). AF propagation movies were interpreted by each operator to assess the source stability/dynamics over tens of minutes before ablation. Sources were identified in 258 of 260 of patients (99%), for 2.8 ± 1.4 sources/patient (1.8 ± 1.1 in left, 1.1 ± 0.8 in right atria). While AF sources precessed in stable regions, emanating activity including spiral waves varied from collision/fusion (fibrillatory conduction). Each source lay in stable atrial regions for 4,196 ± 6,360 cycles, with no differences between paroxysmal versus persistent AF (4,290 ± 5,847 vs. 4,150 ± 6,604; P = 0.78), or right versus left atrial sources (P = 0.26). Rotors and focal sources for human AF mapped by FIRM over prolonged time periods precess ("wobble") but remain within stable regions for thousands of cycles. Conversely, emanating activity such as spiral waves disorganize and collide with the fibrillatory milieu, explaining difficulties in using activation mapping or signal processing analyses at fixed electrodes to detect AF rotors. These results provide a rationale for targeted ablation at AF sources rather than fibrillatory spiral waves. © 2014 Wiley Periodicals, Inc.

  14. Dose-rate mapping and search of radioactive sources in Estonia

    International Nuclear Information System (INIS)

    Ylaetalo, S.; Karvonen, J.; Ilander, T.; Honkamaa, T.; Toivonen, H.

    1996-12-01

    The Estonian Ministry of Environment and the Finnish Centre for Radiation and Nuclear Safety (STUK) agreed in 1995 on a radiation mapping project in Estonia. The country was searched to find potential man-made radioactive sources. Another goal of the project was to produce a background dose-rate map over the whole country. The measurements provided an excellent opportunity to test new in-field measuring systems that are useful in a nuclear disaster. The basic idea was to monitor road sides, cities, domestic waste storage places and former military or rocket bases from a moving vehicle by measuring gamma spectrum and dose rate. The measurements were carried out using vehicle installed systems consisting of a pressurised ionisation chamber (PIC) in 1995 and a combination of a scintillation spectrometer (NaI(TI)) and Geiger-Mueller-counter (GM) in 1996. All systems utilised GPS-satellite navigation signals to relate the measured dose rates and gamma-spectra to current geographical location. The data were recorded for further computer analysis. The dose rate varied usually between 0.03-0.17 μSv/h in the whole country, excluding a few nuclear material storage places (in Saku and in Sillamae). Enhanced dose rates of natural origin (0.17-0.5 μSv/h) were measured near granite statues, buildings and bridges. No radioactive sources were found on road sides or in towns or villages. (orig.) (14 refs.)

  15. Activity measurements of radioactive solutions by liquid scintillation counting and pressurized ionization chambers and Monte Carlo simulations of source-detector systems for metrology

    International Nuclear Information System (INIS)

    Amiot, Marie-Noelle

    2013-01-01

    The research works 'Activity measurements of radioactive solutions by liquid scintillation and pressurized ionization chambers and Monte Carlo simulations of source-detector systems' was presented for the graduation: 'Habilitation a diriger des recherches'. The common thread of both themes liquid scintillation counting and pressurized ionization chambers lies in the improvement of the techniques of radionuclide activity measurement. Metrology of ionization radiation intervenes in numerous domains, in the research, in the industry including the environment and the health, which are subjects of constant concern for the world population these last years. In this big variety of applications answers a large number of radionuclides of diverse disintegration scheme and under varied physical forms. The presented works realized within the National Laboratory Henri Becquerel have for objective to assure detector calibration traceability and to improve the methods of activity measurements within the framework of research projects and development. The improvement of the primary and secondary activity measurement methods consists in perfecting the accuracy of the measurements in particular by a better knowledge of the parameters influencing the detector yield. The works of development dealing with liquid scintillation counting concern mainly the study of the response of liquid scintillators to low energy electrons as well as their linear absorption coefficients using synchrotron radiation. The research works on pressurized ionization chambers consist of the study of their response to photons and electrons by experimental measurements compared to the simulation of the source-detector system using Monte Carlo codes. Besides, the design of a new type of ionization chamber with variable pressure is presented. This new project was developed to guarantee the precision of the amount of activity injected into the patient within the framework of diagnosis examination

  16. Infrared-faint radio sources: a cosmological view. AGN number counts, the cosmic X-ray background and SMBH formation

    Science.gov (United States)

    Zinn, P.-C.; Middelberg, E.; Ibar, E.

    2011-07-01

    Context. Infrared-faint radio sources (IFRS) are extragalactic emitters clearly detected at radio wavelengths but barely detected or undetected at optical and infrared wavelengths, with 5σ sensitivities as low as 1 μJy. Aims: Spectral energy distribution (hereafter SED) modelling and analyses of their radio properties indicate that IFRS are consistent with a population of (potentially extremely obscured) high-redshift AGN at 3 ≤ z ≤ 6. We demonstrate some astrophysical implications of this population and compare them to predictions from models of galaxy evolution and structure formation. Methods: We compiled a list of IFRS from four deep extragalactic surveys and extrapolated the IFRS number density to a survey-independent value of (30.8 ± 15.0) deg-2. We computed the IFRS contribution to the total number of AGN in the Universe to account for the cosmic X-ray background. By estimating the black hole mass contained in IFRS, we present conclusions for the SMBH mass density in the early universe and compare it to relevant simulations of structure formation after the Big Bang. Results: The number density of AGN derived from the IFRS density was found to be ~310 deg-2, which is equivalent to a SMBH mass density of the order of 103 M⊙ Mpc-3 in the redshift range 3 ≤ z ≤ 6. This produces an X-ray flux of 9 × 10-16 W m-2 deg-2 in the 0.5-2.0 keV band and 3 × 10-15 W m-2 deg-2 in the 2.0-10 keV band, in agreement with the missing unresolved components of the Cosmic X-ray Background. To address SMBH formation after the Big Bang we invoke a scenario involving both halo gas accretion and major mergers.

  17. Rugged: an operational, open-source solution for Sentinel-2 mapping

    Science.gov (United States)

    Maisonobe, Luc; Seyral, Jean; Prat, Guylaine; Guinet, Jonathan; Espesset, Aude

    2015-10-01

    When you map the entire Earth every 5 days with the aim of generating high-quality time series over land, there is no room for geometrical error: the algorithms have to be stable, reliable, and precise. Rugged, a new open-source library for pixel geolocation, is at the geometrical heart of the operational processing for Sentinel-2. Rugged performs sensor-to-terrain mapping taking into account ground Digital Elevation Models, Earth rotation with all its small irregularities, on-board sensor pixel individual lines-of-sight, spacecraft motion and attitude, and all significant physical effects. It provides direct and inverse location, i.e. it allows the accurate computation of which ground point is viewed from a specific pixel in a spacecraft instrument, and conversely which pixel will view a specified ground point. Direct and inverse location can be used to perform full ortho-rectification of images and correlation between sensors observing the same area. Implemented as an add-on for Orekit (Orbits Extrapolation KIT; a low-level space dynamics library), Rugged also offers the possibility of simulating satellite motion and attitude auxiliary data using Orekit's full orbit propagation capability. This is a considerable advantage for test data generation and mission simulation activities. Together with the Orfeo ToolBox (OTB) image processing library, Rugged provides the algorithmic core of Sentinel-2 Instrument Processing Facilities. The S2 complex viewing model - with 12 staggered push-broom detectors and 13 spectral bands - is built using Rugged objects, enabling the computation of rectification grids for mapping between cartographic and focal plane coordinates. These grids are passed to the OTB library for further image resampling, thus completing the ortho-rectification chain. Sentinel-2 stringent operational requirements to process several terabytes of data per week represented a tough challenge, though one that was well met by Rugged in terms of the robustness and

  18. Farmer data sourcing. The case study of the spatial soil information maps in South Tyrol.

    Science.gov (United States)

    Della Chiesa, Stefano; Niedrist, Georg; Thalheimer, Martin; Hafner, Hansjörg; La Cecilia, Daniele

    2017-04-01

    Nord-Italian region South Tyrol is Europe's largest apple growing area exporting ca. 15% in Europe and 2% worldwide. Vineyards represent ca. 1% of Italian production. In order to deliver high quality food, most of the farmers in South Tyrol follow sustainable farming practices. One of the key practice is the sustainable soil management, where farmers collect regularly (each 5 years) soil samples and send for analyses to improve cultivation management, yield and finally profitability. However, such data generally remain inaccessible. On this regard, in South Tyrol, private interests and the public administration have established a long tradition of collaboration with the local farming industry. This has granted to the collection of large spatial and temporal database of soil analyses along all the cultivated areas. Thanks to this best practice, information on soil properties are centralized and geocoded. The large dataset consist mainly in soil information of texture, humus content, pH and microelements availability such as, K, Mg, Bor, Mn, Cu Zn. This data was finally spatialized by mean of geostatistical methods and several high-resolution digital maps were created. In this contribution, we present the best practice where farmers data source soil information in South Tyrol. Show the capability of a large spatial-temporal geocoded soil dataset to reproduce detailed digital soil property maps and to assess long-term changes in soil properties. Finally, implication and potential application are discussed.

  19. Determination, Source Identification and GIS Mapping for Nitrate Concentration in Groundwater from Bara Aquifer

    Energy Technology Data Exchange (ETDEWEB)

    Elami, G. M.; Sam, A. K.; Yagob, T. I.; Siddeeg, S. E.M.B.; Hatim, E.; Hajo, I. [Sudan Atomic Energy Commission, Sudan, Khartoum (Sudan)

    2013-07-15

    This study was carried out to determine the level of nitrate concentration in well water from Bara aquifer in north Kordofan state (west central sudan). The analysis was conducted for 69 wells from different villages within the Bara basin. Spectophotometric analysis was used to determine nitrate, nitrite and ammonia. Results revealed that nitrate concentration range was from 9.68 to 891 mg L in the sampled well with 81% exceeding the maximum permissible limits set for drinking water by WHO and SSMO. Animal waste and organic soil nitrogen were found to be the source of nitrate in these wells as indicated by {sup 15}N. The majority of wells with high nitrate are in the north and the north east part of the study area are shown by the GIS predictive map. (author)

  20. Mapping human health risks from exposure to trace metal contamination of drinking water sources in Pakistan

    International Nuclear Information System (INIS)

    Bhowmik, Avit Kumar; Alamdar, Ambreen; Katsoyiannis, Ioannis; Shen, Heqing; Ali, Nadeem; Ali, Syeda Maria; Bokhari, Habib; Schäfer, Ralf B.; Eqani, Syed Ali Musstjab Akber Shah

    2015-01-01

    The consumption of contaminated drinking water is one of the major causes of mortality and many severe diseases in developing countries. The principal drinking water sources in Pakistan, i.e. ground and surface water, are subject to geogenic and anthropogenic trace metal contamination. However, water quality monitoring activities have been limited to a few administrative areas and a nationwide human health risk assessment from trace metal exposure is lacking. Using geographically weighted regression (GWR) and eight relevant spatial predictors, we calculated nationwide human health risk maps by predicting the concentration of 10 trace metals in the drinking water sources of Pakistan and comparing them to guideline values. GWR incorporated local variations of trace metal concentrations into prediction models and hence mitigated effects of large distances between sampled districts due to data scarcity. Predicted concentrations mostly exhibited high accuracy and low uncertainty, and were in good agreement with observed concentrations. Concentrations for Central Pakistan were predicted with higher accuracy than for the North and South. A maximum 150–200 fold exceedance of guideline values was observed for predicted cadmium concentrations in ground water and arsenic concentrations in surface water. In more than 53% (4 and 100% for the lower and upper boundaries of 95% confidence interval (CI)) of the total area of Pakistan, the drinking water was predicted to be at risk of contamination from arsenic, chromium, iron, nickel and lead. The area with elevated risks is inhabited by more than 74 million (8 and 172 million for the lower and upper boundaries of 95% CI) people. Although these predictions require further validation by field monitoring, the results can inform disease mitigation and water resources management regarding potential hot spots. - Highlights: • Predictions of trace metal concentration use geographically weighted regression • Human health risk

  1. Mapping human health risks from exposure to trace metal contamination of drinking water sources in Pakistan

    Energy Technology Data Exchange (ETDEWEB)

    Bhowmik, Avit Kumar [Institute for Environmental Sciences, University of Koblenz-Landau, Fortstrasse 7, D-76829 Landau in der Pfalz (Germany); Alamdar, Ambreen [Key Lab of Urban Environment and Health, Institute of Urban Environment, Chinese Academy of Sciences, Xiamen 361021 (China); Katsoyiannis, Ioannis [Aristotle University of Thessaloniki, Department of Chemistry, Division of Chemical Technology, Box 116, Thessaloniki 54124 (Greece); Shen, Heqing [Key Lab of Urban Environment and Health, Institute of Urban Environment, Chinese Academy of Sciences, Xiamen 361021 (China); Ali, Nadeem [Department of Environmental Sciences, FBAS, International Islamic University, Islamabad (Pakistan); Ali, Syeda Maria [Center of Excellence in Environmental Studies, King Abdulaziz University, Jeddah (Saudi Arabia); Bokhari, Habib [Public Health and Environment Division, Department of Biosciences, COMSATS Institute of Information Technology, Islamabad (Pakistan); Schäfer, Ralf B. [Institute for Environmental Sciences, University of Koblenz-Landau, Fortstrasse 7, D-76829 Landau in der Pfalz (Germany); Eqani, Syed Ali Musstjab Akber Shah, E-mail: ali_ebl2@yahoo.com [Key Lab of Urban Environment and Health, Institute of Urban Environment, Chinese Academy of Sciences, Xiamen 361021 (China); Public Health and Environment Division, Department of Biosciences, COMSATS Institute of Information Technology, Islamabad (Pakistan)

    2015-12-15

    The consumption of contaminated drinking water is one of the major causes of mortality and many severe diseases in developing countries. The principal drinking water sources in Pakistan, i.e. ground and surface water, are subject to geogenic and anthropogenic trace metal contamination. However, water quality monitoring activities have been limited to a few administrative areas and a nationwide human health risk assessment from trace metal exposure is lacking. Using geographically weighted regression (GWR) and eight relevant spatial predictors, we calculated nationwide human health risk maps by predicting the concentration of 10 trace metals in the drinking water sources of Pakistan and comparing them to guideline values. GWR incorporated local variations of trace metal concentrations into prediction models and hence mitigated effects of large distances between sampled districts due to data scarcity. Predicted concentrations mostly exhibited high accuracy and low uncertainty, and were in good agreement with observed concentrations. Concentrations for Central Pakistan were predicted with higher accuracy than for the North and South. A maximum 150–200 fold exceedance of guideline values was observed for predicted cadmium concentrations in ground water and arsenic concentrations in surface water. In more than 53% (4 and 100% for the lower and upper boundaries of 95% confidence interval (CI)) of the total area of Pakistan, the drinking water was predicted to be at risk of contamination from arsenic, chromium, iron, nickel and lead. The area with elevated risks is inhabited by more than 74 million (8 and 172 million for the lower and upper boundaries of 95% CI) people. Although these predictions require further validation by field monitoring, the results can inform disease mitigation and water resources management regarding potential hot spots. - Highlights: • Predictions of trace metal concentration use geographically weighted regression • Human health risk

  2. Counting probe

    International Nuclear Information System (INIS)

    Matsumoto, Haruya; Kaya, Nobuyuki; Yuasa, Kazuhiro; Hayashi, Tomoaki

    1976-01-01

    Electron counting method has been devised and experimented for the purpose of measuring electron temperature and density, the most fundamental quantities to represent plasma conditions. Electron counting is a method to count the electrons in plasma directly by equipping a probe with the secondary electron multiplier. It has three advantages of adjustable sensitivity, high sensitivity of the secondary electron multiplier, and directional property. Sensitivity adjustment is performed by changing the size of collecting hole (pin hole) on the incident front of the multiplier. The probe is usable as a direct reading thermometer of electron temperature because it requires to collect very small amount of electrons, thus it doesn't disturb the surrounding plasma, and the narrow sweep width of the probe voltage is enough. Therefore it can measure anisotropy more sensitively than a Langmuir probe, and it can be used for very low density plasma. Though many problems remain on anisotropy, computer simulation has been carried out. Also it is planned to provide a Helmholtz coil in the vacuum chamber to eliminate the effect of earth magnetic field. In practical experiments, the measurement with a Langmuir probe and an emission probe mounted to the movable structure, the comparison with the results obtained in reverse magnetic field by using a Helmholtz coil, and the measurement of ionic sound wave are scheduled. (Wakatsuki, Y.)

  3. Using open source data for flood risk mapping and management in Brazil

    Science.gov (United States)

    Whitley, Alison; Malloy, James; Chirouze, Manuel

    2013-04-01

    Whitley, A., Malloy, J. and Chirouze, M. Worldwide the frequency and severity of major natural disasters, particularly flooding, has increased. Concurrently, countries such as Brazil are experiencing rapid socio-economic development with growing and increasingly concentrated populations, particularly in urban areas. Hence, it is unsurprising that Brazil has experienced a number of major floods in the past 30 years such as the January 2011 floods which killed 900 people and resulted in significant economic losses of approximately 1 billion US dollars. Understanding, mitigating against and even preventing flood risk is high priority. There is a demand for flood models in many developing economies worldwide for a range of uses including risk management, emergency planning and provision of insurance solutions. However, developing them can be expensive. With an increasing supply of freely-available, open source data, the costs can be significantly reduced, making the tools required for natural hazard risk assessment more accessible. By presenting a flood model developed for eight urban areas of Brazil as part of a collaboration between JBA Risk Management and Guy Carpenter, we explore the value of open source data and demonstrate its usability in a business context within the insurance industry. We begin by detailing the open source data available and compare its suitability to commercially-available equivalents for datasets including digital terrain models and river gauge records. We present flood simulation outputs in order to demonstrate the impact of the choice of dataset on the results obtained and its use in a business context. Via use of the 2D hydraulic model JFlow+, our examples also show how advanced modelling techniques can be used on relatively crude datasets to obtain robust and good quality results. In combination with accessible, standard specification GPU technology and open source data, use of JFlow+ has enabled us to produce large-scale hazard maps

  4. Gas source localization and gas distribution mapping with a micro-drone

    Energy Technology Data Exchange (ETDEWEB)

    Neumann, Patrick P.

    2013-07-01

    The objective of this Ph.D. thesis is the development and validation of a VTOL-based (Vertical Take Off and Landing) micro-drone for the measurement of gas concentrations, to locate gas emission sources, and to build gas distribution maps. Gas distribution mapping and localization of a static gas source are complex tasks due to the turbulent nature of gas transport under natural conditions and becomes even more challenging when airborne. This is especially so, when using a VTOL-based micro-drone that induces disturbances through its rotors, which heavily affects gas distribution. Besides the adaptation of a micro-drone for gas concentration measurements, a novel method for the determination of the wind vector in real-time is presented. The on-board sensors for the flight control of the micro-drone provide a basis for the wind vector calculation. Furthermore, robot operating software for controlling the micro-drone autonomously is developed and used to validate the algorithms developed within this Ph.D. thesis in simulations and real-world experiments. Three biologically inspired algorithms for locating gas sources are adapted and developed for use with the micro-drone: the surge-cast algorithm (a variant of the silkworm moth algorithm), the zigzag / dung beetle algorithm, and a newly developed algorithm called ''pseudo gradient algorithm''. The latter extracts from two spatially separated measuring positions the information necessary (concentration gradient and mean wind direction) to follow a gas plume to its emission source. The performance of the algorithms is evaluated in simulations and real-world experiments. The distance overhead and the gas source localization success rate are used as main performance criteria for comparing the algorithms. Next, a new method for gas source localization (GSL) based on a particle filter (PF) is presented. Each particle represents a weighted hypothesis of the gas source position. As a first step, the PF

  5. Gas source localization and gas distribution mapping with a micro-drone

    International Nuclear Information System (INIS)

    Neumann, Patrick P.

    2013-01-01

    The objective of this Ph.D. thesis is the development and validation of a VTOL-based (Vertical Take Off and Landing) micro-drone for the measurement of gas concentrations, to locate gas emission sources, and to build gas distribution maps. Gas distribution mapping and localization of a static gas source are complex tasks due to the turbulent nature of gas transport under natural conditions and becomes even more challenging when airborne. This is especially so, when using a VTOL-based micro-drone that induces disturbances through its rotors, which heavily affects gas distribution. Besides the adaptation of a micro-drone for gas concentration measurements, a novel method for the determination of the wind vector in real-time is presented. The on-board sensors for the flight control of the micro-drone provide a basis for the wind vector calculation. Furthermore, robot operating software for controlling the micro-drone autonomously is developed and used to validate the algorithms developed within this Ph.D. thesis in simulations and real-world experiments. Three biologically inspired algorithms for locating gas sources are adapted and developed for use with the micro-drone: the surge-cast algorithm (a variant of the silkworm moth algorithm), the zigzag / dung beetle algorithm, and a newly developed algorithm called ''pseudo gradient algorithm''. The latter extracts from two spatially separated measuring positions the information necessary (concentration gradient and mean wind direction) to follow a gas plume to its emission source. The performance of the algorithms is evaluated in simulations and real-world experiments. The distance overhead and the gas source localization success rate are used as main performance criteria for comparing the algorithms. Next, a new method for gas source localization (GSL) based on a particle filter (PF) is presented. Each particle represents a weighted hypothesis of the gas source position. As a first step, the PF-based GSL algorithm

  6. Gas source localization and gas distribution mapping with a micro-drone

    Energy Technology Data Exchange (ETDEWEB)

    Neumann, Patrick P.

    2013-07-01

    The objective of this Ph.D. thesis is the development and validation of a VTOL-based (Vertical Take Off and Landing) micro-drone for the measurement of gas concentrations, to locate gas emission sources, and to build gas distribution maps. Gas distribution mapping and localization of a static gas source are complex tasks due to the turbulent nature of gas transport under natural conditions and becomes even more challenging when airborne. This is especially so, when using a VTOL-based micro-drone that induces disturbances through its rotors, which heavily affects gas distribution. Besides the adaptation of a micro-drone for gas concentration measurements, a novel method for the determination of the wind vector in real-time is presented. The on-board sensors for the flight control of the micro-drone provide a basis for the wind vector calculation. Furthermore, robot operating software for controlling the micro-drone autonomously is developed and used to validate the algorithms developed within this Ph.D. thesis in simulations and real-world experiments. Three biologically inspired algorithms for locating gas sources are adapted and developed for use with the micro-drone: the surge-cast algorithm (a variant of the silkworm moth algorithm), the zigzag / dung beetle algorithm, and a newly developed algorithm called ''pseudo gradient algorithm''. The latter extracts from two spatially separated measuring positions the information necessary (concentration gradient and mean wind direction) to follow a gas plume to its emission source. The performance of the algorithms is evaluated in simulations and real-world experiments. The distance overhead and the gas source localization success rate are used as main performance criteria for comparing the algorithms. Next, a new method for gas source localization (GSL) based on a particle filter (PF) is presented. Each particle represents a weighted hypothesis of the gas source position. As a first step, the PF-based GSL algorithm

  7. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    International Nuclear Information System (INIS)

    Eriksson, E.; Andersen, H. R.; Ledin, A.

    2008-01-01

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens

  8. Substance Flow Analysis and Source Mapping of Chemical UV-filters

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, E., E-mail: eve@env.dtu.dk; Andersen, H. R.; Ledin, A. [Technical University of Denmark, Department of Environmental Engineering (Denmark)

    2008-12-15

    Chemical ultraviolet (UV)-filters are used in sunscreens to protect the skin from harmful UV radiation which may otherwise cause sunburns and skin cancer. Commonly used chemical UV-filters are known to cause endocrine disrupting effects in both aquatic and terrestrial animals as well as in human skin cells. Here, source mapping and substance flow analysis were applied to find the sources of six UV-filters (oxybenzone, avobenzone, 4-methylbenzylidene camphor, octyl methoxycinnamate, octyl dimethyl PABA and homosalate) and to identify the most dominant flows of these substances in Denmark. Urban water, composed of wastewater and surface waters, was found to be the primary recipient of UV-filters, whereby wastewater received an estimated 8.5-65 tonnes and surface waters received 7.1-51 tonnes in 2005. In wastewater treatment plants, their sorption onto sludge is perceived to be an important process and presence in effluents can be expected due to a lack of biodegradability. In addition, the use of UV-filters is expected to continue to increase significantly. Not all filters (e.g., octyl dimethyl PABA and homosalate) are used in Denmark. For example, 4-MBC is mainly associated with self-tanning liquids and private import of sunscreens.

  9. Application of Open Source Software by the Lunar Mapping and Modeling Project

    Science.gov (United States)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  10. Improving flood risk mapping in Italy: the FloodRisk open-source software

    Science.gov (United States)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  11. Near-Infrared Imaging for Spatial Mapping of Organic Content in Petroleum Source Rocks

    Science.gov (United States)

    Mehmani, Y.; Burnham, A. K.; Vanden Berg, M. D.; Tchelepi, H.

    2017-12-01

    Natural gas from unconventional petroleum source rocks (shales) plays a key role in our transition towards sustainable low-carbon energy production. The potential for carbon storage (in adsorbed state) in these formations further aligns with efforts to mitigate climate change. Optimizing production and development from these resources requires knowledge of the hydro-thermo-mechanical properties of the rock, which are often strong functions of organic content. This work demonstrates the potential of near-infrared (NIR) spectral imaging in mapping the spatial distribution of organic content with O(100µm) resolution on cores that can span several hundred feet in depth (Mehmani et al., 2017). We validate our approach for the immature oil shale of the Green River Formation (GRF), USA, and show its applicability potential in other formations. The method is a generalization of a previously developed optical approach specialized to the GRF (Mehmani et al., 2016a). The implications of this work for spatial mapping of hydro-thermo-mechanical properties of excavated cores, in particular thermal conductivity, are discussed (Mehmani et al., 2016b). References:Mehmani, Y., A.K. Burnham, M.D. Vanden Berg, H. Tchelepi, "Quantification of organic content in shales via near-infrared imaging: Green River Formation." Fuel, (2017). Mehmani, Y., A.K. Burnham, M.D. Vanden Berg, F. Gelin, and H. Tchelepi. "Quantification of kerogen content in organic-rich shales from optical photographs." Fuel, (2016a). Mehmani, Y., A.K. Burnham, H. Tchelepi, "From optics to upscaled thermal conductivity: Green River oil shale." Fuel, (2016b).

  12. Counting Possibilia

    Directory of Open Access Journals (Sweden)

    Alfredo Tomasetta

    2010-06-01

    Full Text Available Timothy Williamson supports the thesis that every possible entity necessarily exists and so he needs to explain how a possible son of Wittgenstein’s, for example, exists in our world:he exists as a merely possible object (MPO, a pure locus of potential. Williamson presents a short argument for the existence of MPOs: how many knives can be made by fitting together two blades and two handles? Four: at the most two are concrete objects, the others being merely possible knives and merely possible objects. This paper defends the idea that one can avoid reference and ontological commitment to MPOs. My proposal is that MPOs can be dispensed with by using the notion of rules of knife-making. I first present a solution according to which we count lists of instructions - selected by the rules - describing physical combinations between components. This account, however, has its own difficulties and I eventually suggest that one can find a way out by admitting possible worlds, entities which are more commonly accepted - at least by philosophers - than MPOs. I maintain that, in answering Williamson’s questions, we count classes of physically possible worlds in which the same instance of a general rule is applied.

  13. Point source detection using the Spherical Mexican Hat Wavelet on simulated all-sky Planck maps

    Science.gov (United States)

    Vielva, P.; Martínez-González, E.; Gallegos, J. E.; Toffolatti, L.; Sanz, J. L.

    2003-09-01

    We present an estimation of the point source (PS) catalogue that could be extracted from the forthcoming ESA Planck mission data. We have applied the Spherical Mexican Hat Wavelet (SMHW) to simulated all-sky maps that include cosmic microwave background (CMB), Galactic emission (thermal dust, free-free and synchrotron), thermal Sunyaev-Zel'dovich effect and PS emission, as well as instrumental white noise. This work is an extension of the one presented in Vielva et al. We have developed an algorithm focused on a fast local optimal scale determination, that is crucial to achieve a PS catalogue with a large number of detections and a low flux limit. An important effort has been also done to reduce the CPU time processor for spherical harmonic transformation, in order to perform the PS detection in a reasonable time. The presented algorithm is able to provide a PS catalogue above fluxes: 0.48 Jy (857 GHz), 0.49 Jy (545 GHz), 0.18 Jy (353 GHz), 0.12 Jy (217 GHz), 0.13 Jy (143 GHz), 0.16 Jy (100 GHz HFI), 0.19 Jy (100 GHz LFI), 0.24 Jy (70 GHz), 0.25 Jy (44 GHz) and 0.23 Jy (30 GHz). We detect around 27 700 PS at the highest frequency Planck channel and 2900 at the 30-GHz one. The completeness level are: 70 per cent (857 GHz), 75 per cent (545 GHz), 70 per cent (353 GHz), 80 per cent (217 GHz), 90 per cent (143 GHz), 85 per cent (100 GHz HFI), 80 per cent (100 GHz LFI), 80 per cent (70 GHz), 85 per cent (44 GHz) and 80 per cent (30 GHz). In addition, we can find several PS at different channels, allowing the study of the spectral behaviour and the physical processes acting on them. We also present the basic procedure to apply the method in maps convolved with asymmetric beams. The algorithm takes ~72 h for the most CPU time-demanding channel (857 GHz) in a Compaq HPC320 (Alpha EV68 1-GHz processor) and requires 4 GB of RAM memory; the CPU time goes as O[NRoN3/2pix log(Npix)], where Npix is the number of pixels in the map and NRo is the number of optimal scales needed.

  14. Tsunami hazard maps of spanish coast at national scale from seismic sources

    Science.gov (United States)

    Aniel-Quiroga, Íñigo; González, Mauricio; Álvarez-Gómez, José Antonio; García, Pablo

    2017-04-01

    Tsunamis are a moderately frequent phenomenon in the NEAM (North East Atlantic and Mediterranean) region, and consequently in Spain, as historic and recent events have affected this area. I.e., the 1755 earthquake and tsunami affected the Spanish Atlantic coasts of Huelva and Cadiz and the 2003 Boumerdés earthquake triggered a tsunami that reached Balearic island coast in less than 45 minutes. The risk in Spain is real and, its population and tourism rate makes it vulnerable to this kind of catastrophic events. The Indian Ocean tsunami in 2004 and the tsunami in Japan in 2011 launched the worldwide development and application of tsunami risk reduction measures that have been taken as a priority in this field. On November 20th 2015 the directive of the Spanish civil protection agency on planning under the emergency of tsunami was presented. As part of the Spanish National Security strategy, this document specifies the structure of the action plans at different levels: National, regional and local. In this sense, the first step is the proper evaluation of the tsunami hazard at National scale. This work deals with the assessment of the tsunami hazard in Spain, by means of numerical simulations, focused on the elaboration of tsunami hazard maps at National scale. To get this, following a deterministic approach, the seismic structures whose earthquakes could generate the worst tsunamis affecting the coast of Spain have been compiled and characterized. These worst sources have been propagated numerically along a reconstructed bathymetry, built from the best resolution available data. This high-resolution bathymetry was joined with a 25-m resolution DTM, to generate continuous offshore-onshore space, allowing the calculation of the flooded areas prompted by each selected source. The numerical model applied for the calculation of the tsunami propagations was COMCOT. The maps resulting from the numerical simulations show not only the tsunami amplitude at coastal areas but

  15. Evaluation of the influence of source and spatial resolution of DEMs on derivative products used in landslide mapping

    Directory of Open Access Journals (Sweden)

    Rubini Mahalingam

    2016-11-01

    Full Text Available Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities plan and prepare for these damaging events. Digital elevation models (DEMs are one of the most important data-sets used in landslide hazard assessment. Despite their frequent use, limited research has been completed to date on how the DEM source and spatial resolution can influence the accuracy of the produced landslide susceptibility maps. The aim of this paper is to analyse the influence of spatial resolutions and source of DEMs on landslide susceptibility mapping. For this purpose, Advanced Spaceborne Thermal Emission and Reflection (ASTER, National Elevation Dataset (NED, and Light Detection and Ranging (LiDAR DEMs were obtained for two study sections of approximately 140 km2 in north-west Oregon. Each DEM was resampled to 10, 30, and 50 m and slope and aspect grids were derived for each resolution. A set of nine spatial databases was constructed using geoinformation science (GIS for each of the spatial resolution and source. Additional factors such as distance to river and fault maps were included. An analytical hierarchical process (AHP, fuzzy logic model, and likelihood ratio-AHP representing qualitative, quantitative, and hybrid landslide mapping techniques were used for generating landslide susceptibility maps. The results from each of the techniques were verified with the Cohen's kappa index, confusion matrix, and a validation index based on agreement with detailed landslide inventory maps. The spatial resolution of 10 m, derived from the LiDAR data-set showed higher predictive accuracy in all the three techniques used for producing landslide susceptibility maps. At a resolution of 10 m, the output maps based on NED and ASTER had higher misclassification compared to the LiDAR-based outputs. Further, the 30-m LiDAR output showed improved results over the 10-m NED and 10-m

  16. Mapping human health risks from exposure to trace metal contamination of drinking water sources in Pakistan.

    Science.gov (United States)

    Bhowmik, Avit Kumar; Alamdar, Ambreen; Katsoyiannis, Ioannis; Shen, Heqing; Ali, Nadeem; Ali, Syeda Maria; Bokhari, Habib; Schäfer, Ralf B; Eqani, Syed Ali Musstjab Akber Shah

    2015-12-15

    The consumption of contaminated drinking water is one of the major causes of mortality and many severe diseases in developing countries. The principal drinking water sources in Pakistan, i.e. ground and surface water, are subject to geogenic and anthropogenic trace metal contamination. However, water quality monitoring activities have been limited to a few administrative areas and a nationwide human health risk assessment from trace metal exposure is lacking. Using geographically weighted regression (GWR) and eight relevant spatial predictors, we calculated nationwide human health risk maps by predicting the concentration of 10 trace metals in the drinking water sources of Pakistan and comparing them to guideline values. GWR incorporated local variations of trace metal concentrations into prediction models and hence mitigated effects of large distances between sampled districts due to data scarcity. Predicted concentrations mostly exhibited high accuracy and low uncertainty, and were in good agreement with observed concentrations. Concentrations for Central Pakistan were predicted with higher accuracy than for the North and South. A maximum 150-200 fold exceedance of guideline values was observed for predicted cadmium concentrations in ground water and arsenic concentrations in surface water. In more than 53% (4 and 100% for the lower and upper boundaries of 95% confidence interval (CI)) of the total area of Pakistan, the drinking water was predicted to be at risk of contamination from arsenic, chromium, iron, nickel and lead. The area with elevated risks is inhabited by more than 74 million (8 and 172 million for the lower and upper boundaries of 95% CI) people. Although these predictions require further validation by field monitoring, the results can inform disease mitigation and water resources management regarding potential hot spots. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    Science.gov (United States)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  18. A procedure for merging land cover/use data from Landsat, aerial photography, and map sources - Compatibility, accuracy and cost

    Science.gov (United States)

    Enslin, W. R.; Tilmann, S. E.; Hill-Rowley, R.; Rogers, R. H.

    1977-01-01

    A method is developed to merge land cover/use data from Landsat, aerial photography and map sources into a grid-based geographic information system. The method basically involves computer-assisted categorization of Landsat data to provide certain user-specified land cover categories; manual interpretation of aerial photography to identify other selected land cover/use categories that cannot be obtained from Landsat data; identification of special features from aerial photography or map sources; merging of the interpreted data from all the sources into a computer compatible file under a standardized coding structure; and the production of land cover/use maps, thematic maps, and tabular data. The specific tasks accomplished in producing the merged land cover/use data file and subsequent output products are identified and discussed. It is shown that effective implementation of the merging method is critically dependent on selecting the 'best' data source for each user-specified category in terms of accuracy and time/cost tradeoffs.

  19. Assessment of self-organizing maps to analyze sole-carbon source utilization profiles.

    Science.gov (United States)

    Leflaive, Joséphine; Céréghino, Régis; Danger, Michaël; Lacroix, Gérard; Ten-Hage, Loïc

    2005-07-01

    The use of community-level physiological profiles obtained with Biolog microplates is widely employed to consider the functional diversity of bacterial communities. Biolog produces a great amount of data which analysis has been the subject of many studies. In most cases, after some transformations, these data were investigated with classical multivariate analyses. Here we provided an alternative to this method, that is the use of an artificial intelligence technique, the Self-Organizing Maps (SOM, unsupervised neural network). We used data from a microcosm study of algae-associated bacterial communities placed in various nutritive conditions. Analyses were carried out on the net absorbances at two incubation times for each substrates and on the chemical guild categorization of the total bacterial activity. Compared to Principal Components Analysis and cluster analysis, SOM appeared as a valuable tool for community classification, and to establish clear relationships between clusters of bacterial communities and sole-carbon sources utilization. Specifically, SOM offered a clear bidimensional projection of a relatively large volume of data and were easier to interpret than plots commonly obtained with multivariate analyses. They would be recommended to pattern the temporal evolution of communities' functional diversity.

  20. New resources for smart food retail mapping a GIS and the open source perspective

    Directory of Open Access Journals (Sweden)

    Eric Vaz

    2016-12-01

    Full Text Available In this paper it is demonstrated that open-source GIS software may contribute to allow nonprofit organizations and local food retailers to strategically locate food shops. This impacts realtors and other businesses as well. Areas are covered and clients served avoiding food deserts and increasing security in the health sector (Barnes et al., 2016. The methodology demonstrates how mapping may be processed, allowing people to get a good understanding of the food distribution. Also, decision making at corporate level improves due to better connecting to local production and organic retailers and to better reach out to local consumption. A major consequence of this exercise is likewise to educate users on the negative impacts of food deserts on health and improve awareness supporting the design and integration of sustainable and healthy lifestyles (Vaz and Zhao, 2016. This novel proposal that combines spatial and locational data visualization (McIver, 2003, as well as sharing of information of healthy food retailers within the urban nexus (Morgan and Sonnino, 2010 engage communities actively to participate in the integration of new consumer behaviours and make them clearly expressed.

  1. Determination, source identification and GIS mapping for nitrate concentration in ground water from Bara aquifer

    International Nuclear Information System (INIS)

    Elfaki Taha, G. M. E.

    2010-09-01

    The study was carried-out determine the level of nitrate concentration in well water from Bara aquifer in North Kordofan State. The analysis was conducted for 69 wells from different villages within Bara basin. Physical characteristics were measured including pH, electrical conductivity and dissolved oxygen. Spectrophotometric analysis was used to determine nitrate, nitrite and ammonia. Chloride and hardness were determined telemetrically and flame photometer was used for major elements namely sodium and potassium, whereas atomic absorption spectroscopy was used for trace elements namely iron, manganese, zinc and copper. Results revealed that nitrate concentration range from 9.68 to 891 mg/1 in sampled wells with 81% exceeding the maximum permissible limits set for drinking water by WHO and SSMO. Animal waste and organic soil nitrogen were found to be the sources of nitrate in these wells as indicated by 15 N%. Majority of wells with high nitrate are located in the north and the north-east part of the study area as shown by GIS predictive map. On the average, the concentrations of sodium, potassium, calcium, magnesium, iron, manganese, zinc and copper were found to be within WHO limits for drinking water. (Author)

  2. Source, propagation and site effects: impact on mapping strong ground motion in Bucharest area

    International Nuclear Information System (INIS)

    Radulian, R.; Kuznetsov, I.; Panza, G.F.

    2004-01-01

    Achievements in the framework of the NATO SfP project 972266 focused on the impact of Vrancea earthquakes on the security of Bucharest urban area are presented. The problem of Bucharest city security to Vrancea earthquakes is discussed in terms of numerical modelling of seismic motion and intermediate term earthquake prediction. A hybrid numerical scheme developed by Faeh et al. (1990; 1993) for frequencies up to 1 Hz is applied for the realistic modelling of the seismic ground motion in Bucharest. The method combines the modal summation for the 1D bedrock model and the finite differences for the 2D local structure model. All the factors controlling the ground motion at the site are considered: source, propagation and site effects, respectively. The input data includes the recent records provided by the digital accelerometer network developed within the Romanian-German CRC461 cooperation programme and CALIXTO'99, VRANCEA'99, VRANCEA2001 experiments. The numerical simulation proves to be a powerful tool in mapping the strong ground motion for realistic structures, reproducing acceptably from engineering point of view the observations. A new model of the Vrancea earthquake scaling is obtained and implications for the determination of the seismic motion parameters are analyzed. The role of the focal mechanism and attenuation properties upon the amplitude and spectral content of the ground motion are outlined. CN algorithm is applied for predicting Vrancea earthquakes. Finally, implications for the disaster management strategy are discussed. (authors)

  3. Novel data sources for women's health research: mapping breast screening online information seeking through Google trends.

    Science.gov (United States)

    Fazeli Dehkordy, Soudabeh; Carlos, Ruth C; Hall, Kelli S; Dalton, Vanessa K

    2014-09-01

    Millions of people use online search engines everyday to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker's behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows Internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information-seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. To capture the temporal variations of information seeking about dense breasts, the Web search query "dense breast" was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information-seeking trends about dense breasts over time. Newsworthy events and legislative actions appear to correlate well with peaks in search volume of "dense breast". Geographic regions with the highest search volumes have passed, denied, or are currently considering the dense breast legislation. Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for "dense breast" on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  4. An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies

    Science.gov (United States)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche

  5. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  6. Categorical counting.

    Science.gov (United States)

    Fetterman, J Gregor; Killeen, P Richard

    2010-09-01

    Pigeons pecked on three keys, responses to one of which could be reinforced after a few pecks, to a second key after a somewhat larger number of pecks, and to a third key after the maximum pecking requirement. The values of the pecking requirements and the proportion of trials ending with reinforcement were varied. Transits among the keys were an orderly function of peck number, and showed approximately proportional changes with changes in the pecking requirements, consistent with Weber's law. Standard deviations of the switch points between successive keys increased more slowly within a condition than across conditions. Changes in reinforcement probability produced changes in the location of the psychometric functions that were consistent with models of timing. Analyses of the number of pecks emitted and the duration of the pecking sequences demonstrated that peck number was the primary determinant of choice, but that passage of time also played some role. We capture the basic results with a standard model of counting, which we qualify to account for the secondary experiments. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Open Source Web Tool for Tracking in a Lowcost Mobile Mapping System

    Science.gov (United States)

    Fissore, F.; Pirotti, F.; Vettore, A.

    2017-11-01

    During the last decade several Mobile Mapping Systems (MMSs), i.e. systems able to acquire efficiently three dimensional data using moving sensors (Guarnieri et al., 2008, Schwarz and El-Sheimy, 2004), have been developed. Research and commercial products have been implemented on terrestrial, aerial and marine platforms, and even on human-carried equipment, e.g. backpack (Lo et al., 2015, Nex and Remondino, 2014, Ellum and El-Sheimy, 2002, Leica Pegasus backpack, 2016, Masiero et al., 2017, Fissore et al., 2018). Such systems are composed of an integrated array of time-synchronised navigation sensors and imaging sensors mounted on a mobile platform (Puente et al., 2013, Tao and Li, 2007). Usually the MMS implies integration of different types of sensors, such as GNSS, IMU, video camera and/or laser scanners that allow accurate and quick mapping (Li, 1997, Petrie, 2010, Tao, 2000). The typical requirement of high-accuracy 3D georeferenced reconstruction often makes such systems quite expensive. Indeed, at time of writing most of the terrestrial MMSs on the market have a cost usually greater than 50000, which might be expensive for certain applications (Ellum and El-Sheimy, 2002, Piras et al., 2008). In order to allow best performance sensors have to be properly calibrated (Dong et al., 2007, Ellum and El-Sheimy, 2002). Sensors in MMSs are usually integrated and managed through a dedicated software, which is developed ad hoc for the devices mounted on the mobile platform and hence tailored for the specific used sensors. Despite the fact that commercial solutions are complete, very specific and particularly related to the typology of survey, their price is a factor that restricts the number of users and the possible interested sectors. This paper describes a (relatively low cost) terrestrial Mobile Mapping System developed at the University of Padua (TESAF, Department of Land Environment Agriculture and Forestry) by the research team in CIRGEO, in order to test an

  8. NeuroMap: A spline-based interactive open-source software for spatiotemporal mapping of 2D and 3D MEA data

    Directory of Open Access Journals (Sweden)

    Oussama eAbdoun

    2011-01-01

    Full Text Available A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA technology. Indeed, high-density MEAs provide large-scale covering (several mm² of whole neural structures combined with microscopic resolution (about 50µm of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid deformation based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License (GPL and available at http://sites.google.com/site/neuromapsoftware.

  9. NeuroMap: A Spline-Based Interactive Open-Source Software for Spatiotemporal Mapping of 2D and 3D MEA Data.

    Science.gov (United States)

    Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise

    2011-01-01

    A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.

  10. [Corrected count].

    Science.gov (United States)

    1991-11-27

    The data of the 1991 census indicated that the population count of Brazil fell short of a former estimate by 3 million people. The population reached 150 million people with an annual increase of 2%, while projections in the previous decade expected an increase of 2.48% to 153 million people. This reduction indicates more widespread use of family planning (FP) and control of fertility among families of lower social status as more information is being provided to them. However, the Ministry of Health ordered an investigation of foreign family planning organizations because it was suspected that women were forced to undergo tubal ligation during vaccination campaigns. A strange alliance of left wing politicians and the Roman Catholic Church alleges a conspiracy of international FP organizations receiving foreign funds. The FP strategies of Bemfam and Pro-Pater offer women who have little alternative the opportunity to undergo tubal ligation or to receive oral contraceptives to control fertility. The ongoing government program of distributing booklets on FP is feeble and is not backed up by an education campaign. Charges of foreign interference are leveled while the government hypocritically ignores the grave problem of 4 million abortions a year. The population is expected to continue to grow until the year 2040 and then to stabilize at a low growth rate of .4%. In 1980, the number of children per woman was 4.4 whereas the 1991 census figures indicate this has dropped to 3.5. The excess population is associated with poverty and a forsaken caste in the interior. The population actually has decreased in the interior and in cities with 15,000 people. The phenomenon of the drop of fertility associated with rural exodus is contrasted with cities and villages where the population is 20% less than expected.

  11. Mapping groundwater dynamics using multiple sources of exhaustive high resolution data

    NARCIS (Netherlands)

    Finke, P.A.; Brus, D.J.; Bierkens, M.F.P.; Hoogland, T.; Knotters, M.; Vries, de F.

    2004-01-01

    Existing groundwater table (GWT) class maps, available at full coverage for the Netherlands at 1:50,000 scale, no longer satisfy user demands. Groundwater levels have changed due to strong human impact, so the maps are partially outdated. Furthermore, a more dynamic description of groundwater table

  12. Developing Coastal Surface Roughness Maps Using ASTER and QuickBird Data Sources

    Science.gov (United States)

    Spruce, Joe; Berglund, Judith; Davis, Bruce

    2006-01-01

    This viewgraph presentation regards one element of a larger project on the integration of NASA science models and data into the Hazards U.S. Multi-Hazard (HAZUS-MH) Hurricane module for hurricane damage and loss risk assessment. HAZUS-MH is a decision support tool being developed by the National Institute of Building Sciences for the Federal Emergency Management Agency (FEMA). It includes the Hurricane Module, which employs surface roughness maps made from National Land Cover Data (NLCD) maps to estimate coastal hurricane wind damage and loss. NLCD maps are produced and distributed by the U.S. Geological Survey. This presentation discusses an effort to improve upon current HAZUS surface roughness maps by employing ASTER multispectral classifications with QuickBird "ground reference" imagery.

  13. DC KIDS COUNT e-Databook Indicators

    Science.gov (United States)

    DC Action for Children, 2012

    2012-01-01

    This report presents indicators that are included in DC Action for Children's 2012 KIDS COUNT e-databook, their definitions and sources and the rationale for their selection. The indicators for DC KIDS COUNT represent a mix of traditional KIDS COUNT indicators of child well-being, such as the number of children living in poverty, and indicators of…

  14. Sources to the landscape - detailed spatiotemporal analysis of 200 years Danish landscape dynamics using unexploited historical maps and aerial photos

    DEFF Research Database (Denmark)

    Svenningsen, Stig Roar; Christensen, Andreas Aagaard; Dupont, Henrik

    to declassification of military maps and aerial photos from the cold war, only relatively few sources have been made available to researchers due to lacking efforts in digitalization and related services. And even though the digitizing of cartographic material has been accelerated, the digitally available materials...... or to the commercial photo series from the last 20 years. This poster outlines a new research project focusing on the potential of unexploited cartographic sources for detailed analysis of the dynamic of the Danish landscape between 1800 – 2000. The project draws on cartographic sources available in Danish archives...... of material in landscape change studies giving a high temporal and spatial resolution. The project also deals with the opportunity and constrain of comparing different cartographic sources with diverse purpose and time of production, e.g. different scale and quality of aerial photos or the difference between...

  15. Model-Based Analysis and Optimization of the Mapping of Cortical Sources in the Spontaneous Scalp EEG

    Directory of Open Access Journals (Sweden)

    Andrei V. Sazonov

    2007-01-01

    Full Text Available The mapping of brain sources into the scalp electroencephalogram (EEG depends on volume conduction properties of the head and on an electrode montage involving a reference. Mathematically, this source mapping (SM is fully determined by an observation function (OF matrix. This paper analyses the OF-matrix for a generation model for the desynchronized spontaneous EEG. The model involves a four-shell spherical volume conductor containing dipolar sources that are mutually uncorrelated so as to reflect the desynchronized EEG. The reference is optimized in order to minimize the impact in the SM of the sources located distant from the electrodes. The resulting reference is called the localized reference (LR. The OF-matrix is analyzed in terms of the relative power contribution of the sources and the cross-channel correlation coefficient for five existing references as well as for the LR. It is found that the Hjorth Laplacian reference is a fair approximation of the LR, and thus is close to optimum for practical intents and purposes. The other references have a significantly poorer performance. Furthermore, the OF-matrix is analyzed for limits to the spatial resolution for the EEG. These are estimated to be around 2 cm.

  16. Following the money: Mapping the sources and funding flows of alcohol and other drug treatment in Australia.

    Science.gov (United States)

    Chalmers, Jenny; Ritter, Alison; Berends, Lynda; Lancaster, Kari

    2016-05-01

    The structures of health systems impact on patient outcomes. We present and analyse the first detailed mapping of who funds alcohol and other drug (AOD) treatment and the channels and intermediaries through which funding flows from the funding sources to treatment providers. The study involved a literature review of AOD treatment financing and existing diagrammatic representations of the structure of the Australian health system. We interviewed 190 key informants to particularise the AOD treatment sector, and undertook two case examples of government funded non-government organisations providing AOD treatment. Funding sources include the Australian and state and territory governments, philanthropy, fund-raising and clients themselves. While funding sources align with the health sector generally and the broader social services sector, the complexity of flows from source to treatment service and the number of intermediaries are noteworthy. So too are the many sources of funding drawn on by some treatment providers. Diversification is both beneficial and disadvantageous for non-government treatment providers, adding to administrative workloads, but smoothing the risk of funding shortfalls. Government funders benefit from sharing risk. Circuitous funding flows multiply the funding sources drawn on by services and put distance between the funding source and the service provider. This leads to concerns over lack of transparency about what is being purchased and challenges for the multiply funded service provider in maintaining programs and service models amid multiple and sometimes competing funding and accountability frameworks. [Chalmers J, Ritter A, Berends L, Lancaster K. Following the money: Mapping the sources and funding flows of alcohol and other drug treatment in Australia. Drug Alcohol Rev 2016;35:255-262]. © 2015 Australasian Professional Society on Alcohol and other Drugs.

  17. Verifying mapping, monitoring and modeling of fine sediment pollution sources in West Maui, Hawai'i, USA

    Science.gov (United States)

    Cerovski-Darriau, C.; Stock, J. D.

    2017-12-01

    Coral reef ecosystems, and the fishing and tourism industries they support, depend on clean waters. Fine sediment pollution from nearshore watersheds threatens these enterprises in West Maui, Hawai'i. To effectively mitigate sediment pollution, we first have to know where the sediment is coming from, and how fast it erodes. In West Maui, we know that nearshore sediment plumes originate from erosion of fine sand- to silt-sized air fall deposits where they are exposed by grazing, agriculture, or other disturbances. We identified and located these sediment sources by mapping watershed geomorphological processes using field traverses, historic air photos, and modern orthophotos. We estimated bank lowering rates using erosion pins, and other surface erosion rates were extrapolated from data collected elsewhere on the Hawaiian Islands. These measurements and mapping led to a reconnaissance sediment budget which showed that annual loads are dominated by bank erosion of legacy terraces. Field observations during small storms confirm that nearshore sediment plumes are sourced from bank erosion of in-stream, legacy agricultural deposits. To further verify this sediment budget, we used geochemical fingerprinting to uniquely identify each potential source (e.g. stream banks, agricultural fields, roads, other human modified soils, and hillslopes) from the Wahikuli watershed (10 km2) and analyzed the fine fraction using ICP-MS for elemental geochemistry. We propose to apply this the fingerprinting results to nearshore suspended sediment samples taken during storms to identify the proportion of sediment coming from each source. By combining traditional geomorphic mapping, monitoring and geochemistry, we hope to provide a powerful tool to verify the primary source of sediment reaching the nearshore.

  18. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  19. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  20. Can Twitter Be a Source of Information on Allergy? Correlation of Pollen Counts with Tweets Reporting Symptoms of Allergic Rhinoconjunctivitis and Names of Antihistamine Drugs.

    Science.gov (United States)

    Gesualdo, Francesco; Stilo, Giovanni; D'Ambrosio, Angelo; Carloni, Emanuela; Pandolfi, Elisabetta; Velardi, Paola; Fiocchi, Alessandro; Tozzi, Alberto E

    2015-01-01

    Pollen forecasts are in use everywhere to inform therapeutic decisions for patients with allergic rhinoconjunctivitis (ARC). We exploited data derived from Twitter in order to identify tweets reporting a combination of symptoms consistent with a case definition of ARC and those reporting the name of an antihistamine drug. In order to increase the sensitivity of the system, we applied an algorithm aimed at automatically identifying jargon expressions related to medical terms. We compared weekly Twitter trends with National Allergy Bureau weekly pollen counts derived from US stations, and found a high correlation of the sum of the total pollen counts from each stations with tweets reporting ARC symptoms (Pearson's correlation coefficient: 0.95) and with tweets reporting antihistamine drug names (Pearson's correlation coefficient: 0.93). Longitude and latitude of the pollen stations affected the strength of the correlation. Twitter and other social networks may play a role in allergic disease surveillance and in signaling drug consumptions trends.

  1. DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido

    2018-03-01

    High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.

  2. Mapping nanoscale effects of localized noise-source activities on photoconductive charge transports in polymer-blend films

    Science.gov (United States)

    Shekhar, Shashank; Cho, Duckhyung; Cho, Dong-Guk; Yang, Myungjae; Hong, Seunghun

    2018-05-01

    We develolped a method to directly image the nanoscale effects of localized noise-source activities on photoconducting charge transports in domain structures of phase-separated polymer-blend films of Poly(9,9-di-n-octylfluorenyl-2,7-diyl) and Poly(9,9-di-n-octylfluorene-alt-benzothiadiazole). For the imaging, current and noise maps of the polymer-blend were recorded using a conducting nanoprobe in contact with the surface, enabling the conductivity (σ) and noise-source density (N T) mappings under an external stimulus. The blend-films exhibited the phase-separation between the constituent polymers at domains level. Within a domain, high σ (low N T) and low σ (high N T) regions were observed, which could be associated with the ordered and disordered regions of a domain. In the N T maps, we observed that noise-sources strongly affected the conduction mechanism, resulting in a scaling behavior of σ ∝ {{N}{{T}}}-0.5 in both ordered and disordered regions. When a blend film was under an influence of an external stimulus such as a high bias or an illumination, an increase in the σ was observed, but that also resulted in increases in the N T as a trade-off. Interestingly, the Δσ versus ΔN T plot exhibited an unusual scaling behavior of Δσ ∝ {{Δ }}{{N}{{T}}}0.5, which is attributed to the de-trapping of carriers from deep traps by the external stimuli. In addition, we found that an external stimulus increased the conductivity at the interfaces without significantly increasing their N T, which can be the origin of the superior performances of polymer-blend based devices. These results provide valuable insight about the effects of noise-sources on nanoscale optoelectronic properties in polymer-blend films, which can be an important guideline for improving devices based on polymer-blend.

  3. External knowledge sourcing in the Spanish archaeological sector: Mapping the emergent stage of a business activity

    Directory of Open Access Journals (Sweden)

    Eva Parga-Dans

    2017-03-01

    Full Text Available Recent studies of innovation highlight the importance of external knowledge sourcing. Existing empirical works are based on national surveys and specific industries. The present study contributes to the analysis of strategies for sourcing external knowledge, based on a specific case study and moment in time: the Spanish archaeological sector and its emergence as a new business activity. Our results show that external knowledge sourcing involves diverse mechanisms, agents and two main strategies: cooperation and knowledge acquisition. In an expanding knowledge-based sector emerging in an uncertain context and whose sources of knowledge are scattered, innovation strategy should focus on the search for external knowledge –cooperation and acquisition strategies-, rather than on internal sources.

  4. Mapping and ablating stable sources for atrial fibrillation: summary of the literature on Focal Impulse and Rotor Modulation (FIRM).

    Science.gov (United States)

    Baykaner, Tina; Lalani, Gautam G; Schricker, Amir; Krummen, David E; Narayan, Sanjiv M

    2014-09-01

    Atrial fibrillation (AF) is the most common sustained arrhythmia and the most common indication for catheter ablation. However, despite substantial technical advances in mapping and energy delivery, ablation outcomes remain suboptimal. A major limitation to AF ablation is that the areas targeted for ablation are rarely of proven mechanistic importance, in sharp contrast to other arrhythmias in which ablation targets demonstrated mechanisms in each patient. Focal impulse and rotor modulation (FIRM) is a new approach to demonstrate the mechanisms that sustain AF ("substrates") in each patient that can be used to guide ablation then confirm elimination of each mechanism. FIRM mapping reveals that AF is sustained by 2-3 rotors and focal sources, with a greater number in patients with persistent than paroxysmal AF, lying within spatially reproducible 2.2 ± 1.4-cm(2) areas in diverse locations. This temporospatial reproducibility, now confirmed by several groups using various methods, changes the concepts regarding AF-sustaining mechanisms, enabling localized rather than widespread ablation. Mechanistically, the role of rotors and focal sources in sustaining AF has been demonstrated by the acute and chronic success of source (FIRM) ablation alone. Clinically, adding FIRM to conventional ablation substantially improves arrhythmia freedom compared with conventional ablation alone, and ongoing randomized trials are comparing FIRM-ablation with and without conventional ablation to conventional ablation alone. In conclusion, ablation of patient-specific AF-sustaining mechanisms (substrates), as exemplified by FIRM, may be central to substantially improving AF ablation outcomes.

  5. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    Science.gov (United States)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.

  6. Preparedness for response to the challenges from orphan sources: nationwide environmental radiation mapping with state of the art monitoring systems

    International Nuclear Information System (INIS)

    Saindane, Shashank S.; Pradeepkumar, K.S.; Suri, M.M.K.; Sharma, D.N.

    2008-01-01

    Based on the various international reports on orphan sources, the potential for radiological emergencies in public domain is recognized as a cause of concern. To detect the presence of any such orphan sources and to strengthen the preparedness for response to any radiological emergencies in public domain, a nationwide radiation mapping programme was initiated in India. Various radiation monitoring systems, few of them integrated with Global Positioning System (GPS) installed in mobile monitoring vans were used for this purpose. This monitoring also helped in generating the base line dose rate data of the cities and also in demonstrating the methodology of environmental monitoring for locating the presence of orphan sources, if any. During the detailed monitoring of various cities of the country, different systems such as GSM based Radiation Monitoring System (GRaMS), Compact Radiation Monitoring system, Portable Mobile Gamma Spectrometry System, Gamma Tracer System etc. installed in a vehicle were made to continuously acquire the data at a varying rate from 10 sec to 1 minute acquisition time. These systems can measure dose rate in the range of 0.01 - 100 μGy h -1 and can detect 7.4 MBq (200 μCi) of 60 Co and 25 MBq (675 μCi) of 137 Cs from a distance of 5 metre. Average dose rate recorded during these environmental monitoring was 81 ± 07 nGy h -1 with a maximum of 210 ± 11 nGyh -1 at Bangalore (attributed to the presence of K-40). The digital topographic map and the data acquired from the radiation mapping are used to generate terrestrial radiation map. This radiation profile stored in the database can be used as reference while carrying out the impact assessment following any nuclear / radiological emergencies. These systems also help to tag the radiation levels along with positional coordinates online onto the GIS map of the area. GRaMS also demonstrated its capability for online transmission of the data to the centralized data acquisition Base Station

  7. Mapping Forest Canopy Height over Continental China Using Multi-Source Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Xiliang Ni

    2015-06-01

    Full Text Available Spatially-detailed forest height data are useful to monitor local, regional and global carbon cycle. LiDAR remote sensing can measure three-dimensional forest features but generating spatially-contiguous forest height maps at a large scale (e.g., continental and global is problematic because existing LiDAR instruments are still data-limited and expensive. This paper proposes a new approach based on an artificial neural network (ANN for modeling of forest canopy heights over the China continent. Our model ingests spaceborne LiDAR metrics and multiple geospatial predictors including climatic variables (temperature and precipitation, forest type, tree cover percent and land surface reflectance. The spaceborne LiDAR instrument used in the study is the Geoscience Laser Altimeter System (GLAS, which can provide within-footprint forest canopy heights. The ANN was trained with pairs between spatially discrete LiDAR metrics and full gridded geo-predictors. This generates valid conjugations to predict heights over the China continent. The ANN modeled heights were evaluated with three different reference data. First, field measured tree heights from three experiment sites were used to validate the ANN model predictions. The observed tree heights at the site-scale agreed well with the modeled forest heights (R = 0.827, and RMSE = 4.15 m. Second, spatially discrete GLAS observations and a continuous map from the interpolation of GLAS-derived tree heights were separately used to evaluate the ANN model. We obtained R of 0.725 and RMSE of 7.86 m and R of 0.759 and RMSE of 8.85 m, respectively. Further, inter-comparisons were also performed with two existing forest height maps. Our model granted a moderate agreement with the existing satellite-based forest height maps (R = 0.738, and RMSE = 7.65 m (R2 = 0.52, and RMSE = 8.99 m. Our results showed that the ANN model developed in this paper is capable of estimating forest heights over the China continent with a

  8. A source classification framework supporting pollutant source mapping, pollutant release prediction, transport and load forecasting, and source control planning for urban environments

    DEFF Research Database (Denmark)

    Lützhøft, Hans-Christian Holten; Donner, Erica; Wickman, Tonie

    2012-01-01

    for this purpose. Methods Existing source classification systems were examined by a multidisciplinary research team, and an optimised SCF was developed. The performance and usability of the SCF were tested using a selection of 25 chemicals listed as priority pollutants in Europe. Results The SCF is structured...... in the form of a relational database and incorporates both qualitative and quantitative source classification and release data. The system supports a wide range of pollution monitoring and management applications. The SCF functioned well in the performance test, which also revealed important gaps in priority...

  9. WEB MAPPING ARCHITECTURES BASED ON OPEN SPECIFICATIONS AND FREE AND OPEN SOURCE SOFTWARE IN THE WATER DOMAIN

    Directory of Open Access Journals (Sweden)

    C. Arias Muñoz

    2017-09-01

    Full Text Available The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS and the use of open specifications (OS that address different users’ needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  10. Web Mapping Architectures Based on Open Specifications and Free and Open Source Software in the Water Domain

    Science.gov (United States)

    Arias Muñoz, C.; Brovelli, M. A.; Kilsedar, C. E.; Moreno-Sanchez, R.; Oxoli, D.

    2017-09-01

    The availability of water-related data and information across different geographical and jurisdictional scales is of critical importance for the conservation and management of water resources in the 21st century. Today information assets are often found fragmented across multiple agencies that use incompatible data formats and procedures for data collection, storage, maintenance, analysis, and distribution. The growing adoption of Web mapping systems in the water domain is reducing the gap between data availability and its practical use and accessibility. Nevertheless, more attention must be given to the design and development of these systems to achieve high levels of interoperability and usability while fulfilling different end user informational needs. This paper first presents a brief overview of technologies used in the water domain, and then presents three examples of Web mapping architectures based on free and open source software (FOSS) and the use of open specifications (OS) that address different users' needs for data sharing, visualization, manipulation, scenario simulations, and map production. The purpose of the paper is to illustrate how the latest developments in OS for geospatial and water-related data collection, storage, and sharing, combined with the use of mature FOSS projects facilitate the creation of sophisticated interoperable Web-based information systems in the water domain.

  11. Phenotypic characterization, genetic mapping and candidate gene analysis of a source conferring reduced plant height in sunflower.

    Science.gov (United States)

    Ramos, María Laura; Altieri, Emiliano; Bulos, Mariano; Sala, Carlos A

    2013-01-01

    Reduced height germplasm has the potential to increase stem strength, standability, and also yields potential of the sunflower crop (Helianthus annuus L. var. macrocarpus Ckll.). In this study, we report on the inheritance, mapping, phenotypic and molecular characterization of a reduced plant height trait in inbred lines derived from the source DDR. This trait is controlled by a semidominant allele, Rht1, which maps on linkage group 12 of the sunflower public consensus map. Phenotypic effects of this allele include shorter height and internode length, insensibility to exogenous gibberellin application, normal skotomorphogenetic response, and reduced seed set under self-pollination conditions. This later effect presumably is related to the reduced pollen viability observed in all DDR-derived lines studied. Rht1 completely cosegregated with a haplotype of the HaDella1 gene sequence. This haplotype consists of a point mutation converting a leucine residue in a proline within the conserved DELLA domain. Taken together, the phenotypic, genetic, and molecular results reported here indicate that Rht1 in sunflower likely encodes an altered DELLA protein. If the DELPA motif of the HaDELLA1 sequence in the Rht1-encoded protein determines by itself the observed reduction in height is a matter that remains to be investigated.

  12. road-map for smart grids and electricity systems integrating renewable energy sources

    International Nuclear Information System (INIS)

    Rebec, Gaelle; Moisan, Francois; Gioria, Michel

    2009-12-01

    The vision of smart grids and electricity systems elaborated in this road-map was drawn up on the basis of consultation and talks with a group of experts from industry (EDF, AREVA, GDF-Suez), public research bodies (SUPELEC, Ecole des Mines, INES, universities), grid operators (ERDF, RTE), local authorities' groups (FNCCR) and ADEME. In the course of these working sessions the experts expressed their opinions intuitu personae. The views outlined in this road-map are not to be assimilated with the official positions of the corporations or research organisations to which the members of the group belong. The visions of smart electricity grids and systems integrating renewable energies in 2020 and in 2050 are in sharp contrast. This contrast was deliberately sought out, for two reasons: - to offer the most exhaustive panorama possible of imaginable futures; - to avoid neglecting a critical technological, organisational or socioeconomic bottleneck that might be associated with a possible scenario left out of the discussion. Accordingly, in seeking contrasting visions the group arrived at extreme representations and even caricatures of the future, which nonetheless help define the outer limit of possibilities, and the scope within which the actual situation will most likely be situated in 2020 and in 2050

  13. Imaginal discs--a new source of chromosomes for genome mapping of the yellow fever mosquito Aedes aegypti.

    Directory of Open Access Journals (Sweden)

    Maria V Sharakhova

    2011-10-01

    Full Text Available The mosquito Aedes aegypti is the primary global vector for dengue and yellow fever viruses. Sequencing of the Ae. aegypti genome has stimulated research in vector biology and insect genomics. However, the current genome assembly is highly fragmented with only ~31% of the genome being assigned to chromosomes. A lack of a reliable source of chromosomes for physical mapping has been a major impediment to improving the genome assembly of Ae. aegypti.In this study we demonstrate the utility of mitotic chromosomes from imaginal discs of 4(th instar larva for cytogenetic studies of Ae. aegypti. High numbers of mitotic divisions on each slide preparation, large sizes, and reproducible banding patterns of the individual chromosomes simplify cytogenetic procedures. Based on the banding structure of the chromosomes, we have developed idiograms for each of the three Ae. aegypti chromosomes and placed 10 BAC clones and a 18S rDNA probe to precise chromosomal positions.The study identified imaginal discs of 4(th instar larva as a superior source of mitotic chromosomes for Ae. aegypti. The proposed approach allows precise mapping of DNA probes to the chromosomal positions and can be utilized for obtaining a high-quality genome assembly of the yellow fever mosquito.

  14. Mapping sources, sinks, and connectivity using a simulation model of Northern Spotted Owls

    Science.gov (United States)

    This is a study of source-sink dynamics at a landscape scale. In conducting the study, we make use of a mature simulation model for the northern spotted owl (Strix occidentalis caurina) that was developed as part of the US Fish and Wildlife Service’s most recent recovery plannin...

  15. Offshore dredger sound: source levels, sound maps and risk assessment (abstract)

    NARCIS (Netherlands)

    Jong, C.A.F. de; Ainslie, M.A.; Heinis, F.; Janmaat, J.

    2013-01-01

    The Port of Rotterdam is expanding to meet the growing demand to accommodate large cargo vessels. One of the licensing conditions was the monitoring of the underwater sound produced during its construction, with an emphasis on the establishment of acoustic source levels of the Trailing Suction

  16. Do your syringes count?

    International Nuclear Information System (INIS)

    Brewster, K.

    2002-01-01

    Full text: This study was designed to investigate anecdotal evidence that residual Sestamibi (MIBI) activity vaned in certain situations. For rest studies different brands of syringes were tested to see if the residuals varied. The period of time MIBI doses remained in the syringe between dispensing and injection was also considered as a possible source of increased residual counts. Stress Mibi syringe residual activities were measured to assess if the method of stress test affected residual activity. MIBI was reconstituted using 13 Gbq of Technetium in 3mls of normal saline then boiled for 10 minutes. Doses were dispensed according to department protocol and injected via cannula. Residual syringes were collected for three syringe types. In each case the barrel and plunger were measured separately. As the syringe is flushed during the exercise stress test and not the pharmacological stress test the chosen method was recorded. No relationship was demonstrated between the time MIBI remained in a syringe prior to injection and residual activity. Residual activity was not affected by method of stress test used. Actual injected activity can be calculated if the amount of activity remaining in the syringe post injection is known. Imaging time can be adjusted for residual activity to optimise count statistics. Preliminary results in this study indicate there is no difference in residual activity between syringe brands.Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  17. Project and construction of counting system for neutron probe

    International Nuclear Information System (INIS)

    Monteiro, W.P.

    1985-01-01

    A counting system was developed for coupling neutron probe aiming to register pulses produced by slow neutron interaction in the detector. The neutron probe consists of fast neutron source, thermal neutron detector, amplifier circuit and pulse counting circuit. The counting system is composed by counting circuit, timer and signal circuit. (M.C.K.)

  18. From vineyard to winery: a source map of microbial diversity driving wine fermentation.

    Science.gov (United States)

    Morrison-Whittle, Peter; Goddard, Matthew R

    2018-01-01

    Humans have been making wine for thousands of years and microorganisms play an integral part in this process as they not only drive fermentation, but also significantly influence the flavour, aroma and quality of finished wines. Since fruits are ephemeral, they cannot comprise a permanent microbial habitat; thus, an age-old unanswered question concerns the origin of fruit and ferment associated microbes. Here we use next-generation sequencing approaches to examine and quantify the roles of native forest, vineyard soil, bark and fruit habitats as sources of fungal diversity in ferments. We show that microbial communities in harvested juice and ferments vary significantly across regions, and that while vineyard fungi account for ∼40% of the source of this diversity, uncultivated ecosystems outside of vineyards also prove a significant source. We also show that while communities in harvested juice resemble those found on grapes, these increasingly resemble fungi present on vine bark as the ferment proceeds. © 2017 The Authors. Environmental Microbiology published by Society for Applied Microbiology and John Wiley & Sons Ltd.

  19. Measuring trace gas emission from multi-distributed sources using vertical radial plume mapping (VRPM) and backward Lagrangian stochastic (bLS) techniques

    Science.gov (United States)

    Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The accuracy of the vertical radial plume mapping (VRPM) and the backward Lagrangian (bLS) techniques with an open-path optical spectrosco...

  20. Exploring cosmic origins with CORE: Extragalactic sources in cosmic microwave background maps

    Science.gov (United States)

    De Zotti, G.; González-Nuevo, J.; Lopez-Caniego, M.; Negrello, M.; Greenslade, J.; Hernández-Monteagudo, C.; Delabrouille, J.; Cai, Z.-Y.; Bonato, M.; Achúcarro, A.; Ade, P.; Allison, R.; Ashdown, M.; Ballardini, M.; Banday, A. J.; Banerji, R.; Bartlett, J. G.; Bartolo, N.; Basak, S.; Bersanelli, M.; Biesiada, M.; Bilicki, M.; Bonaldi, A.; Bonavera, L.; Borrill, J.; Bouchet, F.; Boulanger, F.; Brinckmann, T.; Bucher, M.; Burigana, C.; Buzzelli, A.; Calvo, M.; Carvalho, C. S.; Castellano, M. G.; Challinor, A.; Chluba, J.; Clements, D. L.; Clesse, S.; Colafrancesco, S.; Colantoni, I.; Coppolecchia, A.; Crook, M.; D'Alessandro, G.; de Bernardis, P.; de Gasperis, G.; Diego, J. M.; Di Valentino, E.; Errard, J.; Feeney, S. M.; Fernández-Cobos, R.; Ferraro, S.; Finelli, F.; Forastieri, F.; Galli, S.; Génova-Santos, R. T.; Gerbino, M.; Grandis, S.; Hagstotz, S.; Hanany, S.; Handley, W.; Hervias-Caimapo, C.; Hills, M.; Hivon, E.; Kiiveri, K.; Kisner, T.; Kitching, T.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamagna, L.; Lasenby, A.; Lattanzi, M.; Le Brun, A.; Lesgourgues, J.; Lewis, A.; Liguori, M.; Lindholm, V.; Luzzi, G.; Maffei, B.; Mandolesi, N.; Martinez-Gonzalez, E.; Martins, C. J. A. P.; Masi, S.; Massardi, M.; Matarrese, S.; McCarthy, D.; Melchiorri, A.; Melin, J.-B.; Molinari, D.; Monfardini, A.; Natoli, P.; Notari, A.; Paiella, A.; Paoletti, D.; Partridge, R. B.; Patanchon, G.; Piat, M.; Pisano, G.; Polastri, L.; Polenta, G.; Pollo, A.; Poulin, V.; Quartin, M.; Remazeilles, M.; Roman, M.; Rossi, G.; Roukema, B. F.; Rubiño-Martín, J.-A.; Salvati, L.; Scott, D.; Serjeant, S.; Tartari, A.; Toffolatti, L.; Tomasi, M.; Trappe, N.; Triqueneaux, S.; Trombetti, T.; Tucci, M.; Tucker, C.; Väliviita, J.; van de Weygaert, R.; Van Tent, B.; Vennin, V.; Vielva, P.; Vittorio, N.; Young, K.; Zannoni, M.

    2018-04-01

    We discuss the potential of a next generation space-borne Cosmic Microwave Background (CMB) experiment for studies of extragalactic sources. Our analysis has particular bearing on the definition of the future space project, CORE, that has been submitted in response to ESA's call for a Medium-size mission opportunity as the successor of the Planck satellite. Even though the effective telescope size will be somewhat smaller than that of Planck, CORE will have a considerably better angular resolution at its highest frequencies, since, in contrast with Planck, it will be diffraction limited at all frequencies. The improved resolution implies a considerable decrease of the source confusion, i.e. substantially fainter detection limits. In particular, CORE will detect thousands of strongly lensed high-z galaxies distributed over the full sky. The extreme brightness of these galaxies will make it possible to study them, via follow-up observations, in extraordinary detail. Also, the CORE resolution matches the typical sizes of high-z galaxy proto-clusters much better than the Planck resolution, resulting in a much higher detection efficiency; these objects will be caught in an evolutionary phase beyond the reach of surveys in other wavebands. Furthermore, CORE will provide unique information on the evolution of the star formation in virialized groups and clusters of galaxies up to the highest possible redshifts. Finally, thanks to its very high sensitivity, CORE will detect the polarized emission of thousands of radio sources and, for the first time, of dusty galaxies, at mm and sub-mm wavelengths, respectively.

  1. Clean Hands Count

    Medline Plus

    Full Text Available ... Like this video? Sign in to make your opinion count. Sign in 131 2 Don't like this video? Sign in to make your opinion count. Sign in 3 Loading... Loading... Transcript The ...

  2. The Big Pumpkin Count.

    Science.gov (United States)

    Coplestone-Loomis, Lenny

    1981-01-01

    Pumpkin seeds are counted after students convert pumpkins to jack-o-lanterns. Among the activities involved, pupils learn to count by 10s, make estimates, and to construct a visual representation of 1,000. (MP)

  3. Map of decentralised energy potential based on renewable energy sources in Croatia

    International Nuclear Information System (INIS)

    Schneider, D. R.; Ban, M.; Duic, N.; Bogdan, Z.

    2005-01-01

    Although the Republic of Croatia is almost completely electrified there are still regions where electricity network is not in place or network capacity is insufficient. These regions usually include areas of special state care (underdeveloped, war-affected or depopulated areas), islands, and mountainous areas. However, they often have good renewable energy potential. Decentralised energy generation based on renewable energy sources (wind power, hydropower, solar energy, biomass) has potential to ensure energy supply to users in remote and often isolated rural areas (off-grid applications). Such applications will primarily be related to tourism business in mountainous, rural and island/coastal regions. Also, agriculture, wood-processing and food-processing industries will potentially be interested in application of decentralised energy generation systems, most likely those using biomass as fuel (for example cogeneration facilities, connected on-grid).(author)

  4. 2013 Kids Count in Colorado! Community Matters

    Science.gov (United States)

    Colorado Children's Campaign, 2013

    2013-01-01

    "Kids Count in Colorado!" is an annual publication of the Children's Campaign, providing state and county level data on child well-being factors including child health, education, and economic status. Since its first release 20 years ago, "Kids Count in Colorado!" has become the most trusted source for data and information on…

  5. Measurement of the 60 GHz ECR ion source using megawatt magnets - SEISM magnetic field map

    International Nuclear Information System (INIS)

    Marie-Jeanne, M.; Jacob, J.; Lamy, T.; Latrasse, L.; Debray, F.; Matera, J.; Pfister, R.; Trophine, C.

    2012-01-01

    LPSC has developed a 60 GHz Electron Cyclotron Resonance (ECR) Ion Source prototype called SEISM. The magnetic structure uses resistive poly-helix coils designed in collaboration with the French National High Magnetic Fields Facility (LNCMI) to produce a CUSP magnetic configuration. A dedicated test bench and appropriate electrical and water cooling environments were built to study the validity of the mechanics, the thermal behaviour and magnetic field characteristics obtained at various current intensities. During the last months, measurements were performed for several magnetic configurations, with up to 7000 A applied on the injection and extraction coils sets. The magnetic field achieved at 13000 A is expected to allow 28 GHz ECR condition, so by extrapolation 60 GHz should be possible at about 28000 A. However, cavitation issues that appeared around 7000 A are to be solved before carrying on with the tests. This contribution will recall some of the crucial steps in the prototype fabrication, and show preliminary results from the measurements at 7000 A. Possible explanations for the differences observed between the results and the simulation will be given. The paper is followed by the slides of the presentation. (authors)

  6. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs.

  7. Radiation counting statistics

    Energy Technology Data Exchange (ETDEWEB)

    Suh, M. Y.; Jee, K. Y.; Park, K. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiments. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. 11 refs., 6 figs., 8 tabs. (Author)

  8. Radiation counting statistics

    International Nuclear Information System (INIS)

    Suh, M. Y.; Jee, K. Y.; Park, K. K.; Park, Y. J.; Kim, W. H.

    1999-08-01

    This report is intended to describe the statistical methods necessary to design and conduct radiation counting experiments and evaluate the data from the experiment. The methods are described for the evaluation of the stability of a counting system and the estimation of the precision of counting data by application of probability distribution models. The methods for the determination of the uncertainty of the results calculated from the number of counts, as well as various statistical methods for the reduction of counting error are also described. (Author). 11 refs., 8 tabs., 8 figs

  9. Thermal mapping on male genital and skin tissues of laptop thermal sources and electromagnetic interaction.

    Science.gov (United States)

    Safari, Mahdi; Mosleminiya, Navid; Abdolali, Ali

    2017-10-01

    Since the development of communication devices and expansion of their applications, there have been concerns about their harmful health effects. The main aim of this study was to investigate laptop thermal effects caused by exposure to electromagnetic fields and thermal sources simultaneously; propose a nondestructive, replicable process that is less expensive than clinical measurements; and to study the effects of positioning any new device near the human body in steady state conditions to ensure safety by U.S. and European standard thresholds. A computer simulation was designed to obtain laptop heat flux from SolidWorks flow simulation. Increase in body temperature due to heat flux was calculated, and antenna radiation was calculated using Computer Simulation Technology (CST) Microwave Studio software. Steady state temperature and specific absorption rate (SAR) distribution in user's body, and heat flux beneath the laptop, were obtained from simulations. The laptop in its high performance mode caused 420 (W/m 2 ) peak two-dimensional heat flux beneath it. The cumulative effect of laptop in high performance mode and 1 W antenna radiation resulted in temperatures of 42.9, 38.1, and 37.2 °C in lap skin, scrotum, and testis, that is, 5.6, 2.1, and 1.4 °C increase in temperature, respectively. Also, 1 W antenna radiation caused 0.37 × 10 -3 and 0.13 × 10 -1 (W/kg) peak three-dimensional SAR at 2.4 and 5 GHz, respectively, which could be ignored in reference to standards and temperature rise due to laptop use. Bioelectromagnetics. 38:550-558, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Madankan, R. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pouget, S. [Department of Geology, University at Buffalo (United States); Singla, P., E-mail: psingla@buffalo.edu [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Bursik, M. [Department of Geology, University at Buffalo (United States); Dehn, J. [Geophysical Institute, University of Alaska, Fairbanks (United States); Jones, M. [Center for Computational Research, University at Buffalo (United States); Patra, A. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Pavolonis, M. [NOAA-NESDIS, Center for Satellite Applications and Research (United States); Pitman, E.B. [Department of Mathematics, University at Buffalo (United States); Singh, T. [Department of Mechanical and Aerospace Engineering, University at Buffalo (United States); Webley, P. [Geophysical Institute, University of Alaska, Fairbanks (United States)

    2014-08-15

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This paper presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.

  11. Application of EM holographic methods to borehole vertical electric source data to map a fuel oil spill

    International Nuclear Information System (INIS)

    Bartel, L.C.

    1993-01-01

    The multifrequency, multisource holographic method used in the analysis of seismic data is to extended electromagnetic (EM) data within the audio frequency range. The method is applied to the secondary magnetic fields produced by a borehole, vertical electric source (VES). The holographic method is a numerical reconstruction procedure based on the double focusing principle for both the source array and the receiver array. The approach used here is to Fourier transform the constructed image from frequency space to time space and set time equal to zero. The image is formed when the in-phase part (real part) is a maximum or the out-of-phase (imaginary part) is a minimum; i.e., the EM wave is phase coherent at its origination. In the application here the secondary magnetic fields are treated as scattered fields. In the numerical reconstruction, the seismic analog of the wave vector is used; i.e., the imaginary part of the actual wave vector is ignored. The multifrequency, multisource holographic method is applied to calculated model data and to actual field data acquired to map a diesel fuel oil spill

  12. Applications of EM holographic methods to borehole vertical electric source data to map a fuel oil spill

    International Nuclear Information System (INIS)

    Bartel, L.C.

    1993-01-01

    The multifrequency, multisource holographic method used in the analysis of seismic data is to extended electromagnetic (EM) data within the audio frequency range. The method is applied to the secondary magnetic fields produced by a borehole, vertical electric source (VES). The holographic method is a numerical reconstruction procedure based on the double focusing principle for both the source array and the receiver array. The approach used here is to Fourier transform the constructed image from frequency space to time space and set time equal to zero. The image is formed when the in-phase part (real part) is a maximum or the out-of-phase (imaginary part) is a minimum; i.e., the EM wave is phase coherent at its origination. In the application here the secondary magnetic fields are treated as scattered fields. In the numerical reconstruction, the seismic analog of the wave vector is used; i.e., the imaginary part of the actual wave vector is ignore. The multifrequency, multisource holographic method is applied to calculated model data and to actual field data acquired to map a diesel fuel oil spill

  13. Mining the Geophysical Research Abstracts Corpus: Mapping the impact of Free and Open Source Software on the EGU Divisions

    Science.gov (United States)

    Löwe, Peter; Klump, Jens; Robertson, Jesse

    2015-04-01

    Text mining is commonly employed as a tool in data science to investigate and chart emergent information from corpora of research abstracts, such as the Geophysical Research Abstracts (GRA) published by Copernicus. In this context current standards, such as persistent identifiers like DOI and ORCID, allow us to trace, cite and map links between journal publications, the underlying research data and scientific software. This network can be expressed as a directed graph which enables us to chart networks of cooperation and innovation, thematic foci and the locations of research communities in time and space. However, this approach of data science, focusing on the research process in a self-referential manner, rather than the topical work, is still in a developing stage. Scientific work presented at the EGU General Assembly is often the first step towards new approaches and innovative ideas to the geospatial community. It represents a rich, deep and heterogeneous source of geoscientific thought. This corpus is a significant data source for data science, which has not been analysed on this scale previously. In this work, the corpus of the Geophysical Research Abstracts is used for the first time as a data base for analyses of topical text mining. For this, we used a sturdy and customizable software framework, based on the work of Schmitt et al. [1]. For the analysis we used the High Performance Computing infrastructure of the German Research Centre for Geosciences GFZ in Potsdam, Germany. Here, we report on the first results from the analysis of the continuous spreading the of use of Free and Open Source Software Tools (FOSS) within the EGU communities, mapping the general increase of FOSS-themed GRA articles in the last decade and the developing spatial patterns of involved parties and FOSS topics. References: [1] Schmitt, L. M., Christianson, K.T, Gupta R..: Linguistic Computing with UNIX Tools, in Kao, A., Poteet S.R. (Eds.): Natural Language processing and Text

  14. Integrating pipeline data management application and Google maps dataset on web based GIS application unsing open source technology Sharp Map and Open Layers

    Energy Technology Data Exchange (ETDEWEB)

    Wisianto, Arie; Sania, Hidayatus [PT PERTAMINA GAS, Bontang (Indonesia); Gumilar, Oki [PT PERTAMINA GAS, Jakarta (Indonesia)

    2010-07-01

    PT Pertamina Gas operates 3 pipe segments carrying natural gas from producers to PT Pupuk Kaltim in the Kalimantan area. The company wants to build a pipeline data management system consisting of pipeline facilities, inspections and risk assessments which would run on Geographic Information Systems (GIS) platforms. The aim of this paper is to present the integration of the pipeline data management system with GIS. A web based GIS application is developed using the combination of Google maps datasets with local spatial datasets. In addition, Open Layers is used to integrate pipeline data model and Google Map dataset into a single map display on Sharp Map. The GIS based pipeline data management system developed herein constitutes a low cost, powerful and efficient web based GIS solution.

  15. MEAPA - Integrated methodology for alternative energy sources map pin in the State of Para - Brazil; MEAPA - Metodologias integradas para o mapeamento de energias alternativas no Estado do Para

    Energy Technology Data Exchange (ETDEWEB)

    Monteiro, Claudio; Lopes, J. Pecas; Va, Kowk P.; Herold, Helmut [Instituto de Engenharia de Sistemas e Computadores (INESC), Porto (Portugal). Power System Unit. E-mail: cmonteiro@inescn.pt; Rocha, Brigida; Pinheiro, Helten; Rocha, Olavo [Para Univ., Belem, PA (Brazil). Dept. de Engenharia Eletrica. E-mail: rocha@interconect.com.br; Silva, Isa O. [Para Univ., Belem, PA (Brazil). Dept. de Meteorologia; Moraes, Sinfronio [Para Univ., Belem, PA (Brazil). Dept. de Mecanica

    1999-07-01

    This paper describes the MEAPA project for the development of methodologies which support the renewable energy sources integration in the Marajo island - Para - Brazil. The methodologies used will be described including the construction of a geographical database, renewable resources (wind, solar and biomass) mapping, transport costs, cost of electric power produced by various systems and electrification sceneries and comparison of electrification solutions.

  16. Mapping of wind energy potential over the Gobi Desert in Northwest China based on multiple sources of data

    Science.gov (United States)

    Li, Li; Wang, Xinyuan; Luo, Lei; Zhao, Yanchuang; Zong, Xin; Bachagha, Nabil

    2018-06-01

    In recent years, wind energy has been a fastgrowing alternative source of electrical power due to its sustainability. In this paper, the wind energy potential over the Gobi Desert in Northwest China is assessed at the patch scale using geographic information systems (GIS). Data on land cover, topography, and administrative boundaries and 11 years (2000‒2010) of wind speed measurements were collected and used to map and estimate the region's wind energy potential. Based on the results, it was found that continuous regions of geographical potential (GeoP) are located in the middle of the research area (RA), with scattered areas of similar GeoP found in other regions. The results also show that the technical potential (TecP) levels are about 1.72‒2.67 times (2.20 times on average) higher than the actual levels. It was found that the GeoP patches can be divided into four classes: unsuitable regions, suitable regions, more suitable regions, and the most suitable regions. The GeoP estimation shows that 0.41 billion kW of wind energy are potentially available in the RA. The suitable regions account for 25.49%, the more suitable regions 24.45%, and the most suitable regions for more than half of the RA. It is also shown that Xinjiang and Gansu are more suitable for wind power development than Ningxia.

  17. Animal movement network analysis as a tool to map farms serving as contamination source in cattle cysticercosis

    Directory of Open Access Journals (Sweden)

    Samuel C. Aragão

    Full Text Available ABSTRACT: Bovine cysticercosis is a problem distributed worldwide that result in economic losses mainly due to the condemnation of infected carcasses. One of the difficulties in applying control measures is the identification of the source of infection, especially because cattle are typically acquired from multiple farms. Here, we tested the utility of an animal movement network constructed with data from a farm that acquires cattle from several other different farms to map the major contributors of cysticercosis propagation. Additionally, based on the results of the network analysis, we deployed a sanitary management and drug treatment scheme to decrease cysticercosis’ occurrence in the farm. Six farms that had commercial trades were identified by the animal movement network and characterized as the main contributors to the occurrence of cysticercosis in the studied farm. The identification of farms with a putative risk of Taenia saginata infection using the animal movement network along with the proper sanitary management and drug treatment resulted in a gradual decrease in cysticercosis prevalence, from 25% in 2010 to 3.7% in 2011 and 1.8% in 2012. These results suggest that the animal movement network can contribute towards controlling bovine cysticercosis, thus minimizing economic losses and preventing human taeniasis.

  18. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  19. Platelet Count and Plateletcrit

    African Journals Online (AJOL)

    strated that neonates with late onset sepsis (bacteremia after 3 days of age) had a dramatic increase in MPV and. PDW18. We hypothesize that as the MPV and PDW increase and platelet count and PCT decrease in sick children, intui- tively, the ratio of MPV to PCT; MPV to Platelet count,. PDW to PCT, PDW to platelet ...

  20. EcoCount

    Directory of Open Access Journals (Sweden)

    Phillip P. Allen

    2014-05-01

    Full Text Available Techniques that analyze biological remains from sediment sequences for environmental reconstructions are well established and widely used. Yet, identifying, counting, and recording biological evidence such as pollen grains remain a highly skilled, demanding, and time-consuming task. Standard procedure requires the classification and recording of between 300 and 500 pollen grains from each representative sample. Recording the data from a pollen count requires significant effort and focused resources from the palynologist. However, when an adaptation to the recording procedure is utilized, efficiency and time economy improve. We describe EcoCount, which represents a development in environmental data recording procedure. EcoCount is a voice activated fully customizable digital count sheet that allows the investigator to continuously interact with a field of view during the data recording. Continuous viewing allows the palynologist the opportunity to remain engaged with the essential task, identification, for longer, making pollen counting more efficient and economical. EcoCount is a versatile software package that can be used to record a variety of environmental evidence and can be installed onto different computer platforms, making the adoption by users and laboratories simple and inexpensive. The user-friendly format of EcoCount allows any novice to be competent and functional in a very short time.

  1. Clean Hands Count

    Medline Plus

    Full Text Available ... starting stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with ... ads? Get YouTube Red. Working... Not now Try it free Find out why Close Clean Hands Count ...

  2. Counting It Twice.

    Science.gov (United States)

    Schattschneider, Doris

    1991-01-01

    Provided are examples from many domains of mathematics that illustrate the Fubini Principle in its discrete version: the value of a summation over a rectangular array is independent of the order of summation. Included are: counting using partitions as in proof by pictures, combinatorial arguments, indirect counting as in the inclusion-exclusion…

  3. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  4. Connecting smoke plumes to sources using Hazard Mapping System (HMS) smoke and fire location data over North America

    Science.gov (United States)

    Brey, Steven J.; Ruminski, Mark; Atwood, Samuel A.; Fischer, Emily V.

    2018-02-01

    Fires represent an air quality challenge because they are large, dynamic and transient sources of particulate matter and ozone precursors. Transported smoke can deteriorate air quality over large regions. Fire severity and frequency are likely to increase in the future, exacerbating an existing problem. Using the National Environmental Satellite, Data, and Information Service (NESDIS) Hazard Mapping System (HMS) smoke data for North America for the period 2007 to 2014, we examine a subset of fires that are confirmed to have produced sufficient smoke to warrant the initiation of a U.S. National Weather Service smoke forecast. We find that gridded HMS-analyzed fires are well correlated (r = 0.84) with emissions from the Global Fire Emissions Inventory Database 4s (GFED4s). We define a new metric, smoke hours, by linking observed smoke plumes to active fires using ensembles of forward trajectories. This work shows that the Southwest, Northwest, and Northwest Territories initiate the most air quality forecasts and produce more smoke than any other North American region by measure of the number of HYSPLIT points analyzed, the duration of those HYSPLIT points, and the total number of smoke hours produced. The average number of days with smoke plumes overhead is largest over the north-central United States. Only Alaska, the Northwest, the Southwest, and Southeast United States regions produce the majority of smoke plumes observed over their own borders. This work moves a new dataset from a daily operational setting to a research context, and it demonstrates how changes to the frequency or intensity of fires in the western United States could impact other regions.

  5. Novel Data Sources for Women’s Health Research: Mapping Breast Screening Online Information Seeking Through Google Trends

    Science.gov (United States)

    Dehkordy, Soudabeh Fazeli; Carlos, Ruth C.; Hall, Kelli S.; Dalton, Vanessa K.

    2015-01-01

    Rationale and Objectives Millions of people use online search engines every day to find health-related information and voluntarily share their personal health status and behaviors in various Web sites. Thus, data from tracking of online information seeker’s behavior offer potential opportunities for use in public health surveillance and research. Google Trends is a feature of Google which allows internet users to graph the frequency of searches for a single term or phrase over time or by geographic region. We used Google Trends to describe patterns of information seeking behavior in the subject of dense breasts and to examine their correlation with the passage or introduction of dense breast notification legislation. Materials and Methods In order to capture the temporal variations of information seeking about dense breasts, the web search query “dense breast” was entered in the Google Trends tool. We then mapped the dates of legislative actions regarding dense breasts that received widespread coverage in the lay media to information seeking trends about dense breasts over time. Results Newsworthy events and legislative actions appear to correlate well with peaks in search volume of “dense breast”. Geographic regions with the highest search volumes have either passed, denied, or are currently considering the dense breast legislation. Conclusions Our study demonstrated that any legislative action and respective news coverage correlate with increase in information seeking for “dense breast” on Google, suggesting that Google Trends has the potential to serve as a data source for policy-relevant research. PMID:24998689

  6. NO2 and SO2 dispersion modeling and relative roles of emission sources over Map Ta Phut industrial area, Thailand.

    Science.gov (United States)

    Chusai, Chatinai; Manomaiphiboon, Kasemsan; Saiyasitpanich, Phirun; Thepanondh, Sarawut

    2012-08-01

    Map Ta Phut industrial area (MA) is the largest industrial complex in Thailand. There has been concern about many air pollutants over this area. Air quality management for the area is known to be difficult, due to lack of understanding of how emissions from different sources or sectors (e.g., industrial, power plant, transportation, and residential) contribute to air quality degradation in the area. In this study, a dispersion study of NO2 and SO2 was conducted using the AERMOD model. The area-specific emission inventories of NOx and SO2 were prepared, including both stack and nonstack sources, and divided into 11 emission groups. Annual simulations were performed for the year 2006. Modeled concentrations were evaluated with observations. Underestimation of both pollutants was Jbund, and stack emission estimates were scaled to improve the modeled results before quantifying relative roles of individual emission groups to ambient concentration overfour selected impacted areas (two are residential and the others are highly industrialized). Two concentration measures (i.e., annual average area-wide concentration or AC, and area-wide robust highest concentration or AR) were used to aggregately represent mean and high-end concentrations Jbfor each individual area, respectively. For AC-NO2, on-road mobile emissions were found to be the largest contributor in the two residential areas (36-38% of total AC-NO2), while petrochemical-industry emissions play the most important role in the two industrialized areas (34-51%). For AR-NO2, biomass burning has the most influence in all impacted areas (>90%) exceptJor one residential area where on-road mobile is the largest (75%). For AC-SO2, the petrochemical industry contributes most in all impacted areas (38-56%). For AR-SO2, the results vary. Since the petrochemical industry was often identified as the major contributor despite not being the largest emitter, air quality workers should pay special attention to this emission group

  7. Mapping Henry: Synchrotron-sourced X-ray fluorescence mapping and ultra-high-definition scanning of an early Tudor portrait of Henry VIII

    Energy Technology Data Exchange (ETDEWEB)

    Dredge, Paula; Ives, Simon [Art Gallery of New South Wales (AGNSW), Sydney, NSW (Australia); Howard, Daryl L.; Spiers, Kathryn M. [Australian Synchrotron, Clayton, VIC (Australia); Yip, Andrew [Art Gallery of New South Wales (AGNSW), Sydney, NSW (Australia); University of New South Wales, Laboratory for Innovation in Galleries, Libraries, Archives and Museums (iGLAM), National Institute for Experimental Arts, Sydney, NSW (Australia); Kenderdine, Sarah [University of New South Wales, Laboratory for Innovation in Galleries, Libraries, Archives and Museums (iGLAM), National Institute for Experimental Arts, Sydney, NSW (Australia)

    2015-11-15

    A portrait of Henry VIII on oak panel c. 1535 has recently undergone technical examination to inform questions regarding authorship and the painting's relationship to a group of similar works in the collections of the National Portrait Gallery, London, and the Society of Antiquaries. Due to previous conservation treatments of the painting, the conventional transmission X-radiograph image was difficult to interpret. As a result, the painting underwent high-definition X-ray fluorescence (XRF) elemental mapping on the X-ray fluorescence microscopy beamline of the Australian Synchrotron. Scans were conducted at 12.6 and 18.5 keV, below and above the lead (Pb) L edges, respectively. Typical scan parameters were 120 μm pixel size at 7 ms dwell time, with the largest scan covering an area 545 x 287 mm{sup 2} collected in 23 h (10.8 MP). XRF mapping of the panel has guided the conservation treatment of the painting and the revelation of previously obscured features. It has also provided insight into the process of making of the painting. The informative and detailed elemental maps, alongside ultra-high-definition scans of the painting undertaken before and after varnish and over-paint removal, have assisted in comparison of the finely painted details with the London paintings. The resolution offered by the combination of imaging techniques identifies pigment distribution at an extremely fine scale, enabling a new understanding of the artist's paint application. (orig.)

  8. Mapping Henry: Synchrotron-sourced X-ray fluorescence mapping and ultra-high-definition scanning of an early Tudor portrait of Henry VIII

    International Nuclear Information System (INIS)

    Dredge, Paula; Ives, Simon; Howard, Daryl L.; Spiers, Kathryn M.; Yip, Andrew; Kenderdine, Sarah

    2015-01-01

    A portrait of Henry VIII on oak panel c. 1535 has recently undergone technical examination to inform questions regarding authorship and the painting's relationship to a group of similar works in the collections of the National Portrait Gallery, London, and the Society of Antiquaries. Due to previous conservation treatments of the painting, the conventional transmission X-radiograph image was difficult to interpret. As a result, the painting underwent high-definition X-ray fluorescence (XRF) elemental mapping on the X-ray fluorescence microscopy beamline of the Australian Synchrotron. Scans were conducted at 12.6 and 18.5 keV, below and above the lead (Pb) L edges, respectively. Typical scan parameters were 120 μm pixel size at 7 ms dwell time, with the largest scan covering an area 545 x 287 mm 2 collected in 23 h (10.8 MP). XRF mapping of the panel has guided the conservation treatment of the painting and the revelation of previously obscured features. It has also provided insight into the process of making of the painting. The informative and detailed elemental maps, alongside ultra-high-definition scans of the painting undertaken before and after varnish and over-paint removal, have assisted in comparison of the finely painted details with the London paintings. The resolution offered by the combination of imaging techniques identifies pigment distribution at an extremely fine scale, enabling a new understanding of the artist's paint application. (orig.)

  9. Clean Hands Count

    Medline Plus

    Full Text Available ... Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed Unsubscribe 65K ...

  10. Clean Hands Count

    Medline Plus

    Full Text Available ... Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed Unsubscribe 66K ...

  11. Housing Inventory Count

    Data.gov (United States)

    Department of Housing and Urban Development — This report displays the data communities reported to HUD about the nature of their dedicated homeless inventory, referred to as their Housing Inventory Count (HIC)....

  12. Scintillation counting apparatus

    International Nuclear Information System (INIS)

    Noakes, J.E.

    1978-01-01

    Apparatus is described for the accurate measurement of radiation by means of scintillation counters and in particular for the liquid scintillation counting of both soft beta radiation and gamma radiation. Full constructional and operating details are given. (UK)

  13. Allegheny County Traffic Counts

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — Traffic sensors at over 1,200 locations in Allegheny County collect vehicle counts for the Pennsylvania Department of Transportation. Data included in the Health...

  14. Counting Knights and Knaves

    Science.gov (United States)

    Levin,Oscar; Roberts, Gerri M.

    2013-01-01

    To understand better some of the classic knights and knaves puzzles, we count them. Doing so reveals a surprising connection between puzzles and solutions, and highlights some beautiful combinatorial identities.

  15. Clean Hands Count

    Medline Plus

    Full Text Available ... why Close Clean Hands Count Centers for Disease Control and Prevention (CDC) Loading... Unsubscribe from Centers for Disease Control and Prevention (CDC)? Cancel Unsubscribe Working... Subscribe Subscribed ...

  16. Clean Hands Count

    Medline Plus

    Full Text Available ... stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. Working... Not now Try it free Find ...

  17. The Kruskal Count

    OpenAIRE

    Lagarias, Jeffrey C.; Rains, Eric; Vanderbei, Robert J.

    2001-01-01

    The Kruskal Count is a card trick invented by Martin J. Kruskal in which a magician "guesses" a card selected by a subject according to a certain counting procedure. With high probability the magician can correctly "guess" the card. The success of the trick is based on a mathematical principle related to coupling methods for Markov chains. This paper analyzes in detail two simplified variants of the trick and estimates the probability of success. The model predictions are compared with simula...

  18. Analysis of spatial count data using Kalman smoothing

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2007-01-01

    We consider spatial count data from an agricultural field experiment. Counts of weed plants in a field have been recorded in a project on precision farming. Interest is in mapping the weed intensity so that the dose of herbicide applied at any location can be adjusted to the amount of weed present...

  19. Complete Mapping of Complex Disulfide Patterns with Closely-Spaced Cysteines by In-Source Reduction and Data-Dependent Mass Spectrometry

    DEFF Research Database (Denmark)

    Cramer, Christian N; Kelstrup, Christian D; Olsen, Jesper V

    2017-01-01

    bonds are present in complicated patterns. This includes the presence of disulfide bonds in nested patterns and closely spaced cysteines. Unambiguous mapping of such disulfide bonds typically requires advanced MS approaches. In this study, we exploited in-source reduction (ISR) of disulfide bonds during...... the electrospray ionization process to facilitate disulfide bond assignments. We successfully developed a LC-ISR-MS/MS methodology to use as an online and fully automated partial reduction procedure. Postcolumn partial reduction by ISR provided fast and easy identification of peptides involved in disulfide bonding......Mapping of disulfide bonds is an essential part of protein characterization to ensure correct cysteine pairings. For this, mass spectrometry (MS) is the most widely used technique due to fast and accurate characterization. However, MS-based disulfide mapping is challenged when multiple disulfide...

  20. Identification and characterization of scenarios of natural source, representation of the same ones by means of radiological maps

    International Nuclear Information System (INIS)

    Dominguez Ley, Orlando; Capote Ferrera, Eduardo; Alonso Abad, Dolores; Gil Castillo, Reynaldo; Castillo Gomez, Rafael; Ramos Viltre, Enma O.

    2008-01-01

    The National Network of Environmental Radiological Surveillance of the Cuba Republic (NNERS) between their main functions of controlling permanently, from the radiological point of view, the environment in the whole national territory, monitoring, among other indicators, the environmental gamma dose rate. This monitoring is carried out in 18 radiological posts distributed by the whole country that gives us an idea of the atmospheric gamma background of a certain region. The influence of the natural sources scenarios in the measurement of the gamma dose rate, results in the variability of the measurement inside oneself region and even in distances so short as 500 meters the variation can be between the 30 or 40 nGy/h, for that, when we estimate the exhibition dose to which the population is subjected, we would introduce a great error when we take like base the gamma dose rate of the region in study. The study of the environmental gamma background was carried out in Havana City, using a mobile gamma dose rate system, with a high sensibility and a very low time of answer. The used system is able to register in real time the position and the gamma dose rate. The monitoring carries out from an automobile, using an interval of measurement of 10 seconds. The obtained results are classified by measurements ranges, and they were associated to different codes of colors and were showed in a map for a better interpretation and visualization of the same ones. Computer tools were developed for do that. The average value of dose rate in Havana City (taking into account the contribution of the cosmic radiation) is 55,6 nGy/h, similar to the national historical average reported by the NNERS that is of 55,3 nGy/h. The municipality of higher dose rate was Cerro with 61,3 nGy/h, and that of lower value was San Miguel del Padron with 51,9 nGy/h. If we discard the contribution of the cosmic radiation, the dose rate in air in Havana City is below 50 % of the world average. (author)

  1. Low White Blood Cell Count

    Science.gov (United States)

    Symptoms Low white blood cell count By Mayo Clinic Staff A low white blood cell count (leukopenia) is a decrease ... of white blood cell (neutrophil). The definition of low white blood cell count varies from one medical ...

  2. A procedure for merging land cover/use data from LANDSAT, aerial photography, and map sources: Compatibility, accuracy, and cost. Remote Sensing Project

    Science.gov (United States)

    Enslin, W. R.; Tilmann, S. E.; Hill-Rowley, R.; Rogers, R. H.

    1977-01-01

    Regional planning agencies are currently expressing a need for detailed land cover/use information to effectively meet the requirements of various federal programs. Individual data sources have advantages and limitations in fulfilling this need, both in terms of time/cost and technological capability. A methodology has been developed to merge land cover/use data from LANDSAT, aerial photography and map sources to maximize the effective use of a variety of data sources in the provision of an integrated information system for regional analysis. A test of the proposed inventory method is currently under way in four central Michigan townships. This test will evaluate the compatibility, accuracy and cost of the integrated method with reference to inventories developed from a single data source, and determine both the technological feasibility and analytical potential of such a system.

  3. Absolute nuclear material assay using count distribution (LAMBDA) space

    Science.gov (United States)

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  4. Liquid scintillation counting system with automatic gain correction

    International Nuclear Information System (INIS)

    Frank, R.B.

    1976-01-01

    An automatic liquid scintillation counting apparatus is described including a scintillating medium in the elevator ram of the sample changing apparatus. An appropriate source of radiation, which may be the external source for standardizing samples, produces reference scintillations in the scintillating medium which may be used for correction of the gain of the counting system

  5. Alpha scintillation radon counting

    International Nuclear Information System (INIS)

    Lucas, H.F. Jr.

    1977-01-01

    Radon counting chambers which utilize the alpha-scintillation properties of silver activated zinc sulfide are simple to construct, have a high efficiency, and, with proper design, may be relatively insensitive to variations in the pressure or purity of the counter filling. Chambers which were constructed from glass, metal, or plastic in a wide variety of shapes and sizes were evaluated for the accuracy and the precision of the radon counting. The principles affecting the alpha-scintillation radon counting chamber design and an analytic system suitable for a large scale study of the 222 Rn and 226 Ra content of either air or other environmental samples are described. Particular note is taken of those factors which affect the accuracy and the precision of the method for monitoring radioactivity around uranium mines

  6. Principles of correlation counting

    International Nuclear Information System (INIS)

    Mueller, J.W.

    1975-01-01

    A review is given of the various applications which have been made of correlation techniques in the field of nuclear physics, in particular for absolute counting. Whereas in most cases the usual coincidence method will be preferable for its simplicity, correlation counting may be the only possible approach in such cases where the two radiations of the cascade cannot be well separated or when there is a longliving intermediate state. The measurement of half-lives and of count rates of spurious pulses is also briefly discussed. The various experimental situations lead to different ways the correlation method is best applied (covariance technique with one or with two detectors, application of correlation functions, etc.). Formulae are given for some simple model cases, neglecting dead-time corrections

  7. Interpretation of galaxy counts

    International Nuclear Information System (INIS)

    Tinsely, B.M.

    1980-01-01

    New models are presented for the interpretation of recent counts of galaxies to 24th magnitude, and predictions are shown to 28th magnitude for future comparison with data from the Space Telescope. The results supersede earlier, more schematic models by the author. Tyson and Jarvis found in their counts a ''local'' density enhancement at 17th magnitude, on comparison with the earlier models; the excess is no longer significant when a more realistic mixture of galaxy colors is used. Bruzual and Kron's conclusion that Kron's counts show evidence for evolution at faint magnitudes is confirmed, and it is predicted that some 23d magnitude galaxies have redshifts greater than unity. These may include spheroidal systems, elliptical galaxies, and the bulges of early-type spirals and S0's, seen during their primeval rapid star formation

  8. Computerized radioautographic grain counting

    International Nuclear Information System (INIS)

    McKanna, J.A.; Casagrande, V.A.

    1985-01-01

    In recent years, radiolabeling techniques have become fundamental assays in physiology and biochemistry experiments. They also have assumed increasingly important roles in morphologic studies. Characteristically, radioautographic analysis of structure has been qualitative rather than quantitative, however, microcomputers have opened the door to several methods for quantifying grain counts and density. The overall goal of this chapter is to describe grain counting using the Bioquant, an image analysis package based originally on the Apple II+, and now available for several popular microcomputers. The authors discuss their image analysis procedures by applying them to a study of development in the central nervous system

  9. Rainflow counting revisited

    Energy Technology Data Exchange (ETDEWEB)

    Soeker, H [Deutsches Windenergie-Institut (Germany)

    1996-09-01

    As state of the art method the rainflow counting technique is presently applied everywhere in fatigue analysis. However, the author feels that the potential of the technique is not fully recognized in wind energy industries as it is used, most of the times, as a mere data reduction technique disregarding some of the inherent information of the rainflow counting results. The ideas described in the following aim at exploitation of this information and making it available for use in the design and verification process. (au)

  10. EXTINCTION MAP OF THE SMALL MAGELLANIC CLOUD BASED ON THE SIRIUS AND 6X 2MASS POINT SOURCE CATALOGS

    International Nuclear Information System (INIS)

    Dobashi, Kazuhito; Egusa, Fumi; Bernard, Jean-Philippe; Paradis, Deborah; Kawamura, Akiko; Hughes, Annie; Bot, Caroline; Reach, William T.

    2009-01-01

    In this paper, we present the first extinction map of the Small Magellanic Cloud (SMC) constructed using the color excess at near-infrared wavelengths. Using a new technique named X percentile method , which we developed recently to measure the color excess of dark clouds embedded within a star distribution, we have derived an E(J - H) map based on the SIRIUS and 6X Two Micron All Sky Survey (2MASS) star catalogs. Several dark clouds are detected in the map derived from the SIRIUS star catalog, which is deeper than the 6X 2MASS catalog. We have compared the E(J - H) map with a model calculation in order to infer the locations of the clouds along the line of sight, and found that many of them are likely to be located in or elongated toward the far side of the SMC. Most of the dark clouds found in the E(J - H) map have counterparts in the CO clouds detected by Mizuno et al. with the NANTEN telescope. A comparison of the E(J - H) map with the virial mass derived from the CO data indicates that the dust-to-gas ratio in the SMC varies in the range A V /N H = 1-2 x 10 -22 mag H -1 cm 2 with a mean value of ∼1.5 x 10 -22 mag H -1 cm 2 . If the virial mass underestimates the true cloud mass by a factor of ∼2, as recently suggested by Bot et al., the mean value would decrease to ∼8x10 -23 mag H -1 cm 2 , in good agreement with the value reported by Gordon et al., 7.59 x 10 -23 mag H -1 cm 2 .

  11. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 824 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 409,492 ...

  12. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 786 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 413,702 ...

  13. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 414 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  14. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 869 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  15. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 460 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  16. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 741 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  17. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 029 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,974 ...

  18. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 396 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  19. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 094 views 1:19 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,974 ...

  20. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 319 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  1. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 585 views 3:10 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 413,097 ...

  2. Detection and counting systems

    International Nuclear Information System (INIS)

    Abreu, M.A.N. de

    1976-01-01

    Detection devices based on gaseous ionization are analysed, such as: electroscopes ionization chambers, proportional counters and Geiger-Mueller counters. Scintillation methods are also commented. A revision of the basic concepts in electronics is done and the main equipment for counting is detailed. In the study of gama spectrometry, scintillation and semiconductor detectors are analysed [pt

  3. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 384 views 1:19 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Thomas Jefferson University & Jefferson ...

  4. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 285 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  5. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 033 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 410,052 ...

  6. Reticulocyte Count Test

    Science.gov (United States)

    ... htm. (2004 Summer). Immature Reticulocyte Fraction(IRF). The Pathology Center Newsletter v9(1). [On-line information]. Available ... Company, Philadelphia, PA [18th Edition]. Levin, M. (2007 March 8, Updated). Reticulocyte Count. MedlinePlus Medical Encyclopedia [On- ...

  7. Clean Hands Count

    Medline Plus

    Full Text Available ... is starting stop Loading... Watch Queue Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos ... empower patients to play a role in their care by asking or reminding healthcare providers to clean ...

  8. Radiation intensity counting system

    International Nuclear Information System (INIS)

    Peterson, R.J.

    1982-01-01

    A method is described of excluding the natural dead time of the radiation detector (or eg Geiger-Mueller counter) in a ratemeter counting circuit, thus eliminating the need for dead time corrections. Using a pulse generator an artificial dead time is introduced which is longer than the natural dead time of the detector. (U.K.)

  9. Clean Hands Count

    Medline Plus

    Full Text Available ... Queue __count__/__total__ It’s YouTube. Uninterrupted. Loading... Want music and videos with zero ads? Get YouTube Red. ... 043 views 1:36 Wash 'Em - Hand Hygiene Music Video - Duration: 5:46. Jefferson Health 411,292 ...

  10. Searching for Orphan radiation sources

    International Nuclear Information System (INIS)

    Bystrov, Evgenij; Antonau, Uladzimir; Gurinovich, Uladzimir; Kazhamiakin, Valery; Petrov, Vitaly; Shulhovich, Heorhi; Tischenko, Siarhei

    2008-01-01

    Full text: The problem of orphan sources cannot be left unaddressed due high probability of accidental exposure and use of sources for terrorism. Search of objects of this kind is complex particularly when search territory is large. This requires devices capable of detecting sources, identifying their radionuclide composition, and correlating scan results to geographical coordinates and displaying results on a map. Spectral radiation scanner AT6101C can fulfill the objective of search for gamma and neutron radiation sources, radionuclide composition identification, correlation results to geographical coordinates and displaying results on a map. The scanner consists of gamma radiation scintillation detection unit based on NaI(Tl) crystal, neutron detection unit based on two He 3 counters, GPS receiver and portable ruggedized computer. Built-in and application software automates entire scan process, saving all results to memory for further analysis with visual representation of results as spectral information diagrams, count rate profile and gamma radiation dose rates on a geographical map. The scanner informs operator with voice messages on detection of radiation sources, identification result and other events. Scanner detection units and accessories are packed in a backpack. Weighing 7 kg, the scanner is human portable and can be used for scan inside cars. The scanner can also be used for radiation mapping and inspections. (author)

  11. Fast counting electronics for neutron coincidence counting

    International Nuclear Information System (INIS)

    Swansen, J.E.

    1987-01-01

    This patent describes a high speed circuit for accurate neutron coincidence counting comprising: neutron detecting means for providing an above-threshold signal upon neutron detection; amplifying means inputted by the neutron detecting means for providing a pulse output having a pulse width of about 0.5 microseconds upon the input of each above threshold signal; digital processing means inputted by the pulse output of the amplifying means for generating a pulse responsive to each input pulse from the amplifying means and having a pulse width of about 50 nanoseconds effective for processing an expected neutron event rate of about 1 Mpps: pulse stretching means inputted by the digital processing means for producing a pulse having a pulse width of several milliseconds for each pulse received form the digital processing means; visual indicating means inputted by the pulse stretching means for producing a visual output for each pulse received from the digital processing means; and derandomizing means effective to receive the 50 ns neutron event pulses from the digital processing means for storage at a rate up to the neutron event rate of 1 Mpps and having first counter means for storing the input neutron event pulses

  12. Mapping Prehistoric, Historic, and Channel Sediment Distribution, South Fork Noyo River: A Tool For Understanding Sources, Storage, and Transport

    Science.gov (United States)

    Rich D. Koehler; Keith I. Kelson; Graham Matthews; K.H. Kang; Andrew D. Barron

    2007-01-01

    The South Fork Noyo River (SFNR) watershed in coastal northern California contains large volumes of historic sediment that were delivered to channels in response to past logging operations. This sediment presently is stored beneath historic terraces and in present-day channels. We conducted geomorphic mapping on the SFNR valley floor to assess the volume and location...

  13. World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World s Largest Open Source Geographic Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Robert N [ORNL; Piburn, Jesse O [ORNL; Sorokine, Alexandre [ORNL; Myers, Aaron T [ORNL; White, Devin A [ORNL

    2015-01-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings. Acknowledgment Prepared by Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, Tennessee 37831-6285, managed by UT-Battelle, LLC for the U. S. Department of Energy under contract no. DEAC05-00OR22725. Copyright This manuscript has been authored by employees of UT-Battelle, LLC, under contract DE-AC05-00OR22725 with the U.S. Department of Energy. Accordingly, the United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or

  14. The Mapping X-ray Fluorescence Spectrometer (MapX)

    Science.gov (United States)

    Sarrazin, P.; Blake, D. F.; Marchis, F.; Bristow, T.; Thompson, K.

    2017-12-01

    Many planetary surface processes leave traces of their actions as features in the size range 10s to 100s of microns. The Mapping X-ray Fluorescence Spectrometer (MapX) will provide elemental imaging at 100 micron spatial resolution, yielding elemental chemistry at a scale where many relict physical, chemical, or biological features can be imaged and interpreted in ancient rocks on planetary bodies and planetesimals. MapX is an arm-based instrument positioned on a rock or regolith with touch sensors. During an analysis, an X-ray source (tube or radioisotope) bombards the sample with X-rays or alpha-particles / gamma-rays, resulting in sample X-ray Fluorescence (XRF). X-rays emitted in the direction of an X-ray sensitive CCD imager pass through a 1:1 focusing lens (X-ray micro-pore Optic (MPO)) that projects a spatially resolved image of the X-rays onto the CCD. The CCD is operated in single photon counting mode so that the energies and positions of individual X-ray photons are recorded. In a single analysis, several thousand frames are both stored and processed in real-time. Higher level data products include single-element maps with a lateral spatial resolution of 100 microns and quantitative XRF spectra from ground- or instrument- selected Regions of Interest (ROI). XRF spectra from ROI are compared with known rock and mineral compositions to extrapolate the data to rock types and putative mineralogies. When applied to airless bodies and implemented with an appropriate radioisotope source for alpha-particle excitation, MapX will be able to analyze biogenic elements C, N, O, P, S, in addition to the cations of the rock-forming elements >Na, accessible with either X-ray or gamma-ray excitation. The MapX concept has been demonstrated with a series of lab-based prototypes and is currently under refinement and TRL maturation.

  15. Measuring Trace Gas Emission from Multi-Distributed Sources Using Vertical Radial Plume Mapping (VRPM and Backward Lagrangian Stochastic (bLS Techniques

    Directory of Open Access Journals (Sweden)

    Thomas K. Flesch

    2011-09-01

    Full Text Available Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The vertical radial plume mapping (VRPM and the backward Lagrangian stochastic (bLS techniques with an open-path optical spectroscopic sensor were evaluated for relative accuracy for multiple emission-source and sensor configurations. The relative accuracy was calculated by dividing the measured emission rate by the actual emission rate; thus, a relative accuracy of 1.0 represents a perfect measure. For a single area emission source, the VRPM technique yielded a somewhat high relative accuracy of 1.38 ± 0.28. The bLS technique resulted in a relative accuracy close to unity, 0.98 ± 0.24. Relative accuracies for dual source emissions for the VRPM and bLS techniques were somewhat similar to single source emissions, 1.23 ± 0.17 and 0.94 ± 0.24, respectively. When the bLS technique was used with vertical point concentrations, the relative accuracy was unacceptably low,

  16. Complete blood count - slideshow

    Science.gov (United States)

    ... Duplication for commercial use must be authorized in writing by ADAM Health Solutions. About MedlinePlus Site Map FAQs Customer Support Get email updates Subscribe to RSS Follow us Disclaimers Copyright ...

  17. Cardiac MOLLI T1 mapping at 3.0 T: comparison of patient-adaptive dual-source RF and conventional RF transmission.

    Science.gov (United States)

    Rasper, Michael; Nadjiri, Jonathan; Sträter, Alexandra S; Settles, Marcus; Laugwitz, Karl-Ludwig; Rummeny, Ernst J; Huber, Armin M

    2017-06-01

    To prospectively compare image quality and myocardial T 1 relaxation times of modified Look-Locker inversion recovery (MOLLI) imaging at 3.0 T (T) acquired with patient-adaptive dual-source (DS) and conventional single-source (SS) radiofrequency (RF) transmission. Pre- and post-contrast MOLLI T 1 mapping using SS and DS was acquired in 27 patients. Patient wise and segment wise analysis of T 1 times was performed. The correlation of DS MOLLI measurements with a reference spin echo sequence was analysed in phantom experiments. DS MOLLI imaging reduced T 1 standard deviation in 14 out of 16 myocardial segments (87.5%). Significant reduction of T 1 variance could be obtained in 7 segments (43.8%). DS significantly reduced myocardial T 1 variance in 16 out of 25 patients (64.0%). With conventional RF transmission, dielectric shading artefacts occurred in six patients causing diagnostic uncertainty. No according artefacts were found on DS images. DS image findings were in accordance with conventional T 1 mapping and late gadolinium enhancement (LGE) imaging. Phantom experiments demonstrated good correlation of myocardial T 1 time between DS MOLLI and spin echo imaging. Dual-source RF transmission enhances myocardial T 1 homogeneity in MOLLI imaging at 3.0 T. The reduction of signal inhomogeneities and artefacts due to dielectric shading is likely to enhance diagnostic confidence.

  18. Combined 60° Wide-Field Choroidal Thickness Maps and High-Definition En Face Vasculature Visualization Using Swept-Source Megahertz OCT at 1050 nm.

    Science.gov (United States)

    Mohler, Kathrin J; Draxinger, Wolfgang; Klein, Thomas; Kolb, Jan Philip; Wieser, Wolfgang; Haritoglou, Christos; Kampik, Anselm; Fujimoto, James G; Neubauer, Aljoscha S; Huber, Robert; Wolf, Armin

    2015-10-01

    To demonstrate ultrahigh-speed swept-source optical coherence tomography (SS-OCT) at 1.68 million A-scans/s for choroidal imaging in normal and diseased eyes over a ∼60° field of view. To investigate and correlate wide-field three-dimensional (3D) choroidal thickness (ChT) and vascular patterns using ChT maps and coregistered high-definition en face images extracted from a single densely sampled Megahertz-OCT (MHz-OCT) dataset. High-definition, ∼60° wide-field 3D datasets consisting of 2088 × 1024 A-scans were acquired using a 1.68 MHz prototype SS-OCT system at 1050 nm based on a Fourier-domain mode-locked laser. Nine subjects (nine eyes) with various chorioretinal diseases or without ocular pathology are presented. Coregistered ChT maps, choroidal summation maps, and depth-resolved en face images referenced to either the retinal pigment epithelium or the choroidal-scleral interface were generated using manual segmentation. Wide-field ChT maps showed a large inter- and intraindividual variance in peripheral and central ChT. In only four of the nine eyes, the location with the largest ChT was coincident with the fovea. The anatomy of the large lumen vessels of the outer choroid seems to play a major role in determining the global ChT pattern. Focal ChT changes with large thickness gradients were observed in some eyes. Different ChT and vascular patterns could be visualized over ∼60° in patients for the first time using OCT. Due to focal ChT changes, a high density of thickness measurements may be favorable. High-definition depth-resolved en face images are complementary to cross sections and thickness maps and enhance the interpretation of different ChT patterns.

  19. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    Science.gov (United States)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  20. The World Spatiotemporal Analytics and Mapping Project (WSTAMP: Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World’s Largest Open Source Data Sets

    Directory of Open Access Journals (Sweden)

    J. Piburn

    2017-10-01

    Full Text Available Spatiotemporal (ST analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  1. Rcount: simple and flexible RNA-Seq read counting.

    Science.gov (United States)

    Schmid, Marc W; Grossniklaus, Ueli

    2015-02-01

    Analysis of differential gene expression by RNA sequencing (RNA-Seq) is frequently done using feature counts, i.e. the number of reads mapping to a gene. However, commonly used count algorithms (e.g. HTSeq) do not address the problem of reads aligning with multiple locations in the genome (multireads) or reads aligning with positions where two or more genes overlap (ambiguous reads). Rcount specifically addresses these issues. Furthermore, Rcount allows the user to assign priorities to certain feature types (e.g. higher priority for protein-coding genes compared to rRNA-coding genes) or to add flanking regions. Rcount provides a fast and easy-to-use graphical user interface requiring no command line or programming skills. It is implemented in C++ using the SeqAn (www.seqan.de) and the Qt libraries (qt-project.org). Source code and 64 bit binaries for (Ubuntu) Linux, Windows (7) and MacOSX are released under the GPLv3 license and are freely available on github.com/MWSchmid/Rcount. marcschmid@gmx.ch Test data, genome annotation files, useful Python and R scripts and a step-by-step user guide (including run-time and memory usage tests) are available on github.com/MWSchmid/Rcount. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  2. Color quench correction for low level Cherenkov counting.

    Science.gov (United States)

    Tsroya, S; Pelled, O; German, U; Marco, R; Katorza, E; Alfassi, Z B

    2009-05-01

    The Cherenkov counting efficiency varies strongly with color quenching, thus correction curves must be used to obtain correct results. The external (152)Eu source of a Quantulus 1220 liquid scintillation counting (LSC) system was used to obtain a quench indicative parameter based on spectra area ratio. A color quench correction curve for aqueous samples containing (90)Sr/(90)Y was prepared. The main advantage of this method over the common spectra indicators is its usefulness also for low level Cherenkov counting.

  3. Dual-source dual-energy CT angiography with virtual non-enhanced images and iodine map for active gastrointestinal bleeding: Image quality, radiation dose and diagnostic performance

    International Nuclear Information System (INIS)

    Sun, Hao; Hou, Xin-Yi; Xue, Hua-Dan; Li, Xiao-Guang; Jin, Zheng-Yu; Qian, Jia-Ming; Yu, Jian-Chun; Zhu, Hua-Dong

    2015-01-01

    Highlights: • GIB is a common gastrointestinal emergency with a high mortality rate. • Detection and localization of GIB source are important for imaging modality. • DSDECTA using a dual-phase scan protocol is clinically feasible. • DSDECTA with VNE and iodine map images can diagnose the active GIB source accurately. • DSDECTA can reduce radiation dose compared with conventional CT examination in GIB. - Abstract: Objectives: To evaluate the clinical feasibility of dual-source dual-energy CT angiography (DSDECTA) with virtual non-enhanced images and iodine map for active gastrointestinal bleeding (GIB). Methods: From June 2010 to December 2012, 112 consecutive patients with clinical signs of active GIB underwent DSDECTA with true non-enhanced (TNE), arterial phase with single-source mode, and portal-venous phase with dual-energy mode (100 kVp/230 mAs and Sn 140 kVp/178 mAs). Virtual non-enhanced CT (VNE) image sets and iodine map were reformatted from ‘Liver VNC’ software. The mean CT number, noise, signal to noise ratio (SNR), image quality and radiation dose were compared between TNE and VNE image sets. Two radiologists, blinded to clinical data, interpreted images from DSDECTA with TNE (protocol 1), and DSDECTA with VNE and iodine map (protocol 2) respectively, with discordant interpretation resolved by consensus. The standards of reference included digital subtraction angiography, endoscopy, surgery, or final pathology reports. Receiver–operating characteristic (ROC) analysis was undertaken and the area under the curve (AUC) calculated for CT protocols 1 and 2, respectively. Results: There was no significant difference in mean CT numbers of all organs (including liver, pancreas, spleen, kidney, abdominal aorta, and psoas muscle) (P > 0.05). Lower noise and higher SNR were found on VNE images than TNE images (P < 0.05). Image quality of VNE was lower than that of TNE without significant difference (P > 0.05). The active GIB source was identified

  4. Dual-source dual-energy CT angiography with virtual non-enhanced images and iodine map for active gastrointestinal bleeding: Image quality, radiation dose and diagnostic performance

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Hao, E-mail: sunhao_robert@126.com [Department of Radiology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Shuaifuyuan No. 1, Wangfujing Street, Dongcheng District, Beijing 100730 (China); Hou, Xin-Yi, E-mail: hxy_pumc@126.com [Department of Radiology, Beijing Tiantan Hospital, Capital Medical University, Beijing (China); Xue, Hua-Dan, E-mail: bjdanna95@hotmail.com [Department of Radiology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Shuaifuyuan No. 1, Wangfujing Street, Dongcheng District, Beijing 100730 (China); Li, Xiao-Guang, E-mail: xglee88@126.com [Department of Radiology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Shuaifuyuan No. 1, Wangfujing Street, Dongcheng District, Beijing 100730 (China); Jin, Zheng-Yu, E-mail: zhengyu_jin@126.com [Department of Radiology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Shuaifuyuan No. 1, Wangfujing Street, Dongcheng District, Beijing 100730 (China); Qian, Jia-Ming, E-mail: qjiaming57@gmail.com [Department of Gastroenterology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing (China); Yu, Jian-Chun, E-mail: yu-jch@163.com [Department of General Surgery, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing (China); Zhu, Hua-Dong, E-mail: huadongzhu@hotmail.com [Department of Emergency, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing (China)

    2015-05-15

    Highlights: • GIB is a common gastrointestinal emergency with a high mortality rate. • Detection and localization of GIB source are important for imaging modality. • DSDECTA using a dual-phase scan protocol is clinically feasible. • DSDECTA with VNE and iodine map images can diagnose the active GIB source accurately. • DSDECTA can reduce radiation dose compared with conventional CT examination in GIB. - Abstract: Objectives: To evaluate the clinical feasibility of dual-source dual-energy CT angiography (DSDECTA) with virtual non-enhanced images and iodine map for active gastrointestinal bleeding (GIB). Methods: From June 2010 to December 2012, 112 consecutive patients with clinical signs of active GIB underwent DSDECTA with true non-enhanced (TNE), arterial phase with single-source mode, and portal-venous phase with dual-energy mode (100 kVp/230 mAs and Sn 140 kVp/178 mAs). Virtual non-enhanced CT (VNE) image sets and iodine map were reformatted from ‘Liver VNC’ software. The mean CT number, noise, signal to noise ratio (SNR), image quality and radiation dose were compared between TNE and VNE image sets. Two radiologists, blinded to clinical data, interpreted images from DSDECTA with TNE (protocol 1), and DSDECTA with VNE and iodine map (protocol 2) respectively, with discordant interpretation resolved by consensus. The standards of reference included digital subtraction angiography, endoscopy, surgery, or final pathology reports. Receiver–operating characteristic (ROC) analysis was undertaken and the area under the curve (AUC) calculated for CT protocols 1 and 2, respectively. Results: There was no significant difference in mean CT numbers of all organs (including liver, pancreas, spleen, kidney, abdominal aorta, and psoas muscle) (P > 0.05). Lower noise and higher SNR were found on VNE images than TNE images (P < 0.05). Image quality of VNE was lower than that of TNE without significant difference (P > 0.05). The active GIB source was identified

  5. Accuracy of Combined Computed Tomography Colonography and Dual Energy Iiodine Map Imaging for Detecting Colorectal masses using High-pitch Dual-source CT.

    Science.gov (United States)

    Sun, Kai; Han, Ruijuan; Han, Yang; Shi, Xuesen; Hu, Jiang; Lu, Bin

    2018-02-28

    To evaluate the diagnostic accuracy of combined computed tomography colonography (CTC) and dual-energy iodine map imaging for detecting colorectal masses using high-pitch dual-source CT, compared with optical colonography (OC) and histopathologic findings. Twenty-eight consecutive patients were prospectively enrolled in this study. All patients were underwent contrast-enhanced CTC acquisition using dual-energy mode and OC and pathologic examination. The size of the space-occupied mass, the CT value after contrast enhancement, and the iodine value were measured and statistically compared. The sensitivity, specificity, accuracy rate, and positive predictive and negative predictive values of dual-energy contrast-enhanced CTC were calculated and compared between conventional CTC and dual-energy iodine images. The iodine value of stool was significantly lower than the colonic neoplasia (P dual-energy iodine maps imaging was 95.6% (95% CI = 77.9%-99.2%). The specificity of the two methods was 42.8% (95% CI = 15.4%-93.5%) and 100% (95% CI = 47.9%-100%; P = 0.02), respectively. Compared with optical colonography and histopathology, combined CTC and dual-energy iodine maps imaging can distinguish stool and colonic neoplasia, distinguish between benign and malignant tumors initially and improve the diagnostic accuracy of CTC for colorectal cancer screening.

  6. CalCOFI Egg Counts

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Fish egg counts and standardized counts for eggs captured in CalCOFI icthyoplankton nets (primarily vertical [Calvet or Pairovet], oblique [bongo or ring nets], and...

  7. Gully Erosion Mapping and Monitoring at Multiple Scales Based on Multi-Source Remote Sensing Data of the Sancha River Catchment, Northeast China

    Directory of Open Access Journals (Sweden)

    Ranghu Wang

    2016-11-01

    Full Text Available This research is focused on gully erosion mapping and monitoring at multiple spatial scales using multi-source remote sensing data of the Sancha River catchment in Northeast China, where gullies extend over a vast area. A high resolution satellite image (Pleiades 1A, 0.7 m was used to obtain the spatial distribution of the gullies of the overall basin. Image visual interpretation with field verification was employed to map the geometric gully features and evaluate gully erosion as well as the topographic differentiation characteristics. Unmanned Aerial Vehicle (UAV remote sensing data and the 3D photo-reconstruction method were employed for detailed gully mapping at a site scale. The results showed that: (1 the sub-meter image showed a strong ability in the recognition of various gully types and obtained satisfactory results, and the topographic factors of elevation, slope and slope aspects exerted significant influence on the gully spatial distribution at the catchment scale; and (2 at a more detailed site scale, UAV imagery combined with 3D photo-reconstruction provided a Digital Surface Model (DSM and ortho-image at the centimeter level as well as a detailed 3D model. The resulting products revealed the area of agricultural utilization and its shaping by human agricultural activities and water erosion in detail, and also provided the gully volume. The present study indicates that using multi-source remote sensing data, including satellite and UAV imagery simultaneously, results in an effective assessment of gully erosion over multiple spatial scales. The combined approach should be continued to regularly monitor gully erosion to understand the erosion process and its relationship with the environment from a comprehensive perspective.

  8. Tank Information System (tis): a Case Study in Migrating Web Mapping Application from Flex to Dojo for Arcgis Server and then to Open Source

    Science.gov (United States)

    Pulsani, B. R.

    2017-11-01

    Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.

  9. TANK INFORMATION SYSTEM (TIS: A CASE STUDY IN MIGRATING WEB MAPPING APPLICATION FROM FLEX TO DOJO FOR ARCGIS SERVER AND THEN TO OPEN SOURCE

    Directory of Open Access Journals (Sweden)

    B. R. Pulsani

    2017-11-01

    Full Text Available Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.

  10. Counting statistics in radioactivity measurements

    International Nuclear Information System (INIS)

    Martin, J.

    1975-01-01

    The application of statistical methods to radioactivity measurement problems is analyzed in several chapters devoted successively to: the statistical nature of radioactivity counts; the application to radioactive counting of two theoretical probability distributions, Poisson's distribution law and the Laplace-Gauss law; true counting laws; corrections related to the nature of the apparatus; statistical techniques in gamma spectrometry [fr

  11. Protecting count queries in study design.

    Science.gov (United States)

    Vinterbo, Staal A; Sarwate, Anand D; Boxwala, Aziz A

    2012-01-01

    Today's clinical research institutions provide tools for researchers to query their data warehouses for counts of patients. To protect patient privacy, counts are perturbed before reporting; this compromises their utility for increased privacy. The goal of this study is to extend current query answer systems to guarantee a quantifiable level of privacy and allow users to tailor perturbations to maximize the usefulness according to their needs. A perturbation mechanism was designed in which users are given options with respect to scale and direction of the perturbation. The mechanism translates the true count, user preferences, and a privacy level within administrator-specified bounds into a probability distribution from which the perturbed count is drawn. Users can significantly impact the scale and direction of the count perturbation and can receive more accurate final cohort estimates. Strong and semantically meaningful differential privacy is guaranteed, providing for a unified privacy accounting system that can support role-based trust levels. This study provides an open source web-enabled tool to investigate visually and numerically the interaction between system parameters, including required privacy level and user preference settings. Quantifying privacy allows system administrators to provide users with a privacy budget and to monitor its expenditure, enabling users to control the inevitable loss of utility. While current measures of privacy are conservative, this system can take advantage of future advances in privacy measurement. The system provides new ways of trading off privacy and utility that are not provided in current study design systems.

  12. Multi-dimensional water quality assessment of an urban drinking water source elucidated by high resolution underwater towed vehicle mapping.

    Science.gov (United States)

    Lock, Alan; Spiers, Graeme; Hostetler, Blair; Ray, James; Wallschläger, Dirk

    2016-04-15

    Spatial surveys of Ramsey Lake, Sudbury, Ontario water quality were conducted using an innovative underwater towed vehicle (UTV) equipped with a multi-parameter probe providing real-time water quality data. The UTV revealed underwater vent sites through high resolution monitoring of different spatial chemical characteristics using common sensors (turbidity, chloride, dissolved oxygen, and oxidation/reduction sensors) that would not be feasible with traditional water sampling methods. Multi-parameter probe vent site identification is supported by elevated alkalinity and silica concentrations at these sites. The identified groundwater vent sites appear to be controlled by bedrock fractures that transport water from different sources with different contaminants of concern. Elevated contaminants, such as, arsenic and nickel and/or nutrient concentrations are evident at the vent sites, illustrating the potential of these sources to degrade water quality. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Modal Logics with Counting

    Science.gov (United States)

    Areces, Carlos; Hoffmann, Guillaume; Denis, Alexandre

    We present a modal language that includes explicit operators to count the number of elements that a model might include in the extension of a formula, and we discuss how this logic has been previously investigated under different guises. We show that the language is related to graded modalities and to hybrid logics. We illustrate a possible application of the language to the treatment of plural objects and queries in natural language. We investigate the expressive power of this logic via bisimulations, discuss the complexity of its satisfiability problem, define a new reasoning task that retrieves the cardinality bound of the extension of a given input formula, and provide an algorithm to solve it.

  14. Digital coincidence counting

    International Nuclear Information System (INIS)

    Buckman, S.M.; Ius, D.

    1996-01-01

    This paper reports on the development of a digital coincidence-counting system which comprises a custom-built data acquisition card and associated PC software. The system has been designed to digitise the pulse-trains from two radiation detectors at a rate of 20 MSamples/s with 12-bit resolution. Through hardware compression of the data, the system can continuously record both individual pulse-shapes and the time intervals between pulses. Software-based circuits are used to process the stored pulse trains. These circuits are constructed simply by linking together icons representing various components such as coincidence mixers, time delays, single-channel analysers, deadtimes and scalers. This system enables a pair of pulse trains to be processed repeatedly using any number of different methods. Some preliminary results are presented in order to demonstrate the versatility and efficiency of this new method. (orig.)

  15. Digital coincidence counting

    Science.gov (United States)

    Buckman, S. M.; Ius, D.

    1996-02-01

    This paper reports on the development of a digital coincidence-counting system which comprises a custom-built data acquisition card and associated PC software. The system has been designed to digitise the pulse-trains from two radiation detectors at a rate of 20 MSamples/s with 12-bit resolution. Through hardware compression of the data, the system can continuously record both individual pulse-shapes and the time intervals between pulses. Software-based circuits are used to process the stored pulse trains. These circuits are constructed simply by linking together icons representing various components such as coincidence mixers, time delays, single-channel analysers, deadtimes and scalers. This system enables a pair of pulse trains to be processed repeatedly using any number of different methods. Some preliminary results are presented in order to demonstrate the versatility and efficiency of this new method.

  16. Mapping carbon storage in urban trees with multi-source remote sensing data: Relationships between biomass, land use, and demographics in Boston neighborhoods

    International Nuclear Information System (INIS)

    Raciti, Steve M.; Hutyra, Lucy R.; Newell, Jared D.

    2014-01-01

    High resolution maps of urban vegetation and biomass are powerful tools for policy-makers and community groups seeking to reduce rates of urban runoff, moderate urban heat island effects, and mitigate the effects of greenhouse gas emissions. We developed a very high resolution map of urban tree biomass, assessed the scale sensitivities in biomass estimation, compared our results with lower resolution estimates, and explored the demographic relationships in biomass distribution across the City of Boston. We integrated remote sensing data (including LiDAR-based tree height estimates) and field-based observations to map canopy cover and aboveground tree carbon storage at ∼ 1 m spatial scale. Mean tree canopy cover was estimated to be 25.5 ± 1.5% and carbon storage was 355 Gg (28.8 Mg C ha −1 ) for the City of Boston. Tree biomass was highest in forest patches (110.7 Mg C ha −1 ), but residential (32.8 Mg C ha −1 ) and developed open (23.5 Mg C ha −1 ) land uses also contained relatively high carbon stocks. In contrast with previous studies, we did not find significant correlations between tree biomass and the demographic characteristics of Boston neighborhoods, including income, education, race, or population density. The proportion of households that rent was negatively correlated with urban tree biomass (R 2 = 0.26, p = 0.04) and correlated with Priority Planting Index values (R 2 = 0.55, p = 0.001), potentially reflecting differences in land management among rented and owner-occupied residential properties. We compared our very high resolution biomass map to lower resolution biomass products from other sources and found that those products consistently underestimated biomass within urban areas. This underestimation became more severe as spatial resolution decreased. This research demonstrates that 1) urban areas contain considerable tree carbon stocks; 2) canopy cover and biomass may not be related to the demographic characteristics of Boston

  17. Mapping carbon storage in urban trees with multi-source remote sensing data: Relationships between biomass, land use, and demographics in Boston neighborhoods

    Energy Technology Data Exchange (ETDEWEB)

    Raciti, Steve M., E-mail: Steve.M.Raciti@Hofstra.edu [Department of Biology, Hofstra University, Gittleson Hall, Hempstead, NY 11549 (United States); Department of Earth and Environment, Boston University, 685 Commonwealth Ave., Boston, MA 02215 (United States); Hutyra, Lucy R.; Newell, Jared D. [Department of Earth and Environment, Boston University, 685 Commonwealth Ave., Boston, MA 02215 (United States)

    2014-12-01

    High resolution maps of urban vegetation and biomass are powerful tools for policy-makers and community groups seeking to reduce rates of urban runoff, moderate urban heat island effects, and mitigate the effects of greenhouse gas emissions. We developed a very high resolution map of urban tree biomass, assessed the scale sensitivities in biomass estimation, compared our results with lower resolution estimates, and explored the demographic relationships in biomass distribution across the City of Boston. We integrated remote sensing data (including LiDAR-based tree height estimates) and field-based observations to map canopy cover and aboveground tree carbon storage at ∼ 1 m spatial scale. Mean tree canopy cover was estimated to be 25.5 ± 1.5% and carbon storage was 355 Gg (28.8 Mg C ha{sup −1}) for the City of Boston. Tree biomass was highest in forest patches (110.7 Mg C ha{sup −1}), but residential (32.8 Mg C ha{sup −1}) and developed open (23.5 Mg C ha{sup −1}) land uses also contained relatively high carbon stocks. In contrast with previous studies, we did not find significant correlations between tree biomass and the demographic characteristics of Boston neighborhoods, including income, education, race, or population density. The proportion of households that rent was negatively correlated with urban tree biomass (R{sup 2} = 0.26, p = 0.04) and correlated with Priority Planting Index values (R{sup 2} = 0.55, p = 0.001), potentially reflecting differences in land management among rented and owner-occupied residential properties. We compared our very high resolution biomass map to lower resolution biomass products from other sources and found that those products consistently underestimated biomass within urban areas. This underestimation became more severe as spatial resolution decreased. This research demonstrates that 1) urban areas contain considerable tree carbon stocks; 2) canopy cover and biomass may not be related to the demographic

  18. Mapping of the flux and estimate of the radiation source term of neutron fields generated by the GE PETtrace-8 cyclotron

    International Nuclear Information System (INIS)

    Benavente Castillo, Jhonny Antonio

    2017-01-01

    The use of spectrometric techniques in a cyclotron facility is strongly advised for the complete characterization of the neutron radiation field. In recent years, several studies of neutron spectrometry have been carried out at the Cyclotron of the Development Center of Nuclear Technology (CDTN). The main objective of this work is to propose a methodology for mapping of the flux and estimate of the radiation source term of neutron fields generated by the GE PETtrace-8 cyclotron. The method of neutron activation analysis with gold, indium and nickel activation foils was used to measure the activities induced at specific points in the cyclotron bunker. The irradiations of the activation foils were performed using the intermittent irradiation method to optimize the radiation field during 18 F production. The study of the neutron spectrum was performed using three radiation source terms. The first source term was constructed based on data provided by the cyclotron manufacturer using the neutron cross sections of the ENDF/B-VII library. The other two were proposed considering the irradiation process used in the routine of 18 F production. Both radiation source terms used the LA150H proton cross sections and for the 18 O, the cross sections of the physical model CEM03 (Cascade-exciton model) and TENDL (TALYS-based Evaluated Nuclear Data Library) were used. The results of the source terms in relation to the experimental results, in terms of neutron fluence rates, reaction rates and dose equivalent rates, showed that are in the same order of magnitude as those obtained by Ogata et al, Fujibuchi et al, and Gallerani et al., for the same cyclotron; and by Mendez et al. for a different manufacturing cyclotron. The models of the proposed radiation source terms were validated to obtain the spectra generated during the 18 F production when water enriched at 18 O is bombarded with a proton beam of 16.5 MeV. Finally, the model of the LA150H - TENDL - 2015 radiation source term is

  19. Counting statistics in low level radioactivity measurements fluctuating counting efficiency

    International Nuclear Information System (INIS)

    Pazdur, M.F.

    1976-01-01

    A divergence between the probability distribution of the number of nuclear disintegrations and the number of observed counts, caused by counting efficiency fluctuation, is discussed. The negative binominal distribution is proposed to describe the probability distribution of the number of counts, instead of Poisson distribution, which is assumed to hold for the number of nuclear disintegrations only. From actual measurements the r.m.s. amplitude of counting efficiency fluctuation is estimated. Some consequences of counting efficiency fluctuation are investigated and the corresponding formulae are derived: (1) for detection limit as a function of the number of partial measurements and the relative amplitude of counting efficiency fluctuation, and (2) for optimum allocation of the number of partial measurements between sample and background. (author)

  20. Set of counts by scintillations for atmospheric samplings

    International Nuclear Information System (INIS)

    Appriou, D.; Doury, A.

    1962-01-01

    The author reports the development of a scintillation-based counting assembly with the following characteristics: a photo-multiplier with a wide photo-cathode, a thin plastic scintillator for the counting of beta + alpha (and possibility of mounting an alpha scintillator), a relatively small own motion with respect to activities to be counted, a weakly varying efficiency. The authors discuss the counting objective, present equipment tests (counter, proportional amplifier and pre-amplifier, input drawer). They describe the apparatus operation, discuss the selection of scintillators, report the study of the own movement (electron-based background noise, total background noise, background noise reduction), discuss counts (influence of the external source, sensitivity to alpha radiations, counting homogeneity, minimum detectable activity) and efficiencies

  1. a Free and Open Source Tool to Assess the Accuracy of Land Cover Maps: Implementation and Application to Lombardy Region (italy)

    Science.gov (United States)

    Bratic, G.; Brovelli, M. A.; Molinari, M. E.

    2018-04-01

    The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.

  2. Seeing voices: High-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion.

    Science.gov (United States)

    Saint-Amour, Dave; De Sanctis, Pierfilippo; Molholm, Sophie; Ritter, Walter; Foxe, John J

    2007-02-01

    Seeing a speaker's facial articulatory gestures powerfully affects speech perception, helping us overcome noisy acoustical environments. One particularly dramatic illustration of visual influences on speech perception is the "McGurk illusion", where dubbing an auditory phoneme onto video of an incongruent articulatory movement can often lead to illusory auditory percepts. This illusion is so strong that even in the absence of any real change in auditory stimulation, it activates the automatic auditory change-detection system, as indexed by the mismatch negativity (MMN) component of the auditory event-related potential (ERP). We investigated the putative left hemispheric dominance of McGurk-MMN using high-density ERPs in an oddball paradigm. Topographic mapping of the initial McGurk-MMN response showed a highly lateralized left hemisphere distribution, beginning at 175 ms. Subsequently, scalp activity was also observed over bilateral fronto-central scalp with a maximal amplitude at approximately 290 ms, suggesting later recruitment of right temporal cortices. Strong left hemisphere dominance was again observed during the last phase of the McGurk-MMN waveform (350-400 ms). Source analysis indicated bilateral sources in the temporal lobe just posterior to primary auditory cortex. While a single source in the right superior temporal gyrus (STG) accounted for the right hemisphere activity, two separate sources were required, one in the left transverse gyrus and the other in STG, to account for left hemisphere activity. These findings support the notion that visually driven multisensory illusory phonetic percepts produce an auditory-MMN cortical response and that left hemisphere temporal cortex plays a crucial role in this process.

  3. Let's Make Data Count

    Science.gov (United States)

    Budden, A. E.; Abrams, S.; Chodacki, J.; Cruse, P.; Fenner, M.; Jones, M. B.; Lowenberg, D.; Rueda, L.; Vieglais, D.

    2017-12-01

    The impact of research has traditionally been measured by citations to journal publications and used extensively for evaluation and assessment in academia, but this process misses the impact and reach of data and software as first-class scientific products. For traditional publications, Article-Level Metrics (ALM) capture the multitude of ways in which research is disseminated and used, such as references and citations within social media and other journal articles. Here we present on the extension of usage and citation metrics collection to include other artifacts of research, namely datasets. The Make Data Count (MDC) project will enable measuring the impact of research data in a manner similar to what is currently done with publications. Data-level metrics (DLM) are a multidimensional suite of indicators measuring the broad reach and use of data as legitimate research outputs. By making data metrics openly available for reuse in a number of different ways, the MDC project represents an important first step on the path towards the full integration of data metrics into the research data management ecosystem. By assuring researchers that their contributions to scholarly progress represented by data corpora are acknowledged, data level metrics provide a foundation for streamlining the advancement of knowledge by actively promoting desirable best practices regarding research data management, publication, and sharing.

  4. LAWRENCE RADIATION LABORATORY COUNTING HANDBOOK

    Energy Technology Data Exchange (ETDEWEB)

    Group, Nuclear Instrumentation

    1966-10-01

    The Counting Handbook is a compilation of operational techniques and performance specifications on counting equipment in use at the Lawrence Radiation Laboratory, Berkeley. Counting notes have been written from the viewpoint of the user rather than that of the designer or maintenance man. The only maintenance instructions that have been included are those that can easily be performed by the experimenter to assure that the equipment is operating properly.

  5. Mapping carbon storage in urban trees with multi-source remote sensing data: relationships between biomass, land use, and demographics in Boston neighborhoods.

    Science.gov (United States)

    Raciti, Steve M; Hutyra, Lucy R; Newell, Jared D

    2014-12-01

    High resolution maps of urban vegetation and biomass are powerful tools for policy-makers and community groups seeking to reduce rates of urban runoff, moderate urban heat island effects, and mitigate the effects of greenhouse gas emissions. We developed a very high resolution map of urban tree biomass, assessed the scale sensitivities in biomass estimation, compared our results with lower resolution estimates, and explored the demographic relationships in biomass distribution across the City of Boston. We integrated remote sensing data (including LiDAR-based tree height estimates) and field-based observations to map canopy cover and aboveground tree carbon storage at ~1m spatial scale. Mean tree canopy cover was estimated to be 25.5±1.5% and carbon storage was 355Gg (28.8MgCha(-1)) for the City of Boston. Tree biomass was highest in forest patches (110.7MgCha(-1)), but residential (32.8MgCha(-1)) and developed open (23.5MgCha(-1)) land uses also contained relatively high carbon stocks. In contrast with previous studies, we did not find significant correlations between tree biomass and the demographic characteristics of Boston neighborhoods, including income, education, race, or population density. The proportion of households that rent was negatively correlated with urban tree biomass (R(2)=0.26, p=0.04) and correlated with Priority Planting Index values (R(2)=0.55, p=0.001), potentially reflecting differences in land management among rented and owner-occupied residential properties. We compared our very high resolution biomass map to lower resolution biomass products from other sources and found that those products consistently underestimated biomass within urban areas. This underestimation became more severe as spatial resolution decreased. This research demonstrates that 1) urban areas contain considerable tree carbon stocks; 2) canopy cover and biomass may not be related to the demographic characteristics of Boston neighborhoods; and 3) that recent advances

  6. Mapping Carbon Storage in Urban Trees with Multi-source Remote Sensing Data: Relationships between Biomass, Land Use, and Demographics in Boston Neighborhoods

    Science.gov (United States)

    Raciti, S. M.; Hutyra, L.

    2014-12-01

    High resolution maps of urban vegetation and biomass are powerful tools for policy-makers and community groups seeking to reduce rates of urban runoff, moderate urban heat island effects, and mitigate the effects of greenhouse gas emissions. We develop a very high resolution map of urban tree biomass, assess the scale sensitivities in biomass estimation, compare our results with lower resolution estimates, and explore the demographic relationships in biomass distribution across the City of Boston. We integrated remote sensing data (including LiDAR-based tree height estimates) and field-based observations to map canopy cover and aboveground tree carbon storage at ~1 m spatial scale. Mean tree canopy cover was estimated to be 25.5±1.5% and carbon storage was 355 Gg (28.8 Mg C ha-1) for the City of Boston. Tree biomass was highest in forest patches (110.7 Mg C ha-1), but residential (32.8 Mg C ha-1) and developed open (23.5 Mg C ha-1) land uses also contained relatively high carbon stocks. In contrast with previous studies, we did not find significant correlations between tree biomass and the demographic characteristics of Boston neighborhoods, including income, education, race, or population density. The proportion of households that rent was negatively correlated with urban tree biomass (R2=0.26, p=0.04) and correlated with Priority Planting Index values (R2=0.55, p=0.001), potentially reflecting differences in land management among rented and owner-occupied residential properties. We compared our very high resolution biomass map to lower resolution biomass products from other sources and found that those products consistently underestimated biomass within urban areas. This underestimation became more severe as spatial resolution decreased. This research demonstrates that 1) urban areas contain considerable tree carbon stocks; 2) canopy cover and biomass may not be related to the demographic characteristics of Boston neighborhoods; and 3) that recent advances in

  7. SUMS Counts-Related Projects

    Data.gov (United States)

    Social Security Administration — Staging Instance for all SUMs Counts related projects including: Redeterminations/Limited Issue, Continuing Disability Resolution, CDR Performance Measures, Initial...

  8. Liquid scintillation counting

    International Nuclear Information System (INIS)

    Bunge, F.

    1986-01-01

    The methods applied for quench corrections in LSC rely on radioactivity standard sources as reference materials. For determining the quenching effect effect, there are methods based on the sample channel ratio, or the spectral quench parameter determination, or determination of the H-number. Currently available LSC equipment is briefly described. (DG) [de

  9. Compton suppression gamma-counting: The effect of count rate

    Science.gov (United States)

    Millard, H.T.

    1984-01-01

    Past research has shown that anti-coincidence shielded Ge(Li) spectrometers enhanced the signal-to-background ratios for gamma-photopeaks, which are situated on high Compton backgrounds. Ordinarily, an anti- or non-coincidence spectrum (A) and a coincidence spectrum (C) are collected simultaneously with these systems. To be useful in neutron activation analysis (NAA), the fractions of the photopeak counts routed to the two spectra must be constant from sample to sample to variations must be corrected quantitatively. Most Compton suppression counting has been done at low count rate, but in NAA applications, count rates may be much higher. To operate over the wider dynamic range, the effect of count rate on the ratio of the photopeak counts in the two spectra (A/C) was studied. It was found that as the count rate increases, A/C decreases for gammas not coincident with other gammas from the same decay. For gammas coincident with other gammas, A/C increases to a maximum and then decreases. These results suggest that calibration curves are required to correct photopeak areas so quantitative data can be obtained at higher count rates. ?? 1984.

  10. Mapping out Map Libraries

    Directory of Open Access Journals (Sweden)

    Ferjan Ormeling

    2008-09-01

    Full Text Available Discussing the requirements for map data quality, map users and their library/archives environment, the paper focuses on the metadata the user would need for a correct and efficient interpretation of the map data. For such a correct interpretation, knowledge of the rules and guidelines according to which the topographers/cartographers work (such as the kind of data categories to be collected, and the degree to which these rules and guidelines were indeed followed are essential. This is not only valid for the old maps stored in our libraries and archives, but perhaps even more so for the new digital files as the format in which we now have to access our geospatial data. As this would be too much to ask from map librarians/curators, some sort of web 2.0 environment is sought where comments about data quality, completeness and up-to-dateness from knowledgeable map users regarding the specific maps or map series studied can be collected and tagged to scanned versions of these maps on the web. In order not to be subject to the same disadvantages as Wikipedia, where the ‘communis opinio’ rather than scholarship, seems to be decisive, some checking by map curators of this tagged map use information would still be needed. Cooperation between map curators and the International Cartographic Association ( ICA map and spatial data use commission to this end is suggested.

  11. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  12. Shallow Groundwater Temperatures and the Urban Heat Island Effect: the First U.K City-wide Geothermal Map to Support Development of Ground Source Heating Systems Strategy

    Science.gov (United States)

    Patton, Ashley M.; Farr, Gareth J.; Boon, David P.; James, David R.; Williams, Bernard; Newell, Andrew J.

    2015-04-01

    The first UK city-wide heat map is described based on measurements of groundwater from a shallow superficial aquifer in the coastal city of Cardiff, Wales, UK. The UK Government has a target of reducing greenhouse gas emissions by 80% by 2050 (Climate Change Act 2008) and low carbon technologies are key to achieving this. To support the use of ground source heating we characterised the shallow heat potential of an urban aquifer to produce a baseline dataset which is intended to be used as a tool to inform developers and to underpin planning and regulation. We exploited an existing network of 168 groundwater monitoring boreholes across the city, recording the water temperature in each borehole at 1m depth intervals up to a depth of 20m. We recorded groundwater temperatures during the coldest part of 2014, and repeat profiling of the boreholes in different seasons has added a fourth dimension to our results and allowed us to characterise the maximum depth of seasonal temperature fluctuation. The temperature profiles were used to create a 3D model of heat potential within the aquifer using GOCAD® and the average borehole temperatures were contoured using Surfer® 10 to generate a 2D thermal resource map to support future assessment of urban Ground Source Heat Pumps prospectively. The average groundwater temperature in Cardiff was found to be above the average for England and Wales (11.3°C) with 90% of boreholes in excess of this figure by up to 4°C. The subsurface temperature profiles were also found to be higher than forecast by the predicted geothermal gradient for the area. Potential sources for heat include: conduction from buildings, basements and sub-surface infrastructure; insulation effects of the urban area and of the geology, and convection from leaking sewers. Other factors include recharge inhibition by drains, localised confinement and rock-water interaction in specific geology. It is likely to be a combination of multiple factors which we are hoping

  13. Manifold-Based Visual Object Counting.

    Science.gov (United States)

    Wang, Yi; Zou, Yuexian; Wang, Wenwu

    2018-07-01

    Visual object counting (VOC) is an emerging area in computer vision which aims to estimate the number of objects of interest in a given image or video. Recently, object density based estimation method is shown to be promising for object counting as well as rough instance localization. However, the performance of this method tends to degrade when dealing with new objects and scenes. To address this limitation, we propose a manifold-based method for visual object counting (M-VOC), based on the manifold assumption that similar image patches share similar object densities. Firstly, the local geometry of a given image patch is represented linearly by its neighbors using a predefined patch training set, and the object density of this given image patch is reconstructed by preserving the local geometry using locally linear embedding. To improve the characterization of local geometry, additional constraints such as sparsity and non-negativity are also considered via regularization, nonlinear mapping, and kernel trick. Compared with the state-of-the-art VOC methods, our proposed M-VOC methods achieve competitive performance on seven benchmark datasets. Experiments verify that the proposed M-VOC methods have several favorable properties, such as robustness to the variation in the size of training dataset and image resolution, as often encountered in real-world VOC applications.

  14. Track counting in radon dosimetry

    International Nuclear Information System (INIS)

    Fesenbeck, Ingo; Koehler, Bernd; Reichert, Klaus-Martin

    2013-01-01

    The newly developed, computer-controlled track counting system is capable of imaging and analyzing the entire area of nuclear track detectors. The high optical resolution allows a new analysis approach for the process of automated counting using digital image processing technologies. This way, higher exposed detectors can be evaluated reliably by an automated process as well. (orig.)

  15. The application of GBS markers for extending the dense genetic map of rye (Secale cereale L.) and the localization of the Rfc1 gene restoring male fertility in plants with the C source of sterility-inducing cytoplasm.

    Science.gov (United States)

    Milczarski, Paweł; Hanek, Monika; Tyrka, Mirosław; Stojałowski, Stefan

    2016-11-01

    Genotyping by sequencing (GBS) is an efficient method of genotyping in numerous plant species. One of the crucial steps toward the application of GBS markers in crop improvement is anchoring them on particular chromosomes. In rye (Secale cereale L.), chromosomal localization of GBS markers has not yet been reported. In this paper, the application of GBS markers generated by the DArTseq platform for extending the high-density map of rye is presented. Additionally, their application is used for the localization of the Rfc1 gene that restores male fertility in plants with the C source of sterility-inducing cytoplasm. The total number of markers anchored on the current version of the map is 19,081, of which 18,132 were obtained from the DArTseq platform. Numerous markers co-segregated within the studied mapping population, so, finally, only 3397 unique positions were located on the map of all seven rye chromosomes. The total length of the map is 1593 cM and the average distance between markers is 0.47 cM. In spite of the resolution of the map being not very high, it should be a useful tool for further studies of the Secale cereale genome because of the presence on this map of numerous GBS markers anchored for the first time on rye chromosomes. The Rfc1 gene was located on high-density maps of the long arm of the 4R chromosome obtained for two mapping populations. Genetic maps were composed of DArT, DArTseq, and PCR-based markers. Consistent mapping results were obtained and DArTs tightly linked to the Rfc1 gene were successfully applied for the development of six new PCR-based markers useful in marker-assisted selection.

  16. Galaxy number counts: Pt. 2

    International Nuclear Information System (INIS)

    Metcalfe, N.; Shanks, T.; Fong, R.; Jones, L.R.

    1991-01-01

    Using the Prime Focus CCD Camera at the Isaac Newton Telescope we have determined the form of the B and R galaxy number-magnitude count relations in 12 independent fields for 21 m ccd m and 19 m ccd m 5. The average galaxy count relations lie in the middle of the wide range previously encompassed by photographic data. The field-to-field variation of the counts is small enough to define the faint (B m 5) galaxy count to ±10 per cent and this variation is consistent with that expected from galaxy clustering considerations. Our new data confirm that the B, and also the R, galaxy counts show evidence for strong galaxy luminosity evolution, and that the majority of the evolving galaxies are of moderately blue colour. (author)

  17. Hanford whole body counting manual

    International Nuclear Information System (INIS)

    Palmer, H.E.; Brim, C.P.; Rieksts, G.A.; Rhoads, M.C.

    1987-05-01

    This document, a reprint of the Whole Body Counting Manual, was compiled to train personnel, document operation procedures, and outline quality assurance procedures. The current manual contains information on: the location, availability, and scope of services of Hanford's whole body counting facilities; the administrative aspect of the whole body counting operation; Hanford's whole body counting facilities; the step-by-step procedure involved in the different types of in vivo measurements; the detectors, preamplifiers and amplifiers, and spectroscopy equipment; the quality assurance aspect of equipment calibration and recordkeeping; data processing, record storage, results verification, report preparation, count summaries, and unit cost accounting; and the topics of minimum detectable amount and measurement accuracy and precision. 12 refs., 13 tabs

  18. Linking high-resolution geomorphic mapping, sediment sources, and channel types in a formerly glaciated basin of northeastern Alto-Adige/Sudtirol, Italy

    Science.gov (United States)

    Brardinoni, F.; Perina, E.; Bonfanti, G.; Falsitta, G.; Agliardi, F.

    2012-04-01

    To characterize channel-network morphodynamics and response potential in the Gadria-Strimm basin (14.8 km^2) we conduct a concerted effort entailing: (i) high-resolution mapping of landforms, channel reaches, and sediment sources; and (ii) historical evolution of colluvial channel disturbance through sequential aerial photosets (1945-59-69-82-90-00-06-11). The mapping was carried out via stereographic inspection of aerial photographs, examination of 2.5-m gridded DTM and DSM, and extensive field work. The study area is a formerly glaciated basin characterized by peculiar landform assemblages imposed by a combination of tectonic and glacial first-order structures. The most striking feature in Strimm Creek is a structurally-controlled valley step separating an upper hanging valley, dominated by periglacial and fluvial processes, and a V-notched lower part in which lateral colluvial channels are directly connected to Strimm's main stem. In Gadria Creek, massive kame terraces located in proximity of the headwaters provide virtually unlimited sediment supply to frequent debris-flow activity, making this sub-catchment an ideal site for monitoring, hence studying the mechanics of these processes. Preliminary results point to a high spatial variability of the colluvial channel network, in which sub-sectors have remained consistently active during the study period while others have become progressively dormant with notable forest re-growth. In an attempt to link sediment flux to topography and substrate type, future work will involve photogrammetric analysis across the sequential aerial photosets as well as a morphometric/geomechanical characterization of the surficial materials.

  19. A Method for Counting Moving People in Video Surveillance Videos

    Directory of Open Access Journals (Sweden)

    Mario Vento

    2010-01-01

    Full Text Available People counting is an important problem in video surveillance applications. This problem has been faced either by trying to detect people in the scene and then counting them or by establishing a mapping between some scene feature and the number of people (avoiding the complex detection problem. This paper presents a novel method, following this second approach, that is based on the use of SURF features and of an ϵ-SVR regressor provide an estimate of this count. The algorithm takes specifically into account problems due to partial occlusions and to perspective. In the experimental evaluation, the proposed method has been compared with the algorithm by Albiol et al., winner of the PETS 2009 contest on people counting, using the same PETS 2009 database. The provided results confirm that the proposed method yields an improved accuracy, while retaining the robustness of Albiol's algorithm.

  20. A Method for Counting Moving People in Video Surveillance Videos

    Directory of Open Access Journals (Sweden)

    Conte Donatello

    2010-01-01

    Full Text Available People counting is an important problem in video surveillance applications. This problem has been faced either by trying to detect people in the scene and then counting them or by establishing a mapping between some scene feature and the number of people (avoiding the complex detection problem. This paper presents a novel method, following this second approach, that is based on the use of SURF features and of an -SVR regressor provide an estimate of this count. The algorithm takes specifically into account problems due to partial occlusions and to perspective. In the experimental evaluation, the proposed method has been compared with the algorithm by Albiol et al., winner of the PETS 2009 contest on people counting, using the same PETS 2009 database. The provided results confirm that the proposed method yields an improved accuracy, while retaining the robustness of Albiol's algorithm.

  1. A Method for Counting Moving People in Video Surveillance Videos

    Science.gov (United States)

    Conte, Donatello; Foggia, Pasquale; Percannella, Gennaro; Tufano, Francesco; Vento, Mario

    2010-12-01

    People counting is an important problem in video surveillance applications. This problem has been faced either by trying to detect people in the scene and then counting them or by establishing a mapping between some scene feature and the number of people (avoiding the complex detection problem). This paper presents a novel method, following this second approach, that is based on the use of SURF features and of an [InlineEquation not available: see fulltext.]-SVR regressor provide an estimate of this count. The algorithm takes specifically into account problems due to partial occlusions and to perspective. In the experimental evaluation, the proposed method has been compared with the algorithm by Albiol et al., winner of the PETS 2009 contest on people counting, using the same PETS 2009 database. The provided results confirm that the proposed method yields an improved accuracy, while retaining the robustness of Albiol's algorithm.

  2. Accuracy in activation analysis: count rate effects

    International Nuclear Information System (INIS)

    Lindstrom, R.M.; Fleming, R.F.

    1980-01-01

    The accuracy inherent in activation analysis is ultimately limited by the uncertainty of counting statistics. When careful attention is paid to detail, several workers have shown that all systematic errors can be reduced to an insignificant fraction of the total uncertainty, even when the statistical limit is well below one percent. A matter of particular importance is the reduction of errors due to high counting rate. The loss of counts due to random coincidence (pulse pileup) in the amplifier and to digitization time in the ADC may be treated as a series combination of extending and non-extending dead times, respectively. The two effects are experimentally distinct. Live timer circuits in commercial multi-channel analyzers compensate properly for ADC dead time for long-lived sources, but not for pileup. Several satisfactory solutions are available, including pileup rejection and dead time correction circuits, loss-free ADCs, and computed corrections in a calibrated system. These methods are sufficiently reliable and well understood that a decaying source can be measured routinely with acceptably small errors at a dead time as high as 20 percent

  3. Geophysical Modelling and Multi-Scale Studies in the Arctic Seiland Igneous Province: Millimeter to Micrometer Scale Mapping of the Magnetic Sources by High Resolution Magnetic Microscopy

    Science.gov (United States)

    Pastore, Z.; Church, N. S.; McEnroe, S. A.; Oda, H.; ter Maat, G. W.

    2017-12-01

    Rocks samples can have wide range of magnetic properties depending on composition, amount of ferromagnetic minerals, grain sizes and microstructures. These influence the magnetic anomalies from the micro to the global scale making the study of the magnetic properties of interest for multiple applications. Later geological processes such as serpentinization can significantly influence these properties and change the nature of the magnetic anomalies. Particularly, magnetic properties such as remanent magnetization and magnetic susceptibility are directly linked to the magnetic mineralogy composition and grain size and can provide useful information about the geological history of the source. Scanning magnetic microscopy is a highly sensitive and high-resolution magnetometric technique for mapping the magnetic field over a planar surface of a rock sample. The device measures the vertical component of the field above the thin sections and the technique offers a spatial resolution down to tens of micrometers and thus can be used to investigate discrete magnetic mineral grains or magnetic textures and structures, and the magnetic history of the sample. This technique allows a direct correlation between the mineral chemistry (through both electron and optical microscopy) and the magnetic properties. We present as case-study three thin section magnetic scans of two dunite samples from the Reinfjord Ultramafic complex, in northern Norway. The selected thin sections show different magnetic properties which reflect the magnetic petrology. One of the thin sections is from a pristine dunite sample; the other two are highly serpentinized with newly formed magnetite found in multiple, few micrometer thick, veins. We present the preliminary results obtained applying a forward modelling approach on the magnetic anomaly maps acquired over the thin sections. Modelling consists of uniformly-magnetized polygonal bodies whose geometry is constrained by the thickness of the thin section

  4. Coincidence and noncoincidence counting (81Rb and 43K): a comparative study

    International Nuclear Information System (INIS)

    Ikeda, S.; Duken, H.; Tillmanns, H.; Bing, R.J.

    1975-01-01

    The accuracy of imaging and resolution obtained with 81 Rb and 43 K using coincidence and noncoincidence counting was compared. Phantoms and isolated infarcted dog hearts were used. The results clearly show the superiority of coincidence counting with a resolution of 0.5 cm. Noncoincidence counting failed to reveal even sizable defects in the radioactive source. (U.S.)

  5. Recommended methods for monitoring change in bird populations by counting and capture of migrants

    Science.gov (United States)

    David J. T. Hussell; C. John Ralph

    2005-01-01

    Counts and banding captures of spring or fall migrants can generate useful information on the status and trends of the source populations. To do so, the counts and captures must be taken and recorded in a standardized and consistent manner. We present recommendations for field methods for counting and capturing migrants at intensively operated sites, such as bird...

  6. In vivo counting of uranium

    International Nuclear Information System (INIS)

    Palmer, H.E.

    1985-03-01

    A state-of-the-art radiation detector system consisting of six individually mounted intrinsic germanium planar detectors, each 20 cm 2 by 13 mm thick, mounted together such that the angle of the whole system can be changed to match the slope of the chest of the person being counted, is described. The sensitivity of the system for counting uranium and plutonium in vivo and the precedures used in calibrating the system are also described. Some results of counts done on uranium mill workers are presented. 15 figs., 2 tabs

  7. YouGenMap: a web platform for dynamic multi-comparative mapping and visualization of genetic maps

    Science.gov (United States)

    Keith Batesole; Kokulapalan Wimalanathan; Lin Liu; Fan Zhang; Craig S. Echt; Chun Liang

    2014-01-01

    Comparative genetic maps are used in examination of genome organization, detection of conserved gene order, and exploration of marker order variations. YouGenMap is an open-source web tool that offers dynamic comparative mapping capability of users' own genetic mapping between 2 or more map sets. Users' genetic map data and optional gene annotations are...

  8. Application of historical, topographic maps and remote sensing data for reconstruction of gully network development as source of information for gully erosion modeling

    Science.gov (United States)

    Belyaev, Vladimir; Kuznetsova, Yulia

    2017-04-01

    Central parts of European Russia are characterized by relatively shorter history of intensive agriculture in comparison to the Western Europe. As a result of that, significant part of the time period of large-scale cultivation is covered by different types of historical documents. For the last about 150 years reasonably good-quality maps are available. Gully erosion history for the European Russia is more or less well-established, with known peaks of activity associated with initial cultivation (400-200 years ago for the territory of Central Russian Upland) and change of land ownership in 1861 that caused splitting large landlords-owned fields into numerous small parcels owned by individual peasant families. The latter was the most important trigger for dramatic growth of gully erosion intensity as most of such parcels were oriented downslope. It is believed that by detailed reconstructions of gully network development using all the available information sources it can be possible to obtain information suitable for gully erosion models testing. Such models can later be applied for predicting further development of the existing gully networks for several different land use and climate change scenarios. Reconstructions for the two case study areas located in different geographic and historical settings will be presented.

  9. Complete Blood Count (For Parents)

    Science.gov (United States)

    ... Kids Deal With Injections and Blood Tests Blood Culture Anemia Blood Test: Basic Metabolic Panel (BMP) Blood Test: Hemoglobin Basic Blood Chemistry Tests Word! Complete Blood Count (CBC) Medical Tests and Procedures ( ...

  10. Make My Trip Count 2015

    Data.gov (United States)

    Allegheny County / City of Pittsburgh / Western PA Regional Data Center — The Make My Trip Count (MMTC) commuter survey, conducted in September and October 2015 by GBA, the Pittsburgh 2030 District, and 10 other regional transportation...

  11. Counting Triangles to Sum Squares

    Science.gov (United States)

    DeMaio, Joe

    2012-01-01

    Counting complete subgraphs of three vertices in complete graphs, yields combinatorial arguments for identities for sums of squares of integers, odd integers, even integers and sums of the triangular numbers.

  12. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  13. Counting Word Frequencies with Python

    Directory of Open Access Journals (Sweden)

    William J. Turkel

    2012-07-01

    Full Text Available Your list is now clean enough that you can begin analyzing its contents in meaningful ways. Counting the frequency of specific words in the list can provide illustrative data. Python has an easy way to count frequencies, but it requires the use of a new type of variable: the dictionary. Before you begin working with a dictionary, consider the processes used to calculate frequencies in a list.

  14. Liquid scintillation counting of chlorophyll

    International Nuclear Information System (INIS)

    Fric, F.; Horickova, B.; Haspel-Horvatovic, E.

    1975-01-01

    A precise and reproducible method of liquid scintillation counting was worked out for measuring the radioactivity of 14 C-labelled chlorophyll a and chlorophyll b solutions without previous bleaching. The spurious count rate caused by luminescence of the scintillant-chlorophyll system is eliminated by using a suitable scintillant and by measuring the radioactivity at 4 to 8 0 C after an appropriate time of dark adaptation. Bleaching of the chlorophyll solutions is necessary only for measuring of very low radioactivity. (author)

  15. Finger Counting Habits in Middle Eastern and Western Individuals: An Online Survey

    NARCIS (Netherlands)

    Lindemann, O.; Alipour, A.; Fischer, M.H.

    2011-01-01

    The current study documents the presence of cultural differences in the development of finger counting strategies. About 900 Middle Eastern (i.e., Iranian) and Western (i.e., European and American) individuals reported in an online survey how they map numbers onto their fingers when counting from 1

  16. Equivalence of truncated count mixture distributions and mixtures of truncated count distributions.

    Science.gov (United States)

    Böhning, Dankmar; Kuhnert, Ronny

    2006-12-01

    This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.

  17. Source of errors and accuracy of a two-dimensional/three-dimensional fusion road map for endovascular aneurysm repair of abdominal aortic aneurysm.

    Science.gov (United States)

    Kauffmann, Claude; Douane, Frédéric; Therasse, Eric; Lessard, Simon; Elkouri, Stephane; Gilbert, Patrick; Beaudoin, Nathalie; Pfister, Marcus; Blair, Jean François; Soulez, Gilles

    2015-04-01

    To evaluate the accuracy and source of errors using a two-dimensional (2D)/three-dimensional (3D) fusion road map for endovascular aneurysm repair (EVAR) of abdominal aortic aneurysm. A rigid 2D/3D road map was tested in 16 patients undergoing EVAR. After 3D/3D manual registration of preoperative multidetector computed tomography (CT) and cone beam CT, abdominal aortic aneurysm outlines were overlaid on live fluoroscopy/digital subtraction angiography (DSA). Patient motion was evaluated using bone landmarks. The misregistration of renal and internal iliac arteries were estimated by 3 readers along head-feet and right-left coordinates (z-axis and x-axis, respectively) before and after bone and DSA corrections centered on the lowest renal artery. Iliac deformation was evaluated by comparing centerlines before and during intervention. A score of clinical added value was estimated as high (z-axis 5 mm). Interobserver reproducibility was calculated by the intraclass correlation coefficient. The lowest renal artery misregistration was estimated at x-axis = 10.6 mm ± 11.1 and z-axis = 7.4 mm ± 5.3 before correction and at x-axis = 3.5 mm ± 2.5 and z-axis = 4.6 mm ± 3.7 after bone correction (P = .08), and at 0 after DSA correction (P artery was estimated at x-axis = 2.4 mm ± 2.0 and z-axis = 2.2 mm ± 2.0. Score of clinical added value was low (n = 11), good (n= 0), and high (n= 5) before correction and low (n = 5), good (n = 4), and high (n = 7) after bone correction. Interobserver intraclass correlation coefficient for misregistration measurements was estimated at 0.99. Patient motion before stent graft delivery was estimated at x-axis = 8 mm ± 5.8 and z-axis = 3.0 mm ± 2.7. The internal iliac artery misregistration measurements were estimated at x-axis = 6.1 mm ± 3.5 and z-axis = 5.6 mm ± 4.0, and iliac centerline deformation was estimated at 38.3 mm ± 15.6. Rigid registration is feasible and fairly accurate. Only a partial reduction of vascular

  18. Caustics, counting maps and semi-classical asymptotics

    International Nuclear Information System (INIS)

    Ercolani, N M

    2011-01-01

    This paper develops a deeper understanding of the structure and combinatorial significance of the partition function for Hermitian random matrices. The coefficients of the large N expansion of the logarithm of this partition function, also known as the genus expansion (and its derivatives), are generating functions for a variety of graphical enumeration problems. The main results are to prove that these generating functions are, in fact, specific rational functions of a distinguished irrational (algebraic) function, z 0 (t). This distinguished function is itself the generating function for the Catalan numbers (or generalized Catalan numbers, depending on the choice of weight of the parameter t). It is also a solution of the inviscid Burgers equation for certain initial data. The shock formation, or caustic, of the Burgers characteristic solution is directly related to the poles of the rational forms of the generating functions. As an intriguing application, one gains new insights into the relation between certain derivatives of the genus expansion, in a double-scaling limit, and the asymptotic expansion of the first Painlevé transcendent. This provides a precise expression of the Painlevé asymptotic coefficients directly in terms of the coefficients of the partial fractions expansion of the rational form of the generating functions established in this paper. Moreover, these insights point towards a more general program relating the first Painlevé hierarchy to the higher order structure of the double-scaling limit through the specific rational structure of generating functions in the genus expansion. The paper closes with a discussion of the relation of this work to recent developments in understanding the asymptotics of graphical enumeration. As a by-product, these results also yield new information about the asymptotics of recurrence coefficients for orthogonal polynomials with respect to exponential weights, the calculation of correlation functions for certain tied random walks on a 1D lattice, and the large time asymptotics of random matrix partition functions

  19. Caustics, counting maps and semi-classical asymptotics

    Science.gov (United States)

    Ercolani, N. M.

    2011-02-01

    This paper develops a deeper understanding of the structure and combinatorial significance of the partition function for Hermitian random matrices. The coefficients of the large N expansion of the logarithm of this partition function, also known as the genus expansion (and its derivatives), are generating functions for a variety of graphical enumeration problems. The main results are to prove that these generating functions are, in fact, specific rational functions of a distinguished irrational (algebraic) function, z0(t). This distinguished function is itself the generating function for the Catalan numbers (or generalized Catalan numbers, depending on the choice of weight of the parameter t). It is also a solution of the inviscid Burgers equation for certain initial data. The shock formation, or caustic, of the Burgers characteristic solution is directly related to the poles of the rational forms of the generating functions. As an intriguing application, one gains new insights into the relation between certain derivatives of the genus expansion, in a double-scaling limit, and the asymptotic expansion of the first Painlevé transcendent. This provides a precise expression of the Painlevé asymptotic coefficients directly in terms of the coefficients of the partial fractions expansion of the rational form of the generating functions established in this paper. Moreover, these insights point towards a more general program relating the first Painlevé hierarchy to the higher order structure of the double-scaling limit through the specific rational structure of generating functions in the genus expansion. The paper closes with a discussion of the relation of this work to recent developments in understanding the asymptotics of graphical enumeration. As a by-product, these results also yield new information about the asymptotics of recurrence coefficients for orthogonal polynomials with respect to exponential weights, the calculation of correlation functions for certain tied random walks on a 1D lattice, and the large time asymptotics of random matrix partition functions.

  20. Applied categorical and count data analysis

    CERN Document Server

    Tang, Wan; Tu, Xin M

    2012-01-01

    Introduction Discrete Outcomes Data Source Outline of the BookReview of Key Statistical ResultsSoftwareContingency Tables Inference for One-Way Frequency TableInference for 2 x 2 TableInference for 2 x r TablesInference for s x r TableMeasures of AssociationSets of Contingency Tables Confounding Effects Sets of 2 x 2 TablesSets of s x r TablesRegression Models for Categorical Response Logistic Regression for Binary ResponseInference about Model ParametersGoodness of FitGeneralized Linear ModelsRegression Models for Polytomous ResponseRegression Models for Count Response Poisson Regression Mode

  1. Standardization of 241Am by digital coincidence counting, liquid scintillation counting and defined solid angle counting

    International Nuclear Information System (INIS)

    Balpardo, C.; Capoulat, M.E.; Rodrigues, D.; Arenillas, P.

    2010-01-01

    The nuclide 241 Am decays by alpha emission to 237 Np. Most of the decays (84.6%) populate the excited level of 237 Np with energy of 59.54 keV. Digital coincidence counting was applied to standardize a solution of 241 Am by alpha-gamma coincidence counting with efficiency extrapolation. Electronic discrimination was implemented with a pressurized proportional counter and the results were compared with two other independent techniques: Liquid scintillation counting using the logical sum of double coincidences in a TDCR array and defined solid angle counting taking into account activity inhomogeneity in the active deposit. The results show consistency between the three methods within a limit of a 0.3%. An ampoule of this solution will be sent to the International Reference System (SIR) during 2009. Uncertainties were analysed and compared in detail for the three applied methods.

  2. Hanford whole body counting manual

    International Nuclear Information System (INIS)

    Palmer, H.E.; Rieksts, G.A.; Lynch, T.P.

    1990-06-01

    This document describes the Hanford Whole Body Counting Program as it is administered by Pacific Northwest Laboratory (PNL) in support of the US Department of Energy--Richland Operations Office (DOE-RL) and its Hanford contractors. Program services include providing in vivo measurements of internally deposited radioactivity in Hanford employees (or visitors). Specific chapters of this manual deal with the following subjects: program operational charter, authority, administration, and practices, including interpreting applicable DOE Orders, regulations, and guidance into criteria for in vivo measurement frequency, etc., for the plant-wide whole body counting services; state-of-the-art facilities and equipment used to provide the best in vivo measurement results possible for the approximately 11,000 measurements made annually; procedures for performing the various in vivo measurements at the Whole Body Counter (WBC) and related facilities including whole body counts; operation and maintenance of counting equipment, quality assurance provisions of the program, WBC data processing functions, statistical aspects of in vivo measurements, and whole body counting records and associated guidance documents. 16 refs., 48 figs., 22 tabs

  3. Temporal trends in sperm count

    DEFF Research Database (Denmark)

    Levine, Hagai; Jørgensen, Niels; Martino-Andrade, Anderson

    2017-01-01

    a predefined protocol 7518 abstracts were screened and 2510 full articles reporting primary data on SC were reviewed. A total of 244 estimates of SC and TSC from 185 studies of 42 935 men who provided semen samples in 1973-2011 were extracted for meta-regression analysis, as well as information on years.......006, respectively). WIDER IMPLICATIONS: This comprehensive meta-regression analysis reports a significant decline in sperm counts (as measured by SC and TSC) between 1973 and 2011, driven by a 50-60% decline among men unselected by fertility from North America, Europe, Australia and New Zealand. Because......BACKGROUND: Reported declines in sperm counts remain controversial today and recent trends are unknown. A definitive meta-analysis is critical given the predictive value of sperm count for fertility, morbidity and mortality. OBJECTIVE AND RATIONALE: To provide a systematic review and meta-regression...

  4. Photon-counting image sensors

    CERN Document Server

    Teranishi, Nobukazu; Theuwissen, Albert; Stoppa, David; Charbon, Edoardo

    2017-01-01

    The field of photon-counting image sensors is advancing rapidly with the development of various solid-state image sensor technologies including single photon avalanche detectors (SPADs) and deep-sub-electron read noise CMOS image sensor pixels. This foundational platform technology will enable opportunities for new imaging modalities and instrumentation for science and industry, as well as new consumer applications. Papers discussing various photon-counting image sensor technologies and selected new applications are presented in this all-invited Special Issue.

  5. High rate 4π β-γ coincidence counting system

    International Nuclear Information System (INIS)

    Johnson, L.O.; Gehrke, R.J.

    1978-01-01

    A high count rate 4π β-γ coincidence counting system for the determination of absolute disintegration rates of short half-life radionuclides is described. With this system the dead time per pulse is minimized by not stretching any pulses beyond the width necessary to satisfy overlap coincidence requirements. The equations used to correct for the β, γ, and coincidence channel dead times and for accidental coincidences are presented but not rigorously developed. Experimental results are presented for a decaying source of 56 Mn initially at 2 x 10 6 d/s and a set of 60 Co sources of accurately known source strengths varying from 10 3 to 2 x 10 6 d/s. A check of the accidental coincidence equation for the case of two independent sources with varying source strengths is presented

  6. Low Count Anomaly Detection at Large Standoff Distances

    Science.gov (United States)

    Pfund, David Michael; Jarman, Kenneth D.; Milbrath, Brian D.; Kiff, Scott D.; Sidor, Daniel E.

    2010-02-01

    Searching for hidden illicit sources of gamma radiation in an urban environment is difficult. Background radiation profiles are variable and cluttered with transient acquisitions from naturally occurring radioactive materials and medical isotopes. Potentially threatening sources likely will be nearly hidden in this noise and encountered at high standoff distances and low threat count rates. We discuss an anomaly detection algorithm that characterizes low count sources as threatening or non-threatening and operates well in the presence of high benign source variability. We discuss the algorithm parameters needed to reliably find sources both close to the detector and far away from it. These parameters include the cutoff frequencies of background tracking filters and the integration time of the spectrometer. This work is part of the development of the Standoff Radiation Imaging System (SORIS) as part of DNDO's Standoff Radiation Detection System Advanced Technology Demonstration (SORDS-ATD) program.

  7. Liquid scintillation, counting, and compositions

    International Nuclear Information System (INIS)

    Sena, E.A.; Tolbert, B.M.; Sutula, C.L.

    1975-01-01

    The emissions of radioactive isotopes in both aqueous and organic samples can be measured by liquid scintillation counting in micellar systems. The micellar systems are made up of scintillation solvent, scintillation solute and a mixture of surfactants, preferably at least one of which is relatively oil-soluble water-insoluble and another which is relatively water-soluble oil-insoluble

  8. Phase space quark counting rule

    International Nuclear Information System (INIS)

    Wei-gin, C.; Lo, S.

    1980-01-01

    A simple quark counting rule based on phase space consideration suggested before is used to fit all 39 recent experimental data points on inclusive reactions. Parameter free relations are found to agree with experiments. Excellent detail fits are obtained for 11 inclusive reactions

  9. Counting a Culture of Mealworms

    Science.gov (United States)

    Ashbrook, Peggy

    2007-01-01

    Math is not the only topic that will be discussed when young children are asked to care for and count "mealworms," a type of insect larvae (just as caterpillars are the babies of butterflies, these larvae are babies of beetles). The following activity can take place over two months as the beetles undergo metamorphosis from larvae to adults. As the…

  10. Counting problems for number rings

    NARCIS (Netherlands)

    Brakenhoff, Johannes Franciscus

    2009-01-01

    In this thesis we look at three counting problems connected to orders in number fields. First we study the probability that for a random polynomial f in Z[X] the ring Z[X]/f is the maximal order in Q[X]/f. Connected to this is the probability that a random polynomial has a squarefree

  11. On Counting the Rational Numbers

    Science.gov (United States)

    Almada, Carlos

    2010-01-01

    In this study, we show how to construct a function from the set N of natural numbers that explicitly counts the set Q[superscript +] of all positive rational numbers using a very intuitive approach. The function has the appeal of Cantor's function and it has the advantage that any high school student can understand the main idea at a glance…

  12. Logistic regression for dichotomized counts.

    Science.gov (United States)

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  13. Vote Counting as Mathematical Proof

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Pattinson, Dirk

    2015-01-01

    then consists of a sequence (or tree) of rule applications and provides an independently checkable certificate of the validity of the result. This reduces the need to trust, or otherwise verify, the correctness of the vote counting software once the certificate has been validated. Using a rule...

  14. Counting SET-free sets

    OpenAIRE

    Harman, Nate

    2016-01-01

    We consider the following counting problem related to the card game SET: How many $k$-element SET-free sets are there in an $n$-dimensional SET deck? Through a series of algebraic reformulations and reinterpretations, we show the answer to this question satisfies two polynomiality conditions.

  15. The determination by irradiation with a pulsed neutron generator and delayed neutron counting of the amount of fissile material present in a sample; Determination de la quantite de matiere fissile presente dans un echantillon par irradiation au moyen d'une source pulsee de neutrons et comptage des neutrons retardes

    Energy Technology Data Exchange (ETDEWEB)

    Beliard, L; Janot, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    A preliminary study was conducted to determine the amount of fissile material present in a sample. The method used consisted in irradiating the sample by means of a pulsed neutron generator and delayed neutron counting. Results show the validity of this method provided some experimental precautions are taken. Checking on the residual proportion of fissile material in leached hulls seems possible. (authors) [French] Ce rapport rend compte d'une etude preliminaire effectuee en vue de determiner la quantite de matiere fissile presente dans un echantillon. La methode utilisee consiste a irradier l'echantillon considere au moyen d'une source puisee de neutrons et a compter les neutrons retardes produits. Les resultats obtenus permettent de conclure a la validite de la methode moyennant certaines precautions. Un controle de la teneur residuelle en matiere fissile des gaines apres traitement semble possible. (auteurs)

  16. Repeatability of differential goat bulk milk culture and associations with somatic cell count, total bacterial count, and standard plate count

    OpenAIRE

    Koop, G.; Dik, N.; Nielen, M.; Lipman, L.J.A.

    2010-01-01

    The aims of this study were to assess how different bacterial groups in bulk milk are related to bulk milk somatic cell count (SCC), bulk milk total bacterial count (TBC), and bulk milk standard plate count (SPC) and to measure the repeatability of bulk milk culturing. On 53 Dutch dairy goat farms, 3 bulk milk samples were collected at intervals of 2 wk. The samples were cultured for SPC, coliform count, and staphylococcal count and for the presence of Staphylococcus aureus. Furthermore, SCC ...

  17. Crowd counting via scale-adaptive convolutional neural network

    OpenAIRE

    Zhang, Lu; Shi, Miaojing; Chen, Qiaobo

    2017-01-01

    The task of crowd counting is to automatically estimate the pedestrian number in crowd images. To cope with the scale and perspective changes that commonly exist in crowd images, state-of-the-art approaches employ multi-column CNN architectures to regress density maps of crowd images. Multiple columns have different receptive fields corresponding to pedestrians (heads) of different scales. We instead propose a scale-adaptive CNN (SaCNN) architecture with a backbone of fixed small receptive fi...

  18. Study on advancement of in vivo counting using mathematical simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kinase, Sakae [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2003-05-01

    To obtain an assessment of the committed effective dose, individual monitoring for the estimation of intakes of radionuclides is required. For individual monitoring of exposure to intakes of radionuclides, direct measurement of radionuclides in the body - in vivo counting- is very useful. To advance in a precision in vivo counting which fulfills the requirements of ICRP 1990 recommendations, some problems, such as the investigation of uncertainties in estimates of body burdens by in vivo counting, and the selection of the way to improve the precision, have been studied. In the present study, a calibration technique for in vivo counting application using Monte Carlo simulation was developed. The advantage of the technique is that counting efficiency can be obtained for various shapes and sizes that are very difficult to change for phantoms. To validate the calibration technique, the response functions and counting efficiencies of a whole-body counter installed in JAERI were evaluated using the simulation and measurements. Consequently, the calculations are in good agreement with the measurements. The method for the determination of counting efficiency curves as a function of energy was developed using the present technique and a physiques correction equation was derived from the relationship between parameters of correction factor and counting efficiencies of the JAERI whole-body counter. The uncertainties in body burdens of {sup 137}Cs estimated with the JAERI whole-body counter were also investigated using the Monte Carlo simulation and measurements. It was found that the uncertainties of body burdens estimated with the whole-body counter are strongly dependent on various sources of uncertainty such as radioactivity distribution within the body and counting statistics. Furthermore, the evaluation method of the peak efficiencies of a Ge semi-conductor detector was developed by Monte Carlo simulation for optimum arrangement of Ge semi-conductor detectors for

  19. Participatory mapping of target areas to enable operational larval source management to suppress malaria vector mosquitoes in Dar es Salaam, Tanzania

    Directory of Open Access Journals (Sweden)

    Dongus Stefan

    2007-09-01

    Full Text Available Abstract Background Half of the population of Africa will soon live in towns and cities where it can be protected from malaria by controlling aquatic stages of mosquitoes. Rigorous but affordable and scaleable methods for mapping and managing mosquito habitats are required to enable effective larval control in urban Africa. Methods A simple community-based mapping procedure that requires no electronic devices in the field was developed to facilitate routine larval surveillance in Dar es Salaam, Tanzania. The mapping procedure included (1 community-based development of sketch maps and (2 verification of sketch maps through technical teams using laminated aerial photographs in the field which were later digitized and analysed using Geographical Information Systems (GIS. Results Three urban wards of Dar es Salaam were comprehensively mapped, covering an area of 16.8 km2. Over thirty percent of this area were not included in preliminary community-based sketch mapping, mostly because they were areas that do not appear on local government residential lists. The use of aerial photographs and basic GIS allowed rapid identification and inclusion of these key areas, as well as more equal distribution of the workload of malaria control field staff. Conclusion The procedure developed enables complete coverage of targeted areas with larval control through comprehensive spatial coverage with community-derived sketch maps. The procedure is practical, affordable, and requires minimal technical skills. This approach can be readily integrated into malaria vector control programmes, scaled up to towns and cities all over Tanzania and adapted to urban settings elsewhere in Africa.

  20. The Chandra Source Catalog: Source Properties and Data Products

    Science.gov (United States)

    Rots, Arnold; Evans, Ian N.; Glotfelty, Kenny J.; Primini, Francis A.; Zografou, Panagoula; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is breaking new ground in several areas. There are two aspects that are of particular interest to the users: its evolution and its contents. The CSC will be a living catalog that becomes richer, bigger, and better in time while still remembering its state at each point in time. This means that users will be able to take full advantage of new additions to the catalog, while retaining the ability to back-track and return to what was extracted in the past. The CSC sheds the limitations of flat-table catalogs. Its sources will be characterized by a large number of properties, as usual, but each source will also be associated with its own specific data products, allowing users to perform mini custom analysis on the sources. Source properties fall in the spatial (position, extent), photometric (fluxes, count rates), spectral (hardness ratios, standard spectral fits), and temporal (variability probabilities) domains, and are all accompanied by error estimates. Data products cover the same coordinate space and include event lists, images, spectra, and light curves. In addition, the catalog contains data products covering complete observations: event lists, background images, exposure maps, etc. This work is supported by NASA contract NAS8-03060 (CXC).

  1. Counting statistics and loss corrections for the APS

    International Nuclear Information System (INIS)

    Lee, W.K.; Mills, D.M.

    1992-01-01

    It has been suggested that for timing experiments, it might be advantageous to arrange the bunches in the storage ring in an asymmetrical mode. In this paper, we determine the counting losses from pulsed x-ray sources from basic probabilistic arguments and from Poisson statistics. In particular the impact on single-photon counting losses of a variety of possible filling modes for the Advanced Photon Source (APS) is examined. For bunches of equal current, a loss of 10% occurs whenever the count rate exceeds 21% of the bunch repetition rate. This changes slightly when bunches containing unequal numbers of particles are considered. The results are applied to several common detector/electronics systems

  2. Counting statistics and loss corrections for the APS

    International Nuclear Information System (INIS)

    Lee, W.K.; Mills, D.M.

    1992-01-01

    It has been suggested that for timing experiments, it might be advantageous to arrange the bunches in the storage ring in an asymmetrical mode. In this paper, we determine the counting losses from pulsed x-ray sources from basic probabilistic arguments and from Poisson statistics. In particular the impact on single photon counting losses of a variety of possible filling modes for the Advanced Photon Source (APS) is examined. For bunches of equal current, a loss of 10% occurs whenever the count rate exceeds 21% of the bunch repetition rate. This changes slightly when bunches containing unequal numbers of particles are considered. The results are applied to several common detector/electronics systems

  3. Relationship between γ detection dead-time and count correction factor

    International Nuclear Information System (INIS)

    Wu Huailong; Zhang Jianhua; Chu Chengsheng; Hu Guangchun; Zhang Changfan; Hu Gen; Gong Jian; Tian Dongfeng

    2015-01-01

    The relationship between dead-time and count correction factor was investigated by using interference source for purpose of high γ activity measurement. The count rates maintain several 10 s"-"l with γ energy of 0.3-1.3 MeV for 10"4-10"5 Bq radioactive source. It is proved that the relationship between count loss and dead-time is unconcerned at various energy and various count intensities. The same correction formula can be used for any nuclide measurement. (authors)

  4. Coincidence counting corrections for dead time losses and accidental coincidences

    International Nuclear Information System (INIS)

    Wyllie, H.A.

    1987-04-01

    An equation is derived for the calculation of the radioactivity of a source from the results of coincidence counting taking into account the dead-time losses and accidental coincidences. The derivation is an extension of the method of J. Bryant [Int. J. Appl. Radiat. Isot., 14:143, 1963]. The improvement on Bryant's formula has been verified by experiment

  5. Theory of photoelectron counting statistics

    International Nuclear Information System (INIS)

    Blake, J.

    1980-01-01

    The purpose of the present essay is to provide a detailed analysis of those theoretical aspects of photoelectron counting which are capable of experimental verification. Most of our interest is in the physical phenomena themselves, while part is in the mathematical techniques. Many of the mathematical methods used in the analysis of the photoelectron counting problem are generally unfamiliar to physicists interested in the subject. For this reason we have developed the essay in such a fashion that, although primary interest is focused on the physical phenomena, we have also taken pains to carry out enough of the analysis so that the reader can follow the main details. We have chosen to present a consistently quantum mechanical version of the subject, in that we follow the Glauber theory throughout. (orig./WL)

  6. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.

  7. Card counting in continuous time

    OpenAIRE

    Andersson, Patrik

    2012-01-01

    We consider the problem of finding an optimal betting strategy for a house-banked casino card game that is played for several coups before reshuffling. The sampling without replacement makes it possible to take advantage of the changes in the expected value as the deck is depleted, making large bets when the game is advantageous. Using such a strategy, which is easy to implement, is known as card counting. We consider the case of a large number of decks, making an approximat...

  8. Monitoring Milk Somatic Cell Counts

    Directory of Open Access Journals (Sweden)

    Gheorghe Şteţca

    2014-11-01

    Full Text Available The presence of somatic cells in milk is a widely disputed issue in milk production sector. The somatic cell counts in raw milk are a marker for the specific cow diseases such as mastitis or swollen udder. The high level of somatic cells causes physical and chemical changes to milk composition and nutritional value, and as well to milk products. Also, the mastitic milk is not proper for human consumption due to its contribution to spreading of certain diseases and food poisoning. According to these effects, EU Regulations established the maximum threshold of admitted somatic cells in raw milk to 400000 cells / mL starting with 2014. The purpose of this study was carried out in order to examine the raw milk samples provided from small farms, industrial type farms and milk processing units. There are several ways to count somatic cells in milk but the reference accepted method is the microscopic method described by the SR EN ISO 13366-1/2008. Generally samples registered values in accordance with the admissible limit. By periodical monitoring of the somatic cell count, certain technological process issues are being avoided and consumer’s health ensured.

  9. Motorcycle detection and counting using stereo camera, IR camera, and microphone array

    Science.gov (United States)

    Ling, Bo; Gibson, David R. P.; Middleton, Dan

    2013-03-01

    Detection, classification, and characterization are the key to enhancing motorcycle safety, motorcycle operations and motorcycle travel estimation. Average motorcycle fatalities per Vehicle Mile Traveled (VMT) are currently estimated at 30 times those of auto fatalities. Although it has been an active research area for many years, motorcycle detection still remains a challenging task. Working with FHWA, we have developed a hybrid motorcycle detection and counting system using a suite of sensors including stereo camera, thermal IR camera and unidirectional microphone array. The IR thermal camera can capture the unique thermal signatures associated with the motorcycle's exhaust pipes that often show bright elongated blobs in IR images. The stereo camera in the system is used to detect the motorcyclist who can be easily windowed out in the stereo disparity map. If the motorcyclist is detected through his or her 3D body recognition, motorcycle is detected. Microphones are used to detect motorcycles that often produce low frequency acoustic signals. All three microphones in the microphone array are placed in strategic locations on the sensor platform to minimize the interferences of background noises from sources such as rain and wind. Field test results show that this hybrid motorcycle detection and counting system has an excellent performance.

  10. MEASURING PRIMORDIAL NON-GAUSSIANITY THROUGH WEAK-LENSING PEAK COUNTS

    International Nuclear Information System (INIS)

    Marian, Laura; Hilbert, Stefan; Smith, Robert E.; Schneider, Peter; Desjacques, Vincent

    2011-01-01

    We explore the possibility of detecting primordial non-Gaussianity of the local type using weak-lensing peak counts. We measure the peak abundance in sets of simulated weak-lensing maps corresponding to three models f NL = 0, - 100, and 100. Using survey specifications similar to those of EUCLID and without assuming any knowledge of the lens and source redshifts, we find the peak functions of the non-Gaussian models with f NL = ±100 to differ by up to 15% from the Gaussian peak function at the high-mass end. For the assumed survey parameters, the probability of fitting an f NL = 0 peak function to the f NL = ±100 peak functions is less than 0.1%. Assuming the other cosmological parameters are known, f NL can be measured with an error Δf NL ∼ 13. It is therefore possible that future weak-lensing surveys like EUCLID and LSST may detect primordial non-Gaussianity from the abundance of peak counts, and provide information complementary to that obtained from the cosmic microwave background.

  11. A system for mapping radioactive specimens

    International Nuclear Information System (INIS)

    Britten, R.J.; Davidson, E.H.

    1988-01-01

    A system for mapping radioactive specimens comprises an avalanche counter, an encoder, pre-amplifier circuits, sample and hold circuits and a programmed computer. The parallel plate counter utilizes avalanche event counting over a large area with the ability to locate radioactive sources in two dimensions. When a beta ray, for example, enters a chamber, an ionization event occurs and the avalanche effect multiplies the event and results in charge collection on the anode surface for a limited period of time before the charge leaks away. The encoder comprises a symmetrical array of planar conductive surfaces separated from the anode by a dielectric material. The encoder couples charge currents, the amlitudes of which define the relative position of the ionization event. The amplitude of coupled current, delivered to pre-amplifiers, defines the location of the event. (author) 12 figs

  12. Utilizing a Multi-Source Forest Inventory Technique, MODIS Data and Landsat TM Images in the Production of Forest Cover and Volume Maps for the Terai Physiographic Zone in Nepal

    Directory of Open Access Journals (Sweden)

    Kalle Eerikäinen

    2012-12-01

    Full Text Available An approach based on the nearest neighbors techniques is presented for producing thematic maps of forest cover (forest/non-forest and total stand volume for the Terai region in southern Nepal. To create the forest cover map, we used a combination of Landsat TM satellite data and visual interpretation data, i.e., a sample grid of visual interpretation plots for which we obtained the land use classification according to the FAO standard. These visual interpretation plots together with the field plots for volume mapping originate from an operative forest inventory project, i.e., the Forest Resource Assessment of Nepal (FRA Nepal project. The field plots were also used in checking the classification accuracy. MODIS satellite data were used as a reference in a local correction approach conducted for the relative calibration of Landsat TM images. This study applied a non-parametric k-nearest neighbor technique (k-NN to the forest cover and volume mapping. A tree height prediction approach based on a nonlinear, mixed-effects (NLME modeling procedure is presented in the Appendix. The MODIS image data performed well as reference data for the calibration approach applied to make the Landsat image mosaic. The agreement between the forest cover map and the field observed values of forest cover was substantial in Western Terai (KHAT 0.745 and strong in Eastern Terai (KHAT 0.825. The forest cover and volume maps that were estimated using the k-NN method and the inventory data from the FRA Nepal project are already appropriate and valuable data for research purposes and for the planning of forthcoming forest inventories. Adaptation of the methods and techniques was carried out using Open Source software tools.

  13. Upgradation of automatic liquid scintillation counting system

    International Nuclear Information System (INIS)

    Bhattacharya, Sadhana; Behere, Anita; Sonalkar, S.Y.; Vaidya, P.P.

    2001-01-01

    This paper describes the upgradation of Microprocessor based Automatic Liquid Scintillation Counting systems (MLSC). This system was developed in 1980's and subsequently many systems were manufactured and supplied to Environment Survey labs at various Nuclear Power Plants. Recently this system has been upgraded to a more sophisticated one by using PC add-on hardware and developing Windows based software. The software implements more intuitive graphical user interface and also enhances the features making it comparable with commercially available systems. It implements data processing using full spectrum analysis as against channel ratio method adopted earlier, improving the accuracy of the results. Also it facilitates qualitative as well as quantitative analysis of the β-spectrum. It is possible to analyze a sample containing an unknown β-source. (author)

  14. Errors associated with moose-hunter counts of occupied beaver Castor fiber lodges in Norway

    OpenAIRE

    Parker, Howard; Rosell, Frank; Gustavsen, Per Øyvind

    2002-01-01

    In Norway, Sweden and Finland moose Alces alces hunting teams are often employed to survey occupied beaver (Castor fiber and C. canadensis) lodges while hunting. Results may be used to estimate population density or trend, or for issuing harvest permits. Despite the method's increasing popularity, the errors involved have never been identified. In this study we 1) compare hunting-team counts of occupied lodges with total counts, 2) identify the sources of error between counts and 3) evaluate ...

  15. Some target assay uncertainties for passive neutron coincidence counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Langner, D.G.; Menlove, H.O.; Miller, M.C.; Russo, P.A.

    1990-01-01

    This paper provides some target assay uncertainties for passive neutron coincidence counting of plutonium metal, oxide, mixed oxide, and scrap and waste. The target values are based in part on past user experience and in part on the estimated results from new coincidence counting techniques that are under development. The paper summarizes assay error sources and the new coincidence techniques, and recommends the technique that is likely to yield the lowest assay uncertainty for a given material type. These target assay uncertainties are intended to be useful for NDA instrument selection and assay variance propagation studies for both new and existing facilities. 14 refs., 3 tabs

  16. Repeatability of differential goat bulk milk culture and associations with somatic cell count, total bacterial count, and standard plate count

    NARCIS (Netherlands)

    Koop, G.; Dik, N.; Nielen, M.; Lipman, L.J.A.

    2010-01-01

    The aims of this study were to assess how different bacterial groups in bulk milk are related to bulk milk somatic cell count (SCC), bulk milk total bacterial count (TBC), and bulk milk standard plate count (SPC) and to measure the repeatability of bulk milk culturing. On 53 Dutch dairy goat farms,

  17. Nearest neighbors by neighborhood counting.

    Science.gov (United States)

    Wang, Hui

    2006-06-01

    Finding nearest neighbors is a general idea that underlies many artificial intelligence tasks, including machine learning, data mining, natural language understanding, and information retrieval. This idea is explicitly used in the k-nearest neighbors algorithm (kNN), a popular classification method. In this paper, this idea is adopted in the development of a general methodology, neighborhood counting, for devising similarity functions. We turn our focus from neighbors to neighborhoods, a region in the data space covering the data point in question. To measure the similarity between two data points, we consider all neighborhoods that cover both data points. We propose to use the number of such neighborhoods as a measure of similarity. Neighborhood can be defined for different types of data in different ways. Here, we consider one definition of neighborhood for multivariate data and derive a formula for such similarity, called neighborhood counting measure or NCM. NCM was tested experimentally in the framework of kNN. Experiments show that NCM is generally comparable to VDM and its variants, the state-of-the-art distance functions for multivariate data, and, at the same time, is consistently better for relatively large k values. Additionally, NCM consistently outperforms HEOM (a mixture of Euclidean and Hamming distances), the "standard" and most widely used distance function for multivariate data. NCM has a computational complexity in the same order as the standard Euclidean distance function and NCM is task independent and works for numerical and categorical data in a conceptually uniform way. The neighborhood counting methodology is proven sound for multivariate data experimentally. We hope it will work for other types of data.

  18. CalCOFI Egg Counts Positive Tows

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Fish egg counts and standardized counts for eggs captured in CalCOFI icthyoplankton nets (primarily vertical [Calvet or Pairovet], oblique [bongo or ring nets], and...

  19. CalCOFI Larvae Counts Positive Tows

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Fish larvae counts and standardized counts for eggs captured in CalCOFI icthyoplankton nets (primarily vertical [Calvet or Pairovet], oblique [bongo or ring nets],...

  20. Alaska Steller Sea Lion Pup Count Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This database contains counts of Steller sea lion pups on rookeries in Alaska made between 1961 and 2015. Pup counts are conducted in late June-July. Pups are...

  1. Optimization of Uranium Molecular Deposition for Alpha-Counting Sources

    Energy Technology Data Exchange (ETDEWEB)

    Monzo, Ellen [Univ. of Minnesota, Duluth, MN (United States); Parsons-Moss, Tashi [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Genetti, Victoria [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Knight, Kimberly [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-12-12

    Method development for molecular deposition of uranium onto aluminum 1100 plates was conducted with custom plating cells at Lawrence Livermore National Laboratory. The method development focused primarily on variation of electrode type, which was expected to directly influence plated sample homogeneity. Solid disc platinum and mesh platinum anodes were compared and data revealed that solid disc platinum anodes produced more homogenous uranium oxide films. However, the activity distribution also depended on the orientation of the platinum electrode relative to the aluminum cathode, starting current, and material composition of the plating cell. Experiments demonstrated these variables were difficult to control under the conditions available. Variation of plating parameters among a series of ten deposited plates yielded variations up to 30% in deposition efficiency. Teflon particles were observed on samples plated in Teflon cells, which poses a problem for alpha activity measurements of the plates. Preliminary electropolishing and chemical polishing studies were also conducted on the aluminum 1100 cathode plates.

  2. Analyzing thematic maps and mapping for accuracy

    Science.gov (United States)

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by

  3. SPERM COUNT DISTRIBUTIONS IN FERTILE MEN

    Science.gov (United States)

    Sperm concentration and count are often used as indicators of environmental impacts on male reproductive health. Existing clinical databases may be biased towards subfertile men with low sperm counts and less is known about expected sperm count distributions in cohorts of fertil...

  4. Platelet counting using the Coulter electronic counter.

    Science.gov (United States)

    Eggleton, M J; Sharp, A A

    1963-03-01

    A method for counting platelets in dilutions of platelet-rich plasm using the Coulter electronic counter is described.(1) The results obtained show that such platelet counts are at least as accurate as the best methods of visual counting. The various technical difficulties encountered are discussed.

  5. Count-doubling time safety circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.; McDowell, W.P.; Rusch, G.K.

    1981-01-01

    There is provided a nuclear reactor count-factor-increase time monitoring circuit which includes a pulse-type neutron detector, and means for counting the number of detected pulses during specific time periods. Counts are compared and the comparison is utilized to develop a reactor scram signal, if necessary

  6. Count-doubling time safety circuit

    Science.gov (United States)

    Rusch, Gordon K.; Keefe, Donald J.; McDowell, William P.

    1981-01-01

    There is provided a nuclear reactor count-factor-increase time monitoring circuit which includes a pulse-type neutron detector, and means for counting the number of detected pulses during specific time periods. Counts are compared and the comparison is utilized to develop a reactor scram signal, if necessary.

  7. Investigation of internal conversion electron lines by track counting technique

    CERN Document Server

    Islamov, T A; Kambarova, N T; Muminov, T M; Lebedev, N A; Solnyshkin, A A; Aleshin, Yu D; Kolesnikov, V V; Silaev, V I; Niipf-Tashgu, T

    2001-01-01

    The methodology of counting the tracks of the internal conversion electron (ICE) in the nuclear photoemulsion is described. The results on counting the ICE tracks on the photoplates for sup 1 sup 6 sup 1 Ho, sup 1 sup 6 sup 3 Tm, sup 1 sup 6 sup 6 Tm, sup 1 sup 3 sup 5 Ce is described. The above results are obtained through the MBI-9 microscope and the MAS-1 automated facility. The ICE track counting on the photoplates provides for essentially higher sensitivity as compared to the photometry method. This makes it possible to carry out measurements with the sources by 1000 times weaker as by the study into the density of blackening

  8. Automatic quench compensation for liquid scintillation counting system

    International Nuclear Information System (INIS)

    Nather, R.E.

    1978-01-01

    A method of automatic quench compensation is provided, where a reference measure of quench is taken on a sample prior to taking a sample count. The measure of quench is then compared with a reference voltage source which has been established to vary in proportion to the variation of the measure of quench with the level of a system parameter required to restore at least one isotope spectral energy endpoint substantially to a selected counting window discriminator level in order to determine the amount of adjustment of the system parameter required to restore the endpoint. This is followed by the appropriate adjustment of the system parameter required to restore the relative position of the discriminator windows and the sample spectrum and is followed in turn by taking a sample count

  9. Moving from proprietary to open-source solutions for academic research in remote sensing: Example with semi-automated land cover mapping

    OpenAIRE

    Grippa, Taïs

    2017-01-01

    GRASS GIS has recently experienced significant improvements for Object-Based Image Analysis. At ULB the choice was made to combine GRASS GIS and Python in a semi-automated processing chain for land-cover mapping. The later proved its ability of being quickly customized in order to match the requirements of different projects. In order to promote the OSGEO software, we decided to make it freely available, allowing anyone interested to review, reuse and/or enhance it for further studies.

  10. The Atacama Cosmology Telescope: Extragalactic Sources at 148 GHz in the 2008 Survey

    Science.gov (United States)

    Marriage, T. A.; Juin, J. B.; Lin, Y. T.; Marsden, D.; Nolta, M. R.; Partridge, B.; Ade, P. A. R.; Aguirre, P.; Amiri, M.; Appel, J. W.; hide

    2011-01-01

    We report on extragalactic sources detected in a 455 square-degree map of the southern sky made with data at a frequency of 148 GHz from the Atacama Cosmology Telescope 2008 observing season. We provide a catalog of 157 sources with flux densities spanning two orders of magnitude: from 15 mJy to 1500 mJy. Comparison to other catalogs shows that 98% of the ACT detections correspond to sources detected at lower radio frequencies. Three of the sources appear to be associated with the brightest cluster galaxies of low redshift X-ray selected galaxy clusters. Estimates of the radio to mm-wave spectral indices and differential counts of the sources further bolster the hypothesis that they are nearly all radio sources, and that their emission is not dominated by re-emission from warm dust. In a bright (> 50 mJy) 148 GHz-selected sample with complete cross-identifications from the Australia Telescope 20 GHz survey, we observe an average steepening of the spectra between .5, 20, and 148 GHz with median spectral indices of alp[ha (sub 5-20) = -0.07 +/- 0.06, alpha (sub 20-148) -0.39 +/- 0.04, and alpha (sub 5-148) = -0.20 +/- 0.03. When the measured spectral indices are taken into account, the 148 GHz differential source counts are consistent with previous measurements at 30 GHz in the context of a source count model dominated by radio sources. Extrapolating with an appropriately rescaled model for the radio source counts, the Poisson contribution to the spatial power spectrum from synchrotron-dominated sources with flux density less than 20 mJy is C(sup Sync) = (2.8 +/- 0.3) x 1O (exp-6) micro K(exp 2).

  11. Digital intelligence sources transporter

    International Nuclear Information System (INIS)

    Zhang Zhen; Wang Renbo

    2011-01-01

    It presents from the collection of particle-ray counting, infrared data communication, real-time monitoring and alarming, GPRS and other issues start to realize the digital management of radioactive sources, complete the real-time monitoring of all aspects, include the storing of radioactive sources, transporting and using, framing intelligent radioactive sources transporter, as a result, achieving reliable security supervision of radioactive sources. (authors)

  12. How much do women count if they not counted?

    Directory of Open Access Journals (Sweden)

    Federica Taddia

    2006-01-01

    Full Text Available The condition of women throughout the world is marked by countless injustices and violations of the most fundamental rights established by the Universal Declaration of human rights and every culture is potentially prone to commit discrimination against women in various forms. Women are worse fed, more exposed to physical violence, more exposed to diseases and less educated; they have less access to, or are excluded from, vocational training paths; they are the most vulnerable among prisoners of conscience, refugees and immigrants and the least considered within ethnic minorities; from their very childhood, women are humiliated, undernourished, sold, raped and killed; their work is generally less paid compared to men’s work and in some countries they are victims of forced marriages. Such condition is the result of old traditions that implicit gender-differentiated education has long promoted through cultural models based on theories, practices and policies marked by discrimination and structured differentially for men and women. Within these cultural models, the basic educational institutions have played and still play a major role in perpetuating such traditions. Nevertheless, if we want to overcome inequalities and provide women with empowerment, we have to start right from the educational institutions and in particular from school, through the adoption of an intercultural approach to education: an approach based on active pedagogy and on methods of analysis, exchange and enhancement typical of socio-educational animation. The intercultural approach to education is attentive to promote the realisation of each individual and the dignity and right of everyone to express himself/herself in his/her own way. Such an approach will give women the opportunity to become actual agents of collective change and to get the strength and wellbeing necessary to count and be counted as human beings entitled to freedom and equality, and to have access to all

  13. Quantitative Compton suppression spectrometry at elevated counting rates

    International Nuclear Information System (INIS)

    Westphal, G.P.; Joestl, K.; Schroeder, P.; Lauster, R.; Hausch, E.

    1999-01-01

    For quantitative Compton suppression spectrometry the decrease of coincidence efficiency with counting rate should be made negligible to avoid a virtual increase of relative peak areas of coincident isomeric transitions with counting rate. To that aim, a separate amplifier and discriminator has been used for each of the eight segments of the active shield of a new well-type Compton suppression spectrometer, together with an optimized, minimum dead-time design of the anticoincidence logic circuitry. Chance coincidence losses in the Compton suppression spectrometer are corrected instrumentally by comparing the chance coincidence rate to the counting rate of the germanium detector in a pulse-counting Busy circuit (G.P. Westphal, J. Rad. Chem. 179 (1994) 55) which is combined with the spectrometer's LFC counting loss correction system. The normally not observable chance coincidence rate is reconstructed from the rates of germanium detector and scintillation detector in an auxiliary coincidence unit, after the destruction of true coincidence by delaying one of the coincidence partners. Quantitative system response has been tested in two-source measurements with a fixed reference source of 60 Co of 14 kc/s, and various samples of 137 Cs, up to aggregate counting rates of 180 kc/s for the well-type detector, and more than 1400 kc/s for the BGO shield. In these measurements, the net peak areas of the 1173.3 keV line of 60 Co remained constant at typical values of 37 000 with and 95 000 without Compton suppression, with maximum deviations from the average of less than 1.5%

  14. EEG in Silent Small Vessel Disease : sLORETA Mapping Reveals Cortical Sources of Vascular Cognitive Impairment No Dementia in the Default Mode Network

    NARCIS (Netherlands)

    Sheorajpanday, Rishi V. A.; Marien, Peter; Weeren, Arie J. T. M.; Nagels, Guy; Saerens, Jos; van Putten, Michel J. A. M.; De Deyn, Peter P.

    Introduction: Vascular cognitive impairment, no dementia (vCIND) is a prevalent and potentially preventable disorder. Clinical presof the small vessel subcortical subtype may be insidious and difficult to diagnose in the initial stage. We investigated electroencephalographic sources of subcortical

  15. Radon counting statistics - a Monte Carlo investigation

    International Nuclear Information System (INIS)

    Scott, A.G.

    1996-01-01

    Radioactive decay is a Poisson process, and so the Coefficient of Variation (COV) of open-quotes nclose quotes counts of a single nuclide is usually estimated as 1/√n. This is only true if the count duration is much shorter than the half-life of the nuclide. At longer count durations, the COV is smaller than the Poisson estimate. Most radon measurement methods count the alpha decays of 222 Rn, plus the progeny 218 Po and 214 Po, and estimate the 222 Rn activity from the sum of the counts. At long count durations, the chain decay of these nuclides means that every 222 Rn decay must be followed by two other alpha decays. The total number of decays is open-quotes 3Nclose quotes, where N is the number of radon decays, and the true COV of the radon concentration estimate is 1/√(N), √3 larger than the Poisson total count estimate of 1/√3N. Most count periods are comparable to the half lives of the progeny, so the relationship between COV and count time is complex. A Monte-Carlo estimate of the ratio of true COV to Poisson estimate was carried out for a range of count periods from 1 min to 16 h and three common radon measurement methods: liquid scintillation, scintillation cell, and electrostatic precipitation of progeny. The Poisson approximation underestimates COV by less than 20% for count durations of less than 60 min

  16. Discrete calculus methods for counting

    CERN Document Server

    Mariconda, Carlo

    2016-01-01

    This book provides an introduction to combinatorics, finite calculus, formal series, recurrences, and approximations of sums. Readers will find not only coverage of the basic elements of the subjects but also deep insights into a range of less common topics rarely considered within a single book, such as counting with occupancy constraints, a clear distinction between algebraic and analytical properties of formal power series, an introduction to discrete dynamical systems with a thorough description of Sarkovskii’s theorem, symbolic calculus, and a complete description of the Euler-Maclaurin formulas and their applications. Although several books touch on one or more of these aspects, precious few cover all of them. The authors, both pure mathematicians, have attempted to develop methods that will allow the student to formulate a given problem in a precise mathematical framework. The aim is to equip readers with a sound strategy for classifying and solving problems by pursuing a mathematically rigorous yet ...

  17. Counting paths with Schur transitions

    Energy Technology Data Exchange (ETDEWEB)

    Díaz, Pablo [Department of Physics and Astronomy, University of Lethbridge, Lethbridge, Alberta, T1K 3M4 (Canada); Kemp, Garreth [Department of Physics, University of Johannesburg, P.O. Box 524, Auckland Park 2006 (South Africa); Véliz-Osorio, Alvaro, E-mail: aveliz@gmail.com [Mandelstam Institute for Theoretical Physics, University of the Witwatersrand, WITS 2050, Johannesburg (South Africa); School of Physics and Astronomy, Queen Mary, University of London, Mile End Road, London E1 4NS (United Kingdom)

    2016-10-15

    In this work we explore the structure of the branching graph of the unitary group using Schur transitions. We find that these transitions suggest a new combinatorial expression for counting paths in the branching graph. This formula, which is valid for any rank of the unitary group, reproduces known asymptotic results. We proceed to establish the general validity of this expression by a formal proof. The form of this equation strongly hints towards a quantum generalization. Thus, we introduce a notion of quantum relative dimension and subject it to the appropriate consistency tests. This new quantity finds its natural environment in the context of RCFTs and fractional statistics; where the already established notion of quantum dimension has proven to be of great physical importance.

  18. Mass counting of radioactivity samples

    International Nuclear Information System (INIS)

    Oesterlin, D.L.; Obrycki, R.F.

    1977-01-01

    A method and apparatus for concurrently counting a plurality of radioactive samples is claimed. The position sensitive circuitry of a scintillation camera is employed to sort electrical pulses resulting from scintillations according to the geometrical locations of scintillations causing those pulses. A scintillation means, in the form of a scintillating crystal material or a liquid scintillator, is positioned proximate to an array of radioactive samples. Improvement in the accuracy of pulse classification may be obtained by employing collimating means. If a plurality of scintillation crystals are employed to measure the iodine-125 content of samples, a method and means are provided for correcting for variations in crystal light transmission properties, sample volume, and sample container radiation absorption. 2 claims, 7 drawing figures

  19. Source SDK development essentials

    CERN Document Server

    Bernier, Brett

    2014-01-01

    The Source Authoring Tools are the pieces of software used to create custom content for games made with Valve's Source engine. Creating mods and maps for your games without any programming knowledge can be time consuming. These tools allow you to create your own maps and levels without the need for any coding knowledge. All the tools that you need to start creating your own levels are built-in and ready to go! This book will teach you how to use the Authoring Tools provided with Source games and will guide you in creating your first maps and mods (modifications) using Source. You will learn ho

  20. Mapping Mixed Methods Research: Methods, Measures, and Meaning

    Science.gov (United States)

    Wheeldon, J.

    2010-01-01

    This article explores how concept maps and mind maps can be used as data collection tools in mixed methods research to combine the clarity of quantitative counts with the nuance of qualitative reflections. Based on more traditional mixed methods approaches, this article details how the use of pre/post concept maps can be used to design qualitative…

  1. Aerial Survey Counts of Harbor Seals in Lake Iliamna, Alaska, 1984-2013 (NODC Accession 0123188)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset provides counts of harbor seals from aerial surveys over Lake Iliamna, Alaska, USA. The data have been collated from three previously published sources...

  2. A Dataset of Aerial Survey Counts of Harbor Seals in Iliamna Lake, Alaska: 1984-2013

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset provides counts of harbor seals from aerial surveys over Iliamna Lake, Alaska, USA. The data have been collated from three previously published sources...

  3. Solvation Structure and Thermodynamic Mapping (SSTMap): An Open-Source, Flexible Package for the Analysis of Water in Molecular Dynamics Trajectories.

    Science.gov (United States)

    Haider, Kamran; Cruz, Anthony; Ramsey, Steven; Gilson, Michael K; Kurtzman, Tom

    2018-01-09

    We have developed SSTMap, a software package for mapping structural and thermodynamic water properties in molecular dynamics trajectories. The package introduces automated analysis and mapping of local measures of frustration and enhancement of water structure. The thermodynamic calculations are based on Inhomogeneous Fluid Solvation Theory (IST), which is implemented using both site-based and grid-based approaches. The package also extends the applicability of solvation analysis calculations to multiple molecular dynamics (MD) simulation programs by using existing cross-platform tools for parsing MD parameter and trajectory files. SSTMap is implemented in Python and contains both command-line tools and a Python module to facilitate flexibility in setting up calculations and for automated generation of large data sets involving analysis of multiple solutes. Output is generated in formats compatible with popular Python data science packages. This tool will be used by the molecular modeling community for computational analysis of water in problems of biophysical interest such as ligand binding and protein function.

  4. Set of counts by scintillations for atmospheric samplings; Ensemble de comptages par scintillations pour prelevements atmospheriques

    Energy Technology Data Exchange (ETDEWEB)

    Appriou, D.; Doury, A.

    1962-07-01

    The author reports the development of a scintillation-based counting assembly with the following characteristics: a photo-multiplier with a wide photo-cathode, a thin plastic scintillator for the counting of beta + alpha (and possibility of mounting an alpha scintillator), a relatively small own motion with respect to activities to be counted, a weakly varying efficiency. The authors discuss the counting objective, present equipment tests (counter, proportional amplifier and pre-amplifier, input drawer). They describe the apparatus operation, discuss the selection of scintillators, report the study of the own movement (electron-based background noise, total background noise, background noise reduction), discuss counts (influence of the external source, sensitivity to alpha radiations, counting homogeneity, minimum detectable activity) and efficiencies.

  5. Toward unstained cytology and complete blood counts at the point of care (Conference Presentation)

    Science.gov (United States)

    Zuluaga, Andres F.; Pierce, Mark C.; MacAulay, Calum E.

    2017-02-01

    Cytology tests, whether performed on body fluids, aspirates, or scrapings are commonly used to detect, diagnose, and monitor a wide variety of health conditions. Complete blood counts (CBCs) quantify the number of red and white blood cells in a blood volume, as well as the different types of white blood cells. There is a critical unmet need for an instrument that can perform CBCs at the point of care (POC), and there is currently no product in the US that can perform this test at the bedside. We have developed a system that is capable of tomographic images with sub-cellular resolution with consumer-grade broadband (LED) sources and CMOS detectors suitable for POC implementation of CBC tests. The systems consists of cascaded static Michelson and Sagnac interferometers that map phase (encoding depth) and a transverse spatial dimension onto a two-dimensional output plane. Our approach requires a 5 microliter sample, can be performed in 5 minutes or less, and does not require staining or other processing as it relies on intrinsic contrast. We will show results directly imaging and differentiating unstained blood cells using supercontinuum fiber lasers and LEDs as sources and CMOS cameras as sensors. We will also lay out the follow up steps needed, including image segmentation, analysis and classification, to verify performance and advance toward CBCs that can be performed bedside and do not require CLIA-certified laboratories.

  6. Correction for intrinsic and set dead-time losses in radioactivity counting

    International Nuclear Information System (INIS)

    Wyllie, H.A.

    1992-12-01

    Equations are derived for the determination of the intrinsic dead time of the components which precede the paralysis unit in a counting system for measuring radioactivity. The determination depends on the extension of the set dead time by the intrinsic dead time. Improved formulae are given for the dead-time correction of the count rate of a radioactive source in a single-channel system. A variable in the formulae is the intrinsic dead time which is determined concurrently with the counting of the source. The only extra equipment required in a conventional system is a scaler. 5 refs., 2 tabs., 21 figs

  7. Results of LA-ICP-MS sulfide mapping from Algoma-type BIF gold systems with implications for the nature of mineralizing fluids, metal sources, and deposit models

    Science.gov (United States)

    Gourcerol, B.; Kontak, D. J.; Thurston, P. C.; Petrus, J. A.

    2018-01-01

    Quantitative laser ablation inductively coupled plasma-mass spectrometry (LA-ICP-MS) element distribution maps combined with traverse mode analyses have been acquired on various sulfides (pyrite, pyrrhotite, arsenopyrite) from three Canadian Algoma-type BIF-hosted gold deposits ( 4 Moz Au Meadowbank, ≥ 2.8 Moz Au Meliadine district, 6 Moz Au Musselwhite). These data, in conjunction with detailed petrographic and SEM-EDS observations, provide insight into the nature and relative timing of gold events, the presence and implication of trace element zoning regarding crystallization processes, and elemental associations that fingerprint gold events. Furthermore, the use of an innovative method of processing the LA-ICP-MS data in map and traverse modes, whereby the results are fragmented into time-slice data, to generate various binary plots (Ag versus Ni) provides a means to identify elemental associations (Te, Bi) not otherwise apparent. This integrated means of treating geochemical data, along with petrography, allows multiple gold events and remobilization processes to be recognized and their elemental associations determined. The main gold event in each of these deposits is characterized by the coupling of an As-Se-Te-Ag element association coincident with intense stratabound sulfide-replacement of the Fe-rich host rock. Additionally, the data indicate presence of a later remobilization event, which upgraded the Au tenor, as either non-refractory or refractory type, along fracture networks due to the ingress of subsequent base metal-bearing metamorphic fluids (mainly a Pb-Bi association). Furthermore, the data reveal a stratigraphic influence, as reflected in the elemental associations and the elemental enrichments observed and the nature of the sulfide phase hosting the gold mineralization (arsenopyrite versus pyrite).

  8. Topographic mapping

    Science.gov (United States)

    ,

    2008-01-01

    The U.S. Geological Survey (USGS) produced its first topographic map in 1879, the same year it was established. Today, more than 100 years and millions of map copies later, topographic mapping is still a central activity for the USGS. The topographic map remains an indispensable tool for government, science, industry, and leisure. Much has changed since early topographers traveled the unsettled West and carefully plotted the first USGS maps by hand. Advances in survey techniques, instrumentation, and design and printing technologies, as well as the use of aerial photography and satellite data, have dramatically improved mapping coverage, accuracy, and efficiency. Yet cartography, the art and science of mapping, may never before have undergone change more profound than today.

  9. Counting Zero: Rethinking Feminist Epistemologies

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-10-01

    Full Text Available This article concerns feminist engagements with epistemologies. Feminist epistemologies have revealed and challenged the exclusions and denigrations at work in knowledge production processes. And yet, the emphasis on the partiality of knowledge and the non-innocence of any subject position also cast doubt on the possibility of feminist political communities. In view of this, it has been argued that the very parameter of epistemology poses limitations for feminism, for it leads to either political paralysis or prescriptive politics that in fact undoes the political of politics. From a different perspective, decolonial feminists argue for radical epistemic disobedience and delinking the move beyond the confines of Western systems of knowledge and its extractive knowledge economy. Nevertheless, the oppositional logic informs both feminist epistemologies and its critiques, which I argue is symptomatic of the epistemic habits of academic feminism. This article ends with a preliminary reconsideration of the question of origin through the figure of zero. It asks whether it might be possible to conceive of feminist epistemologies as performing the task of counting zero – accounting for origin, wholeness, and universality – that takes into account specificities without forfeiting coalition and claims to knowledge.

  10. Pulse-duration discrimination for increasing counting characteristic plateau and for improving counting rate stability of a scintillation counter

    International Nuclear Information System (INIS)

    Kuz'min, M.G.

    1977-01-01

    For greater stability of scintillation counters operation, discussed is the possibility for increasing the plateau and reducing its slope. Presented is the circuit for discrimination of the signal pulses from input pulses of a photomultiplier. The counting characteristics have been measured with the scintillation detectors being irradiated by different gamma sources ( 60 Co, 137 Cs, 241 Am) and without the source when the scintillation detector is shielded by a tungsten cylinder with a wall thickness of 23 mm. The comparison has revealed that discrimination in duration increase the plateau and reduces its slope. Proceeding from comparison of the noise characteristics, the relationship is found between the noise pulse number and gamma radiation energy. For better stability of the counting rate it is suggested to introduce into the scintillation counter the circuit for duration discrimination of the output pulses of a photomultiplier

  11. Fast radio burst event rate counts - I. Interpreting the observations

    Science.gov (United States)

    Macquart, J.-P.; Ekers, R. D.

    2018-02-01

    The fluence distribution of the fast radio burst (FRB) population (the `source count' distribution, N (>F) ∝Fα), is a crucial diagnostic of its distance distribution, and hence the progenitor evolutionary history. We critically reanalyse current estimates of the FRB source count distribution. We demonstrate that the Lorimer burst (FRB 010724) is subject to discovery bias, and should be excluded from all statistical studies of the population. We re-examine the evidence for flat, α > -1, source count estimates based on the ratio of single-beam to multiple-beam detections with the Parkes multibeam receiver, and show that current data imply only a very weak constraint of α ≲ -1.3. A maximum-likelihood analysis applied to the portion of the Parkes FRB population detected above the observational completeness fluence of 2 Jy ms yields α = -2.6_{-1.3}^{+0.7 }. Uncertainties in the location of each FRB within the Parkes beam render estimates of the Parkes event rate uncertain in both normalizing survey area and the estimated post-beam-corrected completeness fluence; this uncertainty needs to be accounted for when comparing the event rate against event rates measured at other telescopes.

  12. Heterogeneous counting on filter support media

    International Nuclear Information System (INIS)

    Long, E.; Kohler, V.; Kelly, M.J.

    1976-01-01

    Many investigators in the biomedical research area have used filter paper as the support for radioactive samples. This means that a heterogeneous counting of sample sometimes results. The count rate of a sample on a filter will be affected by positioning, degree of dryness, sample application procedure, the type of filter, and the type of cocktail used. Positioning of the filter (up or down) in the counting vial can cause a variation of 35% or more when counting tritiated samples on filter paper. Samples of varying degrees of dryness when added to the counting cocktail can cause nonreproducible counts if handled improperly. Count rates starting at 2400 CPM initially can become 10,000 CPM in 24 hours for 3 H-DNA (deoxyribonucleic acid) samples dried on standard cellulose acetate membrane filters. Data on cellulose nitrate filters show a similar trend. Sample application procedures in which the sample is applied to the filter in a small spot or on a large amount of the surface area can cause nonreproducible or very low counting rates. A tritiated DNA sample, when applied topically, gives a count rate of 4,000 CPM. When the sample is spread over the whole filter, 13,400 CPM are obtained with a much better coefficient of variation (5% versus 20%). Adding protein carrier (bovine serum albumin-BSA) to the sample to trap more of the tritiated DNA on the filter during the filtration process causes a serious beta absorption problem. Count rates which are one-fourth the count rate applied to the filter are obtained on calibrated runs. Many of the problems encountered can be alleviated by a proper choice of filter and the use of a liquid scintillation cocktail which dissolves the filter. Filter-Solv has been used to dissolve cellulose nitrate filters and filters which are a combination of cellulose nitrate and cellulose acetate. Count rates obtained for these dissolved samples are very reproducible and highly efficient

  13. Leveraging multiple datasets for deep leaf counting

    OpenAIRE

    Dobrescu, Andrei; Giuffrida, Mario Valerio; Tsaftaris, Sotirios A

    2017-01-01

    The number of leaves a plant has is one of the key traits (phenotypes) describing its development and growth. Here, we propose an automated, deep learning based approach for counting leaves in model rosette plants. While state-of-the-art results on leaf counting with deep learning methods have recently been reported, they obtain the count as a result of leaf segmentation and thus require per-leaf (instance) segmentation to train the models (a rather strong annotation). Instead, our method tre...

  14. Multiplicity counting from fission detector signals with time delay effects

    Science.gov (United States)

    Nagy, L.; Pázsit, I.; Pál, L.

    2018-03-01

    In recent work, we have developed the theory of using the first three auto- and joint central moments of the currents of up to three fission chambers to extract the singles, doubles and triples count rates of traditional multiplicity counting (Pázsit and Pál, 2016; Pázsit et al., 2016). The objective is to elaborate a method for determining the fissile mass, neutron multiplication, and (α, n) neutron emission rate of an unknown assembly of fissile material from the statistics of the fission chamber signals, analogous to the traditional multiplicity counting methods with detectors in the pulse mode. Such a method would be an alternative to He-3 detector systems, which would be free from the dead time problems that would be encountered in high counting rate applications, for example the assay of spent nuclear fuel. A significant restriction of our previous work was that all neutrons born in a source event (spontaneous fission) were assumed to be detected simultaneously, which is not fulfilled in reality. In the present work, this restriction is eliminated, by assuming an independent, identically distributed random time delay for all neutrons arising from one source event. Expressions are derived for the same auto- and joint central moments of the detector current(s) as in the previous case, expressed with the singles, doubles, and triples (S, D and T) count rates. It is shown that if the time-dispersion of neutron detections is of the same order of magnitude as the detector pulse width, as they typically are in measurements of fast neutrons, the multiplicity rates can still be extracted from the moments of the detector current, although with more involved calibration factors. The presented formulae, and hence also the performance of the proposed method, are tested by both analytical models of the time delay as well as with numerical simulations. Methods are suggested also for the modification of the method for large time delay effects (for thermalised neutrons).

  15. Validation and uncertainty quantification of detector response functions for a 1″×2″ NaI collimated detector intended for inverse radioisotope source mapping applications

    Science.gov (United States)

    Nelson, N.; Azmy, Y.; Gardner, R. P.; Mattingly, J.; Smith, R.; Worrall, L. G.; Dewji, S.

    2017-11-01

    Detector response functions (DRFs) are often used for inverse analysis. We compute the DRF of a sodium iodide (NaI) nuclear material holdup field detector using the code named g03 developed by the Center for Engineering Applications of Radioisotopes (CEAR) at NC State University. Three measurement campaigns were performed in order to validate the DRF's constructed by g03: on-axis detection of calibration sources, off-axis measurements of a highly enriched uranium (HEU) disk, and on-axis measurements of the HEU disk with steel plates inserted between the source and the detector to provide attenuation. Furthermore, this work quantifies the uncertainty of the Monte Carlo simulations used in and with g03, as well as the uncertainties associated with each semi-empirical model employed in the full DRF representation. Overall, for the calibration source measurements, the response computed by the DRF for the prediction of the full-energy peak region of responses was good, i.e. within two standard deviations of the experimental response. In contrast, the DRF tended to overestimate the Compton continuum by about 45-65% due to inadequate tuning of the electron range multiplier fit variable that empirically represents physics associated with electron transport that is not modeled explicitly in g03. For the HEU disk measurements, computed DRF responses tended to significantly underestimate (more than 20%) the secondary full-energy peaks (any peak of lower energy than the highest-energy peak computed) due to scattering in the detector collimator and aluminum can, which is not included in the g03 model. We ran a sufficiently large number of histories to ensure for all of the Monte Carlo simulations that the statistical uncertainties were lower than their experimental counterpart's Poisson uncertainties. The uncertainties associated with least-squares fits to the experimental data tended to have parameter relative standard deviations lower than the peak channel relative standard

  16. Resonance ionization spectroscopy: Counting noble gas atoms

    International Nuclear Information System (INIS)

    Hurst, G.S.; Payne, M.G.; Chen, C.H.; Willis, R.D.; Lehmann, B.E.; Kramer, S.D.

    1981-01-01

    The purpose of this paper is to describe new work on the counting of noble gas atoms, using lasers for the selective ionization and detectors for counting individual particles (electrons or positive ions). When positive ions are counted, various kinds of mass analyzers (magnetic, quadrupole, or time-of-flight) can be incorporated to provide A selectivity. We show that a variety of interesting and important applications can be made with atom-counting techniques which are both atomic number (Z) and mass number (A) selective. (orig./FKS)

  17. Whole-body counting 1990

    International Nuclear Information System (INIS)

    Strand, P.; Selnaes, T.D.

    1990-01-01

    In order to determine the doses from radiocesium in foods after the Chernobyl accident, four groups were chosen in 1987. Two groups, presumed to have a large consumption of food items with a high radiocesium content, were selected. These were Lapp reindeer breeders from central parts of Norway, and hunters a.o. from the municipality of Oeystre Slidre. Two other groups were randomly selected, one from the municipality of Sel, and one from Oslo. The persons in these two groups were presumed to have an average diet. The fall-out in Sel was fairly large (100 kBq/m 2 ), whereas in Oslo the fall-out level was low (2 kBq/m 2 ). The persons in each group were monitored once a year with whole-body counters, and in connection with these countings dietary surveys were preformed. In 1990 the Sel-group and the Lapps in central parts of Norway were followed. Average whole-body activity in each group is compared to earlier years's results, and an average yearly effective dose equivalent is computed. The Sel-group has an average whole-body activity of 2800 Bq for men, and 690 Bq for women. Compared to earlier years, there is a steady but slow decrease in whole-body activities. Yearly dose is calculated to 0.06 mSv for 1990. The Lapps in central parts of Norway have an average whole-body content of 23800 Bq for men and 13600 Bq for women. This results in an average yearly dose of 0.9 mSv for the individuals in the group. Compared to earlier years, the Lapp group show a decrease in whole-body contents since 1988. This decrease is larger among men than women. 5 refs., 8 figs., 6 tabs

  18. Dendrochronology as a source of data for landslide activity maps – an example from Beskid Żywiecki Mountains (Western Carpathians, Poland

    Directory of Open Access Journals (Sweden)

    Łuszczyńska Katarzyna

    2017-09-01

    Full Text Available We applied dendrochronological methods for dating landslide activity in the study area (3.75 km2, on the slopes of Sucha Mountain (1040 m a.s.l., in the Beskid Żywiecki Mountains, in the Western Carpathians. 46 sampling sites were distributed throughout the study area. At each site we sampled 1-3 coniferous trees: Norway spruces (Picea abies Karst. and/or silver firs (Abies alba Mill.. From each tree 2 cores were sampled: one from the upslope and the other from the downslope side of the stem. Based on tree-ring widths measured for opposite sides of stems we have calculated eccentricity index values and dated past landslide events. Mean frequency of landslides was obtained for each sampling site. Finally, the data was interpolated into a map of landslide activity. Inverse Distance Weighting (IDW interpolation has been applied. For most of the study area we found medium (19 sites and low (23 sites levels of landslide activity. The highest level of activity was recorded for the largest landslide slope and for the one small landslide. The study conducted on Sucha Mountain has shown that dendrochronology can be an effective method for analysing landslide activity and may be useful in further studies, including those for landslide hazard and risk assessments.

  19. AN ANNOTATED BIBLIOGRAPHY OF CLIMATIC MAPS OF ANGOLA,

    Science.gov (United States)

    Contents: Map of political divisions of Africa; Map of Angola; Sources with abstracts listed alphabetically by author; Alphabetical author index ; Subject heading index with period of record; Subject heading index with map scales.

  20. Multiplicity counting from fission chamber signals in the current mode

    Energy Technology Data Exchange (ETDEWEB)

    Pázsit, I. [Chalmers University of Technology, Department of Physics, Division of Subatomic and Plasma Physics, SE-412 96 Göteborg (Sweden); Pál, L. [Centre for Energy Research, Hungarian Academy of Sciences, 114, POB 49, H-1525 Budapest (Hungary); Nagy, L. [Chalmers University of Technology, Department of Physics, Division of Subatomic and Plasma Physics, SE-412 96 Göteborg (Sweden); Budapest University of Technology and Economics, Institute of Nuclear Techniques, H-1111 Budapest (Hungary)

    2016-12-11

    In nuclear safeguards, estimation of sample parameters using neutron-based non-destructive assay methods is traditionally based on multiplicity counting with thermal neutron detectors in the pulse mode. These methods in general require multi-channel analysers and various dead time correction methods. This paper proposes and elaborates on an alternative method, which is based on fast neutron measurements with fission chambers in the current mode. A theory of “multiplicity counting” with fission chambers is developed by incorporating Böhnel's concept of superfission [1] into a master equation formalism, developed recently by the present authors for the statistical theory of fission chamber signals [2,3]. Explicit expressions are derived for the first three central auto- and cross moments (cumulants) of the signals of up to three detectors. These constitute the generalisation of the traditional Campbell relationships for the case when the incoming events represent a compound Poisson distribution. Because now the expressions contain the factorial moments of the compound source, they contain the same information as the singles, doubles and triples rates of traditional multiplicity counting. The results show that in addition to the detector efficiency, the detector pulse shape also enters the formulas; hence, the method requires a more involved calibration than the traditional method of multiplicity counting. However, the method has some advantages by not needing dead time corrections, as well as having a simpler and more efficient data processing procedure, in particular for cross-correlations between different detectors, than the traditional multiplicity counting methods.

  1. The Hausdorff and box-counting dimensions of a class of recurrent sets

    Energy Technology Data Exchange (ETDEWEB)

    Dai Meifeng [Nonlinear Scientific Research Center, Faculty of Science, Jiangsu University, Zhenjiang 212013 (China)], E-mail: daimf@ujs.edu.cn; Liu Xi [Nonlinear Scientific Research Center, Faculty of Science, Jiangsu University, Zhenjiang 212013 (China)], E-mail: liuxi2001@etang.com

    2008-05-15

    It is well known that a lot of familiar fractal sets can be generated using recurrent method. Conclusions under similitude linear map are straightforward. In this paper, we study the upper and low bounds for the Hausdorff dimension and boxing-counting dimension of recurrent sets. Especially, we focus our attention on the case of the non-similitude.

  2. Participatory Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    2016-01-01

    practice. In particular, mapping environmental damage, endangered species, and human-made disasters has become one focal point for environmental knowledge production. This type of digital map has been highlighted as a processual turn in critical cartography, whereas in related computational journalism...... of a geo-visualization within information mapping that enhances embodiment in the experience of the information. InfoAmazonia is defined as a digitally created map-space within which journalistic practice can be seen as dynamic, performative interactions between journalists, ecosystems, space, and species...

  3. TESTING THE GLOBAL STAR FORMATION RELATION: AN HCO+ (3-2) MAPPING STUDY OF RED MSX SOURCES IN THE BOLOCAM GALACTIC PLANE SURVEY

    International Nuclear Information System (INIS)

    Schenck, David E.; Shirley, Yancy L.; Reiter, Megan; Juneau, Stephanie

    2011-01-01

    We present an analysis of the relation between the star formation rate (SFR) and mass of dense gas in Galactic clumps and nearby galaxies. Using the bolometric luminosity as a measure of SFR and the molecular line luminosity of HCO + (3-2) as a measure of dense gas mass, we find that the relation between SFR and M dense is approximately linear. This is similar to published results derived using HCN (1-0) as a dense gas tracer. HCO + (3-2) and HCN (1-0) have similar conditions for excitation. Our work includes 16 Galactic clumps that are in both the Bolocam Galactic Plane Survey and the Red MSX Source Survey, 27 water maser sources from the literature, and the aforementioned HCN (1-0) data. Our results agree qualitatively with predictions of recent theoretical models which state that the nature of the relation should depend on how the critical density of the tracer compares with the mean density of the gas.

  4. System and method of liquid scintillation counting

    International Nuclear Information System (INIS)

    Rapkin, E.

    1977-01-01

    A method of liquid scintillation counting utilizing a combustion step to overcome quenching effects comprises novel features of automatic sequential introduction of samples into a combustion zone and automatic sequential collection and delivery of combustion products into a counting zone. 37 claims, 13 figures

  5. Is It Counting, or Is It Adding?

    Science.gov (United States)

    Eisenhardt, Sara; Fisher, Molly H.; Thomas, Jonathan; Schack, Edna O.; Tassell, Janet; Yoder, Margaret

    2014-01-01

    The Common Core State Standards for Mathematics (CCSSI 2010) expect second grade students to "fluently add and subtract within 20 using mental strategies" (2.OA.B.2). Most children begin with number word sequences and counting approximations and then develop greater skill with counting. But do all teachers really understand how this…

  6. Current status of liquid scintillation counting

    International Nuclear Information System (INIS)

    Klingler, G.W.

    1981-01-01

    Scintillation counting of alpha particles has been used since the turn of the century. The advent of pulse shape discrimination has made this method of detection accurate and reliable. The history, concepts and development of scintillation counting and pulse shape discrimination are discussed. A brief look at the ongoing work in the consolidation of components now used for pulse shape discrimination is included

  7. Lazy reference counting for the Microgrid

    NARCIS (Netherlands)

    Poss, R.; Grelck, C.; Herhut, S.; Scholz, S.-B.

    2012-01-01

    This papers revisits non-deferred reference counting, a common technique to ensure that potentially shared large heap objects can be reused safely when they are both input and output to computations. Traditionally, thread-safe reference counting exploit implicit memory-based communication of counter

  8. A Word Count of Modern Arabic Prose.

    Science.gov (United States)

    Landau, Jacob M.

    This book presents a word count of Arabic prose based on 60 twentieth-century Egyptian books. The text is divided into an alphabetical list and a word frequency list. This word count is intended as an aid in the: (1) writing of primers and the compilation of graded readers, (2) examination of the vocabulary selection of primers and readers…

  9. A New Method for Calculating Counts in Cells

    Science.gov (United States)

    Szapudi, István

    1998-04-01

    In the near future, a new generation of CCD-based galaxy surveys will enable high-precision determination of the N-point correlation functions. The resulting information will help to resolve the ambiguities associated with two-point correlation functions, thus constraining theories of structure formation, biasing, and Gaussianity of initial conditions independently of the value of Ω. As one of the most successful methods of extracting the amplitude of higher order correlations is based on measuring the distribution of counts in cells, this work presents an advanced way of measuring it with unprecedented accuracy. Szapudi & Colombi identified the main sources of theoretical errors in extracting counts in cells from galaxy catalogs. One of these sources, termed as measurement error, stems from the fact that conventional methods use a finite number of sampling cells to estimate counts in cells. This effect can be circumvented by using an infinite number of cells. This paper presents an algorithm, which in practice achieves this goal; that is, it is equivalent to throwing an infinite number of sampling cells in finite time. The errors associated with sampling cells are completely eliminated by this procedure, which will be essential for the accurate analysis of future surveys.

  10. Concept Mapping

    Science.gov (United States)

    Technology & Learning, 2005

    2005-01-01

    Concept maps are graphical ways of working with ideas and presenting information. They reveal patterns and relationships and help students to clarify their thinking, and to process, organize and prioritize. Displaying information visually--in concept maps, word webs, or diagrams--stimulates creativity. Being able to think logically teaches…

  11. Deep 3 GHz number counts from a P(D) fluctuation analysis

    Science.gov (United States)

    Vernstrom, T.; Scott, Douglas; Wall, J. V.; Condon, J. J.; Cotton, W. D.; Fomalont, E. B.; Kellermann, K. I.; Miller, N.; Perley, R. A.

    2014-05-01

    Radio source counts constrain galaxy populations and evolution, as well as the global star formation history. However, there is considerable disagreement among the published 1.4-GHz source counts below 100 μJy. Here, we present a statistical method for estimating the μJy and even sub-μJy source count using new deep wide-band 3-GHz data in the Lockman Hole from the Karl G. Jansky Very Large Array. We analysed the confusion amplitude distribution P(D), which provides a fresh approach in the form of a more robust model, with a comprehensive error analysis. We tested this method on a large-scale simulation, incorporating clustering and finite source sizes. We discuss in detail our statistical methods for fitting using Markov chain Monte Carlo, handling correlations, and systematic errors from the use of wide-band radio interferometric data. We demonstrated that the source count can be constrained down to 50 nJy, a factor of 20 below the rms confusion. We found the differential source count near 10 μJy to have a slope of -1.7, decreasing to about -1.4 at fainter flux densities. At 3 GHz, the rms confusion in an 8-arcsec full width at half-maximum beam is ˜ 1.2 μJy beam-1, and a radio background temperature ˜14 mK. Our counts are broadly consistent with published evolutionary models. With these results, we were also able to constrain the peak of the Euclidean normalized differential source count of any possible new radio populations that would contribute to the cosmic radio background down to 50 nJy.

  12. Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'

    DEFF Research Database (Denmark)

    de Nijs, Robin

    2015-01-01

    In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed...... by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all...... methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics...

  13. Maps Suggest Transport and Source Processes of PM2.5 at 1 km x 1 km for the Whole San Joaquin Valley, Winter 2011 (Generalizations from DISCOVER-AQ)

    Science.gov (United States)

    Chatfield, R. B.

    2016-12-01

    We present interpreted data analysis using MAIAC (Multiangle implementation of Atmospheric Correction) retrievals and appropriate RAPid Update Cycle (RAP) meteorology to map respirable aerosol (PM2.5) for the period January and February, 2011. The San Joaquin Valley is one of the unhealthiest regions in the USA for PM2.5 and related morbidity. The methodology evaluated can be used for the entire moderate-resolution imaging spectrometer (MODIS, VIIRS) data record. Other difficult areas of the West: Riverside, CA, Salt Lake City, UT, and Doña Ana County, NM share similar difficulties and solutions. The maps of boundary layer depth for 11-16 hr local time from RAP allows us to interpret aerosol optical thickness as a concentration of particles in a nearly well-mixed box capped by clean air. That mixing is demonstrated by DISCOVER-AQ data and afternoon samples from the airborne measurements, P3B (on-board) and B200 (HSRL2 lidar). This data and the PM2.5 gathered at the deployment sites allowed us to estimate and then evaluate consistency and daily variation of the AOT to PM2.5 relationship. Mixed-effects modeling allowed a refinement of that relation from day to day; RAP mixed layers explained the success of previous mixed-effects modeling. Compositional, size-distribution, and MODIS angle-of-regard effects seem to describe the need for residual daily correction beyond ML depth. We report on an extension method to the entire San Joaquin Valley for all days with MODIS imagery using the permanent PM2.5 stations, evaluated for representativeness. Resulting map movies show distinct sources, particularly Interstate-5 (at 1km x 1km resolution) and the broader Bakersfield area. Accompanying winds suggest transport effects and variable pathways of pollution cleanout. Such estimates should allow morbidity/mortality studies. They should be also useful for actual model assimilations, where composition and sources are uncertain. We conclude with a description of new work to

  14. SU-E-T-375: Evaluation of a MapCHECK2(tm) Planar 2-D Diode Array for High-Dose-Rate Brachytherapy Treatment Delivery Verifications

    Energy Technology Data Exchange (ETDEWEB)

    Macey, N; Siebert, M; Shvydka, D; Parsai, E [University of Toledo Medical Center, Toledo, OH (United States)

    2015-06-15

    Purpose: Despite improvements of HDR brachytherapy delivery systems, verification of source position is still typically based on the length of the wire reeled out relative to the parked position. Yet, the majority of errors leading to medical events in HDR treatments continue to be classified as missed targets or wrong treatment sites. We investigate the feasibility of using dose maps acquired with a two-dimensional diode array to independently verify the source locations, dwell times, and dose during an HDR treatment. Methods: Custom correction factors were integrated into frame-by-frame raw counts recorded for a Varian VariSource™ HDR afterloader Ir-192 source located at various distances in air and in solid water from a MapCHECK2™ diode array. The resultant corrected counts were analyzed to determine the dwell position locations and doses delivered. The local maxima of polynomial equations fitted to the extracted dwell dose profiles provided the X and Y coordinates while the distance to the source was determined from evaluation of the full width at half maximum (FWHM). To verify the approach, the experiment was repeated as the source was moved through dwell positions at various distances along an inclined plane, mimicking a vaginal cylinder treatment. Results: Dose map analysis was utilized to provide the coordinates of the source and dose delivered over each dwell position. The accuracy in determining source dwell positions was found to be +/−1.0 mm of the preset values, and doses within +/−3% of those calculated by the BrachyVision™ treatment planning system for all measured distances. Conclusion: Frame-by-frame data furnished by a 2 -D diode array can be used to verify the dwell positions and doses delivered by the HDR source over the course of treatment. Our studies have verified that measurements provided by the MapCHECK2™ can be used as a routine QA tool for HDR treatment delivery verification.

  15. Standardization of {sup 241}Am by digital coincidence counting, liquid scintillation counting and defined solid angle counting

    Energy Technology Data Exchange (ETDEWEB)

    Balpardo, C., E-mail: balpardo@cae.cnea.gov.a [Laboratorio de Metrologia de Radioisotopos, CNEA, Buenos Aires (Argentina); Capoulat, M.E.; Rodrigues, D.; Arenillas, P. [Laboratorio de Metrologia de Radioisotopos, CNEA, Buenos Aires (Argentina)

    2010-07-15

    The nuclide {sup 241}Am decays by alpha emission to {sup 237}Np. Most of the decays (84.6%) populate the excited level of {sup 237}Np with energy of 59.54 keV. Digital coincidence counting was applied to standardize a solution of {sup 241}Am by alpha-gamma coincidence counting with efficiency extrapolation. Electronic discrimination was implemented with a pressurized proportional counter and the results were compared with two other independent techniques: Liquid scintillation counting using the logical sum of double coincidences in a TDCR array and defined solid angle counting taking into account activity inhomogeneity in the active deposit. The results show consistency between the three methods within a limit of a 0.3%. An ampoule of this solution will be sent to the International Reference System (SIR) during 2009. Uncertainties were analysed and compared in detail for the three applied methods.

  16. An Adaptive Smoother for Counting Measurements

    International Nuclear Information System (INIS)

    Kondrasovs Vladimir; Coulon Romain; Normand Stephane

    2013-06-01

    Counting measurements associated with nuclear instruments are tricky to carry out due to the stochastic process of the radioactivity. Indeed events counting have to be processed and filtered in order to display a stable count rate value and to allow variations monitoring in the measured activity. Smoothers (as the moving average) are adjusted by a time constant defined as a compromise between stability and response time. A new approach has been developed and consists in improving the response time while maintaining count rate stability. It uses the combination of a smoother together with a detection filter. A memory of counting data is processed to calculate several count rate estimates using several integration times. These estimates are then sorted into the memory from short to long integration times. A measurement position, in terms of integration time, is then chosen into this memory after a detection test. An inhomogeneity into the Poisson counting process is detected by comparison between current position estimate and the other estimates contained into the memory in respect with the associated statistical variance calculated with homogeneous assumption. The measurement position (historical time) and the ability to forget an obsolete data or to keep in memory a useful data are managed using the detection test result. The proposed smoother is then an adaptive and a learning algorithm allowing an optimization of the response time while maintaining measurement counting stability and converging efficiently to the best counting estimate after an effective change in activity. This algorithm has also the specificity to be low recursive and thus easily embedded into DSP electronics based on FPGA or micro-controllers meeting 'real life' time requirements. (authors)

  17. The effect of volume and quenching on estimation of counting efficiencies in liquid scintillation counting

    International Nuclear Information System (INIS)

    Knoche, H.W.; Parkhurst, A.M.; Tam, S.W.

    1979-01-01

    The effect of volume on the liquid scintillation counting performance of 14 C-samples has been investigated. A decrease in counting efficiency was observed for samples with volumes below about 6 ml and those above about 18 ml when unquenched samples were assayed. Two quench-correction methods, sample channels ratio and external standard channels ratio, and three different liquid scintillation counters, were used in an investigation to determine the magnitude of the error in predicting counting efficiencies when small volume samples (2 ml) with different levels of quenching were assayed. The 2 ml samples exhibited slightly greater standard deviations of the difference between predicted and determined counting efficiencies than did 15 ml samples. Nevertheless, the magnitude of the errors indicate that if the sample channels ratio method of quench correction is employed, 2 ml samples may be counted in conventional counting vials with little loss in counting precision. (author)

  18. Radiation Counting System Software Using Visual Basic

    International Nuclear Information System (INIS)

    Nanda Nagara; Didi Gayani

    2009-01-01

    It has been created a Gamma Radiation Counting System using interface card, which paired with Personal Computer (PC) and operated by the Visual Basic program. The program was set through varied menu selections such as ”Multi Counting” , ”Counting and Record” and ”View Data”. An interface card for data acquisition was formed by using AMD9513 components as a counter and timer which can be programmed. This counting system was tested and used in waste facility in PTNBR and the result is quite good. (author)

  19. Algorithm for counting large directed loops

    Energy Technology Data Exchange (ETDEWEB)

    Bianconi, Ginestra [Abdus Salam International Center for Theoretical Physics, Strada Costiera 11, 34014 Trieste (Italy); Gulbahce, Natali [Theoretical Division and Center for Nonlinear Studies, Los Alamos National Laboratory, NM 87545 (United States)

    2008-06-06

    We derive a Belief-Propagation algorithm for counting large loops in a directed network. We evaluate the distribution of the number of small loops in a directed random network with given degree sequence. We apply the algorithm to a few characteristic directed networks of various network sizes and loop structures and compare the algorithm with exhaustive counting results when possible. The algorithm is adequate in estimating loop counts for large directed networks and can be used to compare the loop structure of directed networks and their randomized counterparts.

  20. Counts and colors of faint galaxies

    International Nuclear Information System (INIS)

    Kron, R.G.

    1980-01-01

    The color distribution of faint galaxies is an observational dimension which has not yet been fully exploited, despite the important constraints obtainable for galaxy evolution and cosmology. Number-magnitude counts alone contain very diluted information about the state of things because galaxies from a wide range in redshift contribute to the counts at each magnitude. The most-frequently-seen type of galaxy depends on the luminosity function and the relative proportions of galaxies of different spectral classes. The addition of color as a measured quantity can thus considerably sharpen the interpretation of galaxy counts since the apparent color depends on the redshift and rest-frame spectrum. (Auth.)

  1. The National Map - Orthoimagery

    Science.gov (United States)

    Mauck, James; Brown, Kim; Carswell, William J.

    2009-01-01

    Orthorectified digital aerial photographs and satellite images of 1-meter (m) pixel resolution or finer make up the orthoimagery component of The National Map. The process of orthorectification removes feature displacements and scale variations caused by terrain relief and sensor geometry. The result is a combination of the image characteristics of an aerial photograph or satellite image and the geometric qualities of a map. These attributes allow users to: *Measure distance *Calculate areas *Determine shapes of features *Calculate directions *Determine accurate coordinates *Determine land cover and use *Perform change detection *Update maps The standard digital orthoimage is a 1-m or finer resolution, natural color or color infra-red product. Most are now produced as GeoTIFFs and accompanied by a Federal Geographic Data Committee (FGDC)-compliant metadata file. The primary source for 1-m data is the National Agriculture Imagery Program (NAIP) leaf-on imagery. The U.S. Geological Survey (USGS) utilizes NAIP imagery as the image layer on its 'Digital- Map' - a new generation of USGS topographic maps (http://nationalmap.gov/digital_map). However, many Federal, State, and local governments and organizations require finer resolutions to meet a myriad of needs. Most of these images are leaf-off, natural-color products at resolutions of 1-foot (ft) or finer.

  2. A hard X-ray scanning microprobe for fluorescence imaging and microdiffraction at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Cai, L.; Lai, B.; Yun, W.; Ilinski, P.; Legnini, D.; Maser, J.; Rodrigues, W.

    1999-01-01

    A hard x-ray scanning microprobe based on zone plate optics and undulator radiation, in the energy region from 6 to 20 keV, has reached a focal spot size (FWHM) of 0.15 microm (v) x 0.6 microm (h), and a photon flux of 4 x 10 9 photons/sec/0.01%BW. Using a slit 44 meters upstream to create a virtual source, a circular beam spot of 0.15 microm in diameter can be obtained with a photon flux of one order of magnitude less. During fluorescence mapping of trace elements in a single human ovarian cell, the microprobe exhibited an imaging sensitivity for Pt (L a line) of 80 attograms/microm 2 for a count rate of 10 counts per second. The x-ray microprobe has been used to map crystallographic strain and multiquantum well thickness in micro-optoelectronic devices produced with the selective area growth technique

  3. Accommodating Binary and Count Variables in Mediation: A Case for Conditional Indirect Effects

    Science.gov (United States)

    Geldhof, G. John; Anthony, Katherine P.; Selig, James P.; Mendez-Luck, Carolyn A.

    2018-01-01

    The existence of several accessible sources has led to a proliferation of mediation models in the applied research literature. Most of these sources assume endogenous variables (e.g., M, and Y) have normally distributed residuals, precluding models of binary and/or count data. Although a growing body of literature has expanded mediation models to…

  4. Application of a self-organizing map and positive matrix factorization to investigate the spatial distributions and sources of polycyclic aromatic hydrocarbons in soils from Xiangfen County, northern China.

    Science.gov (United States)

    Tao, Shi-Yang; Zhong, Bu-Qing; Lin, Yan; Ma, Jin; Zhou, Yongzhang; Hou, Hong; Zhao, Long; Sun, Zaijin; Qin, Xiaopeng; Shi, Huading

    2017-07-01

    The concentrations of 16 priority polycyclic aromatic hydrocarbons (PAHs) were measured in 128 surface soil samples from Xiangfen County, northern China. The total mass concentration of these PAHs ranged from 52 to 10,524ng/g, with a mean of 723ng/g. Four-ring PAHs contributed almost 50% of the total PAH burden. A self-organizing map and positive matrix factorization were applied to investigate the spatial distribution and source apportionment of PAHs. Three emission sources of PAHs were identified, namely, coking ovens (21.9%), coal/biomass combustion (60.1%), and anthracene oil (18.0%). High concentrations of low-molecular-weight PAHs were particularly apparent in the coking plant zone in the region around Gucheng Town. High-molecular-weight PAHs mainly originated from coal/biomass combustion around Gucheng Town, Xincheng Town, and Taosi Town. PAHs in the soil of Xiangfen County are unlikely to pose a significant cancer risk for the population. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Mapping racism.

    Science.gov (United States)

    Moss, Donald B

    2006-01-01

    The author uses the metaphor of mapping to illuminate a structural feature of racist thought, locating the degraded object along vertical and horizontal axes. These axes establish coordinates of hierarchy and of distance. With the coordinates in place, racist thought begins to seem grounded in natural processes. The other's identity becomes consolidated, and parochialism results. The use of this kind of mapping is illustrated via two patient vignettes. The author presents Freud's (1905, 1927) views in relation to such a "mapping" process, as well as Adorno's (1951) and Baldwin's (1965). Finally, the author conceptualizes the crucial status of primitivity in the workings of racist thought.

  6. 10C survey of radio sources at 15.7 GHz - II. First results

    Science.gov (United States)

    AMI Consortium; Davies, Mathhew L.; Franzen, Thomas M. O.; Waldram, Elizabeth M.; Grainge, Keith J. B.; Hobson, Michael P.; Hurley-Walker, Natasha; Lasenby, Anthony; Olamaie, Malak; Pooley, Guy G.; Riley, Julia M.; Rodríguez-Gonzálvez, Carmen; Saunders, Richard D. E.; Scaife, Anna M. M.; Schammel, Michel P.; Scott, Paul F.; Shimwell, Timothy W.; Titterington, David J.; Zwart, Jonathan T. L.

    2011-08-01

    In a previous paper (Paper I), the observational, mapping and source-extraction techniques used for the Tenth Cambridge (10C) Survey of Radio Sources were described. Here, the first results from the survey, carried out using the Arcminute Microkelvin Imager Large Array (LA) at an observing frequency of 15.7 GHz, are presented. The survey fields cover an area of ≈27 deg2 to a flux-density completeness of 1 mJy. Results for some deeper areas, covering ≈12 deg2, wholly contained within the total areas and complete to 0.5 mJy, are also presented. The completeness for both areas is estimated to be at least 93 per cent. The 10C survey is the deepest radio survey of any significant extent (≳0.2 deg2) above 1.4 GHz. The 10C source catalogue contains 1897 entries and is available online. The source catalogue has been combined with that of the Ninth Cambridge Survey to calculate the 15.7-GHz source counts. A broken power law is found to provide a good parametrization of the differential count between 0.5 mJy and 1 Jy. The measured source count has been compared with that predicted by de Zotti et al. - the model is found to display good agreement with the data at the highest flux densities. However, over the entire flux-density range of the measured count (0.5 mJy to 1 Jy), the model is found to underpredict the integrated count by ≈30 per cent. Entries from the source catalogue have been matched with those contained in the catalogues of the NRAO VLA Sky Survey and the Faint Images of the Radio Sky at Twenty-cm survey (both of which have observing frequencies of 1.4 GHz). This matching provides evidence for a shift in the typical 1.4-GHz spectral index to 15.7-GHz spectral index of the 15.7-GHz-selected source population with decreasing flux density towards sub-mJy levels - the spectra tend to become less steep. Automated methods for detecting extended sources, developed in Paper I, have been applied to the data; ≈5 per cent of the sources are found to be extended

  7. Reduction of CMOS Image Sensor Read Noise to Enable Photon Counting.

    Science.gov (United States)

    Guidash, Michael; Ma, Jiaju; Vogelsang, Thomas; Endsley, Jay

    2016-04-09

    Recent activity in photon counting CMOS image sensors (CIS) has been directed to reduction of read noise. Many approaches and methods have been reported. This work is focused on providing sub 1 e(-) read noise by design and operation of the binary and small signal readout of photon counting CIS. Compensation of transfer gate feed-through was used to provide substantially reduced CDS time and source follower (SF) bandwidth. SF read noise was reduced by a factor of 3 with this method. This method can be applied broadly to CIS devices to reduce the read noise for small signals to enable use as a photon counting sensor.

  8. Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.

    Science.gov (United States)

    de Nijs, Robin

    2015-07-21

    In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.

  9. A measurement technique for counting processes

    International Nuclear Information System (INIS)

    Cantoni, V.; Pavia Univ.; De Lotto, I.; Valenziano, F.

    1980-01-01

    A technique for the estimation of first and second order properties of a stationary counting process is presented here which uses standard instruments for analysis of a continuous stationary random signal. (orig.)

  10. Why calories count: from science to politics

    National Research Council Canada - National Science Library

    Nestle, Marion; Nesheim, Malden C

    2012-01-01

    .... They are also hard to understand. In Why Calories Count, Marion Nestle and Malden Nesheim explain in clear and accessible language what calories are and how they work, both biologically and politically...

  11. CoC Housing Inventory Count Reports

    Data.gov (United States)

    Department of Housing and Urban Development — Continuum of Care (CoC) Homeless Assistance Programs Housing Inventory Count Reports are a snapshot of a CoC’s housing inventory, available at the national and state...

  12. White Blood Cell Counts and Malaria

    National Research Council Canada - National Science Library

    McKenzie, F. E; Prudhomme, Wendy A; Magill, Alan J; Forney, J. R; Permpanich, Barnyen; Lucas, Carmen; Gasser, Jr., Robert A; Wongsrichanalai, Chansuda

    2005-01-01

    White blood cells (WBCs) were counted in 4697 individuals who presented to outpatient malaria clinics in Maesod, Tak Province, Thailand, and Iquitos, Peru, between 28 May and 28 August 1998 and between 17 May and 9 July 1999...

  13. VSRR Provisional Drug Overdose Death Counts

    Data.gov (United States)

    U.S. Department of Health & Human Services — This data contains provisional counts for drug overdose deaths based on a current flow of mortality data in the National Vital Statistics System. National...

  14. Low white blood cell count and cancer

    Science.gov (United States)

    ... gov/ency/patientinstructions/000675.htm Low white blood cell count and cancer To use the sharing features on this page, please enable JavaScript. White blood cells (WBCs) fight infections from bacteria, viruses, fungi, and ...

  15. Regression Models For Multivariate Count Data.

    Science.gov (United States)

    Zhang, Yiwen; Zhou, Hua; Zhou, Jin; Sun, Wei

    2017-01-01

    Data with multivariate count responses frequently occur in modern applications. The commonly used multinomial-logit model is limiting due to its restrictive mean-variance structure. For instance, analyzing count data from the recent RNA-seq technology by the multinomial-logit model leads to serious errors in hypothesis testing. The ubiquity of over-dispersion and complicated correlation structures among multivariate counts calls for more flexible regression models. In this article, we study some generalized linear models that incorporate various correlation structures among the counts. Current literature lacks a treatment of these models, partly due to the fact that they do not belong to the natural exponential family. We study the estimation, testing, and variable selection for these models in a unifying framework. The regression models are compared on both synthetic and real RNA-seq data.

  16. Count rate effect in proportional counters

    International Nuclear Information System (INIS)

    Bednarek, B.

    1980-01-01

    A new concept is presented explaining changes in spectrometric parameters of proportional counters which occur due to varying count rate. The basic feature of this concept is that the gas gain of the counter remains constant in a wide range of count rate and that the decrease in the pulse amplitude and the detorioration of the energy resolution observed are the results of changes in the shape of original current pulses generated in the active volume of the counter. In order to confirm the validity of this statement, measurements of the gas amplification factor have been made in a wide count rate range. It is shown that above a certain critical value the gas gain depends on both the operating voltage and the count rate. (author)

  17. Atom counting with accelerator mass spectrometry

    International Nuclear Information System (INIS)

    Kutschera, Walter

    1995-01-01

    A brief review of the current status and some recent applications of accelerator mass spectrometry (AMS) are presented. Some connections to resonance ionization mass spectroscopy (RIS) as the alternate atom counting method are discussed

  18. Determination of efficiency curves for HPGE detector in different counting geometries

    International Nuclear Information System (INIS)

    Rodrigues, Josianne L.; Kastner, Geraldo F.; Ferreira, Andrea V.

    2011-01-01

    This paper presents the first experimental results related to determination of efficiency curves for HPGe detector in different counting geometries. The detector is a GX2520 Canberra belonging to CDTN/CNEN. Efficiency curves for punctual were determined by using a certified set of gamma sources. These curves were determined for three counting geometries. Following that, efficiency curves for non punctual samples were determined by using standard solutions of radionuclides in 500 ml and 1000 ml wash bottle Marinelli

  19. Genetic Mapping

    Science.gov (United States)

    ... greatly advanced genetics research. The improved quality of genetic data has reduced the time required to identify a ... cases, a matter of months or even weeks. Genetic mapping data generated by the HGP's laboratories is freely accessible ...

  20. Internal dosimetry by whole body counting techniques

    International Nuclear Information System (INIS)

    Sharma, R.C.

    1995-01-01

    Over decades, whole body counting and bioassay - the two principal methods of internal dosimetry have been most widely used to assess, limit and control the intakes of radioactive materials and the consequent internal doses by the workers in nuclear industry. This paper deals with the whole body counting techniques. The problems inherent in the interpretation of monitoring data and likely future directions of development in the assessments of internal doses by direct methods are outlined. (author). 14 refs., 9 figs., 1 tab

  1. How to count an introduction to combinatorics

    CERN Document Server

    Allenby, RBJT

    2010-01-01

    What's It All About? What Is Combinatorics? Classic Problems What You Need to Know Are You Sitting Comfortably? Permutations and Combinations The Combinatorial Approach Permutations CombinationsApplications to Probability Problems The Multinomial Theorem Permutations and Cycles Occupancy Problems Counting the Solutions of Equations New Problems from Old A ""Reduction"" Theorem for the Stirling Numbers The Inclusion-Exclusion Principle Double Counting Derangements A Formula for the Stirling NumbersStirling and Catalan Numbers Stirling Numbers Permutations and Stirling Numbers Catalan Numbers Pa

  2. Accuracy and precision in activation analysis: counting

    International Nuclear Information System (INIS)

    Becker, D.A.

    1974-01-01

    Accuracy and precision in activation analysis was investigated with regard to counting of induced radioactivity. The various parameters discussed include configuration, positioning, density, homogeneity, intensity, radioisotopic purity, peak integration, and nuclear constants. Experimental results are presented for many of these parameters. The results obtained indicate that counting errors often contribute significantly to the inaccuracy and imprecision of analyses. The magnitude of these errors range from less than 1 percent to 10 percent or more in many cases

  3. Remote system for counting of nuclear pulses

    International Nuclear Information System (INIS)

    Nieves V, J.A.; Garcia H, J.M.; Aguilar B, M.A.

    1999-01-01

    In this work, it is describe technically the remote system for counting of nuclear pulses, an integral system of the project radiological monitoring in a petroleum distillation tower. The system acquires the counting of incident nuclear particles in a nuclear detector which process this information and send it in serial form, using the RS-485 toward a remote receiver, which can be a Personal computer or any other device capable to interpret the communication protocol. (Author)

  4. Recursive algorithms for phylogenetic tree counting.

    Science.gov (United States)

    Gavryushkina, Alexandra; Welch, David; Drummond, Alexei J

    2013-10-28

    In Bayesian phylogenetic inference we are interested in distributions over a space of trees. The number of trees in a tree space is an important characteristic of the space and is useful for specifying prior distributions. When all samples come from the same time point and no prior information available on divergence times, the tree counting problem is easy. However, when fossil evidence is used in the inference to constrain the tree or data are sampled serially, new tree spaces arise and counting the number of trees is more difficult. We describe an algorithm that is polynomial in the number of sampled individuals for counting of resolutions of a constraint tree assuming that the number of constraints is fixed. We generalise this algorithm to counting resolutions of a fully ranked constraint tree. We describe a quadratic algorithm for counting the number of possible fully ranked trees on n sampled individuals. We introduce a new type of tree, called a fully ranked tree with sampled ancestors, and describe a cubic time algorithm for counting the number of such trees on n sampled individuals. These algorithms should be employed for Bayesian Markov chain Monte Carlo inference when fossil data are included or data are serially sampled.

  5. Seed counting system evaluation using arduino microcontroller

    Directory of Open Access Journals (Sweden)

    Paulo Fernando Escobar Paim

    2018-01-01

    Full Text Available The development of automated systems has been highlighted in the most diverse productive sectors, among them, the agricultural sector. These systems aim to optimize activities by increasing operational efficiency and quality of work. In this sense, the present work has the objective of evaluating a prototype developed for seed count in laboratory, using Arduino microcontroller. The prototype of the system for seed counting was built using a dosing mechanism commonly used in seeders, electric motor, Arduino Uno, light dependent resistor and light emitting diode. To test the prototype, a completely randomized design (CRD was used in a two-factorial scheme composed of three groups defined according to the number of seeds (500, 1000 and 1500 seeds tested, three speeds of the dosing disc that allowed the distribution in 17, 21 and 32 seeds per second, with 40 repetitions evaluating the seed counting prototype performance in different speeds. The prototype of the bench counter showed a moderate variability of seed number of counted within the nine tests and a high precision in the seed count on the distribution speeds of 17 and 21 seeds per second (s-1 up to 1500 seeds tested. Therefore, based on the observed results, the developed prototype presents itself as an excellent tool for counting seeds in laboratory.

  6. Reference analysis of the signal + background model in counting experiments

    Science.gov (United States)

    Casadei, D.

    2012-01-01

    The model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered from a Bayesian point of view. This is a widely used model for the searches of rare or exotic events in presence of a background source, as for example in the searches performed by high-energy physics experiments. In the assumption of prior knowledge about the background yield, a reference prior is obtained for the signal alone and its properties are studied. Finally, the properties of the full solution, the marginal reference posterior, are illustrated with few examples.

  7. Seismic maps foster landmark legislation

    Science.gov (United States)

    Borcherdt, Roger D.; Brown, Robert B.; Page, Robert A.; Wentworth, Carl M.; Hendley, James W.

    1995-01-01

    When a powerful earthquake strikes an urban region, damage concentrates not only near the quake's source. Damage can also occur many miles from the source in areas of soft ground. In recent years, scientists have developed ways to identify and map these areas of high seismic hazard. This advance has spurred pioneering legislation to reduce earthquake losses in areas of greatest hazard.

  8. Photon counting and fluctuation of molecular movement

    International Nuclear Information System (INIS)

    Inohara, Koichi

    1978-01-01

    The direct measurement of the fluctuation of molecular motions, which provides with useful information on the molecular movement, was conducted by introducing photon counting method. The utilization of photon counting makes it possible to treat the molecular system consisting of a small number of molecules like a radioisotope in the detection of a small number of atoms, which are significant in biological systems. This method is based on counting the number of photons of the definite polarization emitted in a definite time interval from the fluorescent molecules excited by pulsed light, which are bound to the marked large molecules found in a definite spatial region. Using the probability of finding a number of molecules oriented in a definite direction in the definite spatial region, the probability of counting a number of photons in a definite time interval can be calculated. Thus the measurable count rate of photons can be related with the fluctuation of molecular movement. The measurement was carried out under the condition, in which the probability of the simultaneous arrival of more than two photons at a detector is less than 1/100. As the experimental results, the resolving power of photon-counting apparatus, the frequency distribution of the number of photons of some definite polarization counted for 1 nanosecond are shown. In the solution, the variance of the number of molecules of 500 on the average is 1200, which was estimated from the experimental data by assuming normal distribution. This departure from the Poisson distribution means that a certain correlation does exist in molecular movement. In solid solution, no significant deviation was observed. The correlation existing in molecular movement can be expressed in terms of the fluctuation of the number of molecules. (Nakai, Y.)

  9. Noun Countability; Count Nouns and Non-count Nouns, What are the Syntactic Differences Between them?

    Directory of Open Access Journals (Sweden)

    Azhar A. Alkazwini

    2016-11-01

    Full Text Available Words that function as the subjects of verbs, objects of verbs or prepositions and which can have a plural form and possessive ending are known as nouns. They are described as referring to persons, places, things, states, or qualities and might also be used as an attributive modifier. In this paper, classes and subclasses of nouns shall be presented, then, noun countability branching into count and non-count nous shall be discussed. A number of present examples illustrating differences between count and non-count nouns and this includes determiner-head-co-occurrence restrictions of number, subject-verb agreement, in addition to some exceptions to this agreement rule shall be discussed. Also, the lexically inherent number in nouns and how inherently plural nouns are classified in terms of (+/- count are illustrated. This research will discuss partitive construction of count and non-count nouns, nouns as attributive modifier and, finally, conclude with the fact that there are syntactic difference between count and non-count in the English Language.

  10. Benjamin Thompson, Count Rumford Count Rumford on the nature of heat

    CERN Document Server

    Brown, Sanborn C

    1967-01-01

    Men of Physics: Benjamin Thompson - Count Rumford: Count Rumford on the Nature of Heat covers the significant contributions of Count Rumford in the fields of physics. Count Rumford was born with the name Benjamin Thompson on March 23, 1753, in Woburn, Massachusetts. This book is composed of two parts encompassing 11 chapters, and begins with a presentation of Benjamin Thompson's biography and his interest in physics, particularly as an advocate of an """"anti-caloric"""" theory of heat. The subsequent chapters are devoted to his many discoveries that profoundly affected the physical thought

  11. Automatic airline baggage counting using 3D image segmentation

    Science.gov (United States)

    Yin, Deyu; Gao, Qingji; Luo, Qijun

    2017-06-01

    The baggage number needs to be checked automatically during baggage self-check-in. A fast airline baggage counting method is proposed in this paper using image segmentation based on height map which is projected by scanned baggage 3D point cloud. There is height drop in actual edge of baggage so that it can be detected by the edge detection operator. And then closed edge chains are formed from edge lines that is linked by morphological processing. Finally, the number of connected regions segmented by closed chains is taken as the baggage number. Multi-bag experiment that is performed on the condition of different placement modes proves the validity of the method.

  12. Blood Count Tests: MedlinePlus Health Topic

    Science.gov (United States)

    ... Spanish WBC count (Medical Encyclopedia) Also in Spanish Topic Image MedlinePlus Email Updates Get Blood Count Tests ... WBC count Show More Show Less Related Health Topics Bleeding Disorders Blood Laboratory Tests National Institutes of ...

  13. Development of counting system for wear measurements using Thin Layer Activation and the Wearing Apparatus

    Energy Technology Data Exchange (ETDEWEB)

    França, Michel de A.; Suita, Julio C.; Salgado, César M., E-mail: mchldante@gmail.com, E-mail: suita@ien.gov.br, E-mail: otero@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    This paper focus on developing a counting system for the Wearing Apparatus, which is a device previously built to generate measurable wear on a given surface (Main Source) and to carry the fillings from it to a filter (second source). The Thin Layer Activation is a technique used to produce activity on one of the Wearing Apparatus' piece, this activity is proportional to the amount of material worn, or scrapped, from the piece's surface. Thus, by measuring the activity on those two points it is possible to measure the produced wear. The methodology used in this work is based on simulations through MCNP-X Code to nd the best specifications for shielding, solid angles, detectors dimensions and collimation for the Counting System. By simulating several scenarios, each one different from the other, and analyzing the results in the form of Counts Per Second, the ideal counting system's specifications and geometry to measure the activity in the Main Source and the Filter (second source) is chosen. After that, a set of previously activated stainless steel foils were used to reproduce the real experiments' conditions, this real experiment consists of using TLA and the Wearing Apparatus, the results demonstrate that the counting system and methodology are adequate for such experiments. (author)

  14. Mapping of renewable energies

    International Nuclear Information System (INIS)

    Boulanger, V.

    2013-01-01

    Germany is the champion of green energy in Europe: the contribution of renewable energies to electricity generation reached about 20% in 2011. This article describes the situation of renewable energies in Germany in 2011 with the help of 2 maps, the first one gives the installed electrical generation capacity for each region and for each renewable energy source (wind power, hydro-electricity, biomass, photovoltaic energy and biogas) and the second one details the total number of jobs (direct and indirect) for each renewable energy source and for each region. In 2011 about 372000 people worked in the renewable energy sector in Germany. (A.C.)

  15. It counts who counts: an experimental evaluation of the importance of observer effects on spotlight count estimates

    DEFF Research Database (Denmark)

    Sunde, Peter; Jessen, Lonnie

    2013-01-01

    observers with respect to their ability to detect and estimate distance to realistic animal silhouettes at different distances. Detection probabilities were higher for observers experienced in spotlighting mammals than for inexperienced observers, higher for observers with a hunting background compared...... with non-hunters and decreased as function of age but were independent of sex or educational background. If observer-specific detection probabilities were applied to real counting routes, point count estimates from inexperienced observers without a hunting background would only be 43 % (95 % CI, 39...

  16. Delta count-rate monitoring system

    International Nuclear Information System (INIS)

    Van Etten, D.; Olsen, W.A.

    1985-01-01

    A need for a more effective way to rapidly search for gamma-ray contamination over large areas led to the design and construction of a very sensitive gamma detection system. The delta count-rate monitoring system was installed in a four-wheel-drive van instrumented for environmental surveillance and accident response. The system consists of four main sections: (1) two scintillation detectors, (2) high-voltage power supply amplifier and single-channel analyzer, (3) delta count-rate monitor, and (4) count-rate meter and recorder. The van's 6.5-kW generator powers the standard nuclear instrument modular design system. The two detectors are mounted in the rear corners of the van and can be run singly or jointly. A solid-state bar-graph count-rate meter mounted on the dashboard can be read easily by both the driver and passenger. A solid-state strip chart recorder shows trends and provides a permanent record of the data. An audible alarm is sounded at the delta monitor and at the dashboard count-rate meter if a detected radiation level exceeds the set background level by a predetermined amount

  17. Projective mapping

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus

    2012-01-01

    by the practical testing environment. As a result of the changes, a reasonable assumption would be to question the consequences caused by the variations in method procedures. Here, the aim is to highlight the proven or hypothetic consequences of variations of Projective Mapping. Presented variations will include...... instructions and influence heavily the product placements and the descriptive vocabulary (Dehlholm et.al., 2012b). The type of assessors performing the method influences results with an extra aspect in Projective Mapping compared to more analytical tests, as the given spontaneous perceptions are much dependent......Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometimes...

  18. Secure count query on encrypted genomic data.

    Science.gov (United States)

    Hasan, Mohammad Zahidul; Mahdi, Md Safiur Rahman; Sadat, Md Nazmus; Mohammed, Noman

    2018-05-01

    Human genomic information can yield more effective healthcare by guiding medical decisions. Therefore, genomics research is gaining popularity as it can identify potential correlations between a disease and a certain gene, which improves the safety and efficacy of drug treatment and can also develop more effective prevention strategies [1]. To reduce the sampling error and to increase the statistical accuracy of this type of research projects, data from different sources need to be brought together since a single organization does not necessarily possess required amount of data. In this case, data sharing among multiple organizations must satisfy strict policies (for instance, HIPAA and PIPEDA) that have been enforced to regulate privacy-sensitive data sharing. Storage and computation on the shared data can be outsourced to a third party cloud service provider, equipped with enormous storage and computation resources. However, outsourcing data to a third party is associated with a potential risk of privacy violation of the participants, whose genomic sequence or clinical profile is used in these studies. In this article, we propose a method for secure sharing and computation on genomic data in a semi-honest cloud server. In particular, there are two main contributions. Firstly, the proposed method can handle biomedical data containing both genotype and phenotype. Secondly, our proposed index tree scheme reduces the computational overhead significantly for executing secure count query operation. In our proposed method, the confidentiality of shared data is ensured through encryption, while making the entire computation process efficient and scalable for cutting-edge biomedical applications. We evaluated our proposed method in terms of efficiency on a database of Single-Nucleotide Polymorphism (SNP) sequences, and experimental results demonstrate that the execution time for a query of 50 SNPs in a database of 50,000 records is approximately 5 s, where each record

  19. Affective Maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    . In particular, mapping environmental damage, endangered species, and human made disasters has become one of the focal point of affective knowledge production. These ‘more-than-humangeographies’ practices include notions of species, space and territory, and movement towards a new political ecology. This type...... of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper looks at computer-assisted cartography as part...

  20. Modeling and Mapping of Human Source Data

    Science.gov (United States)

    2011-03-08

    Fusion Workshop hosted by the Center for Multisource Information Fusion, University at Buffalo/ CUBRC , October 2008 • The general idea is to conduct...Information Fusion Workshop (via the Center for Multisource Information Fusion, University at Buffalo/ CUBRC ) in October, 2008 at which 40 scientists from

  1. BMI in relation to sperm count

    DEFF Research Database (Denmark)

    Sermondade, N; Faure, C; Fezeu, L

    2013-01-01

    BACKGROUND The global obesity epidemic has paralleled a decrease in semen quality. Yet, the association between obesity and sperm parameters remains controversial. The purpose of this report was to update the evidence on the association between BMI and sperm count through a systematic review...... with meta-analysis. METHODS A systematic review of available literature (with no language restriction) was performed to investigate the impact of BMI on sperm count. Relevant studies published until June 2012 were identified from a Pubmed and EMBASE search. We also included unpublished data (n = 717 men...... studies were included in the meta-analysis, resulting in a sample of 13 077 men from the general population and attending fertility clinics. Data were stratified according to the total sperm count as normozoospermia, oligozoospermia and azoospermia. Standardized weighted mean differences in sperm...

  2. Multiple sample, radioactive particle counting apparatus

    International Nuclear Information System (INIS)

    Reddy, R.R.V.; Kelso, D.M.

    1978-01-01

    An apparatus is described for determining the respective radioactive particle sample count being emitted from radioactive particle containing samples. It includes means for modulating the information on the radioactive particles being emitted from the samples, coded detecting means for sequentially detecting different respective coded combinations of the radioactive particles emitted from more than one but less than all of the samples, and processing the modulated information to derive the sample count for each sample. It includes a single light emitting crystal next to a number of samples, an encoder belt sequentially movable between the crystal and the samples. The encoder belt has a coded array of apertures to provide corresponding modulated light pulses from the crystal, and a photomultiplier tube to convert the modulated light pulses to decodable electrical signals for deriving the respective sample count

  3. CERN_DxCTA counting mode chip

    CERN Document Server

    Moraes, D; Nygård, E

    2008-01-01

    This ASIC is a counting mode front-end electronic optimized for the readout of CdZnTe/CdTe and silicon sensors, for possible use in applications where the flux of ionizing radiation is high. The chip is implemented in 0.25 μm CMOS technology. The circuit comprises 128 channels equipped with a transimpedance amplifier followed by a gain shaper stage with 21 ns peaking time, two discriminators and two 18-bit counters. The channel architecture is optimized for the detector characteristics in order to achieve the best energy resolution at counting rates of up to 5 M counts/second. The amplifier shows a linear sensitivity of 118 mV/fC and an equivalent noise charge of about 711 e−, for a detector capacitance of 5 pF. Complete evaluation of the circuit is presented using electronic pulses and pixel detectors.

  4. CERNDxCTA counting mode chip

    International Nuclear Information System (INIS)

    Moraes, D.; Kaplon, J.; Nygard, E.

    2008-01-01

    This ASIC is a counting mode front-end electronic optimized for the readout of CdZnTe/CdTe and silicon sensors, for possible use in applications where the flux of ionizing radiation is high. The chip is implemented in 0.25 μm CMOS technology. The circuit comprises 128 channels equipped with a transimpedance amplifier followed by a gain shaper stage with 21 ns peaking time, two discriminators and two 18-bit counters. The channel architecture is optimized for the detector characteristics in order to achieve the best energy resolution at counting rates of up to 5 M counts/second. The amplifier shows a linear sensitivity of 118 mV/fC and an equivalent noise charge of about 711 e - , for a detector capacitance of 5 pF. Complete evaluation of the circuit is presented using electronic pulses and pixel detectors

  5. Detection limits for radioanalytical counting techniques

    International Nuclear Information System (INIS)

    Hartwell, J.K.

    1975-06-01

    In low-level radioanalysis it is usually necessary to test the sample net counts against some ''Critical Level'' in order to determine if a given result indicates detection. This is an interpretive review of the work by Nicholson (1963), Currie (1968) and Gilbert (1974). Nicholson's evaluation of three different computational formulas for estimation of the ''Critical Level'' is discussed. The details of Nicholson's evaluation are presented along with a basic discussion of the testing procedures used. Recommendations are presented for calculation of confidence intervals, for reporting of analytical results, and for extension of the derived formula to more complex cases such as multiple background counts, multiple use of a single background count, and gamma spectrometric analysis

  6. Neutron counting and gamma spectroscopy with PVT detectors

    International Nuclear Information System (INIS)

    Mitchell, Dean James; Brusseau, Charles A.

    2011-01-01

    Radiation portals normally incorporate a dedicated neutron counter and a gamma-ray detector with at least some spectroscopic capability. This paper describes the design and presents characterization data for a detection system called PVT-NG, which uses large polyvinyl toluene (PVT) detectors to monitor both types of radiation. The detector material is surrounded by polyvinyl chloride (PVC), which emits high-energy gamma rays following neutron capture reactions. Assessments based on high-energy gamma rays are well suited for the detection of neutron sources, particularly in border security applications, because few isotopes in the normal stream of commerce have significant gamma ray yields above 3 MeV. Therefore, an increased count rate for high-energy gamma rays is a strong indicator for the presence of a neutron source. The sensitivity of the PVT-NG sensor to bare 252 Cf is 1.9 counts per second per nanogram (cps/ng) and the sensitivity for 252 Cf surrounded by 2.5 cm of polyethylene is 2.3 cps/ng. The PVT-NG sensor is a proof-of-principal sensor that was not fully optimized. The neutron detector sensitivity could be improved, for instance, by using additional moderator. The PVT-NG detectors and associated electronics are designed to provide improved resolution, gain stability, and performance at high-count rates relative to PVT detectors in typical radiation portals. As well as addressing the needs for neutron detection, these characteristics are also desirable for analysis of the gamma-ray spectra. Accurate isotope identification results were obtained despite the common impression that the absence of photopeaks makes data collected by PVT detectors unsuitable for spectroscopic analysis. The PVT detectors in the PVT-NG unit are used for both gamma-ray and neutron detection, so the sensitive volume exceeds the volume of the detection elements in portals that use dedicated components to detect each type of radiation.

  7. Effects of plutonium redistribution on lung counting

    International Nuclear Information System (INIS)

    Swinth, K.L.

    1976-01-01

    Early counts of Pu deposition in lungs will tend to overestimate lung contents since calibrations are performed with a uniform distribution and since a more favorable geometry exists in contaminated subjects because the activity is closer to the periphery of the lungs. Although the concentration into the outer regions of the lungs continues as evidenced by the autopsy studies, the counts performed by L X-rays will probably underestimate the lung content; because, simplistically, the geometry several years after exposure consists of a spherical shell with a point of activity in the center. This point of activity represents concentration in the lymph nodes from which the 60 keV gamma of 241 Am will be counted, but from which few of the L X-rays will be counted (this is an example of interorgan distribution). When a correction is made to the L X-ray intensity, the lymph node contribution will tend to increase the amount subtracted while correcting for 241 Am X-rays. It is doubtful that the relative increase in X-ray intensity by concentration in the pleural and sub-pleural regions will compensate for this effect. This will make the plutonium burden disappear while the 241 Am can still be detected. This effect has been observed in a case where counts with an intraesophageal probe indicated a substantial lymph node burden. In order to improve the accuracy of in vivo plutonium measurements, an improved understanding of pulmonary distribution and of distribution effects on in vivo counting are required

  8. Count-to-count time interval distribution analysis in a fast reactor

    International Nuclear Information System (INIS)

    Perez-Navarro Gomez, A.

    1973-01-01

    The most important kinetic parameters have been measured at the zero power fast reactor CORAL-I by means of the reactor noise analysis in the time domain, using measurements of the count-to-count time intervals. (Author) 69 refs

  9. Compositions and process for liquid scintillation counting

    International Nuclear Information System (INIS)

    1974-01-01

    Liquid scintillation counting compositions which include certain polyethoxylated poly(oxypropylene) emulsifiers allow stable dispersion of aqueous or other samples merely by shaking. Preferred are mixtures of such emulsifiers which give homogeneous monophasic-appearing dispersions over wide ranges of temperature and aqueous sample content. Certain of these emulsifiers, without being mixed, are of particular advantage when used in analysis of samples obtained through radioimmunoassay techniques which are extremely difficult to disperse. Certain of these emulsifiers, also without being mixed, uniformly give homogeneous monophasic-appearing aqueous counting samples over much wider ranges of aqueous sample content and temperature than prior sample emulsifiers

  10. Counting dyons in N=4 string theory

    CERN Document Server

    Dijkgraaf, R; Verlinde, Herman L

    1997-01-01

    We present a microscopic index formula for the degeneracy of dyons in four-dimensional N=4 string theory. This counting formula is manifestly symmetric under the duality group, and its asymptotic growth reproduces the macroscopic Bekenstein-Hawking entropy. We give a derivation of this result in terms of the type II five-brane compactified on K3, by assuming that its fluctuations are described by a closed string theory on its world-volume. We find that the degeneracies are given in terms of the denominator of a generalized super Kac-Moody algebra. We also discuss the correspondence of this result with the counting of D-brane states.

  11. Energetic map

    International Nuclear Information System (INIS)

    2012-01-01

    This report explains the energetic map of Uruguay as well as the different systems that delimits political frontiers in the region. The electrical system importance is due to the electricity, oil and derived , natural gas, potential study, biofuels, wind and solar energy

  12. Necklace maps

    NARCIS (Netherlands)

    Speckmann, B.; Verbeek, K.A.B.

    2010-01-01

    Statistical data associated with geographic regions is nowadays globally available in large amounts and hence automated methods to visually display these data are in high demand. There are several well-established thematic map types for quantitative data on the ratio-scale associated with regions:

  13. Participatory maps

    DEFF Research Database (Denmark)

    Salovaara-Moring, Inka

    towards a new political ecology. This type of digital cartographies has been highlighted as the ‘processual turn’ in critical cartography, whereas in related computational journalism it can be seen as an interactive and iterative process of mapping complex and fragile ecological developments. This paper...

  14. Isospectral discrete and quantum graphs with the same flip counts and nodal counts

    Science.gov (United States)

    Juul, Jonas S.; Joyner, Christopher H.

    2018-06-01

    The existence of non-isomorphic graphs which share the same Laplace spectrum (to be referred to as isospectral graphs) leads naturally to the following question: what additional information is required in order to resolve isospectral graphs? It was suggested by Band, Shapira and Smilansky that this might be achieved by either counting the number of nodal domains or the number of times the eigenfunctions change sign (the so-called flip count) (Band et al 2006 J. Phys. A: Math. Gen. 39 13999–4014 Band and Smilansky 2007 Eur. Phys. J. Spec. Top. 145 171–9). Recent examples of (discrete) isospectral graphs with the same flip count and nodal count have been constructed by Ammann by utilising Godsil–McKay switching (Ammann private communication). Here, we provide a simple alternative mechanism that produces systematic examples of both discrete and quantum isospectral graphs with the same flip and nodal counts.

  15. Map of the Physical Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Boyack, Kevin W.

    1999-07-02

    Various efforts to map the structure of science have been undertaken over the years. Using a new tool, VxInsight{trademark}, we have mapped and displayed 3000 journals in the physical sciences. This map is navigable and interactively reveals the structure of science at many different levels. Science mapping studies are typically focused at either the macro-or micro-level. At a macro-level such studies seek to determine the basic structural units of science and their interrelationships. The majority of studies are performed at the discipline or specialty level, and seek to inform science policy and technical decision makers. Studies at both levels probe the dynamic nature of science, and the implications of the changes. A variety of databases and methods have been used for these studies. Primary among databases are the citation indices (SCI and SSCI) from the Institute for Scientific Information, which have gained widespread acceptance for bibliometric studies. Maps are most often based on computed similarities between journal articles (co-citation), keywords or topics (co-occurrence or co-classification), or journals (journal-journal citation counts). Once the similarity matrix is defined, algorithms are used to cluster the data.

  16. Nevada Kids Count Data Book, 1997.

    Science.gov (United States)

    We Can, Inc., Las Vegas, NV.

    This Kids Count data book is the first to examine statewide indicators of the well being of Nevada's children. The statistical portrait is based on 15 indicators of child well being: (1) percent low birth-weight babies; (2) infant mortality rate; (3) percent of children in poverty; (4) percent of children in single-parent families; (5) percent of…

  17. Isospectral graphs with identical nodal counts

    International Nuclear Information System (INIS)

    Oren, Idan; Band, Ram

    2012-01-01

    According to a recent conjecture, isospectral objects have different nodal count sequences (Gnutzmann et al 2005 J. Phys. A: Math. Gen. 38 8921–33). We study generalized Laplacians on discrete graphs, and use them to construct the first non-trivial counterexamples to this conjecture. In addition, these examples demonstrate a surprising connection between isospectral discrete and quantum graphs. (paper)

  18. Phase-space quark counting rule

    Energy Technology Data Exchange (ETDEWEB)

    Wei-Gin, Chao; Lo, Shui-Yin [Academia Sinica, Beijing (China). Inst. of High Energy Physics

    1981-05-21

    A simple quark counting rule based on the phase-space consideration suggested before is used to fit all 39 recent experimental data points on inclusive reactions. Parameter-free relations are found to agree with experiments. Excellent detail fits are obtained for 11 inclusive reactions.

  19. Statistical tests to compare motif count exceptionalities

    Directory of Open Access Journals (Sweden)

    Vandewalle Vincent

    2007-03-01

    Full Text Available Abstract Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use.

  20. Photon counting with small pore microchannel plates

    International Nuclear Information System (INIS)

    Martindale, A.; Lapington, J.S.; Fraser, G.W.

    2007-01-01

    We describe the operation of microchannel plates (MCPs) with 3.2μm diameter channels as photon counting detectors of soft X-rays. Gain and temporal resolution measurements are compared with theoretical scaling laws for channel diameter. A minimum pulse width of 264ps is observed for a two stage multiplier at a total bias voltage of ∼1930V